Jump to content

Nyt

Moderator
  • Content Count

    4,505
  • Joined

  • Last visited

  • Days Won

    7

Posts posted by Nyt


  1. On 1/18/2021 at 10:49 PM, WebMaximus said:

    Amazing result!

    I've had the same problem with my new 970 Evo Plus 2 TB getting very hot.

    Today I bought this product myself but one thing that puzzles me is how there are some space between the 970 and the thermal pad in the bottom of the cooler.

    That makes me wonder what the reason is to have the thermal pad there in the first place. When there's no contact between the backside of the SSD and the thermal pad.

    Did you notice the same thing @Nyt


    I didn't really notice the gap between the back of the SSD and the thermal pad on the bottom.

    Everything that needs cooling (chips and controller) is on the top luckily.


  2. 8 hours ago, Hecubus said:

    got this mounted on my ROG Strix ROG-490

    How do I get it to run? when I boot is starts off by the time it gets into windows it stops, I don't know where to find the controls for it,  in AIsuite 3 or in the Bios

    There is no dedicated header on my Z390-E motherboard for a VRM fan, so I extended the cable and plugged it into another available fan header (meant for a M.2 fan).
    This way I was able to see it in the Fan Xpert 4 tab in AI Suite and set a custom curve. 

    I see that some Strix Z490 boards have a fan header labeled for a VRM fan but I also see some reports on the internet that the fan doesn't seem to turn on.

    My advice would be to monitor the motherboard VRM temperature sensor using software like HWiNFO while stress testing to see if the VRMs are in a safe range and to see if the VRM fan only turns on at a certain VRM temperature. Monitoring the VRM and PCH temps in HWiNFO caused microstutter in gaming for me, so be sure to disable that sensor once you're done stress testing. 

    Else if you have another fan header open that you're not planning to use, you can plug it into that one and control it in AI Suite like normal.


  3. It's a third person game and it feels great in third person view, but there is a first person camera option.

    It's a huge cowboy game with a great story, memorable characters in one of the best open world environments out there. 

    Don't expect to play it at max settings though, one of the worst optimized games on PC. I would suggest using the HardwareUnboxed Optimized settings which is what I use (I maxed the grass Level of Detail Slider and the Geometry Level of Detail slider for better views though) and the game looks fantastic whilst being super smooth (80-100fps at 1440p on a RTX3080)

     


  4. Great looking game. Smooth at 1440p Ultra on my 3080.

    I disabled RT and DLSS though. RT is a massive performance killer and DLSS (even on Quality at 1440p) has too much shimmering on so many edges in the city, and I even noticed some lights freaking out along with some horrible mirror reflections. 


  5. Preloaded, new driver installed, just waiting now.

    But I'm undecided on whether to use Raytracing or not.
    On the one hand there are some impressive screenshots with RT on Ultra. But with my 3080 it will be 60fps at 1440p with DLSS Quality, possibly lower when driving or raining.
    On the other hand at 1440p with DLSS Quality the framerate should be significantly higher and I really like playing FPS games at 70fps or more.

    Will have to see for myself and decide. 


  6. 2 hours ago, Braegnok said:

    Delayed again. :rofl:

    Release date has been pushed back by 21 days, and will now "allegedly" release on December 10. 

    Fool's gold,.. Cyberpunk isn't real. 

     

    I will forgive the developers if there's great RTX and DLSS 2.0 optimization because Cyberpunk 2077 really needs it.

    Games usually include one or two elements like raytraced shadows or raytraced reflections, but Cyberpunk 2077 has raytraced diffuse illumination, raytraced reflections, raytraced AO and raytraced shadows. And all of this in an open world! It's going to be like Crysis 1 all over again lol. 


  7. Very useful guide ! 

    Especially the part about the CRC check on the memory which means a bad memory overclock will probably cause performance loss first instead of artifacting or crashing first like previous cards.

    The review mentions using the Unigine Heaven benchmark and I definitely found it very useful, especially for testing the memory overclock.

    When I was testing my memory overclock on Unigine Heaven benchmark at 2560x1440 with max settings this is what I found :
    Memory offset | Average FPS | Score
    +0 :        127.3  | 3206 
    +250 :    127.7  | 3216
    +400 :    128.0  | 3225 
    +500 :    128.3  | 3231 
    +600 :    128.4  | 3236
    +700 :    128.6  | 3240 
    +800 :    128.8  | 3244 
    +900 :    128.9  | 3248 
    +1000 :  129.1  | 3253 
    +1250 :  129.6  | 3264 
    +1499 :  116.7  | 2940 
    +1500 :  117.3  | 2955 

    The memory on my MSI RTX 3080 Gaming X Trio seems to be very good. All the way up to +1250 I still found gains in performance in Heaven.
    Only after that point did I experience the performance regression associated with the CRC check on the memory and it's quite a severe drop.


  8. 15 hours ago, Braegnok said:

    You should be good to go for a long time after 2022 with your hardware Nyt.

    Have you received your new RTX 3080 card yet?

    Looks like your ready for CyberPunk. And I'm dead in the water setting on the bench. 


    I have received it. Really fast, big improvement over my 1080Ti.
    And it is stable at +75Mhz core and +1000MHz memory (MSI RTX3080 Gaming X Trio if anyone else was wondering).

     


  9. I will be sticking with my 5GHz i9 9900K.

    For me, I don't see much value in buying a new AM4 motherboard and a 5900x for an unnoticeable gain at 1440P and a few more cores that games don't seem to be using effectively yet.

    relative-performance-games-2560-1440.png
     

    For those on compatible AM4 motherboards its a very tempting choice though, as it seems to offer much better performance than the Ryzen 3000 series.


    I will look ahead to AM5 and LGA1700 in late 2021 but probably 2022, which should adopt DDR5 RAM.


  10. I think NVIDIA has the advantage with Raytacing (and DLSS) as well as drivers.

    But AMD is very impressive at the moment in both their GPU and CPU performance, what a massive leap in performance in one generation.

    Now imagine when they bring out the new AM5 socket with 5nm Zen 4 CPUs, DDR5 RAM and RDNA3 by the end of 2022.
    I think hardware is going to see a huge leap in performance in the next few years now that we have proper competition.


  11. Also happy to hear DLSS 2.0 will be in Cyberpunk 2077 and Watch Dogs 3. It's some fantastic tech and I hope it's included in most new releases. 

    The AMD event for the RX6000 series is coming up soon. From the teasers in the Ryzen conference, it looks like the biggest Navi card will be matching the 3080 (well, in some games at least). 

    Will be interesting to see how they handle Raytracing, because NVIDIA looks like they're winning with their RT cores and DLSS 2.0.


  12. On 9/26/2020 at 9:32 PM, Braegnok said:

    Don't worry Nvidia will fix issue. :yes:

    Get ready for the latest Nvidia GeForce Graphics Driver,.. includes automatic performance tuning, downclocking for RTX 3080 3090 GPUs.  


    You're joking but NVIDIA actually brought out 456.55 WHQL a few hours ago and there seems to be very positive feedback from RTX3080 users.
    Sounds like the issues have been solved, with minimal (if any?) performance loss :)


  13. 4 hours ago, Braegnok said:

    Jay tore apart several new 30-series cards,.. the MSI Gaming X trio cards have the same cap layout as FE card,.. 

     


    It's definitely only 1. The MSI 3090 is the card with 2 MLCC groups. 

    I see EVGA has acknowleged the issue and is making  changes to their cards :  https://forums.evga.com/Message-about-EVGA-GeForce-RTX-3080-POSCAPs-m3095238.aspx


  14. Oh boy. I hope my MSI Gaming X Trio 3080 is fine and doesn't have any CTD issues, because if mine is faulty then I'll be facing a 1,5-2 month delay to get a different one because there's a huge backorder list in South Africa :( 

    "RTX 3080 Crash to Desktop Problems Likely Connected to AIB-Designed Capacitor Choice"
    Source : https://www.techpowerup.com/272591/rtx-3080-crash-to-desktop-problems-likely-connected-to-aib-designed-capacitor-choice


  15. 11 hours ago, ir_cow said:

    Yeah I learned my lesson. Don't make videos hours before launch. Kinda rushed. Oh well, I'll replace it eventually. 

    The MSI card review is up. I have a feeling the silicon is binned higher. At stock it beats out the FE overclocked. Not that the FE I had did well in that category anyways. It does have a 100mhz base boost over the FE at 1815Mhz.

    Not sure you can get much more out of a video card without a waterblock or a AIB card that gives 400+ watts power target. 

    In the past generations, the Gaming X Trio, Gigabyte Arorus Extreme and EVGA FTW3 have always been the ones with the highest power limits. There is still a chance to see one with 400 watts support. Otherwise shunt-mod it is!

    I managed to preorder a MSI Gaming X Trio 3080 in SA so now it's just a couple weeks until it arrives in SA.

    Checked the review out afterwards and it's a really nice card. Slightly disappointed it only has a 102% power limit of 350w given the fact that it uses 3 x 8 pins.
    But as you mention in the review here , it doesn't make a whole lot of difference on air cooling.


     


  16. I've also been thinking about it.

    At the moment there's not enough of the large GA102 die to go around, it's already in short supply because it's used in both the 3080 and the 3090.

    In the past the Ti was a cut down 102 die, and this gen the 3080 already is a cut down GA102 die. I think we're actually getting what was meant to be the  3080Ti as the 3080 at $699 thanks to AMD providing competition.

    It will be interesting to see what Big Navi does.
    If it comes with 16Gb that would mean it's either 256bit (starved on bandwidth) or 512bit (very expensive).
    It's more likely it comes with 12Gb which would mean it's 384bit, a good balance of cost/performance and quite an advantage over the 8Gb 256-bit 3070.

    If the rumors are true and Big Navi's performance fits in between the 3070 and 3080, then I wouldn't be surprised to see a 16Gb 3070 Super with GDDR6X instead of GDDR6 to counter AMD, but then it would eat into 10Gb 3080 sales so NVIDIA would need to consider launching a 20Gb 3080 too.


    So in summary, this is what I think will happen :
    - RTX 3000 series launches as usual. 
    - The 12Gb or 16Gb RDNA2 flagship card's performance fits between the 3070 and 3080.
    - Early 2021 NVIDIA counters with a 
    16Gb 3070 Super with 19Gbps GDDR6X instead of GDDR6 on the larger GA103 die instead of GA104 - faster than Big Navi but slower than 3080.
    - Mid 2021 NVIDIA launches 20Gb 3080 Super with higher clocks and faster GDDR6X (Micron did say they're aiming for 21-23Gbps for 2021 so that would be similar to the Super upgrade of Turing).

     


  17. 9 hours ago, ir_cow said:

    Ouch. How can anyone afford that much? I would fly to america and pick up a computer and fly back. It seems like it would be cheaper.

    It seems like it's just the new Ampere cards that are really overpriced in South Africa at launch.

    For the same money as a South African RTX3080 I could build a PC that actually costs about $1200 on Newegg (Ryzen 5 3600, Gigabyte B550, 16Gb GSkill DDR4-3200, XFX 5700XT, Sabrent 1TB Rocket Q NVME with a Fractal case, 600w PSU, Corsair HS35 headset, Cooler Master MM-710 mouse, keyboard and a 24 inch Dell 1080p/75Hz IPS monitor).

    I guess I'm just paying the early adopters tax to get an Ampere in the launch window :happy:


  18. Just saw the South African pricing for these cards.

    When converted to USD, the South African RTX3080 is $1500 and the South African RTX3090 is $2400. Thanks COVID shipping rates and greedy South African suppliers.


    Decided on the MSI Gaming X Trio RTX3080.
    Bought another 8 pin cable from Corsair so I've got 3 x 8pin ready.
    Also bought a Cooler Master ELV8 to stop any possible sagging (although now I see MSI includes something similar in the box).

    Keen to get my hands on one and undervolt + overclock it to get it a bit faster and cooler/quieter.

×
×
  • Create New...