Jump to content

Nyt

Moderator
  • Content Count

    4,505
  • Joined

  • Last visited

  • Days Won

    7

Everything posted by Nyt

  1. Impressed by what I saw, but I wish the 3090 was a bit cheaper. The VRAM gap between a 10Gb 3080 and 24Gb 3090 is just too big. I would've liked something with more VRAM than a 1080Ti/2080TI for around $999 to $1199
  2. They already showed the power connector placement in an official NVIDIA youtube video.
  3. HWiNFO vcore reading and VID readings. Fixed voltage in bios at 1.285v. Maximum VID recorded was 1.296v but mostly 1.279V under ASUS Realbench or Cinebench stress testing. So 0.017v higher than what I set in BIOS, To be honest I didn't know too much about LLC. Watching the buildzoid video on Gamers Nexus about it, it seems like medium is not so bad but that the "Extreme" setting on most boards is often way too high and dangerous.
  4. So the 12 pin connector has been leaked. It will feature on some of the new RTX 3000 cards, probably Founders edition only. Source : https://videocardz.com/newz/seasonic-confirms-nvidia-rtx-30-series-12-pin-adapter
  5. This is interesting, did a quick test. On my 9900K with my Strix Z390-E at 1.3v core with LLC on the lowest (which is level 1), I can't even boot into Windows without a BSOD. At 1.285v with LLC 6 (max is 7), I'm 100% stable with Cinebench, Realbench, hours of gaming, basically everything. HWiNFO shows load voltage around 1.279V which is close to what I have in BIOS. Without LLC I'd probably have to increase the voltage a lot.
  6. Not long now until we find out the final specs. Some leaked pictures of a new RTX card, presumably the RTX3090 - a huge triple slot card. Source : https://videocardz.com/newz/nvidia-geforce-rtx-3090-graphics-card-pictured
  7. Z490 uses the LGA1200 socket and is only compatible with the Intel 10 series CPUs . The i9 9900K CPU uses the LGA1151 socket, which is what a Z390 board has. You sure about the model numbers ? Is it a 10900K and Z490 instead ? Or 9900K and Z390 instead ?
  8. I don't believe most of the tech rumor Youtubers out there. They just stand in a circle and report each others rumours every week. One week Big Navi is easily beating NVIDIA, the next week it's only on par with the 3080. What I think is the most believable at the moment : - All NVIDIA gaming cards based on Samsung 8nm. - Much better Raytracing performance than the 2000 series. - Hotter, noisier and more power hungry than the 2000 series. RTX3090: GA102, 5248 cores, 12Gb GDDR6 at 18Gbps (possibly 24Gb instead), 384bit RTX3080 : GA102, 4352 cores, 10GB GDDR6 at 18Gbps (possibly 20Gb instead), 320bit RTX3070: GA103, 3072 cores, 8Gb (possibly 16Gb instead) GDDR6, 256bit
  9. Updated with the latest info from NVIDIA. All this focus on "21" suggests that the new cards may be the RTX 2100 series and not the 3000 series as some rumors suggested. We will find out more on the 1st.
  10. Link: OCC NVIDIA Reveals Next Generation RTX 30 Series GPUs Coverage I've personally been very excited for the new RTX gaming cards from NVIDIA based on the Ampere architecture and it seems like an unveiling is right around the corner. NVIDIA just updated their Twitter profile with a new profile picture and this new banner. And a countdown on their website : https://www.nvidia.com/en-us/geforce/special-event/ They have just announced this too : "NVIDIA will broadcast a GeForce Special Event, featuring an address by founder and CEO Jensen Huang, on Sept. 1, at 9 a.m. Pacific time. During the event, Huang will highlight the company’s latest innovations in gaming and graphics. Tune in at https://www.nvidia.com/en-us/geforce/special-event/." Mark your calendars, I think we'll be seeing the new NVIDIA Ampere gaming graphics cards being unveiled on the 1st !
  11. G-SYNC Compatible and G-SYNC aren't the same. G-SYNC Compatible is basically Freesync for NVIDIA for certain monitors tested and validated by NVIDIA. This was enabled early 2019. G-SYNC monitors have the original G-SYNC technology with a hardware based G-SYNC module. That monitor you linked is not listed as validated and tested by NVIDIA, you can find the list here if you filter for G-SYNC Compatible : https://www.nvidia.com/en-us/geforce/products/g-sync-monitors/specs/ One thing to note is that you can actually enable G-SYNC compatibility in the control panel for any Freesync monitor - not just those validated by NVIDIA. However there is no guarantee it works without issues (blanking, cutouts, flicker, etc) if you enable it unofficially. I purchased a Dell monitor that was not tested by NVIDIA, but was tested by pcmonitors.info, displayninja.com and rtings.com to work fine with GSYNC compatibility enabled and I currently have this enabled with no issues however I looked for reputable reviews beforehand to be sure of this.
  12. I've noticed some motherboards skimp on VRM cooling. HardwareUnboxed on Youtube is usually a good source for seeing which motherboards with high VRM temps you should avoid. There's also a few other aspects that were important when I was considering motherboards before upgrading : M.2 slots supporting NVME USB 3.0 ports WiFi capability And RGB of course
  13. SLI and crossfire are dead. Hardly any game support, worse frametimes. Better off saving that for a 3080Ti instead. Rumours put it at 350w just at stock, with more than 400w overclocked.
  14. Nyt

    Nyt's next gen PC

    Coming back to this, it turns out the memory was faulty which was one of the reasons why it didn't overclock at all. RMA'd it and the supplier replaced it with Corsair Vengeance RGB Pro 3200CL16 which turned out to be Samsung B-Die. Managed to get it from 3200MHz 16-19-19-36 2T to 3400 17-21-21-39 1T at 1.4v which is not great but not terrible since its a little bit more bandwidth at a similar latency. Now I'm focusing on ensuring the CPU and memory overclocks are rock solid stable. After that, all that's left is waiting till Ampere so I can get closer to 144FPS for Assassins Creed Valhalla and enjoy some Ray Tracing and DLSS 2.0 in Cyberpunk 2077
  15. Custom watercooling does require maintenance, but the All-In-One (AIO) watercoolers like the ones from Corsair don't require any maintenance above what you need for an air cooler (eg. blowing out dust). Only downside is that pump life is hit or miss. Some last a few months, whilst some last years. I'm a big fan of AIOs, particularly Corsair AIOs, particularly because they're much smaller than large CPU coolers and it makes it much easier getting to the ram or accessing headers. But if you want air cooling, I'd also recommend the NH-D15 mentioned above. Particularly the Chromax black edition which looks gorgeous.
  16. Igor's Lab also did some research into this new connector. Source : https://www.igorslab.de/en/what-is-dran-an-nvidias-new-12-pin-power-supply-no-panic-but-it-is-override-igorslab-investigative/
  17. I highly doubt it would require a new PSU. If there is a new plug, they'd probably just supply the necessary adapters in the box. I just hope the cooling is up to scratch. I don't mind the higher power draw but I don't like hot noisy cards.
  18. I can't blame your buddy for wanting to stick with NVIDIA at the moment. Raytracing and DLSS 2.0 are the future, the 5700XT is going to age very quickly. @damian swop that 5700XT for an RTX 2070 Super!
  19. That's one of the reasons I like Realbench. One of the tests uses Handbrake h.264 video compression. And at the same time it runs two other tests - one is single threaded GIMP Image Editing and the other is OpenGL Luxmark on the GPU.
  20. Nice list! Definitely going to try a few of the tools mentioned above. I recently figured out how useful MemTest86 and HCI MemTest are, whilst troubleshooting some random game crashes. After a bit of trouble shooting (testing two sticks, one stick, then the other stick) it turns out one of my sticks were faulty and were causing errors. For my CPU overclock I found that 30 minutes of Cinebench was useful for a quick test. And then a 1hr Realbench run which gives me a good idea of stability as well as how hot things get. For my GPU overclock I found that running Unigine Valley at 4K was useful. You normally see any artifacts and then can dial it back a bit and leave it running. I never found much use for Furmark/Kombustor as the clocks are dropped way down and it didn't seem realistic to me. Does Linpack Xtreme and Prime 95 reach crazy temps because of AVX ? I remember certain versions of Prime95 used to hit 100C within a few seconds which is crazy. I also find Battlefield V is one of the more intensive games out there. If you play few hours without crashes your CPU/GPU is pretty stable for gaming.
  21. Hello everyone, So my new Cryorig Frostbit M.2 cooler has finally arrived. Here are some pictures, my thoughts and temperature results. The drive being used is a Samsung 970 Evo Plus 1Tb PCIe 3.0 NVME on an ASUS Strix Z390-E Gaming motherboard. Here is what comes in the box : Before, when the SSD was using the stock ASUS heatsink in the M2_2 slot: After, with the SSD in the M2_1 slot (ignore the dust) : And the important bit, temperature results after running CrystalDiskMark 7 (9 passes for each test, 64Gb size) : Before, using the ASUS heatsink with its thermal pad applied correctly (tape removed first) : After, using the Cryorig Frostbit M.2 heatsink cooler : So Drive Temperature 1 is what I'm assuming to be the temperature of the NAND chips, this is the value reported in Samsung Magician. Drive Temperature 1 dropped from 65C to 56C, a nice 9C drop whilst being moved to an area of the case where it is is warmer. Drive Temperature 2 is a value only shown in HWiNFO. This is what I'm assuming to be the controller of the NVME, the hottest part. Drive Temperature 2 dropped from 96C to 62C, a whopping 34C. I believe this is from the stock ASUS heatsink and its thermal pad not making enough contact with the controller of my 970 Evo Plus. This 970 Evo Plus model has the controller being slightly shorter than the taller NAND chips and because the stock ASUS heatsink has very little mounting pressure, there is somewhat of an air gap between the pad and the controller resulting in the high temperatures we see. This Cryorig Frostbit has a much higher mounting pressure and compresses the thermal pads tightly onto the controller and NAND chips. I did remove the plastic film covering the factory ASUS thermal pad so it's not that film causing the high temperatures Overall I'm very pleased with it and its a welcome addition to my PC for around $40. Some additional thoughts about installation : - I could not install this cooler in my M2_2 slot on my motherboard. The cooler was simply too wide and hit the PCIe slot clip seen in the before picture. I ended up moving it to the M2_1 slot, which is actually a warmer area as there is more hot GPU air there. - The instructions, well, there aren't any. Atleast not in the box . Scan the QR code and you can download a one-page picture guide but the best installation tip is the "Cryorig Frostbit Unboxing & Montage [GERMAN]" video on Youtube. I don't speak German, but seeing the steps just made it easier. - It is quite tricky to align the thermal pads. Take your time with this. - It does come with a tube of Cryorig CP7 thermal paste, but you can use your own. I used some Cooler Master Mastergel Maker which has a higher thermal conductivity than the CP7. - The picture guide shows two blobs of thermal paste where the bottom of copper heatpipe fits in the NVME heatsink groove. I used more so that the area was well covered. - I also added thermal paste on the upper heatsink which fits and screws in over the top of the copper heatpipe. So basically I made sure that wherever the copper heatpipe made contact with metal there was sufficient thermal paste. - I also left the Samsung stickers on my NVME for warranty purposes in the future.
  22. I think 4K is going to be the maximum viable resolution for at least 3 years. 4K/60 is firmly 2080Ti territory with everything else struggling without compromising on settings. With the move towards greater levels of fidelity as games develop alongside the new PS5 and Xbox Series X, as well as the fact that Raytracing is going to become way more popular, I think we will be stuck at 1440p/144 and 4K/60 on the next generation NVIDIA Ampere and AMD RDNA 2 cards. I also think it will be interesting to see how the new consoles shape the future of gaming. If developers target 4K/30 as a baseline whilst fully utilizing the PS5 hardware (basically a 3.5GHz 8 core Ryzen and 5700XT with currently unmatched IO capabilities) then many gamers who still wish to maintain 60fps and above will have to look at much more powerful hardware.
  23. Maybe it is overkill, but even with a bunch of fans my case gets hot quickly. When gaming at 21C/70F, my Strix GTX1080TI reaches about 68C at 50% fan speed and vents all that hot air in the case. That somehow raises the M.2 drive's temperature from 40C at idle to 50C even though the drive isn't being used as the OS is currently on a SATA SSD and the game is running off of another SATA SSD. I figured that when I install the OS to the M.2 the temp will go up a bit more, and when new cross-generation games that are also on PS5/Xbox Series X use the extra speed of the NVME that it could get even hotter so there might be some benefits from using the Frostbit (the only M.2 cooler around in South Africa sadly, no EK M.2 cooler or similar lower profile coolers).
  24. M.2 drives and especially NVME drives benefit from having a heatsink fitted or some direct airflow over it. I ran some CrystalDiskMark tests with my Samsung 970 Evo Plus 1Tb with just the ASUS motherboard M.2 heatsink (attached via the supplied thermal pad) and the drive reached 62C according to HWmonitor and Samsung Magician. I let it cool for a while and I ran it again and it reached 65C where the write speeds decreased so I do think that there might be some thermal throttling going on. I've actually got an Cryorig Frostbit M.2 cooler coming in the mail so I'll be doing a temperature test on that soon . I couldn't find any reviews on it, only Computex announcements from back in 2018 but it looks good on paper so it will be interesting see how it performs.
  25. Hello everyone. So I was curious what effect the optional 40mm fan had over the the VRM area of the ASUS Strix Z390-E Gaming motherboard. Other motherboard models in the ASUS Strix range such as the Z390-F, Z390-H and Z390-i lack the required bracket to mount the fan. This is how it looks with it installed. For the keen eye that spotted the single channel RAM, the other DIMM has been discovered to be faulty so I'm finishing off today with the good DIMM before both DIMMs are being sent back. Also ignore the slightly messy cable management and dust, will have a chance to sort this out while the RAM is being RMA'd. Due to my love of fans and cooling, all motherboard fan headers were occupied except the WaterPump+ header so that is what I opted to use for the VRM fan. Test system is an Intel 9900k at 5Ghz, 1.285v LLC6, 1.25v VCCIO and 1.2V VCCSA. CPU cooler is a Corsair H105 with Push/Pull fans (AIO so there's not as much air moving over the VRM area as an air cooler). Corsair 780T case setup : Intake : 2x120mm bottom, 2x120mm front radiator Exhaust : 3x120mm top, 1x140mm rear All the 120 fans were around 1200rpm, with the radiator fans at 1400rpm. Result with the fan turned off, a maximum VRM temperature reading of 59C on HWiNFO after a 15 minute Cinebench run at 21C ambient. The CPU peaked at roughly 80C during this test. Result with the fan at roughly 80%-90% fan speed (4800-5500rpm), a maximum VRM temperature reading of 53C on HWiNFO after a 15 minute Cinebench run at 21C ambient. The VRM Fan was relatively quiet under load, and could not be heard above the case fans. At idle at 70% speed, the fan noise was not noticeable above the case fans at 950-1000rpm. Summary : - Roughly a 5-6C drop in VRM temperature during stress testing, however the VRM temperature was well within spec in either test. - With sufficient case airflow, the VRMs on an ASUS Strix Z390-E Gaming should remain cool without the optional VRM fan even with a high ambient temperature. - However for those with limited case airflow or who may be pushing higher voltage overclocks, it is worth considering. Thanks for reading.
×
×
  • Create New...