Jump to content


  • Content Count

  • Joined

  • Last visited

  • Days Won


Posts posted by ir_cow

  1. Well my personal opinion on this subject, is the higher the resolution the better. I have a Acer Predator 27" 165Hz 2K monitor and when the Display port broke and I went back to 60hz, I couldn't tell the difference really. Sure it was less "smooth" but it didn't change my ability to get kills. I much rather have a higher resolution than uber refresh rate. But I also not a competitive gamer. I play COD, Battlefield 5 and Destiny 2 mostly. None require much skill be be considered good.

    If you are playing Valiant, CS:GO or Apex Legends, it might make more sense to go all out with the increased refresh rate.

  2. I'm going to try 

    2 hours ago, Braegnok said:

    The Crucial modules run OK in Z390, but run much better in Z490 system,.. are you going to be running them in your X570? the results would be interesting. :geek:


    I'm going to try both. I think my AMD 3800x tops out at 4600~ with two sticks without more Soc Voltage. I can run the same TridentZ Neo kit at 4700 without much trouble on the MSI Z490 Unify. I haven't had time to play around with it much, so it could very well be a timings issue.

  3. The man hate windows :). Actually Linux can do a lot of things and NVIDIA does give out new drivers every few months.

    As for compatibility, I never had problem getting Ubuntu to install, though I only used it for server stuff while working at a NBC station, didn't try to play games at all. During install you can choose the option to install all the 3rd party plug-ins along with the OS itself.

  4. 3 hours ago, Gremlin said:

    What is AIB??



    Add In Board. Fancy way of saying any company that makes hardware that isn't the parent company. Also the FE review date has been moved to the 16th. Word on the street is because of COVID shipping issues for some.

  5. I think its like for every 1c below 35 it gains 10mhz from the boost. At a certian point you cant go higher without more voltage.

    As for talking about running cards at 85c, from my experience it is true they wear out in 2-3 years.  But for normal gaming it will be a lot less stress than what mining does. Plus you can triple that  time due to "normal" gaming sessions being no more then 8hrs per day. This means even with a abusive overclock and high temps, gamer will have the video card 6+ years. At that point it has lot all value.

  6. 1 hour ago, dling said:

    So if you guys  see my signature.  My I already bottle neck with current rig ? Would I be bottle neck with the 3080ti ?

    Can't see the sig. I am also the one person here who could give you a answer but can't due to NDA. I can't even comment on this :/ I do cover this in my review as it is a big question rolling around the internet right now.

    Also the 3080 Ti isn't one of the launch cards for this year. Are you asking about 2080 Ti vs 3080?

  7. Nyt you have a good point. I didn't think of the competition. Only reason to shove 16GB into these cards besides for HD Texture packs is to match AMD. People look at the big numbers, but not the actual performance. Basic market strategy. Heck when testing the Titan Xp, I still haven't used 12GB in any game. The ones that are close is only at 4K and they sure don't run well lol.

    By the time 16,20 or 24gb is used, gamers will have moved onto 16K gaming. Remember when the 780 6GB came out. How well does that fair in modern games...

  8. I dont really see a reason to go above the 10gb mark for 99% of users. The only exception is for HD texture packs at 4K or 8K gaming.


    If arent doing either. No reason to go out of your way and pay more for the same thing. But the 3070 Ti seems fishy. I think someome miss typed or left it as a placeholder. 

    So lets see Final Fantasy 15, Skyrim, Far Cry 5, Borderlands 2 and Monster Hunter Worlds are the only games I can think of that released HD texutre packs. Would I spend $150-200 more for extra memory above 10GB. Not me, but I guess if that is the games you love to play it might be worth the cost.

  9. WOW!

    I manged to get this 3600 Tridentz Neo kit to 4700 CL19 yesterday. Was thinking of getting Ballistic MAX 4000 since those seems to reach 5000 also.

    Let me know how it goes. When I ran 4700 on the i9-10900k my VCIO votlage was 1.67v. I think the safe limit is 1.5 for daily use...

    Amazon has the 4000 for $135 right now. Very tempting..

  10. 1 hour ago, road-runner said:

    Doesnt take much to check email and look at facebook lol.. Should have sold them long ago along with other cards... I got a bunch of older ones from years ago also going back to nvidia gtx 570 and what was the eyefinity ati card 5870 or something got 4 of those, some 7970s


    I would buy some older card for my collection. Not sure if you want to sell them. Not worth the time to benchmark these days. Maybe for a special chart for WoW when the new expansion comes out in October.

  11. I would trust BIOS > HWINFO > Other software. However idle doesn't really matter much. Its under load that counts. I know the 3600 comes with a different fans than the 3700 and up. It produces less heat so it doesn't need as beefy cooler.

    Lots of CPU coolers ranging from $30 to 150. Really depends on what you are trying to do and what temp you think are acceptable. Im pretty sure the Ryzen 3rd gen is just fine operating at 80c 24/7.

  12. Thanks! I am planning on getting a waterblock at some point. Haven't decided on the FE or AIB card.  Not sure brand yet. Honestly probably whatever is easiest since I don't care about looks or size of the waterblock. Just thermal function.

    The EVGA Hydro Copper is a good choice. The only reason I don't buy pre-install block SKUs is because I want to sell the card later on. I have horrible luck selling cards without a aircooler.

  13. Yeah this time no "reference" pcb exists. Just reference componets and AIB partners have to make their own PCB. Personally the whole 3000 series looks so nice I wouldn't want to take it apart for a waterblock.

    It will be much easier with partner PCBs. As seem in the past, nvidia has a billion screws. 

    But this also means every block will be a limited run due to low compatibility. That Asus strix and Msi Gaming waterblocks are like $75 more for the 2080 ti..

  14. Oh yeah DLSS 1.0 is broken. On top of it being locked off per video card, it doesn't always work anyways. Like the 2060 and 2070 super only support 2K DLSS. 2060 only works on 1080 and the 2080 Super is 2k and 4k. NVIDIA has this segregation so whatever reason. You cannot use DLSS at 1080 for any of the super cards.

    Battlefield 5 will not use DLSS for ANY of the cards at 4K. It worked last year, not sure what update broke it.

    So far Control and Mechwarriors 5 is the only 2 games I have that supprt DLSS 2.0. Its less broken, but when DLSS is enabled for Control the internal render drops to 2K no matter the settings I pick.

    This is pre-Ampere of course with the newest drivers. I'll find out soon enough if the trend continues. Basically from the 4 DLSS games Ive played, all have different issues with what resolution will work.

  15. Any reason you want to see the DLC over the main game and built-in benchmark? I know the DLC has "improved" RTX features. Not sure what is different or if the raytrace is really different.

    I already have Ultra, Extreme and RTX presets benchmarked as part of the main game (for previous cards). I do have the DLC, but haven't played it.

    I believe the only difference between extreme and ultra is 200% internal rendering. 

  • Create New...