Jump to content


  • Content Count

  • Joined

  • Last visited

  • Days Won


Posts posted by ir_cow

  1. 1 hour ago, road-runner said:

    Doesnt take much to check email and look at facebook lol.. Should have sold them long ago along with other cards... I got a bunch of older ones from years ago also going back to nvidia gtx 570 and what was the eyefinity ati card 5870 or something got 4 of those, some 7970s


    I would buy some older card for my collection. Not sure if you want to sell them. Not worth the time to benchmark these days. Maybe for a special chart for WoW when the new expansion comes out in October.

  2. I would trust BIOS > HWINFO > Other software. However idle doesn't really matter much. Its under load that counts. I know the 3600 comes with a different fans than the 3700 and up. It produces less heat so it doesn't need as beefy cooler.

    Lots of CPU coolers ranging from $30 to 150. Really depends on what you are trying to do and what temp you think are acceptable. Im pretty sure the Ryzen 3rd gen is just fine operating at 80c 24/7.

  3. Thanks! I am planning on getting a waterblock at some point. Haven't decided on the FE or AIB card.  Not sure brand yet. Honestly probably whatever is easiest since I don't care about looks or size of the waterblock. Just thermal function.

    The EVGA Hydro Copper is a good choice. The only reason I don't buy pre-install block SKUs is because I want to sell the card later on. I have horrible luck selling cards without a aircooler.

  4. Yeah this time no "reference" pcb exists. Just reference componets and AIB partners have to make their own PCB. Personally the whole 3000 series looks so nice I wouldn't want to take it apart for a waterblock.

    It will be much easier with partner PCBs. As seem in the past, nvidia has a billion screws. 

    But this also means every block will be a limited run due to low compatibility. That Asus strix and Msi Gaming waterblocks are like $75 more for the 2080 ti..

  5. Oh yeah DLSS 1.0 is broken. On top of it being locked off per video card, it doesn't always work anyways. Like the 2060 and 2070 super only support 2K DLSS. 2060 only works on 1080 and the 2080 Super is 2k and 4k. NVIDIA has this segregation so whatever reason. You cannot use DLSS at 1080 for any of the super cards.

    Battlefield 5 will not use DLSS for ANY of the cards at 4K. It worked last year, not sure what update broke it.

    So far Control and Mechwarriors 5 is the only 2 games I have that supprt DLSS 2.0. Its less broken, but when DLSS is enabled for Control the internal render drops to 2K no matter the settings I pick.

    This is pre-Ampere of course with the newest drivers. I'll find out soon enough if the trend continues. Basically from the 4 DLSS games Ive played, all have different issues with what resolution will work.

  6. Any reason you want to see the DLC over the main game and built-in benchmark? I know the DLC has "improved" RTX features. Not sure what is different or if the raytrace is really different.

    I already have Ultra, Extreme and RTX presets benchmarked as part of the main game (for previous cards). I do have the DLC, but haven't played it.

    I believe the only difference between extreme and ultra is 200% internal rendering. 

  7. Benchmarks for sure. I have 13 games lined up already.  If you have a certian game request, depending on time I'll try to get it in.

    HDR, well I dont have a monitor or TV for that.

    Acer sent me some 4K 144hz predator last year and lets say it didnt work out well. Nothing but troubles.  Windows coming back from sleep mode would be blown out. Could barely see the mouse. Games usually switched back and forth during loading screens for HDR and other random stuff that is more minor. I was not impressed with G-Sync HDR 400. I never figured it out before I had to return it. They sent me two, both with very simliar problems.

    I'll retry it in the future at some point, but honestly other then watching movies via VLC player, it was a horrible experience.

  8. Yeah after the NVIDIA announcement the internet has gone silent. It might not seem like a lot is going on, but this launch is going to be epic if what NVIDIA has said about the stats holds true. Most of us reviewers are under NDA and necessary if you ever hope to receive samples for a launch review.

    Anything good to watch on TV?

  9. I really want AMD to come out with something big to bring the prices down on NVIDIA side. At least this time around NVIDIA is "affordable". 2080 Ti performance for $499 is really good. Anyone playing at 1080 or 1440 will be set for years to come. Anyone that must play at 4K max settings can get a 3080 and those who want to dabble in 8K or maybe play Control (RTX ON) without DLSS will be happy. That game brings the 2080 SUPER down to 14 FPS. unplayable. I can only imagine what Cyberpunk 2077 will do with 4K (RTX On). Its the new crysis benchmark for the next 5 years.

  10. Well AMD rumored Big NAVI is only 80 CU, so double the 5700XT. It will probably end up slightly above the 3070, but it better be the same price. Unfortunately that will be AMDs top dog it seems. It might have  been scrapped in favor of RDNA2. Or it is the same but with improved IPC of RDNA2. A 10% increase will still put it well below the 3080.

    As for the reference PCB. I think this time around all AIB partners will use their own design.  Because the work around for years has been to slap a blower fan on the OEM and cheap units to see it for $100-200 less. Now with this mini PCB, it is unlikely any of current air cooler will easily fit without a different PCB or drastic rework. So like EVGA who often uses the reference PCB for the Ultra and SC lineup will be forced to either use the FTW PCB for everything or keep NVIDIA cooler.

    It costs a lot of money to re-tool a cooler. It is cheaper to slap a GPU on empty PCB like MSI does. Same cooler for multiple series. Technically a different PCB for each, but the size is usually about the same for sub-set. Meaning the use of the same cooler and fans for a lot of the products. Also it keeps the branding nice.

  11. Sept 17th it looks like for the 3080 and 24th for 3090. Did anyone catch the 12pin connector in the middle of the card when the 3070 was shown (31:10)? I didn't think about it until now, but the PCB is really small so that connector is actually like dead center. Its going to drive people crazy for cable management. 

    Edit: NVIDIA also says the 3070 is faster than the 2080 TI for only $499. Wow It looks like anyone with a RTX Titan or 2080 Ti isn't going to be able to see it used for very much here shortly.


  12. I don't know much about the inter workings of the SoC but it is well documented that if you raise the SoC voltage to 1.2v (or above) PCIE 4.0 is disabled. My ASUS X570 Hero tells me this when I do, which is nice as not all vendors do.

    I think the only reason to set it that high is for world records if you are trying to run DDR4 5000 and above.  This is because the VDDG CCD Voltage, VDDG IOD Voltage and CLDO VDDG voltage are all part of the memory controller which is kinda tied to the SoC voltage. So you cannot have say the VDDG higher then the SoC. Getting DDR4 6000 is probably impossible without 1.2v SoC.

    The only reason to raise the SoC on its own that high is if you are trying to get a higher FCLK, though without LN2 cooling, the limit is pretty much 1900Mhz and that can be achieved with 1.125v.

  • Create New...