Jump to content

ir_cow

Reviewer
  • Content Count

    6,908
  • Joined

  • Last visited

  • Days Won

    10

1 Follower

About ir_cow

  • Rank
    I Am A Cow...
  • Birthday July 2

OCC

Contact Methods

  • AIM
    The1stMoonCow
  • Skype
    Isaiah333
  • ICQ
    0

Profile Information

  • Gender
    Male
  • Location
    Olympia, WA
  • Interests
    Photography, Computers

Social

  • Twitter
    @IsaiahRossiter

Gaming

  • Steam
    TheMasterCow
  • Xbox Live
    XboxCow
  • PSN
    Evilestcow

Recent Profile Visitors

46,266 profile views
  1. Check out https://whattomine.com/ to see what cards are best for mining. Its what I used when I was into it. Honestly though, its waaay more profitable to buy coins when its low than to mine it yourself.
  2. Ohhh! But enabling 6 (80 total) more ray trace cores isnt going to bring it any closer to the 3080
  3. Indeed the RTX 3060 Ti will be the absolute best buy if its under $300 AND the "leaked" benchmarks are correct and comes out sometime this year. That is until AMD puts out a 6600 or something. As for AMD....well unless you can get your hands on a AMD 5600X, that Intel 10700k will wipe the floor in gaming compared to the last gen Ryzen (3000). It also depends on the game, resolution and graphical settings. Only when you are CPU bound, which is primary only 1920x1080 and below these days, does it have a major impact.
  4. I personally have not flashed any SUPERS, though the process is the same as the OG cards. Because the card is a custom PCB, I'm not sure of what vBIOS can be used as replacement. Usually I suggest EVGA since it has a fan all-stop, but that is the regular cards....
  5. I don't see anything wrong with the build. Im sure if you look hard enough you'll find better deals on the GPU. The RTX 2060 isnt a bad choice, but RX 5700 and 5600XT often fall within the same $300 price point. Gives you more options.
  6. First off I did suggest a CMOS reset if you actually read the replies. Second. Cant help someone if no one has done it before on these forums. We are just guessing how to fix it. The guide I wrote was for desktop GPUs.... If someone reference it and doesnt use it properly. Thats on them. I'll stick to my orignal statment. It shouldnt work due to being a different GPU SKU. People say it does, but it i don't suggest it. "Unlocking" the power limit on a laptop is a horrible idea. For example the RTX 2070 mobile has a 80 Watt TDP. Unloxking it to 200+ spells disaster to me. Has anyone done a VRM analysis? I highly doubt it wil be the same 12-16 phase design. Not only is 200+ watts on a latop GPU a bad idea (cooling wise) BUT that VRM wil get toasty quick. I think people who do this flash are asking for a laptop that overheats..
  7. Nice! The Trio I reviewed was able to reach +800, but after a few hours of benchmarks it started to crash the computer. So I settled on +400. You seem to have a very nice card. I know Micron has 21.5Gbps memory. Not sure NVIDIA didn't use it and stuck with 19Gbps. Maybe due to heat issues?
  8. Hmm. I didnt look closely at the Dark. I thought it was just a cosmetic refresh. Chaging the phanses to 90 will certainly help for XOC overclockers.
  9. Thats what makes it fun! If a could overclock a $100 i3 chip to the extreme on LN2 I would. But its money I do not have to spend.
  10. ir_cow

    Dungeons 3

    Thanks for the heads up!
  11. Igor is pretty knowable on this stuff. That is disappointing because I figured all of the CPU would hit this. But this also means that they can all reach 1900 which was a big struggle before. I am curious to see the difference between 3800 CL14 and 4000 CL15. Probably none-existent in real-world apps.
  12. Ordered the 5800X from B&H. I'll see what 4.6Ghz does with FCLK of 2000. Probably will match the 10900K for 1080 gaming.
  13. I doubt I'll be getting a RX 6000 review sample. No AIB cards this time it looks like and AMD doesn't send anything to OCC usually. I'm just buying a CPU to replace my personal 1920X threadripper system. Maybe swap it out with the 3800X for benchmarks if the gains are as good as AMD claims it to be.
  14. Anyone else going to get the 4th gen Ryzen? I have a feeling the preformance increase for games is only going to be at 1080 where the CPU struggles to keep up with high end graphics cards. Besides that a 15-20% IPC increase across the board isnt bad. Though what it will come down to for me at least is if the overclock is higher than 4.3Ghz (all-core) and if the new FCLK of 4000 will have more impact then say 3800 which is the limit for gen 3.
  15. RayTracing is part of DX12U and Vulken so both series has "equal" reference starting point. How AMD NAVI-2 will preform will interesting to see. As for DLSS, that is a NVIDIA only thing.
×
×
  • Create New...