Jump to content


  • Content count

  • Joined

  • Last visited

  • Days Won


About ir_cow

  • Rank
    I Am A Cow...
  • Birthday July 2


Contact Methods

  • AIM
  • Skype
  • ICQ

Profile Information

  • Gender
  • Location
    Olympia, WA
  • Interests
    Photography, Computers


  • Twitter


  • Steam
  • Xbox Live
  • PSN

Recent Profile Visitors

43,553 profile views
  1. Are you aware that Steve Jobs passed away a number of years ago? But I think he would have approved on the cheese grater over the trash can.
  2. ir_cow

    i7 9700kf ram compatibility

    I would also run Memtest86 to make sure the memory is stable. Just because windows doesn't crash, doesn't mean it isn't producing errors.
  3. It shouldn't. You only need 1 cpu core per WU. Bandwidth of pcie 2.0 8x is enough. Glad it works for you!
  4. The A10 is a APU and is a bit slower than the FX-6300 requirements. That might be why it is running poorly.
  5. ir_cow

    PCIe 4.0 X 16 vs 3.0 X 16

    This is a little late but the card has to support PCIe 4.0 for it to run at that speed. Only the RX 5700 is PCIe 4.0 right now. Not that it actually makes a difference (yet). Anyways 5 Mill PPD is awesome! To bad folding coin is dead. Would have been nice a 2 years ago
  6. ir_cow

    Ryzen Build 2019

    I'm with Braegnok. Make sure to update the BIOS to the newest version. I had issues posting at 3200 originally (On a MSI Board) and the XMP profile didn't work. Now 4266 is possible with the same kit if I manually set it and of course the XMP works. I have the WIFI version of the HERO right now for testing. 3200 CL14 Boots no problem after updating the BIOS. I haven't done much of anything with it so far besides checking if it boots. Edit: If you have more issues again, try to manually set the voltage and sub-timings. I know you did already, but sub-timings are important for booting with these Ryzen Chips. Send me a PM and I'll see what I can to help you out.
  7. ir_cow

    i7 9700kf ram compatibility

    Yes if you are willing to tweaking the timings and voltages. It has been a while since I ran fast memory on a Intel system but I believe above 3600, it stops being XMP plug n play for the most part. If you raise the DRAM voltage above 1.4, i think the memory controller needs to go up as well. Im sure someone else has more recent experience. A quick google search tells me it wont be that easy.
  8. ir_cow

    Ryzen Build 2019

    I cannot complain about free performance on the GPU, but its going to sound like a jet engine lol. Pretty much whatever clocks you can get out of 1.3v is about all you can do for the CPU overclocking portion. Of course the more cores, means more heat. I just found from my testing is that above 1.3v sustain load (aka Prime95) most AIO coolers end up thermally running away. It will creep up to 100c and above if you let it run long enough. 115c is the safety limit, but I really wouldn't stay above 90c for long periods of time. Good news is you have less cores on the 3600x, so less heat. Bad news is that all Ryzens top out around 4.5ghz (all cores) without LN2. I cannot go above 4.3ghz without 1.375v for this 3800x. The AIO cooler can't handle it. So 4.3ghz is my soft limit with a AIO cooler. Some liquid metal and a open-loop solution, I could reach 4.5ghz...maybe. Anyways enjoy that new system!
  9. ir_cow

    Ryzen Build 2019

    Is this everything you bought already? Overall good choices. I agree not having a clear cmos button or debug lights can be annoying, but really unless you are into overclocking (which Ryzen CPUs cant really do), it isnt a big deal. I would say the 5700 softmod is silly. Seems like all it does is allow for higher clocks above 1800 vs 2100 the xt has. That blower cooler is going to keep the clocks down anyways. 100c TJ.max is normal
  10. ir_cow

    Good upgrade?

    For gaming the 3700X is actually faster. Especially if you manage to get the FLCK to 1900 and make the memory 1:1 ratio. A 9900K will beat out any AMD cpu in most games. It really depends on the resolution and video card. Because while that cpu is faster, if the video card cant keep up, its kind of a waste of money to spend more for the same FPS. Personally I would grab a 3700x and 3600 DDR4 CL16 ram and call it a day. You could play around with it and run 3800, but its a gamble. I managed to push this 3200 kit to 4266 CL18 but benchmarks were better with 3800 CL16.
  11. I would wait for the next generation of video cards. A 1080 Ti matches the RTX 2080. So the only upgrade path is a 2080 Ti. At $1300 a pop, it isnt much of a upgrade for 4k gaming. Games that dont support SLI are going to stuggle with some games @4k max settings. So lets say 25-30% increase for $2600 just for the video cards alone...
  12. Hello, I have some upsetting news and I am not sure how to respond in a civil and respectful way. It has comes to my attention that Gamers Nexus has deemed the XFX THICC cards not worth the money and may be the worst ABP card available. This is not a quote, but my summery of the videos. Someone watched these videos and sent me a email regarding this. This person (I will not name) has taken on assumptions and decided "I am a sell out" and my review results are "faked" to give the brand a good image. This is what I want to talk about... Instead of ignoring this person. I watched Gamers Nexus review and find it surprising the performance was indeed on the lower end. It also had me questioning my own testing methods. But after thinking about it for a while, I wanted to address this publicly instead of privately. After all, OCC is a community site and feedback is always welcome. Please do so! First I think it should be clear by now for anyone who has been around for a while, we do not fake reviews. In fact anyone remember XFX case from 2014~? Yeah that was garage and I even said so in my review. I think one of the assumption is since we get review samples directly from XFX, a good review of the product is kind of expected. Once again, while the overall review is that staff members personal opinion, it is generally the same conclusion as anyone else who would review it. We look for flaws with a attention to details that may or may not be important to some, but a deal breaker for others. That is what a review is, not a show n' tell. The only thing I can really think of why Gamers Nexus has such a negative review based on performance, may be related to being a retail sample vs where ever my sample came from. Mine I believe was the batch first off the assembly line sent to reviewers, sealed and no special care given to it (besides shipping across the ocean). No released date was provided initially because AMD lack of supply (noted in the news that week). When I first received the card, I was admittedly put off by the plastic shell and the benchmarks put it on the so-so level. But before I said anything, the XFX PR Rep gave me a new vBIOS that fixes the "fan curve" which was suppose to be the final revision. Which it did and also gave a performance boost overall. The vBIOS is on TechPowerUp. So in a nutshell, I suspect Gamers Nexus got the first batch of retails cards with the old vBIOS on it. But if that isn't the case, well I can't explain the performance (or lack there of). Because of completely different testing setups, I cannot give a real answer. On that note, I have NEVER been pressured by XFX to give a good review and everything I said is my own words. While the Plastic shell might be off putting to some and cheapen the feel, to others, it looks great. I don't care much either way, as I am someone who is always after the performance, rather than looks. But it was something that I question myself for the review. Ultimately the performance of the card was good, so I let the readers decide if the plastic shell is a personal issue or not. GN also pointed out the heatsink was aluminum, which I did know about, but never mention it in my review. Mainly because cooling issues wasn't present and I had no reason to talk about it. I think this is something I will have to talk about in future reviews, since I do agree it is important to know the internal cooling solution. GN also talked about high TJ.Max in which they have a point. My sample was still well under 100c and once again the performance and clock speeds were very good. It is also I need to start including in my reviews since old GPU Temp is no longer what is used for AMD cards. It is now the TJ.Max sensor. In light of all of this, I did change my OCC Award to Silver and noted the aluminum heatsink. Mainly because of the cooler itself. Price may be a bit high when compared to the competition, but sales and rebates happen all the time. The price isn't outrageous at MSRP, considering even with it's "Flaws" I found it much better than the reference blower cooler. To conclude, to whoever is reading this, I hope you understand not all reviews are equal and that is why reading multiple reviews is best practices. I do it all the time myself. This is just a extreme case of differences. Am I defending XFX?, no, I am defending my review of the sample I received. I hope XFX reaches out the GN and helps them out.
  13. First I think SSDs are way more reliable than mechanical drives. Just the size isn't a good price when compared. Anyways I pulled up the ASRock manual http://asrock.pc.cdn.bitgravity.com/Manual/RAID/Z77 Extreme4/English.pdf . Page 13 explains how to setup the raid via Raid BIOS, which is the only way I ever knew how to do it besides inside windows in software mode. The actual manual is confusing to what ports a which but I looks like you have 3 sets. 2x SATA 6 Gb/s from the CPU (Grey), 2x SATA 6Gb/s from the ASMedia ASM1061 (Grey) and 4x SATA 3 Gb/s from the chipset (Black). If you do setup raid from the Raid BIOS, you will need drivers from windows to recognize the raid from installation. For Linux, I have never setup raid as the primary OS. I do not know. Considering Linux Mint is barebones, I doubt it includes the drivers. ASRock only lists Windows drivers for everything... Easiest thing to do is grab a single drive for the OS and setup the raid for storage. Than you wont need to have drivers on hand for OS install. I think Ubuntu will auto download the drivers as it installs. Never had to put in a disk for it.
  14. ir_cow

    Comcast Modem Replacement

    I am using the ARRIS SURFboard SB6190 paired with NETGEAR Nighthawk X6 Smart WiFi Router (R8000). Running Blast 250 MBPS. I get the rated speed on Ethernet and 17.8 MB/s on AC WIFI. I only paid $120 for the modem 3 years ago and the Router was $50 from craigslist. Both are extremely solid. Never have to restart either of them. I use to rent a ARRIS from comcast and it was so cheap it would drop signal ever few hours. Comcast of course blamed me. But it was in fact them all along as I suspected. Another thing to note with the SB6190 is it has the Intel PUMA chip. People hate it because it has bad "ping". It has since been LoOOONG fix in a push update from the internet providers even before I bought mine. No offense to Netgear, but ever modem i've owned from them besides this Nighthawk has had major issues. Usually it was freeze every few days and eventually the 5Ghz band would just stop working. I have a few laying in a box. It was always so cheap ($50), so I kept buying them. To be fair my ASUS AC-1900 ($200) also stop working on the 5Ghz band also. Same with my Linksys. Seems super common across all companies for me. Not sure why It happens to me, but maybe no one realizes their 5G band isn't working.
  15. ir_cow

    PCIe 4.0 X 16 vs 3.0 X 16

    Yeah no need to spend unnecessary money. Try 16x PCIE 3.0 and compare to 8x PPD. Its a hassle, but it will give a better understanding of how little [email protected] bandwidth actually uses compared to gaming. Things do change and the FoldingCore is always updating. What are they on now Core22? Maybe it uses more bandwidth than before.. IDK.