Jump to content


  • Content Count

  • Joined

  • Last visited

  • Days Won


Everything posted by ir_cow

  1. Looks like AGESA COMBO PI V2 for AMD now support NIVIDA GPUs resizeBAR. I'm not sure if NIVIDIA needs enabled it in the drivers. I suspect so. Using the past AGESA 1.0.9 for the X570 does not work with NVIDIA BAR support even though it does for RX,6800-6900 cards. Having issues getting any memory above 3600 to boot. Time to play with the voltages again.... Edit: Figured it out after only 3 Hours! (insert sarcasm) It looks like the newest AGESA requires a bump in memory voltage. What was once bootable at 1.4v, now takes 1.45 and to get it to pass Memtest86 it is now 1.5v. No wonder so many people are reporting their previously working memory is no longer booting. AMD did something that messed with the memory....
  2. I just finished testing AC Valhalla. This is a unbelievable amount of increase. Now I must do some more A/B testing on other cards and games to see if they are affected as well. I am 99% sure these are accurate. I did these test back to back with multiple runs. Just hard to believe So far NVIDIA has not said anything supporting this feature. At least not in any currently card out. If they do...well I will have a lot of retesting to do. 6800XT AC Valhalla BAR Testing: 1920x1080 BAR Disabled: 121 FPS 1920x1080 BAR Enabled: 140 FPS 2560x1440 BAR Disabled: 97 FPS 2560x1440 BAR Enabled: 110 FPS MSI UNIFY Z490 BIOS
  3. On the MSI Z490 Unify, after updating the BIOS its either Disabled (default) or enable. I'll take a screenshot when I am home. Haven't tried it on the ASUS X570 Hero yet. I have to reinstall windows since CMS must be disabled. I'm running off a SATA drive so it fails to boot when disabled. apparently a windows reinstall with it disabled is the only fix.
  4. I was looking to see how I can tell if BAR is enabled in windows besides running benchmarks and compare. Well someone figured it out. After you enable it in the BIOS, boot into windows and follow these steps. Go to Device Manager Under display adapters - right click on your GPU Click properties Go to the resources tab and make sure that "Large Memory Range" is listed.
  5. I didn't see the OP used the "Try it" memory option in the BIOS. Download the Ryzen Memory Calculator and plug those numbers in, you will get better results i think.
  6. That is really bizarre that the same memory is no longer stable at such low frequencies. You should be able to go up to 3600 without even touching the SoC voltage to keep that 1600 Mhz FCLK to match the memory frequency. So it is unlikely that it is a problem with the CPU. I've run into problems where after I update the BIOS, all my saved profiles no longer work. Have you tried to reset the motherboard BIOS to defaults and start over? You can do this in the BIOS or just reset the CMOS. Might be worth a shot. But here you say that the memory is OC to 3200, so what is the stock XMP profile, the 2133? you listed? If that is the case, you might just need to raise the memory voltage to 1.35v as the default 1.2v might not be enough to reach that frequency stable. Maybe it was set on Auto voltage before and it now has changed back to defaults with the new BIOS update for this 5600X CPU. Edit: I'm pretty sure that memory is Hynix A-Die in which case 3200 CL16 might be the upper limit of those ICs without doing some major tweaking. I don't think those respond well to higher voltage like1.5v either.
  7. I'm just the baby here, but it getting close to 18 years now for me on OCC. I didn't know Verran well, and my memory is a bit fuzzy. However I do remember him being very helpful when I asked stupid questions about computers.
  8. I have it. Honestly a bit disappointed. All these reviews said it was faster than a 9900K or 10900K. I've tested it against 20 games now. So far only Civilian 6 beats Intel and that is it. I would say 10 are still behind in frame rate (10FPS+), 9 match it and 1 of course I said beats it. Got it at 4.6Gghz 2000 FCLK 1:1. 4.7Ghz seems for 99% of us that is the limit unless your willing to pump 1.4v into it. So my 10900K is still faster in everything. Costed less and has better memory support. It was a good improvement over the 3800X I had previously, but that CPU had good value for gaming, but the 5600 and 5800 do not compared to Intel right now and the scalpers prices make it even worse. This is just my opinion from the data I have personally collected. I can build both for the same price, same memory. So Intel is the winner when it comes to gaming. At least for this price point. The 5950X has other qualities that make it a better pick over Intel, but its not the same price, or even close.
  9. Yeah the forums were booming in 2009. Still lots of old memebers are still lurking in the shadows
  10. Cooling look case! Also Chris once again does a great job of covering everything about this case top to bottom.
  11. I've been playing with a 1920X and RTX 3070. Some scenes even with DLSS enabled just crawled so I turned Ray-Tracing to medium. On a side note, I am just about 9 hours into the story and Keanu Reeves finally appears and starts the main storyline. Either i'm really slow at completing the main missions (thats all I've been doing) or the marketing lied to me - haha! All the trailers showed him front and center. I'm not complaining, though reviews say the main story is only 15 hours long. This would mean he is only like 1/3rd of the storyline or I am really just that slow.
  12. ir_cow


    I ended up watching the video fully. I think the title is correct but the video came out when RDR2 was only a few weeks old. I think Vulken API is much better over DX12 for that game. The optimizations recommend isn't half bad. I am just unsure if those are numbers are arcuate now.
  13. I would say your are okay, though the headroom is low for my comfort. It also depends on the grahics card. You didnt list it, but from picture I would say a RTX 3070 FE. So 4.8ghz - all-core @ 1.25v ( probably) would net you 250 watts. Add 240 for graphics card and 100 for everything else. That puts you at 600~ max load.
  14. ir_cow


    You just have to have to use Vulken API. I run all my benchmarks max settings. Its graphically demanding and light on the CPU. Edit: That video is outdated. Most, if not all the performance issues have been solved in patches and driver updates.
  15. ir_cow


    Its mainly single player but has Online function that is half baked (for now). Think of GTA5 but a western movie.
  16. None of those things in the article will do anything. GTA 5 is very CPU intense. I suggest going in and start with low settings and dail it up as you play. See what works best. If your on intels iGPU, it might never be playable above 1280x720.
  17. If you can get all sorts of sizes of pads off amazon and newegg. Only so many choices, .5mm 1mm and 1.5mm. https://www.newegg.com/p/2S7-009N-00010?Description=thermal pads&cm_re=thermal_pads-_-2S7-009N-00010-_-Product I had the GTX 1070 version, same cooler and PCB. I believe it was 1mm. I had the same problem when I went to put my cooler back on after a waterblock. Its been over 4 years since I did it, so I really don't remember honestly.
  18. What year is this 1998? Your taking about DDR-266 and DDR-400 here. The FSB and DDR overclocking was before my time. Maybe someone else can help, but knowing the CPU might help a bit too.
  19. The microcenter guy is wrong about the memory controller. It is actually on the CPU. So the only thing that the motherboard has control over is voltage stability and signal integrity. Some extreme overclockers have been getting 5000+ on the memory with a B550 chipset, but you will find its all with the new Zen 3 CPUs. Just marketing to show off that B550 isnt just a budget product. I believe the world records are still on the x570. In either case, all those really high records aren't for daily use or even 16GB. Most are a single 8GB DIMM with half disabled. For memory, 3600 speed is plug n play, though my limited experience with 4 dimms vs 2 tells me you are going to have a hard time overclocking. However if you are going just running at stock, proabably wont have much issues. That is also where the quailty of the mb comes into play at lot more. More dimms means you need better voltage regulation and PCB signal integrity.
  20. @wildman2 thanks! Apparently I cant use Google very well lol. Well in that case the MSI Unify is my first choice over gigabyte. I just like MSI BIOS better and have had better memory compatibility. Overall whatever is cheaper of the two.
  21. ir_cow

    New Build

    Yes "Boost" is up to 5Ghz. So 50x is correct. However thats for 1 core only. A lot of motherboards will do all-core as a default, which is technically wrong to do so as its not in Intels specs. Second is the power package limit is 250~ watts I believe. Either you need to lower the voltage or overclock it manually.
  22. But thats only going to tell you idle voltage. Still its a start.
  23. Ohh we are going waaay back to AMD X2 days. Well you said the system died. So it could be a number of things, but the good news is replacements parts used on ebay are really cheap. The only thing that might be a problem in replacing is the Power Supply since the standard has charged and shifted from 3.3v / 5v to 12v. So lets start with what it does. Like how did it die? does it not power on, not boot into linux, system crashes, etc.
  24. Glad to see your build coming to life. For motherboards and I believe for what you will be using it for, which is hands off 99% uptime. Best thing you can do is get a board with a good VRM so it doesn't overheat. The 5900X will use around 80-120 AMPs as it boosts up and down. My 5800X is constant 85A because I turned boost off and leave it at 4.6ghz all-core. That 5900x has a few extra cores. Anyways of the three choices I would say the Aorus Master. Simply because A) it uses 12x50a (600 AMPs) IR3556. The Asus has the same 12x50 config but using Vishay power stages. These seems to be more budget oriented. Finally the MSI Unify....I couldn't find any review that actually looked the VRM part number...Thats a no go for me. --- Next is the memory. It is true that having your memory in sync with the FCLK dramatically lowers latency. 2:1 can be as high as 80ms and 64~ on the optimal frequency and FCLK. But it gets more complicated from there. Timings also change the latency. I haven't looked into anything but games in my investigations. In those at least, by inducing latency with higher timings had a lower impact than a 2:1 FLCK ratio. With that bit of information I would take a for example 4000 memory with CL19 over 3600 CL14 IF the FCLK was still 1:1. Of course this 2000 Mhz FLCK can only be achieved on the new Zen 3 CPUs (5000 series). I have only played around with it a little bit, but so far it seems exactly the same as previous CPUs. If you want the highest FLCK you must raise the SoC voltage. I usually sit a 1.125v, but a lot of people do not want to play around with voltages and stability testing. I have "heard" that 4 DIMMS is better in performance then 2 even for dual channel CPUs like Ryzen. However I have not tested this out in a scientific way. The only thing I can say for sure is that the more DIMMS you have, the more stability issues arise at higher frequencies. I had a super hard time getting 4x8GB 3800 stable, where as 2x16GB 3800 had no problems. Same amount of memory, same timings. On a side note, booting to windows does not count people! Must validate with Memtest86. Since you are using Chrome, I would aim for a 3600 64GB (2x32GB) kit. I'm sure 32GB is enough, but Chrome does eat a lot of memory depending on the web page. Don't worry about 4000 / 2000. You are aiming for maximum up time. You will theoretically take a hit in latency, but I don't know how much for your applicational use.
  • Create New...