Jump to content

ir_cow

Reviewer
  • Content Count

    6,647
  • Joined

  • Last visited

  • Days Won

    10

Everything posted by ir_cow

  1. ir_cow

    PS5

    Technically the cables only have to be certified for 2.1 (48 Gbit/s). Still the same amount of pins (19). Which is why they are all backwards compatible. I have some HDMI 1.3 (10.3 Gbit/s) cables that are shielded and was built to last. They display 4K @ 60. Which shouldn't be possible if it was truely only rated for (10.3 Gbit/s). It is really about the quality of the cable which is why companies replaced the number on the marketing with bandwidth instead. Easy to repackage and resell when the new specs come out. I'm waiting for AVRs and Pre-Amps to support HDMI 2.1, Marantz said in 2018 that they will offer a video/HDMI replacement board for the AV8805 once everything becomes certified. Projection is 2021 . Everyone else in the game like Emotiva, Outlaw, Anthem, etc haven't even talked about 2.1 implementations. This tells me no sooner than 2021-2020 before people with home theaters can enjoy true 4K HDR (Dolby Vision) without compression or 8K with surround sound. Yes eARC works, but its kinda funky like lip sync issues and doesn't support anything above basic Dobly 5.1. Forget about Dobly Atmos.
  2. If you wipe the drive you lose all the data just to be clear. But yes you can wipe it without issue. I have MX100 500 drive that I have wiped a few times now.
  3. You can fold on PICE 2.0 8X without much of a loss for a 1080 Ti. Instead of a million PPD per card, it was like 800K. Of course it is very much WU dependent. I was doing it with my old Intel 3960X last year. 4 Cards, 8x8x8x8 2.0. PCIE 3.0 1X is really bad and 2.0 4X isn't much better.
  4. Yeah but if you go above 101-102mhz it will break PCIe and many other things. Only extreme overclckers have done 104. It is much easier to use the cpu multiplier these days.
  5. Welcome back!! I was trying to beat road runner for years, never got close.
  6. Hmm you might be onto something. The PCIE BUS should always been 100 by default. Some manufacturers will have it at 99.7, others 100.5. This would indeed slightly change the clock speed. As an example 100.6*44.5 = 4.476Ghz. But this wouldn't explain the 4.149Ghz unless its its like 100.4*41 = 4.116Ghz. That means the CPU boost multiplier is dropping when a overclock is enabled. My only thoughts on this is the voltage goes up and the CPU heats up causing the CPU to drop the boost clocks. It takes about 1.3v to sustain 4.3Ghz in Prime95. Even good AIO coolers hit 90c+. I still don't think this has anything to do with the video card unless its pumping out hot air all over the CPU heatsink. The VEGA cards are hot cookies and take a lot of power (200-300 Watts). Heck my Red Devil VEGA64 when overclocked draws 360 Watts alone.
  7. That is really weird. I haven't experience this. I have noticed weird things like 3.6Ghz base clock instead of 3.9ghz depending on the drivers I install. I honestly think AMD drivers are always broken in some way. Looks like you found a new bug.
  8. If your talking about the infinity clock on Ryzen CPUs, I don't think it does. The whole FSB BUS is a Intel thing. PCIE has its own clock separate from the CPU.
  9. I can't see how a video card is related to a CPU overclock. Maybe the OC was reset in the process of installing a new video card and drivers?
  10. All the Pros and Ultras for the RX 5600 XT from XFX are the same NAVI 7nm at this time and no matter what brand you buy it will be the same. However DX12 Ultimate (Ray Tracing Support) was just announced. I don't think any game supports it yet. Also I don't know how well the RX 5600 or 5700 cards will handle it and at what resolution. Once I find a game that supports it, I will benchmark it across all the NAVI cards I have. Edit: It seems from the news, it points to RNDA-2 (aka unreleased AMD video cards) "fully" support DXR 1.1. I don't actually know if its a extra compute unit stuck onto the GPU like NVIDIA did for RTX (Tensor cores) or it is just more powerful and the RNDA-1 (gen1) will support it to after a driver update. My guess from looking at the specs of DXR 1.1 AMD has to had something to the GPU to make it compliant. Maybe not all the DX12-U features will work. We just don't know yet.
  11. I don't get the whole passkey thing. Its not even tied to your username. I have changed my username a bunch of times and still use the same passkey. It works fine and I get all the "bonus points" that comes with using a passkey edit: Here is a passkey you can use 45beeee6abe55804d064da197ff3f5f4
  12. Different subject, this ones about if folding is bad for a GPU. The same title doesnt help.
  13. I don't think so. Other then having a video card that can be assigned "big WU", its random. Though maybe some turth to that statement. I notices Deadline for many covid projects is 12-24hrs when usually the WU have 3-7 days before it expires. So its possible only video card that can get it done in time is assigned the work unit. My RTX 2080 Super has about 1 hr turn around. I really dont know, just guessing.
  14. I believe the 2nd slot is PCIE 3.0 only, not 4.0. I haven't heard of needing the assigning the Gen to a M.2 slot unless you have limited PCIE lanes. Both the MSI X570 Gaming and ASUS X570 Crosshair booted in the first M.2 slot in Gen 4.0 mode. I didn't have to touch it in the BIOS. Edit: Sabrent Rocket, Corsair MP600 and Gigabyte AORUS all used the same Phison PS5016-E16 controller and I know that Sabrent and Corsair used Toshiba 96L TLC Nands making them essentially the same drive with a different sticker. If I think about it enough I think only Samsung, Seagate use in-house controllers right now. That is for the PCIE 4.0 ones.
  15. Folding is better then mining for sure. The longest I left a card folding was a year (GTX 1080 Ti). Seem to be okay afterwards. But just like any computer component, heat and overvoltage will always shorten the lifespan
  16. Its been going in and out all week. every time it failed attempts to talk to the server it, the time doubles. Mine have been saying 14 hours until the next attempt. I just restart the client it resets to zero.
  17. ir_cow

    PS5

    HDMI 1.4b supports 4K @ 60Hz Chroma 4:2:2. Most TVs sold in the last two years have HDMI 2.0 and end of 2019 Samsung and LG started to do 2.1 on the high end ones. If you want HDR @ 60Hz you need 2.0 or higher. But the best will be HDMI 2.1 if you want a "real" HDR (Chrome 4:4:4) image with Dolby Vision which is only just now getting on TVs under $3,000. Personally I am waiting for 60" 8K TVs to drop below $2000. Then I will replace my AVR and TV all at once. https://en.wikipedia.org/wiki/HDMI Not sure why HDMI doesn't work from your PC. Only thing I can think of is that HDMI connection is not HDCP compliant somehow.
  18. New projects added for COVID. No information about them on the folding forums.
  19. This is good news! However I want to help and it takes 20 minutes to get a new WU right now.
  20. In the web browser you can set which project to work on. But if you go into the "Advanced Controls" on windows toolbar you can change the settings under Configuration. Configure > Advanced Tab However for the CODVID-19, the [email protected] team said on Twitter that the default "ANY" project setting will prioritize it over anything else. So leaving default configuration will be best for now. Edit: It sucks you cannot set it only for COVID-19. One of my cards is working on 11763... I guess It all has to get done eventually and only so many WU are available at one time. So instead of being idle, your compute power can be used for other serious projects too.
  21. Hey people, Even though some of worlds Supercomputers are working on COVID-19, [email protected] and launch its own projects. https://foldingathome.org/2020/03/10/covid19-update/ [email protected] Download Link: https://foldingathome.org/start-folding/ Time to start folding again. Make sure to be working on the following projects. This will help researchers better understand the Virus. 11741: Coronavirus SARS-CoV-2 (COVID-19 causing virus) receptor binding domain in complex with human receptor ACE2. atoms: 165550, credit: 15396 11742: Coronavirus SARS-CoV-2 (COVID-19 causing virus) protease in complex with an inhibitor. atoms: 62227, credit: 9405 11743: Coronavirus SARS-CoV-2 (COVID-19 causing virus) protease – potential drug target. atoms: 62180, credit: 9405 11744: Coronavirus SARS-CoV (SARS causing virus) receptor binding domain trapped by a SARS-CoV S230 antibody. atoms: 109578, credit: 7608 11745: Coronavirus SARS-CoV (SARS causing virus) receptor binding domain mutated to the SARS-CoV-2 (COVID-19 causing virus) trapped by a SARS-CoV S230 antibody. atoms: 110370, credit: 7685 11746: Coronavirus SARS-CoV-2 (COVID-19 causing virus) receptor binding domain in complex with human receptor ACE2 (alternative structure to 11741). atoms: 182699, credit: 16615 Update 3/16/20 : New Projects 11759 11760 11761 11762 11763 11764 How to Setup [email protected]: Edit: [email protected] Servers are overloaded currently. If you are not getting new Work Units, this might be why. Please post questions ONLY related to Folding, we don't need to spread panic and false information.
  22. Hmm I guess it makes sense the 3950X didn't include a cooler. The 3800X gets toasty, so I can only imagine what 8 extra cores would do.
  23. The included Wraith cooler is okay to hold you over.
  24. Okay so I went ahead and popped my F4-3733C17Q-64GTZKK Trident Z DDR4-3733MHz CL17-19-19-39 1.35V 64GB (4x16GB) 64GB kit back in to see what I could do. As I suspected, XMP does not work because this kit I have is 3733 and I already knew it would not boot, but I tried anyways. I took a mixture of Ryzen Calculator settings and the limits of what MY system could do. Lots of trail and error. About 3 hours of playing around with the timing and watching Memtest86 fail. But I finally achieved 3200 CL14. I could not get it to go above 3400. Your best bet is to get a 3200 64Kit that is Samsung B-Die. I have no experience with Micron E-Die. I don't know the timings or voltages needed for the DIMMS themselves. The SOC voltage should be about the same though since that is the memory controller, not the DIMMS themselves. Maybe someone can sheld some light on what these G.Skill NEO 64GB (2x32GB) F4-3200C16D-64GTZN kits are. The less DIMMS you have the better from a plug and play standpoint. Those have to be dual-rank, but it would be 2 slots instead of 4. However I have no clue how low you can go on the timings. I suspect not much lower honestly. If you just want a plug n play kit, the G.Skill Neo ones are a good choice. I do see they have a 3200 CL16-18-18-38 kit. But if you want to get the best deal, find 3600 CL16 for cheaper and downclock it to 3200. Edit: It looks like most 64GB NEO kits are Hynix memory. If not all of them. Hard to track down each part number. So don't expect CL14 unless its Samsung B-Die. Seems like Samsung Kits are almost double the price! That is a little bit much I think. You're not buying a Ryzen with 64GB for gaming. So CL16 is a much better buy. So the take away here is don't waste money of a CL14 kit for $600. I see so many kits for $280 @ 3200 CL16, but they are all Hynix ICs. Also don't expect any kit to boot @ 3600 with 4x16GB either. I'm not saying, if I can't no one can. But you won't find many people running above 3200 for a 64GB kit. I had mine set for CL20 and it still wouldn't post above 3400. I think 4 DIMMS is too much for the memory controller (without applying unsafe voltages). I don't suggest pushing the SOC voltage long term past 1.125v. HERE are my best settings. You will want to start off with CL16 (16-17-16-16-35-50-36) and work your way down. The most important parts I've found when dealing with high frequencies or high DIMM count is the SOC voltages and RTT settings. CPU Settings: Memory Frequency = [DDR4-3200 MHz] FCLK Frequency = [1600 MHz] CPU / SOC / Memory Voltages: CPU Core Voltage = [Manual mode] - CPU Core Voltage Override = [1.325] (unrelated to memory overclocking) CPU SOC Voltage = [Manual mode] - VDDSOC Voltage Override = [1.10000] DRAM Voltage [1.400] VDDG CCD Voltage = 1.025 VDDG IOD Voltage = 1.025 CLDO VDDG voltage = .900 (Some BIOS just reads it as 900) 1.8V PLL Voltage [Auto] 1.05V SB Voltage [Auto] Memory Timings: DRAM CAS# Latency [14] Trcdrd = [14] Trcdwr = [14] DRAM RAS# PRE Time [14] DRAM RAS# ACT Time [30] Trc = [44] TrrdS = [4] TrrdL = [6] Tfaw = [34] TwtrS = [4] TwtrL = [12] Twr = [12] Trcpage [Auto] = 0 TrdrdScl = [4] TwrwrScl = [4] Trfc = [500] Trfc2 = [469] Trfc4 = [289] Tcwl = [14] Trtp = [12] Trdw = [8] Twrrd = [4] TwrwrSc = [1] TwrwrSd = [7] TwrwrDd = [7] TrdrdSc = [1] TrdrdSd = [5] TrdrdDd = [5] Tcke = [9] ProcODT = [48 ohm] Cmd2T = [1T] Gear Down Mode = [Enabled] Power Down Enable = [Enabled] RttNom = [RZQ/7] RttWr = [RZQ/3] RttPark = [RZQ/1] MemAddrCmdSetup = Greyed Out AUOT MemCsOdtSetup = Greyed Out AUTO MemCkeSetup = Greyd Out AUTO MemCadBusClkDrvStren [24.0 Ohm] MemCadBusAddrCmdDrvStren [24.0 Ohm] MemCadBusCsOdtDrvStren [24.0 Ohm] MemCadBusCkeDrvStren [24.0 Ohm]
×
×
  • Create New...