Jump to content


  • Content Count

  • Joined

  • Last visited

About Arboleda

  • Rank
  1. I've a few things to add. First, to the poster above... A wise student listens to his experienced mentor. However, the experienced mentor is able and willing to answer the golden question: "why". If I had to name one area where people on this forum could use a little improvement, it's the willingness to answer the question "why". The student learns nothing when you say "listen to me you f---ing n00b. Buy a high-end power supply or you'll suffer the consequences. Just do it." Some students (we who seek to learn are all students) will be satisfied with the answer. But what did you teach them? When they help their friend build a killer DFI rig and their friend says "why do I need the 520W power supply? And why doesn't the $20 520W power supply at CompUSA cut it?" then what does your student say? He says "Beats me, the guys at dfi-street said your . will blow up if you don't." This is just not the pattern that a mentor-student relationship should follow. To those who have been around the block and here answering questions from newer users, do you not wish to be a good mentor? Unfortunately I'm one of the really bone-headed "students" in the game of life who's skeptical of any answer that doesn't teach me "why". In fact I was annoyed with the whole "buy a 480 watt PSU or your computer will meltdown" so I bought a good quality 380 watt PSU and it worked fine. Ummm, until lastnight. Hehe, now here's where I stop giving some constructive criticism to those arguing the "shut up and do what we say" (which not all of you are doing, in fact the "why" has now been answered in this thread to those who are looking for it) where I transition to arguing in defense of the 480 Watt PSU recommendation. See... I started out with a Venice 3000+ and Radeon X700 Pro running with my SeaSonic S12-380. Granted I also had liquid cooling parts (high end fans on radiator, a pump, etc) as well as a SCSI controller and 15K drive in addition to my 7200RPM drive and normal system fans. So I argued a little bit the side that no one can tell me that a 480 A-List PSU is *needed*. My system was OC'd to 2700 MHz and was rock solid. And I even put it on a Kill-O-Watt meter and observed that no matter what I ran I couldn't get the PSU to draw more than 250 watts or so (which means the system was only drawing about 200 watts). Ha I said, proof that I don't *need* a 480 watt PSU. Now since that time I succumbed to the temptations of getting better parts and I've starting gaming on my PC at times whereas I used to only ever really use it for my profession, software development. I picked up an X2 3800+ and slapped it up 'till it ran at 2800 MHz rock solid (woot). Now mind you this was still with my SeaSonic 380W PSU. So at this point I'm still prime stable and game stable (and benchmark stable. See my post in the OCDB thread... err, no you can't because AG erased it. Grrr...) Then I saw the killer mod you can do on the brand new GTO2 cards by Sapphire and had to have one (I'm now into gaming after all). This card takes an auxilliary power connecter so it clearly uses higher wattage. I then decide to start with AquaMark to see if its stable. And sure enough, errors out after a few minutes. Try again, errors out. Piss, not cool. Can you guess what the problem is at this point? I try rebooting and get nothing but 4 LED's boot after boot. I put back in a 6600GT card that I have on loan from a friend. Still no boot. 20 minutes later I come back and it boots with the 6600GT. Thanks goodness. Hmmmm, I haven't slapped it on the Kill-O-Watt since upgrading to the X2-3800+. I put it on the Kill-O-Watt, run Prime95 on both cores and wham I'm drawing over 300 watts (with the 6600GTO mind you, not the GTO2). That GTO2 under Aquamark with heavy CPU usage may have been enough to push me past the abilities of my 380 Watt PSU. I may have even killed the new GTO2, who knows. I won't know if the GTO2 survived until I get my OCZ PowerStream 520 from FedEx tomorrow. Yep, the student has learned from the mentor. I didn't understand "why" in the beginning so I got burnt by stubborness. When other students come along and want to understand why, let's give them real answers. Maybe they can avoid burning themselves.
  2. You may have already done this but bear with me... On my rigs I've followed Thunda's recommendations of separately finding out the max CPU and max memory performance. To find out the max CPU performance, you drop that memory onto a much lower divider (maybe even 1:2). After you know the max stable overclock of the CPU, you can drop the CPU multiplier to something low, reset the HTT to 200, and restore the memory divider to 1:1. Then go about increasing the HTT until you get to points where you have to loosen the timings to keep getting the HTT higher. Stop when you don't want to continue to loosen. You now know your max CPU clockrate and your max memory clockrate. You sound like you're on top of things - have you already done this kind of stuff?
  3. You might be surprised how easy and addictive the overclocking can be. Honestly, I can't imagine going to the work of assembling a kick-butt DFI system and then not getting to the fun part of seeing how far the CPU, Memory, and Video Card can go in terms of overclocking. Remember, the people on these forums are not into silly overclocking where it's just barely stable enough to boot and run a benchmark that you can brag to all your friends about (while your system is blue-screening in the meantime). We're into STABLE overclocks that you can run rock solid day in day out for years. My Venice 3000+ overclocked from 1.8GHz to 2.7GHz. 2.7 GHz is between the FX-55 ($800) and the FX-57 ($1100). Man was I stoked! Then my second DFI rig took an X2-3800 clocked at 2 GHz up to 2.8 GHz rock solid. Man is that stuff fun!
  4. No... Let's take an example... Say you take a Venice 3000+ clocked at 1.8 GHz and run it in your sweet DFI mobo. Runs great at stock speeds, right? Now after you've won the lottery go blow $1100 on an FX-57 chip clocked at 2.8 GHz. Swap out your Venice 3000+ for that FX-57. Check the manual... Does it say you need to increase any voltages? Nope, you're all set to run. So if an FX-57 can run 2.8 GHz just fine without increased LDT and chipset voltages, why would your Venice 3000+ overclocked to 2.6GHz need increased LDT/Chipset voltages? It doesn't! The only reason you overvolt your processor is that you're squeezing extra clockrate out it at the expense of heat. Your motherboard, on the other hand, is designed to handle slow or fast processors without the need to mess around with LDT or chipset voltage. Take a gander over to the overclocking database thread. You'll notice many posts where they get an awesome overclock out of the CPU without the need to overvolt the LDT or chipset voltages. Does this help?
  5. Wow, can't believe my entry was removed. I can't say it any better than Psylenced. I put a lot of time into putting that entry together and it got removed because I summed up my benchmarks in text *in addition* to linking screen shots of them!? Wow. And you classy lady and moan that more people don't post to the OCDB after pulling a stunt like that? You didn't even PM me or simply remove the offensive text summary. I thought I was being helpful by summarizing the benchmarks in text. Hell, I've had trouble reading the poor screenshots from other posts so I think I'm being nice by summarizing. You gotta realize that while I may be reading too much into this, it's a slap in the face. I've tried to be real helpful since I was introduced to the DFI scene a few months back. This is downright insulting.
  6. I've got an X2 3800+ that runs 2800GHz rock solid on liquid cooling. I'm just about to receive an SI-120 air cooler that I was going to try out on it out of curiosity. You're welcome to make an offer but it'd have to be dang good because 2800GHz rocks!
  7. LDT and chipset don't need any extra volts to my understanding. Think about it this way... If you were to plunk down $1100 for an FX-57 running at 2.6GHz would you have to increase your LDT/Chipset to run the FX-57? Nope... Overvolting these components is just going to add to higher temps.
  8. You may realize this already but you'll find quite a bit (more) discussion about heater cores on xtremesystems.org's forums.
  9. You can use whatever combination works. 10x270 is the same as 9x300 in terms of CPU performance. Sometimes you might pick one versus the other (such as 10x270 if your RAM is happier at 270. Of course you can do 9x300 and put the RAM on a multiplier...) I'm a little confused about the question. Which BIOS do you mean? Not to my knowledge. My X2 3800+ (stock 2000 MHz) is overclocked to 2800MHz with an entry in the OCDB and I never touched the LDT voltage. Some people want to overvolt it just because they can and hey, maybe it will help (is their thought) but there's just no reason to do so in what I've read.
  10. I love the 704-2BT (Bit Toe) BIOS on my X2-3800+ with OCZ TCCD5 memory. YMMV but you might consider giving it a try.
  11. Methinks you may not fully understand what you're talking about here. First, I'm a software developer and have a good feel for the issues involved. Second, it's undeniable that the AMD driver fixes the problem. Before installing the AMD X2 Processor driver on my X2 3800+ (overclocked to 2800 MHz, woot) I tried three games and all three had problems. Symptoms: Earth 2160: Every 2-3 seconds there was about 0.5 second in which no frames were drawn. Looks system had the hiccups! Doom3: Same exact issue as with Earth 2160. FarCry: When I would move forward say down a 100 foot tunnel it would skip as I moved. The frames were drawn the whole time but one moment I'd be fluidly moving from the 100' to the 90' point and then all of the sudden I was at the 80' point moving fluidly to the 70' point. Stop moving and everything looked fine. After installing the X2 driver from AMD, *all these issues were fixed*. It's not voodoo and I didn't do anything else except install the driver and try again. Now, that said, the driver's implementation is probably kludgy in the sense that I believe it allows most apps to run with default dual-core affinity and somehow senses games that are known to have problems and sets their affinity to one core or the other. How a CPU driver could know that a game is playing, I'm not sure. But there's no questioning my observation.
  12. The OCDB thread doesn't have any entries at all for the Venice 3500+ to my knowledge. As of a few weeks ago, here are how some of other other chips have done: ClawHammer 3500+ (1 entry) Minimum: 2200 + 550 = 2750 (25%) Average: 2200 + 550 = 2750 (25%) Maximum: 2200 + 550 = 2750 (25%) ClawHammer FX-55 (3 entries) Minimum: 2600 + 106 = 2706 (4%) Average: 2600 + 167 = 2767 (6%) Maximum: 2600 + 200 = 2800 (8%) San Diego 3700+ (6 entries) Minimum: 2200 + 440 = 2640 (15%) Average: 2200 + 557 = 2757 ( 25%) Maximum: 2200 + 700 = 2900 (32%) San Diego 4000+ (7 entries) Minimum: 2400 + 350 = 2750 (15%) Average: 2400 + 446 = 2846 (19%) Maximum: 2400 + 504 = 2904 (21%) Toledo X2 4400+ (6 entries) Minimum: 2200 + 400 = 2600 (18%) Average: 2200 + 437 = 2637 (20%) Maximum: 2200 + 500 = 2700 (23%) Venice 3000+ (8 entries) Minimum: 1800 + 657 = 2457 (37%) Average: 1800 + 792 = 2592 (44%) Maximum: 1800 + 954 = 2754 (53%) Venice 3200+ (7 entries) Minimum: 2000 + 500 = 2500 (25%) Average: 2000 + 643 = 2643 (32%) Maximum: 2000 + 700 = 2700 (35%) Winchester 3000+ (1 entry) Minimum: 1800 + 540 = 2340 (30%) Average: 1800 + 540 = 2340 (30%) Maximum: 1800 + 540 = 2340 (30%) Winchester 3200+ (5 entries) Minimum: 2000 + 200 = 2200 (10%) Average: 2000 + 455 = 2455 (23%) Maximum: 2000 + 600 = 2600 (30%) Winchester 3500+ (2 entries) Minimum: 2200 + 440 = 2640 (20%) Average: 2200 + 440 = 2640 (20%) Maximum: 2200 + 440 = 2640 (20%)
  13. My sig has a link to my 3800's entry in the OCDB. It's rock solid at 2800 MHz so you might jot down the stepping.
  14. Angry_Games has a thread about memory clockrate versus latencies. He feels very strongly that CPU MHz is top priority, always. I'd go with the higher CPU MHz.
  15. If you're mainly a gamer, pick up a Venice 3000+ or Venice 3200+ for under $200, overclock the daylights out of it, and then upgrade 1-2 years from now when you can get a X2 5000+ for under $200 Seriously, in your shoes that's probably what I'd do.
  • Create New...