Jump to content

dual 3870 or 1 3870 X2?


Recommended Posts

Im wondering if i should dish out around 100 bucks more to get 2 3870's or just buy one of the X2's. Any suggestions?

 

I always choose a single card over a pair, seems that the performance is about the same between them in this case anyhow.

Share this post


Link to post
Share on other sites

I always choose a single card over a pair, seems that the performance is about the same between them in this case anyhow.

 

wouldnt the performance in CF be lower becouse he wont get 2x 16xpci-e but 1x 16 and 1x 4?

Share this post


Link to post
Share on other sites

do you play in resolutions higher than 1600x1200 (1680x1050 widescreen)?

 

If not, don't waste your money....a single card can easily do up to 1600x1200 (I've got an 8800GTS 320MB that does 4xAA etc on all games except Crysis without any trouble at all up to 1600x1200). If you absolutely have to have 4xAA or 8xAA + 16xAF and every single graphics option maxed out and you are playing minimum 1600x1200 (or higher), then a second card is a decent idea, and most especially if you are using a 24" widescreen or better (1920x1200).

 

But again, a single card is the better (and cheaper) solution (make sure you get a card that has at least 512MB of video memory on it!), and as Suedenim says, you can always add in a second card down the road for Xfire on your ATI Xfire-enabled chipset.

Share this post


Link to post
Share on other sites

I do believe that heat and performance wise the dual 3870 will be better than one 3870x2.

Also, you could overclock them much further than the 3870X2, on stock cooling that is.

 

Perhaps, but unless he bought an overpowered beast of a power supply for those 1900s he'd have to buy another one to run a pair of 3870s. I believe they have two power plugs on them. The chances of running an X2 on the existing power supply (whatever it may be) are higher.

Share this post


Link to post
Share on other sites

you wouldn't really need to clock an X2 though

 

Umm.. why not?

 

And if he plans to keep this setup for a while than overclocking will be important to keep up with newer games ( to a certain extent).

And overclocking is fun ( that's why I bought these hd2900Pro's, could just as easily have bought 2x 8800Gts, but these hd2900s are way more fun.lol).

Share this post


Link to post
Share on other sites

Umm.. why not?

 

And if he plans to keep this setup for a while than overclocking will be important to keep up with newer games ( to a certain extent).

And overclocking is fun ( that's why I bought these hd2900Pro's, could just as easily have bought 2x 8800Gts, but these hd2900s are way more fun.lol).

 

um....is this a serious question? If it is, you should google around and search this forum etc because the answer is so obvious that it surprises me that you (a long-time member of this forum) would even ask such a thing...

 

Overclocking is NOT important. It adds very little value or performance in the end...3 fps? 5 fps? considering the damage you do to the hardware, the shortening of the lifespan, the instability that it causes (not all games will play nicely with even a little overclock...you should know this already) I would say that Overclocking is NOT important and ends up being a detriment to your end-goal. Considering the level of overclock necessary to achieve any REAL performance gains, you can then see where this point comes from.

 

A 'little' overclocking like using the ATI CP or Nv CP to auto-clock etc isn't really part of the consideration. neither of those programs will overclock the gpu/memory beyond what is acceptable or safe.

 

I imagine you are talking about overclocking beyond that, which is where I tell you to read the previous paragraph about instability, shortened lifespan, etc.

 

 

As for fun...playing the games = fun. Overclocking isn't really fun (to me), it is just a task that one does. It might be fun for you and some others, but I'm going to venture a guess that for the majority, it is not really part of the equation (either no-clocking or using the "automatic" clocking in ati/nv cp's).

 

But here again, a dual-gpu needing overclocked? This is why I asked in the first part if you are serious because you already know the answer to this...but just in case, I'll present a scenario for you and you answer:

 

you have a 2Ghz single core cpu. you overclock it to 2600mhz to get more performance in most apps.

 

you have a 2Ghz dual-core cpu. is it necessary to overclock now?

 

answer = no for a couple of reasons.

 

a second core picks up the workload that no amount (or only extreme amounts) of overclocking can make up for on a single core. Even in today's games...cpu speed isn't really a factor...it's all about gpu's robustness.

 

 

so let's talk about a game scenario with gpu's then. You have a single gpu...you overclock it and the memory and get a few percentage points of framerate increase...until you hit the big resolutions (typically 1920x1200 which is right about where a second gpu becomes important). Then you don't get much increase at all.

 

so you add a second gpu, and now the game doesn't really run any faster, but you are able to introduce more 'candy' like AA, higher AF, more realistic shadows, etc...something that no amount of overclocking can make up for on a single gpu...

 

 

see where this is going?

 

 

So no, overclocking will NOT be important to keep up with new games. The only thing important in that aspect is keeping up with technology or adding in a second (cpu/gpu) to help with the workload. Overclocking is now a moot factor as you can see.

 

also keep in mind...what is fun for you isn't really always fun for anyone else (most of us just want to play that new killer game instead of xxxxing with our system for 3 weeks to try and get 2 extra fps)

Share this post


Link to post
Share on other sites

True.

But overclocking is not completely worthless. And dual cores don;t always work out how some would like to think. Many games don't scale so well at all.

 

And anyways, did you ever open up taskmanager when playing, say Crysis? The CPU usage shoots up to 100% within a matter of seconds on my old 2.0Ghz Opteron. Then, I overclocked it to 2.8 and the CPU usage hovers at 90-95%.

This is on a res of 1440x900 and with my old 7950GT.

Negligible difference, I know, but it's still there.

 

Overclocking video cards can be rewarding, if you know how to do it right and know how to circumvent/eliminate the various bottlenecks within the GPU.

Example: My old 7950GT, the GPU went all the way up to 700Mhz, Mem only up to 821Mhz though. The benchmark scores increased, but the FPS in certian games decreased. So, I reduced the core to 680Mhz and the Mem to 800Mhz and the benchmarks maybe dropped a couple of points, but the games' FPS increased.

 

Now look at my HD2900Pros, due to the high mem interface, the memory doesn't need any higher clocks. But the GPU speed is a BIG bottleneck on these cards (believe me, I've spent days gathering this info on various forums/sites). So, the first course of action for me will be to overclock the snot out of the GPU (no volt mods though).

 

Does it decrease lifespan? Probably, but without voltmods and no added heat it won't make a big difference.

 

:angel:

Share this post


Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...