Jump to content

4Ghz Overclocked Core i7 930 Bundle deal


Recommended Posts

Yeah, I know.

 

I'm just thinking that losing 2 percent of the performance would basically mean I was losing quite a lot of the added benefit of my overclocked graphics card.

 

It's running at 925Mhz instead of 850Mhz.

 

http://www.overclockersclub.com/reviews/sa...oxic_2gb/16.htm

 

According to this very website, it outperforms the GTX 480 in most games.

 

The performance increase of the Toxic vs the normal HD5870, in Crysis Warhead, is about 4 percent. So if I lose 2 percent of the total performance, that is quite a chunk of my overclock benefit being lost.

 

 

I think I have a plan though. I'll get the Gigabyte motherboard, and keep just the one HD5870 for now. I'll see if I can get it working with the 9500GT, and see how well that works in games vs CPU based PhysX. If there's not much difference, then I'll just get rid of the 9500GT and get a second HD5870 later on.

 

Metro 2033 is the game that worries me. I could always complete that using the 9500GT then get rid of the card afterwards.

 

 

Understandable, but the 2% you lose is on the total, so think of it this way

 

850MHz x 2% loss = 17MHz lost

925MHz x 2% loss = 18.5MHz lost

 

So your OC is losing a TOTAL of 1.5MHz on your overclock, which is TINY compared to the cost difference for a x16 CF mobo!!!!!

Share this post


Link to post
Share on other sites

  • Replies 56
  • Created
  • Last Reply

Top Posters In This Topic

16x vs. 8x yields the same performance. The cards never use the bandwith.

 

16xvs8x-2.png

 

From what I read, the new ATI and Nvidea cards still do not use the total bandwith. Even if it is a 2% loss. If 16x does 100 FPS and 8x will do 98 FPS. If your refresh rate of your monitor is 75 FPS anything over is water over the bridge. People brag about high frame rates but monitors with 75 FPS only run 75 FPS. LOL. Your screen can say 500 FPS but the monitor is only running.... 75 FPS.

 

Understandable, but the 2% you lose is on the total, so think of it this way

 

850MHz x 2% loss = 17MHz lost

925MHz x 2% loss = 18.5MHz lost

 

So your OC is losing a TOTAL of 1.5MHz on your overclock, which is TINY compared to the cost difference for a x16 CF mobo!!!!!

 

 

Ahhh. It doesn't work like this.....

Edited by Drdeath

Share this post


Link to post
Share on other sites

Math always works! The main reason I love math, is because it can be completely proven all of the time. But Dr. you are basically making the same point that I am with the x16 vs x8 arguement and the performance loss, and although you are not going to lose literally 1.5MHz on the clock, I believe the equation helped the OP understand that he wasn't losing the performance he thought he was on a x8 CF mobo!

Share this post


Link to post
Share on other sites

Math always works! The main reason I love math, is because it can be completely proven all of the time. But Dr. you are basically making the same point that I am with the x16 vs x8 arguement and the performance loss, and although you are not going to lose literally 1.5MHz on the clock, I believe the equation helped the OP understand that he wasn't losing the performance he thought he was on a x8 CF mobo!

Agreed! Nice! :thumbs-up:

Share this post


Link to post
Share on other sites

Agreed! Nice! :thumbs-up:

 

Hm. Point taken.

 

Actually my monitor can go up to 100Hz, but still, yes, it does seem to be a tiny loss.

 

I think I'll go ahead and get the bundle then.

 

God knows if I'll actually be able to get the PhysX card to work, though! Sounds very complicated. Some people claim you have to stick resistors up the vga socket or some such weirdness. Other people say it's unneccessary.

 

Any idea what the truth of the matter is?

 

Oh, and by the way, when one of your x16 slots turns into a x8 slot, and you have a 5870 card in each x16 slot, what happens to the performance of the card in the remaining x16 slot? Does that drop to match the x8 slot card? It's not a huge deal if it does, but is that what happens?

Edited by dennis.resevfan

Share this post


Link to post
Share on other sites

Yea, unfortunately, no matter what each slot runs at individually, if the mobo specs say x8 in crossfire, then that is what you get with a 2 card configuration. For example a mobo has 2 pcie slots, 1 at x16 and 1 at x8, when you have a card in each running CF, they will both run at x8, matching one another!

 

I hope this answers your question. But, no matter what config the mobo has for the pcie slots, you will see a noticable difference when running CF vs just 1 card!

Share this post


Link to post
Share on other sites

If your refresh rate of your monitor is 75 FPS anything over is water over the bridge. People brag about high frame rates but monitors with 75 FPS only run 75 FPS. LOL. Your screen can say 500 FPS but the monitor is only running.... 75 FPS.

 

It's not all about what your monitor is running at. Some games run differently with say 500fps and 1000fps. Doesn't matter if you see it or not. Also it depends if they're talking about 75 average fps or something else. There is always plenty of room for improvement until your minimum fps is over 75 (if you have a 75Hz monitor).

Edited by Flibo

Share this post


Link to post
Share on other sites

Just checking, but do games ever demand GPU based PhysX to activate certain options, or is it simply a matter of better frame rates if you use the GPU rather than the CPU?

 

Not so that I have noticed with my setup.

Share this post


Link to post
Share on other sites

Not so that I have noticed with my setup.

 

Hm, ok, thanks.

 

It ws mostly just Metro 2033 that was worrying me.

 

That game is notoriously fickle. Not to mention it supposedly runs like crap no matter what hardware you have. Then again, I won't be using any 3D or anything like that, so maybe it'll be ok.

Share this post


Link to post
Share on other sites

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now

×
×
  • Create New...