Jump to content

Can you compare performance by MHz between chipset makers?


FataLFlaw
 Share

Recommended Posts

I remember back a few years ago when AMD first started using their numbering system so that consumers could better compare their chips with Intel's. Whether or not this was just a marketing tactic, the argument was that you cannot compare performance based solely on clock speed.

 

Does this same logic apply to video cards? Given two cards with the same amount of memory, will the higher clock speed perform better? I hope this doesn't turn into a fanboy war between ATI and Nvidia, I'm just attempting to understand this better.

Share this post


Link to post
Share on other sites

No, you can't. The benches between Intel & AMD consisted of "Intel wins on desktop-work, AMD wins on Gaming".

 

Same can/could (haven't kept myself up-to-date on this) be said about ATI & Nvidia: "ATI wins on Direct3D, Nvidia on OpenGL". Remember the reviews, using Half-Life 2 and Quake 4? ;)

 

So no, to me Mhz doesn't say it all. Utilization of the thechnology is more important...

 

edit: CPDMF beat me to it... LoL

Share this post


Link to post
Share on other sites

hehe, some megahertz are tinny... while others are HUGE! :D

 

I would take a look at Anandtech's or the infamous Tomshardwareguide to see who is who at diferent game testing... and even then its dificult to understand the results, or actually believe them.

Share this post


Link to post
Share on other sites

At one point in my life I always thought of nVidia as the "safe" venture with great performance while ATI was only for those willing to tinker forever with their rigs. Well, regardless of the fact that I am constantly tinkering with my rig I have found that is NOT the case for either. I have had driver and hardware problems with nVidia chipsets that are supposed to be exclusive to ATI and I have had "tinkerless" (oh that's a funny word) success with ATI.

 

In the long run, at the level of performance that the average user (98%+ of us) will obtain through either platform is of such a nominal change in the long run I cannot see the worth of clammoring over which platform to actually end up with.

 

AMD or Intel, ATI or nVidia, Norton or McAfee, Windows or Mac (ugh, go RH), air cool or h20, plastic or paper. There are infinite battles to partake and you simply need to choose which side you're on. :) Both sides win constantly so it isn't that hard of a decision. ;)

 

I personally do not dislike nVidia nor do I not understand why anyone would go with nVidia. Rather I love the ATI chipsets so much right now that I am on their side. Purely personal preference.

 

And on a side note with speeds: When I OC'd my board to 850MHz memory (1700) I noticed my 3DMark06 scores went down from the 825 (1650) setting. As I understand it this is inherently due to the memory chips' architecture. I am performing far better on 3DMark06 at my current 909 (1818) setting than anything higher than that. Same goes for the core settings. Basically, I believe this negates the argument that higher clock speeds mean better card. So many other protocols have an effect on the performance.

 

boo

Share this post


Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

 Share

×
×
  • Create New...