Jump to content

C2D Vrs 4400+ A must read!


-CsA-TAZ

Recommended Posts

I agree with you completely Happy ;)

 

really wish I had my hands on a Crossfire/FX-60 system so I could post some comparison. Or at the very least an AMD 64 X2 that's runnin' the same speed I am. Course, with my ram the way it is now, good luck on that to anyone thinking of trying ;)

 

I do definitely agree with you that SLI increases are more noticable in games, it's like the threads I posted to another thread earlier this week I think, showed Crossfire VS SLI (X1900s VS 7900GTXs), granted they were run with an FX-57, but SLI whipped butt in FEAR at 1920x1200. Only resolution I believe they got around the same performance in was 1024x768... and seriously, who the hell games at that resolution anymore?

 

Even my friend with his lowly 3200+, 1GB of RAM and 6800XT games at 1280x1024 or 1600x1200 LOL Course he plays mostly Doom3 although I recently got him in to FEAR which is kickin' his (and his computer's) butt.

 

This is another reason I wish Oblivion had a darn test built in to it like FEAR does, because I think it'd show how truly crippling modern graphics is. I mean, plain and simple, look at my specs, this is no light weight system, yet, if I turn EVERYTHING (even grass, which I never use cuz I hate not being able to see . I drop on the ground), when it's raining etc. and I'm outdoors, my frame rate occasionally drops in to the teens.

 

People always say "Well ATI can do HDR and AA at the same time, SLI can't" ... okay, explain this to me:

 

If ATI Crossfire ALREADY performs WORSE than SLI and I'm gettin' in the the MID-1x Frames per second range...

 

WHY would you want to make the cards do MORE and drop your frame rate even further? Seriously?

 

That's my two cents.

 

Also, I do agree with you that the whole master card thing sucks, although I do agree with ATI on one fact, that you can mix and match (to a certain extent) cards, whereas with SLI you can't even mix a 7800GTX 512MB with a 7800GTX 256MB (you can do this with some different sized memory cards, but those two in particular are ... far more different on the NVidia evolutionary scale than they'd like you to believe). That's nice, cuts down on cost (You could start with a X1800 and an X1900 master card, and buy another X1900 later, at least that's the way I understand it).

 

And ATI has overclocking potential whereas Nvidia manufacturers lately have been putting out cards already maxed out on speed (unless you volt mod).

 

But when it comes down to it, I want raw, pure, outta the box performance. And I have to say I'm very happy with what I've gotten outta these 7900GTXs, and I was also very happy with my 7800GTX 512MBs before I sold them to get my 7900s.

 

I can't wait to see what 8800s can do, I'm sure they'll own anything out, although I do think 7950GX2s will at least keep pace with them in some things (2 graphics cores verses 1, you'd expect no less).

 

And to be honest, I just thought of something today... You know how Intel always used to put out these insane speeds for processors (3.4, 3.6, 3.8)... why all of the sudden did they stop that? I think my crazy mind may've come up with the answer. What if they didn't? What if (this is gonna sound very AMD-prejudice) they got tired of AMD beating them with lower clock speeds, so they decided "You know what, we're gonna put out a 3.8GHz processor, but sell it set to 2.4GHz, so people think they're gettin' a great deal".

 

LMAO That popped in to my head today, funny huh? Talk about conspiracy theories. To be honest, if I were Intel, I'd've done just that LOL Either way, Conroes ARE better than anything else currently on the market in real world and all other situations... but, to be blunt, what do you expect? Socket 939 (I'm not mentioning AM2 because I believe it's the biggest worthless update in recent computer world history for reasons I won't go in to here) is how old? 2 years? Granted, the 775 socket the Core2Duo fits in is equally as old, but they redesigned the chip itself from the ground up. If AMD had done the same thing, I'm sure they'd have a great product too right now, unfortunately they decided to be stupid and go for AM2/DDR2 (I call this stupid because I've seen benchmarks from a Stock FX-62 VS a stock FX-60 clocked to 2.8GHz. There's almost no difference in performance, and seeing as DDR2 ram they were testing was DDR2-800 VS DDR400 in the FX-60 system... well, I think you see my point).

 

I hope they've learned their lesson and they'll come out ahead when Quad cores hit. We'll see, me? I'll be waiting for AMD's regardless. And then, IF Intel is still on top after AMD's quad cores hit, then MAYBE I'll CONSIDER getting an Intel.

 

Until then, me and my FX-60/Opty 165 will be playin' Halo (and FEAR and Oblivion as soon as I get 2 more sticks of this PC4800 so I can have 2GBs of RAM again), and enjoying every moment of it ;)

 

::steps off the proverbial soap box after taking a deep closing breathe:: haven't typed that much in a single post in a loooooooooooong time.

Share this post


Link to post
Share on other sites

  • Replies 37
  • Created
  • Last Reply

Top Posters In This Topic

well I got to say that Crossfire...ain't all everyone wishes it was.

 

It rocks real nice in 3d benchmarks...but I'm beginning to honestly hate it in real games since...I play a good amount of games and in every game so far, I get better performance with a single card.

 

But as you have often said, you need to be running 1600 X 1200 for before it starts to show its true potential.

 

NWN2 is a prime example...runs like butt with Crossfire on (6.10 latest drivers) but disable Crossfire and a single card runs it like a champ (and then there's momma's rig with a lowly 7800GT and it seems to run better than my rig for some reason)

 

Hmm i always find this with Aquamark3, my 7800GT aways did as well or better in this benchie than any of my ATi cards!

 

 

I mean, if Crossfire works like a champ and shows scores of 40k in '03 and 160k in Aquamark, but won't run my NWN any faster than my gf's 7800GT...what kind of results are those?

 

True!

 

and if you are going to compare stuff, might want to show people some results, not just type them. If you want examples, look at the top stickies in the AMD - Overclocking section.

 

LOL ok ok will do, but like i have stated below, i have jumped the gun droped a boll*** :sad: :eek: :sad:

 

Hey CsA, why don't you try clockin' that Duo around the same as the X2 (2.6ish) and see what the difference is... I'd love to see those results, very tempted to have a comparison with Old Guy even though I know I'll loose :) LOL

 

Hmm, well it may have been premature of me to throw those results up there as it now appears to be a problem I have, no matter what is set my clock speeds to the benchies always hit around the same score, so it is now obvious that Im doing something wrong in my Bios, so I apoligise for jumping the gun on this, however @ stock out of the box then the 3dmarks are only a little better than my AMD @2.4, that only scores 300 points less in 3D/06 than the C2D @ 2.4, so once i get to the bottom of this problem i will report back and eat so humble pie! :sad: :O

Share this post


Link to post
Share on other sites

@taz: and perhaps you should use more cpu intensive applications when testing two cpu's at same speed (forget about points in 3dmark, or check cpu scores not blend)...

 

Try encoding/ripping/rendering; and to get a clearer picture you need to test big files. Encoding 8mb of data will give you a fraction of seconds difference, but try 800mb of data >>> you will see the pure computing power of c2d's!

Share this post


Link to post
Share on other sites

Happy, perhaps you could post about your experiences with crossfire over in the video cards forum? I've never really heard of many problems with it and would like to know more before I invest the $ in getting a crossfire setup.

 

And I'm sure others would be interested in reading it as well as I can't be the only person looking to do a crossfire setup with the C2D chips out.

Share this post


Link to post
Share on other sites

Happy, perhaps you could post about your experiences with crossfire over in the video cards forum? I've never really heard of many problems with it and would like to know more before I invest the $ in getting a crossfire setup.

 

And I'm sure others would be interested in reading it as well as I can't be the only person looking to do a crossfire setup with the C2D chips out.

 

well...I'm hesitant to go on a rant because others seem to enjoy their Crossfire setups just dandy...

 

and i finally got a couple 7950GT's for SLI so I'm gonna really do some testing to see exactly how good SLI is these days (because previously I only had 6600GT's for SLI)

 

I'm just in a foul mood so don't pay any attention to me.

Share this post


Link to post
Share on other sites

well...I'm hesitant to go on a rant because others seem to enjoy their Crossfire setups just dandy...

 

and i finally got a couple 7950GT's for SLI so I'm gonna really do some testing to see exactly how good SLI is these days (because previously I only had 6600GT's for SLI)

 

I'm just in a foul mood so don't pay any attention to me.

 

Cuh there's you ranting on and complaining about your top top end system when I'd give my left leg for it (OK, probably not...)

 

I'd be happy to take that second X1900XT off your hands believe me ;)

Share this post


Link to post
Share on other sites

@taz: and perhaps you should use more cpu intensive applications when testing two cpu's at same speed (forget about points in 3dmark, or check cpu scores not blend)...

 

Try encoding/ripping/rendering; and to get a clearer picture you need to test big files. Encoding 8mb of data will give you a fraction of seconds difference, but try 800mb of data >>> you will see the pure computing power of c2d's!

 

yea i hear what your saying but i had it running @ 3.6 today and still not much better than @ 2.4, but i will try some other benchies like the built in ones in fear etc, there is a few settings in my bios that im unsure about maybe you can point me in the right direction on another site /forum where i can get some mor info on this mobo! thx!

 

P.S what monitoring software do you use for temps etc?

Share this post


Link to post
Share on other sites

taz: i am using core temp (as can be seen in the screenshots in my sig). I want to believe the readings are correct. In the worse case, we know that C2d's dont throttle up until 80s, heh.

 

and for bios settings, perhaps my other thread in this section might help you. Otherwise google my friend is your best bet.

 

Furthermore, I know what there are at least 2 big threads in xs (but currently i am not in love with xs...). Vr-zone, pcmoddingmalaysia, bleedinedge might also help...

Share this post


Link to post
Share on other sites

Try Everest too, though in some cases it might not read correctly. (using a P5B right now and apparently I'm pushing 2.7v through my processor :rolleyes:)

 

And yeah, I highly recommend doing a divx conversion to see it shine, if you haven't done it before try autogordianknot (i think thats what its called, been a while) as its simple. ;)

Share this post


Link to post
Share on other sites

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now

×
×
  • Create New...