Jump to content

8800 series comparisons in games


red930

Recommended Posts

I was impressed that a 7900GTX Sli or x1900xt/x1950xtx Crossfire could keep up w/the 8800GTX

 

and then there's the question of just how well Crossfire actually runs on the games in question. You don't hear these reviewers talking about sitting down for hours to play these games to get these benchmarks.

 

I've got to say I'm very disappointed in Crossfire after having used it for many many months (very nice X1900XT's as per my sig, on 3 different boards now)

 

Between weak drivers, bloated drivers, bloated .NET necessity, and weak game support...I can't wait to go back to SLI (or a single 8800 lol).

 

I remember reading all these great benchmarks about SLI when it was still in it's young stages thinking "wow, they are getting great fps!" and then the reality struck me that SLI really kinda sucked for a long time until the drivers as well as the game support was mature.

 

I mean, I could run all the benches and the timedemos in the games (just like I do right now in Crossfire) no problem, and they are impressive.

 

But actually sitting down for 2 hours or maybe more like as if you were REALLY playing the game (ie: like I do when I actually...play a game)...you quickly tire of the nonsense that Crossfire brings. Not to mention the extra 150MB or so that it loads on startup just to have .NET + CCC (or you can have the option to not use CCC but then you can't do Crossfire...).

 

SLI used to be just as bad except for the bloatware part.

 

Now SLI actually works quite well, and only idiot game devs like EA refuse to actually test thoroughly (nor allow widescreen etc without hacking up some config file).

 

I pray that ATI will abandon this bs with .NET and allow us to nail Crossfire without the bloatware drivers...but mostly I just hope ATI fixes their drivers and gets more game devs to come on board...because if I had to pay for these two X1900XT Crossfire cards....oh man I would be PISSED OFF (just like I would have been if I'd have actually had to buy 6600GT SLI cards when they first came out, and they were nowhere near as expensive as my X1900XT's were)

 

/end-rant

Share this post


Link to post
Share on other sites

  • Replies 32
  • Created
  • Last Reply

Top Posters In This Topic

Yep, I agree with you.

 

Course I never experienced the "games don't support it" thing with my SLI because when I first got SLI (when I got my second 7800GTX 512MB back in the day, 750 freakin' bucks) the game I got to play on the setup was FEAR ;) which loooooooves SLI LOL

 

Even with my ol' 7800 512 setup that game's scalable performance was unmistakable (both with an FX-55 and even more so with an Opteron 165 @ 2.4/2.65GHz)

 

As I've been saying a lot lately, I wish someone would send me their 939/Crossfire board and 1900s to compare with my setup, that'd really be a put up or shut up situation (both for me and them) as I'd be comparing both with my same processor and RAM and seeing which really performs better (I have no doubt I'd win, I've seen real-world gaming numbers from many games comparing X1900s and 7900s in SLI/Crossfire, SLI wins almost every time, very VERY few times does it loose, and then it's by a nose hair)

 

Although in reality that arguement (for me at least) is now COMPLETELY obsolete as the 8800 wipes the floor with anything out on the market, NVidia, ATI or otherwise. Any game that the performance increase is marginal in is either due to A: driver support being sketchy, B: The game doesn't really support the card, or C: the game is just plain retarded.

Share this post


Link to post
Share on other sites

I agree with SLI being awful at first. I ran 2 7800's..eVGA's the highest clocked ones...sli...but had numerous problems with games. Had to set configs for each one, got alot of artifacts and worse. Haven't run it since. The 8800 should eliminate any need till I make my next CPU in 6 months or so...

Share this post


Link to post
Share on other sites

its all about vista too, new g card means new operating system.

 

This isn't true. Yes the 8800 is DX10 silicone and yes, if when you make the move to vista you want to run DX10 natively then you need a DX10 card.

 

However for a start the vista aero gui uses DX9 and any DX9 card will run vista and DX10 software just fine. I've tested it enough on my X1800 to know this.

As far as the 8800 goes, don't think of it as a necessity for vista, thats what marketing wants you to think. Instead think of it as a really really fast DX9 card which also future proofs you for DX10 games 8months down the line.

Share this post


Link to post
Share on other sites

This shouldn't come as a surprise. There's more difference between the two then clock speeds. The difference in benchmarks will be influenced by in-game settings and resolutions.

Share this post


Link to post
Share on other sites

This isn't true. Yes the 8800 is DX10 silicone and yes, if when you make the move to vista you want to run DX10 natively then you need a DX10 card.

 

However for a start the vista aero gui uses DX9 and any DX9 card will run vista and DX10 software just fine. I've tested it enough on my X1800 to know this.

As far as the 8800 goes, don't think of it as a necessity for vista, thats what marketing wants you to think. Instead think of it as a really really fast DX9 card which also future proofs you for DX10 games 8months down the line.

 

8 months? What part of someone's butt did you pull that out of? LOL J/K

 

But seriously, DX10 is not 8 months down the road, it's like... 2.

Share this post


Link to post
Share on other sites

Guest RohypnoL
Here's a new twist on an OC 8800GTS. Even at 647/942 overclocked, the GTS still couldn't keep up with the stock GTX in F.E.A.R. This is way different from earlier reviews they did on the GTS.

 

http://www.firingsquad.com/hardware/nvidia...cking/page9.asp

 

That's because the GTX has 32 more shaders than the GTS does. (128 on the GTX vs only 96 on the GTS) The GTX shaders are also clocked higher.

Share this post


Link to post
Share on other sites

It's not like the old days boys.

 

The GT (GTS now) and the GTX are not two peas in the same pod like they have been in the past. It's not like you can OC a GTS to GTX speeds and keep up with it, they are NOT the same card.

 

7800GT OCed enough, will keep up with a 7800GTX or pass it. I think NVidia got sick of this "problem" and decided to "Solve it" in this latest series.

 

Frankly I don't blame them.

Share this post


Link to post
Share on other sites

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now

×
×
  • Create New...