Jump to content

R600 < 8800gtx :0


Recommended Posts

My initial disappointment came from reading the [H] review and it was negative. On some level, I realize that drivers are immature and will develop over time. Reading the tech reviews of the gpu architecture, it sounds very promising. ATI/AMD have taken a somewhat different route than nVidia. I found that the fact that the gpu was said to use more power and generated more heat because of gpu current leaks disappointing.
[H] would have had a hard time finding a testing method that would be less meaningful for testing a high end card.

 

3dmark, Shadermark, Quake 4 know what all these have in common? They are not good enough for [H] to report in normal card testing, but somehow they are good enough to evaluate power testing. How does that make sence? Doing a hot box test is a real straw man ploy for a high end card. 55 degree case temps?!? My processor doesn't get that hot maby that is a good test for a HTPC with only one fan but oddly enough they dont do this to video cards that would be more used in these conditions. I guess it is only important when it proves the point you want to make.

 

Here is Tom's Quote

 

For starters, R600 drew less overall system power than GeForce 8800GTX in the game tests.

 

What does this say? Not much, the difference in power use was smaller than the level of error for the testing equipment. Throw in testing variables and the results mean jack. Toms realized this and sadly enough [H] did not.

 

I am not a Toms fan usually their test tend to help out Nvidia and Intel though it looks like they are trying to be fair.

 

I am not saying this card is the best thing in the world- it isnt. It is nearly impossible to say where it will go right now the old 8800's were a disappointment too but they were a lot more expensive.

Share this post


Link to post
Share on other sites

Guest LookBackX2

The two commonalities between all the reviews I've read, and just about what everyone has commented on is this: The drivers are immature, and it's 6 Months late.

 

Take just about any review from card releases from ATI, save for the X19xx cards, since the x800 was released and replace the model numbers with 2900 and it will read the same. Just seems like dejavu.

 

Personally, I think ATI cards have leaned toward a better image quality, while Nvidia cards have retained higher frame rate production. Both companies have different approaches to architecture. As long as I get a steady 60fps, since I don't like my screen to tear constantly, the card has performed as I would like on a fps front. It seems any mid or high end card is going to do that for me. What I would like to read a review on is image quality, and the actual change in quality realized by changing the settings. Especially since each company goes about creating eye candy a little differently than the other.

 

What does raw performance really matter if 90% of the users have vertical sync on?

 

Image quality, and a steady frame rate have more impact on product enjoyment for most end users than fps ability ever will.

Share this post


Link to post
Share on other sites

I bought one at release and it works fine, even with early drivers. I am using a 21" Sceptre LCD at 1680/1050 res and it runs all my DX9 games at max settings just fine although I don't have FEAR or Quake 4 so I don't know how well it can push some of the newest games. It does great with Doom 3 and Oblivion; just go to techpowerup http://www.techpowerup.com/reviewdb/Video%...s/ATI/HD%202900 not all the reviews are totally negative. Plus the bundle with my Sapphire version was good. It had Valves Blackbox with 3 free games via Steam: HalfLife ep2, Team Fortress and Portal, haven't used it yet, and Cyberlink PowerDVD (already have it though). Plus I got it from Newegg and got a Logitec G5 mouse combo deal for $419. If you take into account the bundle and mouse it really wasn't priced to bad. It has internal crossfire so the dongle is gone. Overall I am happy with the purchase, but it does use a different architecture than Nvidia so we will have to wait for DX10 games to see if its a total flop. I wished it used DDR4 but at least the it has a 512mb memory interface.

Share this post


Link to post
Share on other sites

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
×
×
  • Create New...