Jump to content

2900XT can't take the top spot


Recommended Posts

[H] at that time for review did use the most current drivers at that time which was 8.374 but 8.38 was out during the weekend. Maybe they will update their article with it but like I've said and many others said before the drivers are still immature. I belive a few cards from both NV and ATI had zero driver support for a month or two when it came out new or a refresh. At least be happy that there are some drivers for these new cards and ATI is working on putting out better ones then just saying "use driver version x for now."

 

I hope I don't sound like a ATI fan boy cause I'm not. :D

Share this post


Link to post
Share on other sites

  • Replies 81
  • Created
  • Last Reply

Top Posters In This Topic

Whoa whoa. Calm down there guy, no need for name calling.

 

I should have made it more obvious that I would have gotten 2 for crossfire.

 

lol

 

i thought about getting two, then decided that was just not necessary ;)

 

i also thought about ordering 23, since I could afford 23 of them, but then again realized I just didn't want to heh.

 

don't worry, I have the same thoughts as you and everyone else...I'm not self-righteous and perfect by any means...I just get a little irritated at guys who go overboard by ordering 50 of something not to give away to friends to enjoy but to make money at a company's expense (not that I'm any better by ordering a single one for myself to enjoy at the company's expense for their error).

 

I almost ordered 2, one for me and one for my best buddy, but the honest truth is that they will send us an email claiming it was an error, so sorry, nice try ;)

 

but I had to try for at least one!

 

I came to the conclusion yesterday that the 2900XT will be fixed with drivers. How long that that is going to take is what I'm not sure about. Right now I am wondering if I should trust my instincts and go with the 2900XT or be safe and go with the 8800GTS 640. Ive never owned an ATI card before but there is just something about this one (maybe its the sexy design...just kidding). The Valve Black Box thing is a big plus for me too cause I plan on buying that when it comes out.

 

if you got to have a killer card NOW, get the 8800GTS. I have the 320MB version and it plays all the way up to 1600x1200 with 8xAA/16xAF enabled in the HL2 engine games, and 4xAA/16xAF in things like FEAR, BF2, CoH, etc.

 

However, if you are REALLY wanting those Valve games (and you know you do, we all do!), and already in need of a vid card upgrade, then get the 2900XT and enjoy it while also enjoying the sometimes-hassle, sometimes-treasure of maturing drivers and hopefully-quick releases ;)

Share this post


Link to post
Share on other sites

[H] at that time for review did use the most current drivers at that time which was 8.374 but 8.38 was out during the weekend. Maybe they will update their article with it but like I've said and many others said before the drivers are still immature. I belive a few cards from both NV and ATI had zero driver support for a month or two when it came out new or a refresh. At least be happy that there are some drivers for these new cards and ATI is working on putting out better ones then just saying "use driver version x for now."

 

I hope I don't sound like a ATI fan boy cause I'm not. :D

 

nah, I know you ain't fanboy. I do think you are correct...the driving demand for new and better tech/performance I think drives these companies to release products before the drivers/software is mature, just to have the upper hand, and hold the upper hand for as long as possible before it shifts back to "the other guy"

Share this post


Link to post
Share on other sites

Guest JustinSane
However, if you are REALLY wanting those Valve games (and you know you do, we all do!), and already in need of a vid card upgrade, then get the 2900XT and enjoy it while also enjoying the sometimes-hassle, sometimes-treasure of maturing drivers and hopefully-quick releases ;)

 

Yea I am in need of a new card so anything new is gonna kick this 7800 GT's butt. I have a good feeling about this 2900XT. Thanks for the reply.

Share this post


Link to post
Share on other sites

[H] had the latest drivers.

 

 

If they had them they did not use them there were already behind by two driver revisions by launch date.

 

Second question how did they not notice that the FPS is higher in some games with AA on than off. That would make me really want to re do my tests!

 

I would second AG's statement go back look at the FPS of the 8800 series at launch- even though they dont have good Vista drivers the performance is worlds better with updated drivers and patched games compared to the first scores.

 

One thing to note power usage is going to be driver controlled too. Once the card stops spinning its wheels doing work it doesn't put to good use o power can go down. I noted a couple reviews that used the more recent drivers used less power than the 8800's at Idle I don't know if this is chance or what. Many review sites did not even bother to publish the drivers and that can make a 50% or more difference in scores.

Share this post


Link to post
Share on other sites

Kinc, not resting after setting the 3d05 single card world record using the 2900XT cooled by cascade, attempted to do the same in 3d03 using liquid nitrogen.

 

LN2 with no load at atmospheric pressure has a temperature of -196C. Under the load of the 2900XT the best that could be done was a run at -70C.

 

The mousepot used was a rev 1. Jason has done several revisions since so lower temps may be obtainable. But this clearly shows the cooling problem AMD/ATI has faced trying to get this card to market.

 

http://www.nordichardware.com/news,6331.html

Share this post


Link to post
Share on other sites

Guest SPQQKY
Kinc, not resting after setting the 3d05 single card world record using the 2900XT cooled by cascade, attempted to do the same in 3d03 using liquid nitrogen.

 

LN2 with no load at atmospheric pressure has a temperature of -196C. Under the load of the 2900XT the best that could be done was a run at -70C.

 

The mousepot used was a rev 1. Jason has done several revisions since so lower temps may be obtainable. But this clearly shows the cooling problem AMD/ATI has faced trying to get this card to market.

 

http://www.nordichardware.com/news,6331.html

 

Yeah, I guess you could hear the humming between the hot core and the LN2 filled pot.

Since this has turned into a flame SPQQKY thread, I guess I will say no more. But you know guys, I'm not the only one disappointed in this card and I never said I was an nVidia fanboy, I only posted what I found on the net, and reviews and end users agreed the game performance (not benchmark performance)el suxored. If only I was the Great Karnak I could have dispelled all that I read and forseen the future of this card, then I wouldn't need to read all that stuff (It just goes to show reading is for loosers, wait for the ATi movie). I guess the hype that has surrounded this thing for sooooooo long had me expecting much more from it.

I guess we can speculate on new drivers, hell, we speculated about the card itself and now with my all knowing Karnak powers, I see in the future.......well, just more BS.

So I say lets just stop developing hardware and let the software catch up. We had 32-bit CPU's for about 7 years before we had a 32 bit OS. We have hardware that is not being used to it's full potential because the software developement just isn't there. I mean, look at some games out now, terrible if you ask me. Some are such system hogs, even the latest hardware can't keep up and they don't look all that great, whereas some games look great and can run fluidly.

Share this post


Link to post
Share on other sites

There is nothing wrong with the 2900XT. So far is has shown to be an excellent upper mid-range card. After all, Kinc got a 75% overclock out of it. What his run illustrates is the technology is levels above the chip itself.

 

If I was nVidia the 2900 series cards isn't what would concern me. Look closely at these extreme overclocks. The technology is solid. It's the silicon that is weak. A couple of chip revisions to get current leakage down and address one or two other issues and we could be seeing some cards that will astound everyone.

Share this post


Link to post
Share on other sites

I came to the conclusion yesterday that the 2900XT will be fixed with drivers. How long that that is going to take is what I'm not sure about. Right now I am wondering if I should trust my instincts and go with the 2900XT or be safe and go with the 8800GTS 640.
One thing drivers won't fix is that insane power consumption :beer

Share this post


Link to post
Share on other sites

The silicon shrink will certainly help with the power which leaves more room for upping the volts and the clocks but this will apply to NVidia's chips when they get shrunk too.

 

Also the XT is not the "flagship" card, that's the XTX which is yet to sail out of the OEM fog, if it ever will.

 

As to hardware not being utilised - I would say that only applies to CPU related processing as Games aren't inherently multithreadable and the development tools don't fully support it. GPU's on the other hand can easily be swamped (fully utilised) by the simple process of using a bigger monitor or adding another one.

 

So at what point will there be a resolution high enough that the eye can't distinguish between it and reality (assuming you are sat at a reasonable distance without a magnifying glass)

Share this post


Link to post
Share on other sites

Yeah, I guess you could hear the humming between the hot core and the LN2 filled pot.

Since this has turned into a flame SPQQKY thread, I guess I will say no more. But you know guys, I'm not the only one disappointed in this card and I never said I was an nVidia fanboy, I only posted what I found on the net, and reviews and end users agreed the game performance (not benchmark performance)el suxored. If only I was the Great Karnak I could have dispelled all that I read and forseen the future of this card, then I wouldn't need to read all that stuff (It just goes to show reading is for loosers, wait for the ATi movie). I guess the hype that has surrounded this thing for sooooooo long had me expecting much more from it.

I guess we can speculate on new drivers, hell, we speculated about the card itself and now with my all knowing Karnak powers, I see in the future.......well, just more BS.

So I say lets just stop developing hardware and let the software catch up. We had 32-bit CPU's for about 7 years before we had a 32 bit OS. We have hardware that is not being used to it's full potential because the software developement just isn't there. I mean, look at some games out now, terrible if you ask me. Some are such system hogs, even the latest hardware can't keep up and they don't look all that great, whereas some games look great and can run fluidly.

nah, it isn't a "flame SPQQKY" thread at all. It's more of the same you've always heard from me, which is basically "think about it from more than just your own opinion's angle"

 

there are forces at work with these pieces of technology that even the best of us don't understand, and the majority really don't even understand the 'why' of something, they have been fed with review site speculation nonsense for years, so when tech sites throw up some benchmarks and piss and moan about how it doesn't beat something else, the majority go railing on forums about how disappointed you are that your expectations were not met.

 

Seems only a few of us, like Praz, see it for the much bigger picture that it is. There's so much technology inside that new gpu that simply won't be utilized for probably years to come, and by the time that tech is being utilized to intention, it is already 2 or 3 generations replaced by newer technology that can do all that old tech better than ever, but is saddled with, as stated, even newer and better technology that is still waiting for software to catch up to fully utilize it to intention.

 

It's a game of catch-up. You get hyped by the sites on the internet doing the hyping (and honestly, seems most are paid mouthpieces...you are granted exclusives or given plenty of test hardware so you got to make as much noise as possible to generate publicity, but almost every article I've read on this card, with the exception of Anandtech's Derek Wilson "Calling a Spade a Spade", does nothing but piss and moan about how we all were expecting some 8800 killer and got this...power draining monstrosity instead.

 

Well, I can say I am slightly disappointed that it isn't an instant 8800 killer (not because I hate NV, but because it will drive NV to release something even better, or possibly even just fix the drivers for their 8800's lol).

 

But I actually take the time to read about the new technology. And as I've described endlessly, I know that this is mostly MS's attempt to force everyone to Vista with their DX10. If you really peruse the articles surrounding G80 NV and R600 ATI (AMD), you'll realize that these gpu's were simply not designed for Windows XP and DX9.

 

They are truly next-gen gpu's, and they are built around a new DX10 API that is pretty darn different than previous DX iterations. Problem is, these gpu's still have to work on 90% of all the potential customer rigs that might purchase them, because 90% are still running Windows XP, and will for a good while.

 

The features in the new ATI card and the G80 NV's, if you really stop to think about them, are pretty incredible. These cards are being over-engineered from day one, and they are starting to take on much more of a role in how your games are rendered. Gone are the days when you had to have a fast cpu AND a fast gpu, since for almost ever the games relied a hell of a lot more on cpu than gpu, but you still needed a decent gpu.

 

Then came the days of offloading most of the work to the GPU. FEAR is a perfect example of this...drop any cpu from 1600Mhz Sempron to 3Ghz Core into a rig with an 8800GTS and you will experience almost the same exact framerates.

 

These new DX10 cards are going to do a lot more, once the software (games) start being programmed with DX10 in mind first, and DX9 as a 'compatibility' mode. Again, 90% of your customers don't have Vista. Selling only to the 10% who might have both Vista AND a DX10 card reduces that number to maybe 7% or lower. Bad odds for making money, no matter how awesome your DX10 game is.

 

Soon the cards will be able to run most of the physics, offloading even more of the work from the cpu (which could then be programmed for better AI or any number of tasks, which will hopefully utilize multi-core cpu's better than they do now). This 2900XT already can bridge audio across it and out to an HDMI port, so there's another function.

 

I guess what I am saying is that we aren't flaming YOU or anyone else who is unhappy with the new ATI offering. What I am saying is that it is, however, your own fault for being so disappointed that you have let it burn to the surface. But instead of getting angry at even that statement, step back and look at the big picture objectively. Dual-core's were great when they came out, but the reality was (and still is pretty much) that they don't actually do a whole lot better than single core cpu's, and sometimes they fare much much worse (in apps etc that are mystified by more than one cpu core lol).

 

Again, the 8800 cards performed very poorly compared to today with mature drivers and possibly game updates (or even minor DX updates).

 

Once you really study the new DX10 API though, and then really study the technology on the G80 and R600 gpu's, you should see pretty clearly why they possibly don't perform so well on software that wasn't built for the features within the gpu's.

 

As for the power consumption, well, it's true that it will go down with better revisions and a shrink or two, but it probably won't go down all that much as once you get a couple revisions and a 65nm die shrink, guess what? The companies will say "hey, now we can offer these gpu's at 50% faster stock speed since that will put them right back to where they were at release in terms of power consumption....but they'll be 50% faster!"

 

And we, by then, will have already bought into this new psu scheme for 8-pin PCI-E 2.0, so it won't really matter all that much, and also there will be plenty of cooling options to get rid of the 747 turbine that is factory stock.

 

As I sit here typing this, I am reminded of a time during the cpu days when Intel changed their processor architecture, and it really pissed a lot of people off, because they were so caught up in the numbers, they didn't think about the big picture.

 

There was a time when cpu's started moving away from raw clock speed on say a 32-bit internal bus, to just a larger internal bus (say 64-bit, 128-bit, whatever).

 

Everyone pissed and moaned about the Pentium3 if I remember correctly. They thought that (and they proved it with benchmarks of course, in a close-minded way) releasing a cpu that was actually larger bus but no clock increase was harming performance. And with their benchmarks, they showed it with no doubt. But they never focused on the most important aspect. They didn't realize that with a wider bus path, the core didn't have to work as hard. It could, after optimizing software, which the benchmarks of course were not optimized for, since it was new technology, actually do more work per cycle because it could be fed more data over that bigger bus.

 

It was a concept realized that I've always remembered. Think about it now also, as clocks speeds from Intel reduced, but the cpu became much more efficient. So did A64's. An 1800Ghz A64 running in 32-bit mode could run circles around a 2.2Ghz Barton AthlonXP. A 2.13Ghz Core2 can run circles around a 3Ghz Pentium4 and even PentiumD.

 

It is really the same with gpu's. Once the benchmarks were updated or optimized for this new cpu architecture, everyone stfu about the change in clock speed (and the cpu companies had a hard time of it because they'd hyped up that clock speed for YEARS if you remember) because the cpu's were showing that with optimized software, the new architecture was MUCH better, more efficient. Once we have optimized native DX10 apps and optimized drivers, I think you'll see a lot of changes, and all of the negatives you thought about the ATI card as well as possibly any NV card, will be forgotten.

 

The game industry has moved away from busting out raw framerate and into the direction of Image Quality. This is a good thing. Who cares if you can get 340fps at 1280x1024....can you do 70fps @ 1280x1024 with 8xAA/16xAF/HDR/everything maxed?

 

Gone are the days of us needing to play 800x600 or 1024x768 so the game will run proper (though it will look like butt). Now we all want 1440x900, 1680x1050, 1920x1200, but more importantly, we want it with all AA and AF and HDR and eye candy maxed out. We dont care if we get 6500fps as long as we can turn every single detail up to max, so you can spot the sneer on that jerk's face through your sniper scope from 600 yards away. You want to be able to almost read a license plate on the car you are chasing. We want to see forests of trees with shitloads of leaves swaying in the wind, birds fluttering to branches, shadows hiding potential enemies, etc.

 

This is something at least that MS realized and have shot for in their infinite but sometimes corrupt wisdom. That is why they developed this new DX10 API (again, read up on it and what is will do for gaming as it evolves).

 

 

And finally, here's something to keep in mind:

 

by the time full DX10 games come around, we should be on our second-to-fourth generation of hardware, drivers, and API"

 

what that essentially means is that by the time the software catches up, the hardware will be even more able to make us gasp in surprise at the realism, the beauty, etc.

 

Sometimes the bigger picture is harder to grasp when your emotions (wants/needs lol) are involved. Instant gratification is today's societal negative. Patience truly is a virtue, but if you don't exercize it often, it becomes like your legs and belly...flabby, lazy, and unsightly

 

 

;)

Share this post


Link to post
Share on other sites

So at what point will there be a resolution high enough that the eye can't distinguish between it and reality (assuming you are sat at a reasonable distance without a magnifying glass)

 

thats about 125million pixels, for the avarage person. so a few years yet. i just hope that its in my life time though;):)

Share this post


Link to post
Share on other sites

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now

×
×
  • Create New...