Jump to content

Ati Trilinear Cheating Uncovered


Bosco

Recommended Posts

neither do i! Their optimizations seem legitamate. Its not like you can notice the difference in image quality if theyre not rendering the stuff you cant see!

Share this post


Link to post
Share on other sites

It is in the sense that Nvidia was given crap for the same thing (I know that technically it isn't, just like brown and black-and-white cows standing together.) In a sense it may not be cheating, but it is an attempt to raise scores by cutting quality, however small. Nvidia and ATI are now equal in the renderer's doghouse in my mind. Is it a big deal? No, it's just competition. But tell that to those who've raised hell about it.

Share this post


Link to post
Share on other sites

What it comes down to is business image... If later on down the line, ATI comes out with some super vidcard, people might speculate that said cards scores were attained through foul play like the case that is occuring now. Even though it doesn't affect the pretty pixels much (assuming they do "mess around with it"), it gives the buyer a sense of being "0\/\//\/3d" :ph34r::P

 

I personally don't care so long as it can OC and push those frames at ludacris speed (a cookie for the first person to guess where that term came from ;) )

 

Sorry to go offtrack for a minute but speaking of which, what is a STABLE clockrate for an All-In-Wonder 8500 128MB (Radeon 8500 clockrates will also be helpful)?

Share this post


Link to post
Share on other sites

Guest ThermalGuru

Cheat or no cheat, if it improves my card....I'm all for it.

 

I think the ludacris term is derived from Latin by a scientist named Cris, translated down from Luden's Cough Drops....originally meaning to have a bad cold and cough your brains out looking ludacris.

Edited by ThermalGuru

Share this post


Link to post
Share on other sites

Well hey! Can I have that cookie for saying "Spaceballs"?!

(at least I think that's what you're talking about, another cookie for the latin stuff)

But I agree with your sentiments, Kurosen, which is why they're even in my book now. But if it runs great, looks great, and at a great price, I would buy one even if it meant I was "owned" (j/k). I just want a card that doesn't have to cheat to have good performance. Oh well.

Share this post


Link to post
Share on other sites

It is in the sense that Nvidia was given crap for the same thing (I know that technically it isn't, just like brown and black-and-white cows standing together.)  In a sense it may not be cheating, but it is an attempt to raise scores by cutting quality, however small.  Nvidia and ATI are now equal in the renderer's doghouse in my mind.  Is it a big deal?  No, it's just competition.  But tell that to those who've raised hell about it.

I don't really think it's the same thing at all though.What the ATI guy said made perfect sense to me.Why render something 16x when it can be done in 2x and look the same?It's only comparable to the NVidia SNAFU in the sense that it erodes consumer confidence.You're dead-on about that one. :)

Share this post


Link to post
Share on other sites

Im not sure if its true. but i read that nvidias optimizations were specifically for benchmarks. (ie 3dmark2003 and aquamark3) and not for games. I dont know if its true or not tho.

 

All i know is after seeing the shadermark comparisons between the two, i prefer ati. In farcry benchmarks the x800xt and 9800ultra are neck and neck. But when you compare lighting effects between the two manufacturers the nvidia card are very blocky. Ive seen this example in more than one review (plus theres many complaints about this and low framerates from nvidia owners on the farcry forums - its ironic because the came is a "The way its meant to be played" game) . So im wondering if nvidia cards were forced to run higher resolution shaders, what would their actual shader performance be?

 

shader quality comparisons

 

performance benchmarks

Share this post


Link to post
Share on other sites

Guest ThermalGuru

I've talked about this way before this post in the past. This Trilinear filtering is also why ATI chose not to jump on the Pixel Shader 3.0 Bandwagon. If you have not already switched over from Nvidia to ATI by now, this is just one more reason why you should. You will continue to see others on here that are much hapier with ATI's way of gaining quality over Nvidia's way. Performance will always be pretty close between the 2 as it has always been, but you can't top ATI's quality and never will with an Nvidia card because Nvidia always wants to go off on some weird tangent from the "norm" in game design. This info is all over the internet, all you need to do is start looking into game designer forums versus computer hardware forums, that would only make sense since the game developers are what you guys are really interested in. Well, at least you should be, because it doesn't make much sense to get the fastest card and only be able to show off it's FPS in a benchmark. You should demand quality along with it.

 

For that, I give you ATI. I know these things because I do 3D modeling and I'm into graphics. Have been since the 80's. It's what pays the bills. But it doesn't take experience to realize this, you just need to research, research, and the when you are done, research some more across the internet. All the facts are layed out for you someplace. Try to stay away from ones promoted by ATI and Nvidia, eventually you will find what I'm saying is true by people who are into game development and other graphics guru like myself.

 

I vote we try to get Overclockers Club to do an interview with somebody in Game Development from someplace like ID software or Activision. They both are being promoted by Nvidia but I would love to see an "official" statement on what the designers are actually using in their computers to create the games.

 

<@ I ate your cookie dude...sorry.

Edited by ThermalGuru

Share this post


Link to post
Share on other sites

×
×
  • Create New...