Jump to content


Photo

Ati Trilinear Cheating Uncovered


  • Please log in to reply
10 replies to this topic

#1 Bosco

Bosco

    OCC Boss

  • Senior Admin
  • PipPipPipPipPipPipPipPipPip
  • 32274 posts
  • Gender:Male
  • Location:Canada

Posted 20 May 2004 - 09:36 AM

The article can be found here

Main Gaming Rig
Intel 3960X
MSI X79A-GD65 8D
16GB of Corsair Vengeance
NVIDIA 780TI's in SLI
Corsair Force 3 GT 240GB SSD
Coolermaster 932 Case
Noctua D14 CPU Cooler
Thermaltake Toughpower XT Platinum 1275 Watts
3 X 24" LCD's
Donating to OCC :::: OCC Site Rules :::: OCC Reviews
RIP Verran and Nemo gone but never will be forgotten.


#2 r_target

r_target

    forgotten but not gone

  • Folding Member
  • 5947 posts

Posted 20 May 2004 - 07:05 PM

I don't see what the big deal is about this.
Core2 [email protected]|Abit IP35 Pro|Mushkin XP2 6400 2x1GB|EVGA 8800GTS Superclock|Douglas SBD-3 Posted Image

#3 rein

rein

    Member

  • Members
  • PipPip
  • 192 posts
  • Location:CANADA

Posted 20 May 2004 - 11:13 PM

neither do i! Their optimizations seem legitamate. Its not like you can notice the difference in image quality if theyre not rendering the stuff you cant see!

#4 the11ama

the11ama

    Formerly known as a rabbit

  • Members
  • PipPipPipPip
  • 1058 posts
  • Gender:Male
  • Location:IN

Posted 21 May 2004 - 07:39 AM

It is in the sense that Nvidia was given crap for the same thing (I know that technically it isn't, just like brown and black-and-white cows standing together.) In a sense it may not be cheating, but it is an attempt to raise scores by cutting quality, however small. Nvidia and ATI are now equal in the renderer's doghouse in my mind. Is it a big deal? No, it's just competition. But tell that to those who've raised hell about it.
Posted Image

#5 kurosen

kurosen

    DTTL!!!

  • Folding Member
  • 1446 posts
  • Gender:Male
  • Location:Brooklyn, NY

Posted 21 May 2004 - 07:54 AM

What it comes down to is business image... If later on down the line, ATI comes out with some super vidcard, people might speculate that said cards scores were attained through foul play like the case that is occuring now. Even though it doesn't affect the pretty pixels much (assuming they do "mess around with it"), it gives the buyer a sense of being "0\/\//\/3d" :ph34r: :P

I personally don't care so long as it can OC and push those frames at ludacris speed (a cookie for the first person to guess where that term came from ;) )

Sorry to go offtrack for a minute but speaking of which, what is a STABLE clockrate for an All-In-Wonder 8500 128MB (Radeon 8500 clockrates will also be helpful)?
Posted Image

#6 Guest_ThermalGuru_*

Guest_ThermalGuru_*
  • Guests

Posted 21 May 2004 - 07:55 AM

Cheat or no cheat, if it improves my card....I'm all for it.

I think the ludacris term is derived from Latin by a scientist named Cris, translated down from Luden's Cough Drops....originally meaning to have a bad cold and cough your brains out looking ludacris.

Edited by ThermalGuru, 21 May 2004 - 08:07 AM.


#7 the11ama

the11ama

    Formerly known as a rabbit

  • Members
  • PipPipPipPip
  • 1058 posts
  • Gender:Male
  • Location:IN

Posted 21 May 2004 - 08:46 AM

Well hey! Can I have that cookie for saying "Spaceballs"?!
(at least I think that's what you're talking about, another cookie for the latin stuff)
But I agree with your sentiments, Kurosen, which is why they're even in my book now. But if it runs great, looks great, and at a great price, I would buy one even if it meant I was "owned" (j/k). I just want a card that doesn't have to cheat to have good performance. Oh well.
Posted Image

#8 kurosen

kurosen

    DTTL!!!

  • Folding Member
  • 1446 posts
  • Gender:Male
  • Location:Brooklyn, NY

Posted 21 May 2004 - 10:49 AM

Here's your cookie a rabbit, it was spaceballs :P

@
Posted Image

#9 r_target

r_target

    forgotten but not gone

  • Folding Member
  • 5947 posts

Posted 21 May 2004 - 12:11 PM

It is in the sense that Nvidia was given crap for the same thing (I know that technically it isn't, just like brown and black-and-white cows standing together.)  In a sense it may not be cheating, but it is an attempt to raise scores by cutting quality, however small.  Nvidia and ATI are now equal in the renderer's doghouse in my mind.  Is it a big deal?  No, it's just competition.  But tell that to those who've raised hell about it.

I don't really think it's the same thing at all though.What the ATI guy said made perfect sense to me.Why render something 16x when it can be done in 2x and look the same?It's only comparable to the NVidia SNAFU in the sense that it erodes consumer confidence.You're dead-on about that one. :)
Core2 [email protected]|Abit IP35 Pro|Mushkin XP2 6400 2x1GB|EVGA 8800GTS Superclock|Douglas SBD-3 Posted Image

#10 rein

rein

    Member

  • Members
  • PipPip
  • 192 posts
  • Location:CANADA

Posted 21 May 2004 - 04:48 PM

Im not sure if its true. but i read that nvidias optimizations were specifically for benchmarks. (ie 3dmark2003 and aquamark3) and not for games. I dont know if its true or not tho.

All i know is after seeing the shadermark comparisons between the two, i prefer ati. In farcry benchmarks the x800xt and 9800ultra are neck and neck. But when you compare lighting effects between the two manufacturers the nvidia card are very blocky. Ive seen this example in more than one review (plus theres many complaints about this and low framerates from nvidia owners on the farcry forums - its ironic because the came is a "The way its meant to be played" game) . So im wondering if nvidia cards were forced to run higher resolution shaders, what would their actual shader performance be?

shader quality comparisons

performance benchmarks

#11 Guest_ThermalGuru_*

Guest_ThermalGuru_*
  • Guests

Posted 21 May 2004 - 06:51 PM

I've talked about this way before this post in the past. This Trilinear filtering is also why ATI chose not to jump on the Pixel Shader 3.0 Bandwagon. If you have not already switched over from Nvidia to ATI by now, this is just one more reason why you should. You will continue to see others on here that are much hapier with ATI's way of gaining quality over Nvidia's way. Performance will always be pretty close between the 2 as it has always been, but you can't top ATI's quality and never will with an Nvidia card because Nvidia always wants to go off on some weird tangent from the "norm" in game design. This info is all over the internet, all you need to do is start looking into game designer forums versus computer hardware forums, that would only make sense since the game developers are what you guys are really interested in. Well, at least you should be, because it doesn't make much sense to get the fastest card and only be able to show off it's FPS in a benchmark. You should demand quality along with it.

For that, I give you ATI. I know these things because I do 3D modeling and I'm into graphics. Have been since the 80's. It's what pays the bills. But it doesn't take experience to realize this, you just need to research, research, and the when you are done, research some more across the internet. All the facts are layed out for you someplace. Try to stay away from ones promoted by ATI and Nvidia, eventually you will find what I'm saying is true by people who are into game development and other graphics guru like myself.

I vote we try to get Overclockers Club to do an interview with somebody in Game Development from someplace like ID software or Activision. They both are being promoted by Nvidia but I would love to see an "official" statement on what the designers are actually using in their computers to create the games.

<@ I ate your cookie dude...sorry.

Edited by ThermalGuru, 21 May 2004 - 07:00 PM.