Ati Trilinear Cheating Uncovered
Posted 20 May 2004 - 09:36 AM
Main Gaming Rig
Samsung 1TB SSD
Corsair 600T White
Noctua D14 CPU Cooler
Corsair 1200 Watt PSU
Asus 32" 4K Monitor
Donating to OCC :::: OCC Site Rules :::: OCC Reviews
RIP Verran and Nemo gone but never will be forgotten.
Posted 20 May 2004 - 11:13 PM
Posted 21 May 2004 - 07:39 AM
Posted 21 May 2004 - 07:54 AM
I personally don't care so long as it can OC and push those frames at ludacris speed (a cookie for the first person to guess where that term came from )
Sorry to go offtrack for a minute but speaking of which, what is a STABLE clockrate for an All-In-Wonder 8500 128MB (Radeon 8500 clockrates will also be helpful)?
Posted 21 May 2004 - 07:55 AM
I think the ludacris term is derived from Latin by a scientist named Cris, translated down from Luden's Cough Drops....originally meaning to have a bad cold and cough your brains out looking ludacris.
Edited by ThermalGuru, 21 May 2004 - 08:07 AM.
Posted 21 May 2004 - 08:46 AM
(at least I think that's what you're talking about, another cookie for the latin stuff)
But I agree with your sentiments, Kurosen, which is why they're even in my book now. But if it runs great, looks great, and at a great price, I would buy one even if it meant I was "owned" (j/k). I just want a card that doesn't have to cheat to have good performance. Oh well.
Posted 21 May 2004 - 12:11 PM
I don't really think it's the same thing at all though.What the ATI guy said made perfect sense to me.Why render something 16x when it can be done in 2x and look the same?It's only comparable to the NVidia SNAFU in the sense that it erodes consumer confidence.You're dead-on about that one.
It is in the sense that Nvidia was given crap for the same thing (I know that technically it isn't, just like brown and black-and-white cows standing together.) In a sense it may not be cheating, but it is an attempt to raise scores by cutting quality, however small. Nvidia and ATI are now equal in the renderer's doghouse in my mind. Is it a big deal? No, it's just competition. But tell that to those who've raised hell about it.
Posted 21 May 2004 - 04:48 PM
All i know is after seeing the shadermark comparisons between the two, i prefer ati. In farcry benchmarks the x800xt and 9800ultra are neck and neck. But when you compare lighting effects between the two manufacturers the nvidia card are very blocky. Ive seen this example in more than one review (plus theres many complaints about this and low framerates from nvidia owners on the farcry forums - its ironic because the came is a "The way its meant to be played" game) . So im wondering if nvidia cards were forced to run higher resolution shaders, what would their actual shader performance be?
shader quality comparisons
Posted 21 May 2004 - 06:51 PM
For that, I give you ATI. I know these things because I do 3D modeling and I'm into graphics. Have been since the 80's. It's what pays the bills. But it doesn't take experience to realize this, you just need to research, research, and the when you are done, research some more across the internet. All the facts are layed out for you someplace. Try to stay away from ones promoted by ATI and Nvidia, eventually you will find what I'm saying is true by people who are into game development and other graphics guru like myself.
I vote we try to get Overclockers Club to do an interview with somebody in Game Development from someplace like ID software or Activision. They both are being promoted by Nvidia but I would love to see an "official" statement on what the designers are actually using in their computers to create the games.
<@ I ate your cookie dude...sorry.
Edited by ThermalGuru, 21 May 2004 - 07:00 PM.