Jump to content

6800GT or X800Pro?


Recommended Posts

Originally posted by CrimeDog

Kamel, what exactly is your "specialty?" I'd hope it's not video cards since you don't know about 1/2 the market.

 

that was bad terminology, i just simply follow them, i'm not necessarily specialized in them.

 

-Neither card has a real advantage over the other. Nvidia has Doom 3 (for now) but ATI will have HL2. Time will balance these out. Both will play this game extremely well.

-Both series are extremely overclockable.

 

admittedly, i follow nvidia more than ati, because i use linux often and ati does not support linux (well, sortof... that gets into technicalities, but basically nvidia's support in linux is literally 10x or better than ati's); however, i do know that you can not judge performance of a game that has not been released. speculation does imply that the ati cards will push ahead of nvidia's cards, but judging from other directx applications, nvidia has the lead. generally, ati will end up with the "win" because when anti-aliasing is applied ati creeps ahead. this is confirmed with hundreds and hundreds of benchmarks, it does not mean just doom3.

 

 

If you know what you're looking for you can buy a X800Pro with VIVO and convert it to a full fledged XT with a simple bios flash. They're the same card (dead giveaway is the yellow power connector and the 1.6ns ram.)

 

as you can the 6800gt flashing it to a 6800ultra. this is confirmed by a large deal of people, and the 6800gt can even be flashed to the 6800ue. that usually requires cooling other than stock, along with possibly a step up in voltage, however, it has been done.

 

Apparently some MSI 6800nu can be flashed to 16 pipelines. This has not been widely confirmed like the ATI side, but it's a fairly new one.

 

this has only been confirmed with some MSI cards because they are the only ones that have been tested with the new version of riva tuner that has not been released yet. according to the developer himself, this should be possible on all 6800nu's with approximately the same success rate as the 9800se with the L-shaped ram (not 100%, but still damn good).

 

All the top 3dmark scores are with an x800xt.

 

lol, there are so many flaws with this comment i will simply act as if that one just slipped and you don't honestly consider standardized benchmarks as a true representation of real-world speed.

 

The ati cards aren't furnaces like the nvidias (they're compared to prescotts, they need two molex connectors for a reason.)

 

lol, oke dude. operation temperature means nothing when you can run a card at 85C and be 100% stable and artifact free. the temperature diode is also located on-die, if this were true for processors, they would have readings just as high.

 

As for the companies themselves... Nvidia has a history of cheating every chance it gets. Numerous drivers have been rejected by the community, and the latest ones have a checkbox that gets you about 4-5k more 3d03 points. ATI introduced a "smart shader" that Nvidia questions, even though it doesn't effect IQ to the naked eye. ATI recenly sent out a press release and a message to futuremark to disqualify their 4.6? drivers. This was completely unexpected because no one noticed that anything was wrong with the drivers, but apparently they had a small cheat in 3dMark.

 

lmfao, oke, give me links and proof. ati has been known for its optimisations also. do you think that ati does not affect image quality? you're really something. i don't know what kind of fanATic world you've been living in, but the image quality is effected and has been proven to be effected by great deals in unreal tournament 2004, and farcry. you know, you can only ride the 9700pro wave for so long. it's nvidia's turn in the seat, and that's more than obvious... just as it was ati's turn in the seat with the 9800pro vs the 5950u.

 

ATI's IQ has always been a teensy bit better than Nvidia.

 

LOL, this is worse than the 3dmark comment. i think i will just write it off as "you're full of ." on this one. sorry, not trying to be rude, but that comment was 100% FUD.

 

GT's are commonly available, as are X800pros. Nvidia is much more available in B&M stores though. The ultra has been available at newegg for about $80 over MSRP (but may be backordered now.) Everyone's still impatienly waiting for the XT's release though, estimates from ATI say september-october.

 

ok, what's that supposed to mean :confused:

 

If I were to buy right now (I don't need to) I'd get a x800pro vivo, or a GT, whichever fell into my lap for the best price.

 

eh, i am not going to say you're a bad person for coming to ati's defense or anything, and i do have respect for ati as a company. all i am saying is that the nvidia card is the better choice this time around. your first sentance was degrading and implied that i knew nothing because you knew a thing or two... that makes no sense to me. you proved nothing of what i said wrong, as none of it is... so i don't see how your second FUD comment of "you know 1/2 of the market" applies.

Share this post


Link to post
Share on other sites

  • Replies 126
  • Created
  • Last Reply

Top Posters In This Topic

Originally posted by deathadder

I usually play games like Battlefield, Far Cry, Demo's of pretty much all the FPS's out there. I am really just wanting to use this thread as a place for specs, reviews, first hand impressions that sort of thing. (Not an ATI vs NVIDIA thread) Just to give the users of this forum a central location for data regarding these two cards and how to go about getting more out of whichever card someone decides to get. Thanks for the info and please keep it coming!!

 

 

 

Haven't played much of anything since my card did done blowed up!!

 

i thought i was being informative :confused:

 

discussions and debates usually lead to facts about each card, just as long as it's still constructive. i try to keep a light heart to it all and not take too seriously what others say when they mess up or whatever. i don't take offense to comments, but if i see someone posting false or misleading information i will definitely set them straight on it, for the sake of others reading this, not my own.

Share this post


Link to post
Share on other sites

Kamel, i was not posting to taunt you, i was posting for deathadder (except the first sentance.)

 

Also, for most of your objections I literally said "wtf" aloud.

 

I searched the orb (for 3d01 35-40k, for 3k03 15-20k) and for 3d01 it was ALL ati, for 3d03 (approved drivers) there was ONE freaking 6800 (go LardArse!). I know it's a synthetic benchmark, but that doesn't make my simple statement less true. All I said was the ATI cards are dominating the orb.

 

Every single review I have ever seen has said ATI has *slightly* better IQ than nvidia.

 

The 6800 series produces a lot more heat than the x800 series. It's not a temp comparison, it's a HEAT comparison. Have you seen the coolers?

Originally posted by Kamel

though the 6800 has less mhz, it has more power and tons more transisters, so it likes a lot of juice from the PSU, at like half the trannies, the x800 series is better if you are concerned about power consumption.

It draws more power and doesn't make more heat? C'mon that's just common sense.

 

Why would you bother flashing a GT to an Ultra? They're the same card, except the GT has lower clock speeds (and slower ram.) If you flash to the EE bios you get a bump in Vgpu.

 

You completely missed when nvidia blatently cheated with driver optimizations? Have you been living under a rock? You say you're an expert?????? Check this out:

http://www.futuremark.com/companyinfo/3dma...udit_report.pdf

 

I'm sorry, the last thing I mean to do is offend you, however from reading your comments I would highly suggest that no one takes advice from you.

 

I want to restate my conclusion of the two cards: They both perform fantastically! Get whichever you prefer.

Share this post


Link to post
Share on other sites

no, don't take it as an argument as far as going head to head on the subject. i'm not trying to.

 

i never said it doesn't make more heat.

 

the 6800gt and the 6800ultra have the same speed ram, the difference between the ram only exists on the 6800vs6800gt, where the 6800 has 2.6ns ram and the gt has 2.0ns ram (the new DDR3, as 6800gt+ has).

 

i didn't say i was an expert, i actually disclaimed that if you remember correctly.

that was bad terminology, i just simply follow them, i'm not necessarily specialized in them.
so pls stop putting words into my mouth, tnx.

 

yes, and ati also blatently cheated on their driver optimisations.. have you been living under a rock??? what do you call "application specific optimisations"? try using an older driver of ati and re-naming the exe file name of 3dmark and see how many 3dmark scores you get then (these optimisations still apply with new drivers, yet the application specific optimisations are smarter and will know the change in the filename).

 

you completely contradict yourself also in this line "I'm sorry, the last thing i mean to do is offend you, however from reading your comments i would highly suggest that no one takes advice from you" <-- the only thing i have to say to that is that i am highly offended, and that's just what you set out to do. you have not proven anything i've said as wrong, and you have twisted my words around to your liking. i don't appreciate that, and i would like some facts and links to back up your wonderful thoughts and ideas, buddy.

 

from what i've read, i don't think anyone should take advice from you, because you only think you know what you're talking about. you have provided no proof, and no links. i am willing to provide a mass of great quanitity of links and proof for everything i've said and back every bit of it up, you, however, are talking out of your butt.

Share this post


Link to post
Share on other sites

I'd go for the 6800GT. It's clearly the better deal. My 6800GT oc'd to Ultra speeds right out of the box, i can go higher but i'm not gonna push it too hard for now. This thing soars on Doom3, i can play it on ultra quality just fine. The X800PRO is not worth it, compared to the 6800GT. 99.99% of all 6800GT's can reach Ultra speeds, very few X800PRO's can be successfully softmodded and OC'd to an X800XT, and some softmodded PRO's die after a couple days after softmod.

 

The power and heat is not a big deal either. A 9800XT idles at higher temps than my OC'd 6800GT does. My PSU rails haven't really budged either, coming from an OC'd 9500NP softmodded. The card is a little big though, and can be a tight fit in some cases, but you shouldn't worry.

 

The 6800GT can almost guarantee Ultra speeds, while the X800PRO cannot. At stock the X800PRO also loses.

Share this post


Link to post
Share on other sites

Originally posted by deathadder

Thanks so far for the info. I have actually been looking at the 6800 GT a little bit more favorably than the x800pro. Especially the BFG card because of it's good overclocks and lifetime warranty. I hope Viper John puts some input here he definitely seems to know his shiznit about video cards/drivers and such.

 

Well the cards are pretty evenly matched...even in Doom3 if the Cat 4.9's are used and

the Interaction.vfp is edited (or simply replaced with the one in the link below) to

uncripple the Radeon. Both cards will easily run right up and sit on Doom3' 60fps cap.

 

http://esprit.campus.luth.se/~humus/temp/d...rmanceTweak.rar

 

 

The x800Pro is about 45 bucks or so cheaper...$360 versus $405 but admittedly I haven't

searched all that hard on GT pricing.

 

The X800Pro's have been stock OC'ing like cats with their butt ends on fire and have a

decided edge over the NV40 there. The worst x800Pro i've had OC'ed 560/560 stock.

Have one here right now that OC's 574/580 stone stock with the Pro bios. A flash to

the XT-PE bios (still 12 pipe though) usually adds aniother 10 to 13.5Mhz or so to the

stock Pro memory OC.

 

The x800pro's have been modding out well too. The 560/560 card above came in at

633/621 clean and had no trouble cracking 13,000 in 3DM03 running 12 pipes in my

DFI UI running 253x11 (which is bottle necking the card).

 

If we can ever get around ATI's multiple layers of hardware/software protection that

prevent the last quad of pipes from being enabled the X800Pro's will clean NV's clock

hands down.

 

Lastly a lifetime warranty sounds good but in reality it is somewhat pointless and only

increases what you pay for the card. It is only worth while if you plan to make the

card your last vid card purchase for several years and that is HIGHLY unlikely. It will

obsolete in 12 to 18 months and you will be shopping for the next latest and greatest

by then.

 

You will be very hard pressed to go wrong either way you choose to go.

 

Viper

Share this post


Link to post
Share on other sites

Originally posted by Kamel

http://www.anandtech.com/video/showdoc.aspx?i=2146&p=3

 

^^ lol, the 6800gt pisses all over the X800XTPE like a $600 card, and that's without taking into consideration its insane overclocking abilities.

 

it kinda makes me feel bad for ati, but not really... the fanATics have been going nuts ever since the release of the 9700pro, it's about time nvidia got a decent break.

 

 

Those test were run before the Cat 4.9's and before the Interaction.vfp file tweaks were found so

the x800's are not running completely crippled.

 

Viper

Share this post


Link to post
Share on other sites

Originally posted by ViperJohn

Those test were run before the Cat 4.9's and before the Interaction.vfp file tweaks were found so

the x800's are not running completely crippled.

 

Viper

 

I don't think this is a fair assessment to make. ATi is allowed to make driver updates to enhance compatibility or performance with certain software, but tweaks such as the interaction.vfp skew results. Rather, compare ATI's doom 3 performance to NVIDIA's merely with their newest driver versions, as the interaction.vfp does not help to make an "apples to apples" comparison. I can only imagine that if such a tweak were made to further NVIDIA Doom 3 performance then everyone would be crying about more "driver optimizations" on NVIDIA's part. Furthermore, the interaction.vfp does not net a signifcant gain in performance and has been proven to create artifacting and unecessary texture tearing.

 

I think that both the 6800 series and x800 series are great cards, each with their strengths and weaknesses, but in terms of Doom 3 NVIDIA has won this round hands down. This is something that fanboys of both sides should accept. I am positive that ATI will put out their fair share of wins within the next several months as well. Furthermore, people should understand that comparions between cards should be made at stock and as even a baseline as possible. If a website such as HardOcp or Tom's were to compare a 3.0c with a 3200+ they should do so with both at 200 fsb and stock mhz, not with the 3200+ running at 250fsb and the intel left at 200. It simply doesn't make for accurate conclusions.

 

deception``

Share this post


Link to post
Share on other sites

Originally posted by deception``

I don't think this is a fair assessment to make. ATi is allowed to make driver updates to enhance compatibility or performance with certain software, but tweaks such as the interaction.vfp skew results. Rather, compare ATI's doom 3 performance to NVIDIA's merely with their newest driver versions, as the interaction.vfp does not help to make an "apples to apples" comparison. I can only imagine that if such a tweak were made to further NVIDIA Doom 3 performance then everyone would be crying about more "driver optimizations" on NVIDIA's part. Furthermore, the interaction.vfp does not net a signifcant gain in performance and has been proven to create artifacting and unecessary texture tearing.

 

I think that both the 6800 series and x800 series are great cards, each with their strengths and weaknesses, but in terms of Doom 3 NVIDIA has won this round hands down. This is something that fanboys of both sides should accept. I am positive that ATI will put out their fair share of wins within the next several months as well. Furthermore, people should understand that comparions between cards should be made at stock and as even a baseline as possible. If a website such as HardOcp or Tom's were to compare a 3.0c with a 3200+ they should do so with both at 200 fsb and stock mhz, not with the 3200+ running at 250fsb and the intel left at 200. It simply doesn't make for accurate conclusions.

 

deception``

 

It is more than fair when the code of the game is written in such a way that cripples

a given video card be it ATI or NV. You will probably see a game patch in the not

to distant future that will address the settings directly. That is what tuning and patching

a new game is all about and it works both ways. If the games card detection algorithms

were better it would do it for you now but that is not the games developers fault. If they

kept tuning and changing code with every piece of new hardware that came out a/the

game would never get released. They have to cut off development/ code changes at

some point and go into actual production. The simple fact is the games code was

finalized/locked down and production/pressing of game disks started long before

either card was available and any meaningful testing could be done. There are probably

many more tweaks to be found that will make either card run better.

 

Skewing the results is a moot point as the TimeDemo does that for you. When running

TimeDemo the Doom3 game max FPS cap (60fps) is disabled and it shows frame rates

that you will never see in the game itself with either card. The interaction.vfp works

better in the game than in the TimeDemo BTW.

 

If you are getting texture tearing that is because the frame rate is running higher

than the monitor refresh rate which proves the card is running faster.

 

Viper

Share this post


Link to post
Share on other sites

Originally posted by ViperJohn

Those test were run before the Cat 4.9's and before the Interaction.vfp file tweaks were found so

the x800's are not running completely crippled.

 

Viper

 

that patch effects IQ of the anistrophic filtering. it isn't too bad, admittedly, but be aware that it is at the small expense of IQ.

 

i totally agree, however, it is hard to go wrong with either card... you definitely wont get ripped off, i just think the 6800gt is a better choice.

 

the 6800gt's overclock very well also, with automatic overclocking, the average 6800gt can get around 425/1170, which is even higher than ultra speeds. of course, it can reach up to 75C under load, and even higher on some cards, but it doesn't effect stability at all. it also does not add very much to the overall temps of your case (feeling for ambient heat from a 6800gt i couldn't notice any, the "feeler rod" on my friends 6800gt [that's what i named it] it felt like about 80F.

 

as for the image quality differences, i don't want to be the pot calling the kettle black... here's links pointing out the differences between nvidia and ati as far as optimizations go, and the "humus tweak" you posted in your response. like i said, it's at the expense of IQ, though admittedly not very much IQ. take a look at the comparisons and see for yourself what you think, don't need someone else telling you if the IQ is good or not.

 

on one last note... the tweak did "improve" the speed, but doom3 still favors nvidia cards after the tweak, the gap just isn't as wide as it used to be.

 

the doom3 tweak

http://www.skenegroup.net/tertsi/doom3/ati...t_vs_humus.html

http://www.skenegroup.net/tertsi/doom3/ati..._vs_humus2.html

 

the "optimization" that ati made (smart shader" that Nvidia questions, even though it doesn't effect IQ to the naked eye), if your eyes are clothed, please make them naked now

 

http://www.skenegroup.net/tertsi/doom3/ati...ia_vs_ati1.html

http://www.skenegroup.net/tertsi/doom3/ati...ia_vs_ati2.html

http://www.skenegroup.net/tertsi/ut2003/ati_optimizations/

http://www.skenegroup.net/tertsi/ut2003/at...dia_vs_ati.html

http://www.skenegroup.net/tertsi/doom3/ati.../47_vs_491.html

http://www.skenegroup.net/tertsi/doom3/ati.../47_vs_492.html

Share this post


Link to post
Share on other sites

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now

×
×
  • Create New...