Jump to content

ATi vs. nVidia


Nerm

Recommended Posts

  • Replies 309
  • Created
  • Last Reply

Top Posters In This Topic

ATI fans, read and weep. B)

Clicky

Read an weap? Are you kidding? Thats like comparing a tnt2 to a 9800xt. They are two completely different generations of hardware. Like everyone else said. You cant compare apples with oranges.

 

We'll see who wins when ati release their new shiznit.

 

Just remember to not let fanboyism get in the way of desicion making. If nvidia does turn out to be faster buy it, it ati does buy them. Thats the nature of capatilism.

Share this post


Link to post
Share on other sites

ima hold off on praising nvidia for thier card. for all we know it could be a total flop comepared to ati's new card.

You couldnt be more right! ATI's new card hopefully rips up the competition :D

 

It will take a hell of a card from nvidia to sway me away from ATI.

 

 

 

Anyone seen any news on the new ATI card? The only thing I know is its apparently called the "X800XT", and I got that from Paranoid.

Share this post


Link to post
Share on other sites

http://theinquirer.org/?article=15377

http://theinquirer.org/?article=15248

http://theinquirer.org/?article=15229

http://www.xbitlabs.com/news/video/display...0413124046.html

http://www.xbitlabs.com/news/video/display...0408100025.html

http://www.xbitlabs.com/news/video/display...0406161139.html

 

basically the following is what is known:

 

X800 SE - R420, 8 Pipelines, ??? MB 800 MHz 128-bit DDR, 450 MHz core clock, AGP

X800 Pro - R420, 12 Pipelines, 256 MB/512 MB 1000 MHz 256-bit DDR/GDDR3, 500 MHz core clock, AGP

X800 XT - R420, 16 Pipelines, 256 MB/512 MB 1200 MHz 256-bit DDR/GDDR3, 600 MHz core clock, AGP

X880 Pro - R423, 12 Pipelines, 256 MB/512 MB 1000 MHz 256-bit DDR/GDDR3, 500 MHz core clock, PCI Express

X880 XT - R423, 16 Pipelines, 256 MB/512 MB 1200 MHz 256-bit DDR/GDDR3, 600 MHz core clock, PCI Express

 

the following may or may not be true:

 

X600 - RV380, 4 Pipelines, 128 MB/256 MB 600 MHz 128-bit DDR, PCI Express, ??? MHz core clock

X300 - RV370, 4 Pipelines, ??? MB ??? MHz 64/128-bit DDR, PCI Express, ??? MHz core clock

Share this post


Link to post
Share on other sites

http://theinquirer.org/?article=15377

http://theinquirer.org/?article=15248

http://theinquirer.org/?article=15229

http://www.xbitlabs.com/news/video/display...0413124046.html

http://www.xbitlabs.com/news/video/display...0408100025.html

http://www.xbitlabs.com/news/video/display...0406161139.html

 

basically the following is what is known:

 

X800 SE - R420, 8 Pipelines, ??? MB 800 MHz 128-bit DDR, 450 MHz core clock, AGP

X800 Pro - R420, 12 Pipelines, 256 MB/512 MB 1000 MHz 256-bit DDR/GDDR3, 500 MHz core clock, AGP

X800 XT - R420, 16 Pipelines, 256 MB/512 MB 1200 MHz 256-bit DDR/GDDR3, 600 MHz core clock, AGP

X880 Pro - R423, 12 Pipelines, 256 MB/512 MB 1000 MHz 256-bit DDR/GDDR3, 500 MHz core clock, PCI Express

X880 XT - R423, 16 Pipelines, 256 MB/512 MB 1200 MHz 256-bit DDR/GDDR3, 600 MHz core clock, PCI Express

 

the following may or may not be true:

 

X600 - RV380, 4 Pipelines, 128 MB/256 MB 600 MHz 128-bit DDR, PCI Express, ??? MHz core clock

X300 - RV370, 4 Pipelines, ??? MB ??? MHz 64/128-bit DDR, PCI Express, ??? MHz core clock

512 MB? Holy crap... soon, our video cards will have more RAM than most people's computers!

Share this post


Link to post
Share on other sites

I am dissapointed in the core frequencies in the upcomming cards to say the least. I would have liked to see the high end cards sporting 1000 mhz rather then more ram we don't even use. I mean come on the 9800 xt and the 5950 ultra dont even utilize all of the 256mb they have so why make a card that has twice that but only slightly faster clock speeds?

Share this post


Link to post
Share on other sites

Guest Ballz2TheWallz

the nvidia card is clocked at 400mhz! and its beat the living **** out of higher clocked cards. as showed with AMD CLOCK ISNT EVERYTHING the nvidia nv40 chip is more powerful supposedly than most p4's

 

another example 9800xt is clocked 412mhz opposed to 5950 475mhz and in most cases it has much better preformence

Edited by Ballz2TheWallz

Share this post


Link to post
Share on other sites

I am dissapointed in the core frequencies in the upcomming cards to say the least. I would have liked to see the high end cards sporting 1000 mhz rather then more ram we don't even use. I mean come on the 9800 xt and the 5950 ultra dont even utilize all of the 256mb they have so why make a card that has twice that but only slightly faster clock speeds?

i agree, we all know that they have very low voltage cpu's running 1Ghz, and jeez, you can get a 2.8G cpu for 150 bones, i think they could put one in a pcb and add some ram and make one hell of a card, of course heat would be an issue, but i know they could be using faster gpu's and still make a healthy profit, especially when a top of the line card runs $500 + bones

 

it's called holding out and making as much as you can on as little as possible ;)

Share this post


Link to post
Share on other sites

Yup its not about the core frequencies anymore its about how much the card can render. Think back to the fx5800 ultra and how cra p it was compared to a 9800. Nvidia have always used brute force mhz to win but they couldn't do it with the fx series. Under direct x 9 its all about the vertex shaders perform and how many pixel pipes the card has. The new nv40 has 16 pipes in a 16x2 layout or 32 in a theoretical 32x0 layout. whereas a fx5950 is 4x2 or 8x0. also i can't remeber where i read it but i ve heard the new nv40 is the only vard to be hjardware compatible with direct x 9 c when its released with service pack 2. This is because it can handle the fp32 128 bit floating point colour accuracy required by dx9c. This is against the r420s fp24 (dx9 B) If this is true this could be the downfall for ati as the card can't push fp32. Effectively the fx5950 is direct x9c hardware compatible as well but performs very bad under fp32. Only time will tell. Will be greatonce both half life 2 and dom 3 are running on the r420 and nv40 cos they well be able to see the differences. Also far cry is being upgraded to pixel shader 3.0 soon (dx9c) which will add to the eye candy! :D

Share this post


Link to post
Share on other sites

Nvidia promises a 50 % Increase with Direct x 9.1 but when it comes out ATI will gain increase too , SO NVIDIA sucks! i hate thier cards , If i ever had the chance alone with a Nvidia card , U would never be able to tell it was a video card , Im a Dieheart ATI Fan

 

ATI Ownz! dont forget that!

Share this post


Link to post
Share on other sites

Nvidia promises a 50 % Increase with Direct x 9.1 but when it comes out ATI will gain increase too , SO NVIDIA sucks! i hate thier cards , If i ever had the chance alone with a Nvidia card , U would never be able to tell it was a video card , Im a Dieheart ATI Fan

 

ATI Ownz! dont forget that!

The NV40 will OWNZ the ATI card. I have always been a diehard NVidia fan EXCEPT for the FX series. Lets pretend that never existed and then talk about which company is better. :P

Share this post


Link to post
Share on other sites

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now

×
×
  • Create New...