Jump to content

NVIDIA GF100 Architecture Details


Nemo

Recommended Posts

:blink: What? You seem to think I'm anti-nVidia or something...

 

I can tell you this: Nobody knows anything concrete performance-wise (other than "good") and the numbers being thrown around mean absolutely dick. I'll take the same attitude I did with all of the 5-series hype and wait for actual performance numbers. Till then it's all just idle speculation without much merit. :lol: Sure it'll be fast but until we see real numbers, real power consumption, and real prices it doesn't mean a thing.

 

that were a joke.

 

and the numbers in the topic i posted were 100% real from beta drivers in a very early stage. But we still dont have real power or pricing (which we all fear is gonna be high). but it is "numbers that mean something" and you can choose to see them or not,

Share this post


Link to post
Share on other sites

  • Replies 45
  • Created
  • Last Reply

Top Posters In This Topic

but it is "numbers that mean something" and you can choose to see them or not,

Zero information on clocks, these numbers definitely don't mean anything. I'll wait on ccokeman to give us a review and actual performance results :)

Share this post


Link to post
Share on other sites

Hmm, the technology their implementing seems to be pretty advanced, and I can't wait to see actual benchmarks, and even how far these cards based on the architecture are able to overclock. Though I'm wondering if it's too late for right now, as AMD has a good part of the market with their value and performance cards.

good review ccokeman, very informative. i'm glad i looked it over. i've seen some of these vids in various places on the web but this is the first time i've seen where it was all compiled for conclusions...

i'd like a firm release date though... B:)

Share this post


Link to post
Share on other sites

Zero information on clocks, these numbers definitely don't mean anything. I'll wait on ccokeman to give us a review and actual performance results :)

 

I actually am not concerned too much wit the data, what little is there. What is there is real. Not final clocks or wattage. The fps and such is real. Those numbers dont mean anything to you or who ever else decides them to be meaningless. But this is real data. Its not the whole picture. But it is useful and meaningful in what little they represent.

 

The gf100 driver team has their work cut out for them. This card takes a lot of effort; to get the most out of it much time and energy is spent on finding the most efficient use of the ultra flexible design. Seemingly endless totally new ways of processing the data are being born. Its crazy.

 

But the small amount of data available is small but not meaningless. The fps and other data that was viewed is real and not made up. that doesnt mean all games will show such improvements. The gf100 has much going for it but nothing is designed around it to date. So the drivers are gonna have to make use of this card and you may see a few anomalies here and there. Some games may not fit well with it at any effort, we will soon know. But for now there is some data out on a few particular cases that have been witnessed. You can bet upon release that in those cases the card will at least perform that well if not even better.

Edited by ocre

Share this post


Link to post
Share on other sites

I actually am not concerned too much wit the data, what little is there. What is there is real.

Do you realize you just contradicted yourself in the premise of your argument? :lol:

 

The numbers that have been posted mean nothing.

 

The gf100 driver team has their work cut out for them. This card takes a lot of effort; to get the most out of it much time and energy is spent on finding the most efficient use of the ultra flexible design. Seemingly endless totally new ways of processing the data are being born. Its crazy.

On the GPGPU side of things I agree. In terms of rendering the GF100 is little more than a souped-up GT200 chip (much like the RV870 is a souped-up RV770). The architecture is largely similar in terms of pushing polygons and shading...it's the computing side of the GF100 that's going to take the most work to make it efficient (both in terms of driver work and in terms of recoding existing programs to run efficiently).

 

The fps and other data that was viewed is real and not made up.

How do you know that? :unsure:

Edited by Waco

Share this post


Link to post
Share on other sites

am i the only one who thinks the GF100 code name is a stupid idea? even if it is only a codename and not a final product name. most of us have been expecting fermi to be the GT300 series. and to the everyday consumer who doesn't read up on tech will go 100 series is lower than the 200 series therefore i should buy 200 series because newer products have higher numbers (i.e 9 series came after 8, which came after 7 etc.). also why go GF when nvidia has traditionally stuck with GT, GTS, GTX?

 

i'd also love to see the real world performance. and i cant wait to see what pricing will be like, and if ATI will respond with price cuts or new cards etc.

Share this post


Link to post
Share on other sites

am i the only one who thinks the GF100 code name is a stupid idea? even if it is only a codename and not a final product name. most of us have been expecting fermi to be the GT300 series. and to the everyday consumer who doesn't read up on tech will go 100 series is lower than the 200 series therefore i should buy 200 series because newer products have higher numbers (i.e 9 series came after 8, which came after 7 etc.). also why go GF when nvidia has traditionally stuck with GT, GTS, GTX?

 

i'd also love to see the real world performance. and i cant wait to see what pricing will be like, and if ATI will respond with price cuts or new cards etc.

GF100 is the name of the chip. The card will likely be the GTX 360/380.

Share this post


Link to post
Share on other sites

Wait so this thing is a quad core GPU pretty much??

I like all the side technologies they're working on, just hope the hardware is under $1K.

 

I'm with Waco in that I don't get excited until I see some more concrete stuff with multiple sources

Share this post


Link to post
Share on other sites

Wait so this thing is a quad core GPU pretty much??

Well, kinda.

 

It's 4 "cores" each with 128 CUDA cores (so sorta like a128-core GT200b with upgrades for GPGPU apps and DX11, as far as I understand it). Designing it like that allows them to cut it down pretty easily.

Edited by Waco

Share this post


Link to post
Share on other sites


×
×
  • Create New...