Jump to content

G80 Taped out (Good Info)


Recommended Posts

They're powerhouses in any case whether they're running DX10 or DX9, so don't dismiss them on that premise. I don't know about G80 but R600 will boost DX9 performance with quite a bit.

Share this post


Link to post
Share on other sites

never said it wont, but lets face it, gfx isnt going anywhere besides dx10, an 7800 can run everything now'o'days, 7900/1900 even more, as for resultion, uber high res has sli/CF...

no real point in speed advancment here unless you wanna use higher AA...

i mean, faster is always better, duh, but it wont give us much for now, thats my idea atleast

Share this post


Link to post
Share on other sites

Here............

 

 

And the firm decided it doesn't need a unified Shader for its upcoming G80 chipset. Instead, it decided that you will be fine with twice as many pixel Shader numbers as geometry and vertex Shaders.

 

AS we understand it, if a Nvidia DX10 chip ends up with 32 pixel-Shaders, the same chip will have 16 Shaders that will be able to process geometry instancing or the vertex information.

 

ATI's R600 and its unified Shaders work a bit differently. Let's assume that ATI hardware has 64 unified Shaders. This means that ATI can process 64 pixel lines only per clock. That may be in the proportions: 50 pixel, 14 vertex and geometry lines per clock, or 40 vertex, 10 pixel and 14 geometry information per clock. Any ratio that adds up to 64 will do. I hope you get this maths.

 

The Nvidian chippery is limited to 32 pixel and 16 vertex and geometry lines per clock, which might be a wining ratio but it is still too early to say. We don’t know who will win the next generation hardware game and whose approach is better: ATI's unified or Nvidia's two-to-one ratio.

 

DirectX 10 actually doesn’t care how you do your Shaders as you speak with an abstraction layer and hardware can do its pixel, vertex and geometry data the way it wants. It will serve up the information to DirectX 10 that will process it inthe end.

 

Nvidia's G80 is fully DirectX 10 capable as well as Shader Model 4.0 capable but it won't be unified, according to our sources.

Share this post


Link to post
Share on other sites

never said it wont, but lets face it, gfx isnt going anywhere besides dx10, an 7800 can run everything now'o'days, 7900/1900 even more, as for resultion, uber high res has sli/CF...

no real point in speed advancment here unless you wanna use higher AA...

i mean, faster is always better, duh, but it wont give us much for now, thats my idea atleast

 

Not really for one, you don't need vista for dx10, officially you will though.

Unofficially you won't.

The ddk for example installs dx10 and that installs on previous ver's of windows.

 

 

For 2.

The newest card, I dn.

But I do know for example that if you cranked the detals all the way up on farcry, I mean all the way, and from the driver, on the 3rd level you got 3 fps on a 6600gt, at 640x480x32.

No aa, and I never use af because the quality is worse.

There's cards out there probably 4 times faster, in gpu power.

There's cards out there with just over 2x the memory bandwith

Still, need a card that can proccess in the gpu 100 times faster, or something like that.

Don't take this as saying cranking up the details for the nv cpl and the farcry menu's...

 

I mean that game is freaking old, and I could'nt run it at max detail, I could, but not all levels.

I could any amagine how much power the newest farcry and ut will need, especially ut becuase of the texture res'es they use.

 

I mean, you may beable to run it, but I can't wihtout jacking down the details to somewhat normal levels in some areas, and yuck lol.

 

 

The bst way to pus it is, for the mahority of peopel, you're right.

For some people though, we need more power, lots more.

 

 

Mack27

Is there any word on the tmu's per pixal pipeline?

Share this post


Link to post
Share on other sites

a. to be honest, i dont care 'bout unoffical support, i presum that MS wont let us run it in any other way without vista, and even if it will, it will suck.

b. ah, why are you comparing a 6600? i mean, what, you meant that in two years, the price equevilant of a 6600 could run far cry @ 32aa? so what? by then you'll say crysis is running @ 6 fps... that wasnt a good comperison, or i didnt understand it at all :) - it is 2 am :L

i can run farcry with 1280 everything @ ultra with 4AA or hdr@7 at more then playable FPS... so no, i dont see a reason for a faster card to be honest, maybe a bit more, but thats just so ill get not only 50, but 60 to match my vsync

Share this post


Link to post
Share on other sites

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
×
×
  • Create New...