Jump to content

GTX 600s might release in April


Prunes

Recommended Posts

10% faster ? I think I'll skip this one, I can just OC my cards and get a lot more performance. That 2304 CUDA core GK110 looks interesting by probably not worth upgrading my cards for.

Im just going to keep my 28nm cards, they max out everything I want and I minght just probably WC them. I'll wait for the next 4608 CUDA core lower nm card in some years to come :P

Nyt, don't lie to yourself. You won't be able to keep those cards until a new gen is released. We all know that :happy:

Share this post


Link to post
Share on other sites

Nyt, don't lie to yourself. You won't be able to keep those cards until a new gen is released. We all know that :happy:

From memory these cards and current rig config ect is supposed to last till 2015 for him or something with no more upgrades.

Share this post


Link to post
Share on other sites

From memory these cards and current rig config ect is supposed to last till 2015 for him or something with no more upgrades.

I think three years is manageable, if you're alright playing games at mid graphics near the end. Which I have a hard time believing Nyt would :P

Share this post


Link to post
Share on other sites

From memory these cards and current rig config ect is supposed to last till 2015 for him or something with no more upgrades.

 

Theoretically, yes. Practically for him: no.

Share this post


Link to post
Share on other sites

just a question nyt, you have a single 1080x1920 monitor/tv right?

 

i know this may sound ironic coming from me, but why do you need dual 7970s. are you running 3d? cause that may explain it it some more intesive games, just wondering why so much power is needed for a single screen

Share this post


Link to post
Share on other sites

Nyt, don't lie to yourself. You won't be able to keep those cards until a new gen is released. We all know that :happy:

 

Only until something more worthwhile like 15nm or below

 

just a question nyt, you have a single 1080x1920 monitor/tv right?

 

i know this may sound ironic coming from me, but why do you need dual 7970s. are you running 3d? cause that may explain it it some more intesive games, just wondering why so much power is needed for a single screen

 

Single 1920x1080 TV that is 100Hz and I need dual 7970's in order to play super intense games like Metro 2033 and Witcher uber sampling at an FPS which is not laggy for me :P

 

From memory these cards and current rig config ect is supposed to last till 2015 for him or something with no more upgrades.

 

Well at least until 2014 , so thats like 1.5 - 2 years which I think is easily achievable since these cards support DX11.1 for Windows 8 anyway so I dont have to upgrade for a new DX :P

 

I'd probably upgrade to the next nm die shrink cards of at least 15nm or below (else its pointless really) and then maybe a Haswell or the next enthusiast platform after LGA2011 in 2-3years since CPU can last much longer than GPU's since CPU's hardly effect FPS, heck I could've stayed with my 4GHz i7 920 and still enjoy games :lol:

Share this post


Link to post
Share on other sites

Only until something more worthwhile like 15nm or below

 

 

 

Single 1920x1080 TV that is 100Hz and I need dual 7970's in order to play super intense games like Metro 2033 and Witcher uber sampling at an FPS which is not laggy for me :P

 

 

 

Well at least until 2014 , so thats like 1.5 - 2 years which I think is easily achievable since these cards support DX11.1 for Windows 8 anyway so I dont have to upgrade for a new DX :P

 

I'd probably upgrade to the next nm die shrink cards of at least 15nm or below (else its pointless really) and then maybe a Haswell or the next enthusiast platform after LGA2011 in 2-3years since CPU can last much longer than GPU's since CPU's hardly effect FPS, heck I could've stayed with my 4GHz i7 920 and still enjoy games :lol:

Nyt why did you bother mentioning 100Hz? That is a horizontal refresh and it doesn't put more strain on the GPU :P

 

If you coulda stayed with the 920 why didn't you :P

Share this post


Link to post
Share on other sites

Single 1920x1080 TV that is 100Hz and I need dual 7970's in order to play super intense games like Metro 2033 and Witcher uber sampling at an FPS which is not laggy for me :P

 

The bolder part is the most important part posted by you in the whole post. :P

Share this post


Link to post
Share on other sites

One problem with that demo though is it isn't apples-to-apples. The link states they used FXAA instead of MSAA to increase performance. I'm not sure if the difference between the AA methods could lead to such an impressive comparison, but it is still inappropriate to say one 680 matches three 580s because there is currently no strong evidence supporting that.

My point is the information in that link doesn't necessarily say the 680 is that superior to the 580.

 

Now, if there really are going to be three times as many shaders, I want to see some PPD!

 

IMO FXAA looks far better then MSAA and uses a ton less resources which sucks because AMD can't do that.

 

So you have PhysX and now a better FXAA that NVIDA can do and do well I might add, there is some other stuff they have coming that is going to boost their cards performance which is going to push AMD further and further down the line.

 

I can confirm OCC has two Kepler cards and I have one of them. All I can say is what you are seeing from the Demo is not far of the mark.

Share this post


Link to post
Share on other sites

IMO FXAA looks far better then MSAA and uses a ton less resources which sucks because AMD can't do that.

FXAA works with all cards actually - it's just a post-processing filter. Skyrim for example allows FXAA on AMD and nVidia cards.

Share this post


Link to post
Share on other sites

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
×
×
  • Create New...