El_Capitan Posted March 9, 2012 Posted March 9, 2012 May I ask where? http://www.newegg.com/Product/ProductList.aspx?Submit=ENE&N=100007709%20600094002%20600085533&IsNodeId=1&name=GeForce%20GTX%20580%20%28Fermi%29 Not to mention people selling used ones for ~ $370. Share this post Link to post Share on other sites More sharing options...
Prunes Posted March 9, 2012 Posted March 9, 2012 10% faster ? I think I'll skip this one, I can just OC my cards and get a lot more performance. That 2304 CUDA core GK110 looks interesting by probably not worth upgrading my cards for. Im just going to keep my 28nm cards, they max out everything I want and I minght just probably WC them. I'll wait for the next 4608 CUDA core lower nm card in some years to come Nyt, don't lie to yourself. You won't be able to keep those cards until a new gen is released. We all know that Share this post Link to post Share on other sites More sharing options...
Stonerboy779 Posted March 9, 2012 Posted March 9, 2012 Nyt, don't lie to yourself. You won't be able to keep those cards until a new gen is released. We all know that From memory these cards and current rig config ect is supposed to last till 2015 for him or something with no more upgrades. Share this post Link to post Share on other sites More sharing options...
NikoDG Posted March 9, 2012 Posted March 9, 2012 From memory these cards and current rig config ect is supposed to last till 2015 for him or something with no more upgrades. I think three years is manageable, if you're alright playing games at mid graphics near the end. Which I have a hard time believing Nyt would Share this post Link to post Share on other sites More sharing options...
d6bmg Posted March 9, 2012 Posted March 9, 2012 From memory these cards and current rig config ect is supposed to last till 2015 for him or something with no more upgrades. Theoretically, yes. Practically for him: no. Share this post Link to post Share on other sites More sharing options...
Muchoman1 Posted March 9, 2012 Posted March 9, 2012 just a question nyt, you have a single 1080x1920 monitor/tv right? i know this may sound ironic coming from me, but why do you need dual 7970s. are you running 3d? cause that may explain it it some more intesive games, just wondering why so much power is needed for a single screen Share this post Link to post Share on other sites More sharing options...
Nyt Posted March 9, 2012 Posted March 9, 2012 Nyt, don't lie to yourself. You won't be able to keep those cards until a new gen is released. We all know that Only until something more worthwhile like 15nm or below just a question nyt, you have a single 1080x1920 monitor/tv right? i know this may sound ironic coming from me, but why do you need dual 7970s. are you running 3d? cause that may explain it it some more intesive games, just wondering why so much power is needed for a single screen Single 1920x1080 TV that is 100Hz and I need dual 7970's in order to play super intense games like Metro 2033 and Witcher uber sampling at an FPS which is not laggy for me From memory these cards and current rig config ect is supposed to last till 2015 for him or something with no more upgrades. Well at least until 2014 , so thats like 1.5 - 2 years which I think is easily achievable since these cards support DX11.1 for Windows 8 anyway so I dont have to upgrade for a new DX I'd probably upgrade to the next nm die shrink cards of at least 15nm or below (else its pointless really) and then maybe a Haswell or the next enthusiast platform after LGA2011 in 2-3years since CPU can last much longer than GPU's since CPU's hardly effect FPS, heck I could've stayed with my 4GHz i7 920 and still enjoy games Share this post Link to post Share on other sites More sharing options...
Stonerboy779 Posted March 9, 2012 Posted March 9, 2012 Only until something more worthwhile like 15nm or below Single 1920x1080 TV that is 100Hz and I need dual 7970's in order to play super intense games like Metro 2033 and Witcher uber sampling at an FPS which is not laggy for me Well at least until 2014 , so thats like 1.5 - 2 years which I think is easily achievable since these cards support DX11.1 for Windows 8 anyway so I dont have to upgrade for a new DX I'd probably upgrade to the next nm die shrink cards of at least 15nm or below (else its pointless really) and then maybe a Haswell or the next enthusiast platform after LGA2011 in 2-3years since CPU can last much longer than GPU's since CPU's hardly effect FPS, heck I could've stayed with my 4GHz i7 920 and still enjoy games Nyt why did you bother mentioning 100Hz? That is a horizontal refresh and it doesn't put more strain on the GPU If you coulda stayed with the 920 why didn't you Share this post Link to post Share on other sites More sharing options...
d6bmg Posted March 10, 2012 Posted March 10, 2012 Single 1920x1080 TV that is 100Hz and I need dual 7970's in order to play super intense games like Metro 2033 and Witcher uber sampling at an FPS which is not laggy for me The bolder part is the most important part posted by you in the whole post. Share this post Link to post Share on other sites More sharing options...
Nyt Posted March 10, 2012 Posted March 10, 2012 If you coulda stayed with the 920 why didn't you E-peen of course Share this post Link to post Share on other sites More sharing options...
Bosco Posted March 11, 2012 Posted March 11, 2012 One problem with that demo though is it isn't apples-to-apples. The link states they used FXAA instead of MSAA to increase performance. I'm not sure if the difference between the AA methods could lead to such an impressive comparison, but it is still inappropriate to say one 680 matches three 580s because there is currently no strong evidence supporting that. My point is the information in that link doesn't necessarily say the 680 is that superior to the 580. Now, if there really are going to be three times as many shaders, I want to see some PPD! IMO FXAA looks far better then MSAA and uses a ton less resources which sucks because AMD can't do that. So you have PhysX and now a better FXAA that NVIDA can do and do well I might add, there is some other stuff they have coming that is going to boost their cards performance which is going to push AMD further and further down the line. I can confirm OCC has two Kepler cards and I have one of them. All I can say is what you are seeing from the Demo is not far of the mark. Share this post Link to post Share on other sites More sharing options...
Waco Posted March 11, 2012 Posted March 11, 2012 IMO FXAA looks far better then MSAA and uses a ton less resources which sucks because AMD can't do that. FXAA works with all cards actually - it's just a post-processing filter. Skyrim for example allows FXAA on AMD and nVidia cards. Share this post Link to post Share on other sites More sharing options...
Recommended Posts
Please sign in to comment
You will be able to leave a comment after signing in
Sign In Now