Jump to content

nVidia Geforce 337.50 Beta Drivers Released; Ran some tests on my own


Crow47

Recommended Posts

 

 

Weeney, you just described a bottleneck xD It doesn't matter how many cores the game uses or whatever, it's still a bottleneck.

 

If the gpu isn't being maxed out and you don't already have 1000fps or a frame limiter in place it's pretty sure to be a cpu bottleneck ;)

You were saying the CPU was the bottleneck, while the game is to blame.

It is the other way around :P

So a game that needs a faster CPU to run faster...is bottlenecked by itself?

 

I don't get your line of thought. :lol:

Share this post


Link to post
Share on other sites

  • Replies 50
  • Created
  • Last Reply

Top Posters In This Topic

 

 

Weeney, you just described a bottleneck xD It doesn't matter how many cores the game uses or whatever, it's still a bottleneck.

 

If the gpu isn't being maxed out and you don't already have 1000fps or a frame limiter in place it's pretty sure to be a cpu bottleneck ;)

You were saying the CPU was the bottleneck, while the game is to blame.

It is the other way around :P

So a game that needs a faster CPU to run faster...is bottlenecked by itself?

 

I don't get your line of thought. :lol:

 

 

That's what I'm saying! :lol:

 

I installed Tomb Raider last night, and I didn't have the chance to do a before/after comparison, but my initial impressions were that there had been a slight improvement. As soon as I have a chance I'll try to quantify my impressions with some results. 

Share this post


Link to post
Share on other sites

 

Weeney, you just described a bottleneck xD It doesn't matter how many cores the game uses or whatever, it's still a bottleneck.

 

If the gpu isn't being maxed out and you don't already have 1000fps or a frame limiter in place it's pretty sure to be a cpu bottleneck ;)

 

You were saying the CPU was the bottleneck, while the game is to blame.

It is the other way around :P

 

wat :lol:

 

I don't think you understood. I don't remember saying the game is to blame :P Games that are cpu bound have lots of ai and or physics, calculations etc to be done, which isn't the games fault really, and if the cpu can't process it fast enough it will hold back the gpu which is waiting on the cpu to send it the processed info so it can render the scene. And apparently the API is using up cpu time so it slows it further, and there are some not so well coded games but theyre not that bad since most are for pc only(rts etc) 

Edited by DanTheGamer11

Share this post


Link to post
Share on other sites

 

 

Weeney, you just described a bottleneck xD It doesn't matter how many cores the game uses or whatever, it's still a bottleneck.

 

If the gpu isn't being maxed out and you don't already have 1000fps or a frame limiter in place it's pretty sure to be a cpu bottleneck ;)

 

You were saying the CPU was the bottleneck, while the game is to blame.

It is the other way around :P

 

wat :lol:

 

I don't think you understood. I don't remember saying the game is to blame :P Games that are cpu bound have lots of ai and or physics, calculations etc to be done, which isn't the games fault really, and if the cpu can't process it fast enough it will hold back the gpu which is waiting on the cpu to send it the processed info so it can render the scene. And apparently the API is using up cpu time so it slows it further, and there are some not so well coded games but theyre not that bad since most are for pc only(rts etc) 

 

 

Once again, I'm pretty sure you just described a CPU bottleneck :P

Share this post


Link to post
Share on other sites

 

 

 

Weeney, you just described a bottleneck xD It doesn't matter how many cores the game uses or whatever, it's still a bottleneck.

 

If the gpu isn't being maxed out and you don't already have 1000fps or a frame limiter in place it's pretty sure to be a cpu bottleneck ;)

 

You were saying the CPU was the bottleneck, while the game is to blame.

It is the other way around :P

 

wat :lol:

 

I don't think you understood. I don't remember saying the game is to blame :P Games that are cpu bound have lots of ai and or physics, calculations etc to be done, which isn't the games fault really, and if the cpu can't process it fast enough it will hold back the gpu which is waiting on the cpu to send it the processed info so it can render the scene. And apparently the API is using up cpu time so it slows it further, and there are some not so well coded games but theyre not that bad since most are for pc only(rts etc) 

 

 

Once again, I'm pretty sure you just described a CPU bottleneck :P

 

I know I did :P Whats ya ppoint, was explaining it more

Edited by DanTheGamer11

Share this post


Link to post
Share on other sites

Seeing that there is a huge lack of benchmarks for cards a few years old I decided to follow your lead!  Here are my results for my GTX560TI SLI setup with a little cut and paste with my results added in and:

 

All at stock gpu clocks (880ghz), cpu is at 4.6ghz.  

Batman: Arkham Origins: 47 (All settings set to max at my monitor's native resolution of 2560x1440, Vsync off)

 

Metro: Last Light: 44.67 FPS (1920x1080, DX11, Physx On, SSAA Off - AAA - DOF On)

 

After installing the new Geforce Experience and driver (337.50), followed by a reboot, I reran each benchmark.

 

Batman: Arkham Origins:47 (All settings set to max at my monitor's native resolution of 2560x1440, Vsync off)

 

Metro: Last Light: 44.67 (1920x1080, DX11, Physx On, SSAA Off - AAA - DOF On)

 

Absolutely no change for me in the final average results...however, in the min/max frame rates I did see a slight bump in the max frames recorded.  Metro went from (min/max) 5/218 to 5/225 and Batman Origins went from 22/65 to 22/69.  

 

Perhaps there would be a better increase in newer games?  It seems that both of these title hardly needed a driver update, so minimal gains perhaps where to be expected...

I haven't tried Last Light (one of the many games in my Steam backlog), but for Arkham Origins, are you running it with all the DX11 features? I say this because they're very demanding and should probably only all be on if you're running a high-end card. Turning off even a couple would probably put you over the 60 mark. Did you check to see what GFE claims is the optimal setting?

 

My initial testing on my GTX 770 (non-Beta driver) was 60FPS (avg) with everything max, but with all DX11 turned off, it jumped up to 76FPS. Turning just the "snow" on, it dropped to 69FPS.

 

Obviously this won't impact the gains (or lack thereof) of the new driver, but just thought I'd mention it.

Share this post


Link to post
Share on other sites

 

Seeing that there is a huge lack of benchmarks for cards a few years old I decided to follow your lead!  Here are my results for my GTX560TI SLI setup with a little cut and paste with my results added in and:

 

All at stock gpu clocks (880ghz), cpu is at 4.6ghz.  

Batman: Arkham Origins: 47 (All settings set to max at my monitor's native resolution of 2560x1440, Vsync off)

 

Metro: Last Light: 44.67 FPS (1920x1080, DX11, Physx On, SSAA Off - AAA - DOF On)

 

After installing the new Geforce Experience and driver (337.50), followed by a reboot, I reran each benchmark.

 

Batman: Arkham Origins:47 (All settings set to max at my monitor's native resolution of 2560x1440, Vsync off)

 

Metro: Last Light: 44.67 (1920x1080, DX11, Physx On, SSAA Off - AAA - DOF On)

 

Absolutely no change for me in the final average results...however, in the min/max frame rates I did see a slight bump in the max frames recorded.  Metro went from (min/max) 5/218 to 5/225 and Batman Origins went from 22/65 to 22/69.  

 

Perhaps there would be a better increase in newer games?  It seems that both of these title hardly needed a driver update, so minimal gains perhaps where to be expected...

I haven't tried Last Light (one of the many games in my Steam backlog), but for Arkham Origins, are you running it with all the DX11 features? I say this because they're very demanding and should probably only all be on if you're running a high-end card. Turning off even a couple would probably put you over the 60 mark. Did you check to see what GFE claims is the optimal setting?

 

My initial testing on my GTX 770 (non-Beta driver) was 60FPS (avg) with everything max, but with all DX11 turned off, it jumped up to 76FPS. Turning just the "snow" on, it dropped to 69FPS.

 

Obviously this won't impact the gains (or lack thereof) of the new driver, but just thought I'd mention it.

 

 

What resolution are you running at, just out of curiosity? FPS can be somewhat subjective, at least for me, I am usually OK with anything in the 40-50 fps range if it means I can set everything on max. Overall fidelity vs. smoothness is a tradeoff I can accept. Can't speak for you, or others, however.

 

I will say, while in the benchmark I had an average of 51 fps, during actual gameplay I'd say there's a good 60-70% of the time I am above 60 fps. Naturally, the benchmark is designed to be stressful to the system and isn't always representative of what the majority of gameplay will be like. 

Share this post


Link to post
Share on other sites

 

 

Seeing that there is a huge lack of benchmarks for cards a few years old I decided to follow your lead!  Here are my results for my GTX560TI SLI setup with a little cut and paste with my results added in and:

 

All at stock gpu clocks (880ghz), cpu is at 4.6ghz.  

Batman: Arkham Origins: 47 (All settings set to max at my monitor's native resolution of 2560x1440, Vsync off)

 

Metro: Last Light: 44.67 FPS (1920x1080, DX11, Physx On, SSAA Off - AAA - DOF On)

 

After installing the new Geforce Experience and driver (337.50), followed by a reboot, I reran each benchmark.

 

Batman: Arkham Origins:47 (All settings set to max at my monitor's native resolution of 2560x1440, Vsync off)

 

Metro: Last Light: 44.67 (1920x1080, DX11, Physx On, SSAA Off - AAA - DOF On)

 

Absolutely no change for me in the final average results...however, in the min/max frame rates I did see a slight bump in the max frames recorded.  Metro went from (min/max) 5/218 to 5/225 and Batman Origins went from 22/65 to 22/69.  

 

Perhaps there would be a better increase in newer games?  It seems that both of these title hardly needed a driver update, so minimal gains perhaps where to be expected...

I haven't tried Last Light (one of the many games in my Steam backlog), but for Arkham Origins, are you running it with all the DX11 features? I say this because they're very demanding and should probably only all be on if you're running a high-end card. Turning off even a couple would probably put you over the 60 mark. Did you check to see what GFE claims is the optimal setting?

 

My initial testing on my GTX 770 (non-Beta driver) was 60FPS (avg) with everything max, but with all DX11 turned off, it jumped up to 76FPS. Turning just the "snow" on, it dropped to 69FPS.

 

Obviously this won't impact the gains (or lack thereof) of the new driver, but just thought I'd mention it.

 

 

What resolution are you running at, just out of curiosity? FPS can be somewhat subjective, at least for me, I am usually OK with anything in the 40-50 fps range if it means I can set everything on max. Overall fidelity vs. smoothness is a tradeoff I can accept. Can't speak for you, or others, however.

 

I will say, while in the benchmark I had an average of 51 fps, during actual gameplay I'd say there's a good 60-70% of the time I am above 60 fps. Naturally, the benchmark is designed to be stressful to the system and isn't always representative of what the majority of gameplay will be like. 

 

I normally run at 1680x1050, but did the bench at 1920x1080 since that's the norm.

 

My point is irregardless of resolution though - DX11 is resource intensive no matter what you're running. 16FPS difference is rather large...but again, that was before the beta driver where DX11 performance was supposedly boostered. But yes, I too noticed that actual gameplay seemed to perform better than the benchmark, which I often find in most games anyway.

Share this post


Link to post
Share on other sites

 

 

 

Seeing that there is a huge lack of benchmarks for cards a few years old I decided to follow your lead!  Here are my results for my GTX560TI SLI setup with a little cut and paste with my results added in and:

 

All at stock gpu clocks (880ghz), cpu is at 4.6ghz.  

Batman: Arkham Origins: 47 (All settings set to max at my monitor's native resolution of 2560x1440, Vsync off)

 

Metro: Last Light: 44.67 FPS (1920x1080, DX11, Physx On, SSAA Off - AAA - DOF On)

 

After installing the new Geforce Experience and driver (337.50), followed by a reboot, I reran each benchmark.

 

Batman: Arkham Origins:47 (All settings set to max at my monitor's native resolution of 2560x1440, Vsync off)

 

Metro: Last Light: 44.67 (1920x1080, DX11, Physx On, SSAA Off - AAA - DOF On)

 

Absolutely no change for me in the final average results...however, in the min/max frame rates I did see a slight bump in the max frames recorded.  Metro went from (min/max) 5/218 to 5/225 and Batman Origins went from 22/65 to 22/69.  

 

Perhaps there would be a better increase in newer games?  It seems that both of these title hardly needed a driver update, so minimal gains perhaps where to be expected...

I haven't tried Last Light (one of the many games in my Steam backlog), but for Arkham Origins, are you running it with all the DX11 features? I say this because they're very demanding and should probably only all be on if you're running a high-end card. Turning off even a couple would probably put you over the 60 mark. Did you check to see what GFE claims is the optimal setting?

 

My initial testing on my GTX 770 (non-Beta driver) was 60FPS (avg) with everything max, but with all DX11 turned off, it jumped up to 76FPS. Turning just the "snow" on, it dropped to 69FPS.

 

Obviously this won't impact the gains (or lack thereof) of the new driver, but just thought I'd mention it.

 

 

What resolution are you running at, just out of curiosity? FPS can be somewhat subjective, at least for me, I am usually OK with anything in the 40-50 fps range if it means I can set everything on max. Overall fidelity vs. smoothness is a tradeoff I can accept. Can't speak for you, or others, however.

 

I will say, while in the benchmark I had an average of 51 fps, during actual gameplay I'd say there's a good 60-70% of the time I am above 60 fps. Naturally, the benchmark is designed to be stressful to the system and isn't always representative of what the majority of gameplay will be like. 

 

I normally run at 1680x1050, but did the bench at 1920x1080 since that's the norm.

 

My point is irregardless of resolution though - DX11 is resource intensive no matter what you're running. 16FPS difference is rather large...but again, that was before the beta driver where DX11 performance was supposedly boostered. But yes, I too noticed that actual gameplay seemed to perform better than the benchmark, which I often find in most games anyway.

 

 

Ah, Ok, I was starting to wonder my SLI setup with ~2600 cuda cores was performing slighty worse than your 1536 (yeah I know about SLI scaling and optimization, etc). 

 

You're absolutely right, DX11 is certainly resource intensive. 16 FPS is a big difference, but it's less of a difference between 60 and 44 than it is from 44 to 28. It's all relative and subjective. Just depends on what you like :)

Share this post


Link to post
Share on other sites

 

 

 

 

Seeing that there is a huge lack of benchmarks for cards a few years old I decided to follow your lead!  Here are my results for my GTX560TI SLI setup with a little cut and paste with my results added in and:

 

All at stock gpu clocks (880ghz), cpu is at 4.6ghz.  

Batman: Arkham Origins: 47 (All settings set to max at my monitor's native resolution of 2560x1440, Vsync off)

 

Metro: Last Light: 44.67 FPS (1920x1080, DX11, Physx On, SSAA Off - AAA - DOF On)

 

After installing the new Geforce Experience and driver (337.50), followed by a reboot, I reran each benchmark.

 

Batman: Arkham Origins:47 (All settings set to max at my monitor's native resolution of 2560x1440, Vsync off)

 

Metro: Last Light: 44.67 (1920x1080, DX11, Physx On, SSAA Off - AAA - DOF On)

 

Absolutely no change for me in the final average results...however, in the min/max frame rates I did see a slight bump in the max frames recorded.  Metro went from (min/max) 5/218 to 5/225 and Batman Origins went from 22/65 to 22/69.  

 

Perhaps there would be a better increase in newer games?  It seems that both of these title hardly needed a driver update, so minimal gains perhaps where to be expected...

I haven't tried Last Light (one of the many games in my Steam backlog), but for Arkham Origins, are you running it with all the DX11 features? I say this because they're very demanding and should probably only all be on if you're running a high-end card. Turning off even a couple would probably put you over the 60 mark. Did you check to see what GFE claims is the optimal setting?

 

My initial testing on my GTX 770 (non-Beta driver) was 60FPS (avg) with everything max, but with all DX11 turned off, it jumped up to 76FPS. Turning just the "snow" on, it dropped to 69FPS.

 

Obviously this won't impact the gains (or lack thereof) of the new driver, but just thought I'd mention it.

 

 

What resolution are you running at, just out of curiosity? FPS can be somewhat subjective, at least for me, I am usually OK with anything in the 40-50 fps range if it means I can set everything on max. Overall fidelity vs. smoothness is a tradeoff I can accept. Can't speak for you, or others, however.

 

I will say, while in the benchmark I had an average of 51 fps, during actual gameplay I'd say there's a good 60-70% of the time I am above 60 fps. Naturally, the benchmark is designed to be stressful to the system and isn't always representative of what the majority of gameplay will be like. 

 

I normally run at 1680x1050, but did the bench at 1920x1080 since that's the norm.

 

My point is irregardless of resolution though - DX11 is resource intensive no matter what you're running. 16FPS difference is rather large...but again, that was before the beta driver where DX11 performance was supposedly boostered. But yes, I too noticed that actual gameplay seemed to perform better than the benchmark, which I often find in most games anyway.

 

 

Ah, Ok, I was starting to wonder my SLI setup with ~2600 cuda cores was performing slighty worse than your 1536 (yeah I know about SLI scaling and optimization, etc). 

 

You're absolutely right, DX11 is certainly resource intensive. 16 FPS is a big difference, but it's less of a difference between 60 and 44 than it is from 44 to 28. It's all relative and subjective. Just depends on what you like :)

 

Well actually I'm talking about my 16FPS difference from 60 to 76...which is even less important :lol:

Share this post


Link to post
Share on other sites

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now

×
×
  • Create New...