Jump to content


Photo
- - - - -

nVidia Geforce 337.50 Beta Drivers Released; Ran some tests on my own


  • Please log in to reply
50 replies to this topic

#25 Waco

Waco

    Lab Rat 2

  • Members
  • PipPipPipPipPipPipPipPip
  • 16508 posts
  • Gender:Male
  • Location:Los Alamos, NM

Posted 09 April 2014 - 04:26 AM


Weeney, you just described a bottleneck xD It doesn't matter how many cores the game uses or whatever, it's still a bottleneck.

If the gpu isn't being maxed out and you don't already have 1000fps or a frame limiter in place it's pretty sure to be a cpu bottleneck ;)


You were saying the CPU was the bottleneck, while the game is to blame.
It is the other way around :P
So a game that needs a faster CPU to run faster...is bottlenecked by itself?

I don't get your line of thought. :lol:

Tolerance is a sign of weakness.


#26 Crow47

Crow47

    Certified Geek

  • Members
  • PipPipPipPip
  • 1324 posts
  • Gender:Male

Posted 09 April 2014 - 04:31 AM

 

 

Weeney, you just described a bottleneck xD It doesn't matter how many cores the game uses or whatever, it's still a bottleneck.

If the gpu isn't being maxed out and you don't already have 1000fps or a frame limiter in place it's pretty sure to be a cpu bottleneck ;)


You were saying the CPU was the bottleneck, while the game is to blame.
It is the other way around :P
So a game that needs a faster CPU to run faster...is bottlenecked by itself?

I don't get your line of thought. :lol:

 

 

That's what I'm saying! :lol:

 

I installed Tomb Raider last night, and I didn't have the chance to do a before/after comparison, but my initial impressions were that there had been a slight improvement. As soon as I have a chance I'll try to quantify my impressions with some results. 


CrowcompX

Gigabyte X99 UD3P - i7-5820k 4.3GHz - 16GB DDR4 2400 HyperX - Asus Strix 1070  - 500GB m.2 Samsung 850 Evo -

750 Watt eVGA Supernova G2 - Corsair H80i - In Win 303 Black - Win 10 Pro x64 - LG 34" Ultrawide w/Freesync


 


#27 DanTheGamer11

DanTheGamer11

    Proud noob

  • Members
  • PipPipPipPipPip
  • 4403 posts
  • Gender:Male

Posted 09 April 2014 - 04:51 AM

 

Weeney, you just described a bottleneck xD It doesn't matter how many cores the game uses or whatever, it's still a bottleneck.

 

If the gpu isn't being maxed out and you don't already have 1000fps or a frame limiter in place it's pretty sure to be a cpu bottleneck ;)

 

You were saying the CPU was the bottleneck, while the game is to blame.

It is the other way around :P

 

wat :lol:

 

I don't think you understood. I don't remember saying the game is to blame :P Games that are cpu bound have lots of ai and or physics, calculations etc to be done, which isn't the games fault really, and if the cpu can't process it fast enough it will hold back the gpu which is waiting on the cpu to send it the processed info so it can render the scene. And apparently the API is using up cpu time so it slows it further, and there are some not so well coded games but theyre not that bad since most are for pc only(rts etc) 


Edited by DanTheGamer11, 09 April 2014 - 04:53 AM.

Processor AMD Phenom II X6 1055T @3.2Ghz
Cooler Cooler Master Hyper 212 Evo
Memory 8GB 1.48V 1600Mhz DDR3 Crucial Ballistix Tactical
Motherboard Asrock N68C-GS FX
Graphics XFX HD 7950
Storage 1TB WD Caviar Blue HDD, 240GB Kingston Hyper X SSD, Thanks OCC & Kingston, makes a big difference  :)
Optical Disk Drive Sony Optiarc DVD/CD ReWriter
Power Supply Cooler Master GX 650W Bronze
Case Cooler Master CM690, Akasa 4 port USB front panel (MOAR USBs  :woot:)


#28 WarWeeny

WarWeeny

    Charizard on the streets, snorlax in the sheets

  • Members
  • PipPipPipPip
  • 1673 posts
  • Gender:Male
  • Location:leeuwarden / netherlands

Posted 09 April 2014 - 05:04 AM

Pfff, CLEARLY ( :rolleyes:) , i am way too smart for you guys to understand such logic's!  :whistling:


You can get the weeny out of the war, but you cannot get the war out of the weeny

Rest in peace my good old gtx 480, you deserved it

 

Thinks he has a weeny worthy of war.

 


#29 Crow47

Crow47

    Certified Geek

  • Members
  • PipPipPipPip
  • 1324 posts
  • Gender:Male

Posted 09 April 2014 - 05:26 AM

 

 

Weeney, you just described a bottleneck xD It doesn't matter how many cores the game uses or whatever, it's still a bottleneck.

 

If the gpu isn't being maxed out and you don't already have 1000fps or a frame limiter in place it's pretty sure to be a cpu bottleneck ;)

 

You were saying the CPU was the bottleneck, while the game is to blame.

It is the other way around :P

 

wat :lol:

 

I don't think you understood. I don't remember saying the game is to blame :P Games that are cpu bound have lots of ai and or physics, calculations etc to be done, which isn't the games fault really, and if the cpu can't process it fast enough it will hold back the gpu which is waiting on the cpu to send it the processed info so it can render the scene. And apparently the API is using up cpu time so it slows it further, and there are some not so well coded games but theyre not that bad since most are for pc only(rts etc) 

 

 

Once again, I'm pretty sure you just described a CPU bottleneck :P


CrowcompX

Gigabyte X99 UD3P - i7-5820k 4.3GHz - 16GB DDR4 2400 HyperX - Asus Strix 1070  - 500GB m.2 Samsung 850 Evo -

750 Watt eVGA Supernova G2 - Corsair H80i - In Win 303 Black - Win 10 Pro x64 - LG 34" Ultrawide w/Freesync


 


#30 DanTheGamer11

DanTheGamer11

    Proud noob

  • Members
  • PipPipPipPipPip
  • 4403 posts
  • Gender:Male

Posted 09 April 2014 - 05:27 AM

 

 

 

Weeney, you just described a bottleneck xD It doesn't matter how many cores the game uses or whatever, it's still a bottleneck.

 

If the gpu isn't being maxed out and you don't already have 1000fps or a frame limiter in place it's pretty sure to be a cpu bottleneck ;)

 

You were saying the CPU was the bottleneck, while the game is to blame.

It is the other way around :P

 

wat :lol:

 

I don't think you understood. I don't remember saying the game is to blame :P Games that are cpu bound have lots of ai and or physics, calculations etc to be done, which isn't the games fault really, and if the cpu can't process it fast enough it will hold back the gpu which is waiting on the cpu to send it the processed info so it can render the scene. And apparently the API is using up cpu time so it slows it further, and there are some not so well coded games but theyre not that bad since most are for pc only(rts etc) 

 

 

Once again, I'm pretty sure you just described a CPU bottleneck :P

 

I know I did :P Whats ya ppoint, was explaining it more


Edited by DanTheGamer11, 09 April 2014 - 05:28 AM.

Processor AMD Phenom II X6 1055T @3.2Ghz
Cooler Cooler Master Hyper 212 Evo
Memory 8GB 1.48V 1600Mhz DDR3 Crucial Ballistix Tactical
Motherboard Asrock N68C-GS FX
Graphics XFX HD 7950
Storage 1TB WD Caviar Blue HDD, 240GB Kingston Hyper X SSD, Thanks OCC & Kingston, makes a big difference  :)
Optical Disk Drive Sony Optiarc DVD/CD ReWriter
Power Supply Cooler Master GX 650W Bronze
Case Cooler Master CM690, Akasa 4 port USB front panel (MOAR USBs  :woot:)


#31 DanTheGamer11

DanTheGamer11

    Proud noob

  • Members
  • PipPipPipPipPip
  • 4403 posts
  • Gender:Male

Posted 09 April 2014 - 06:27 AM

Planetside 2 would be a good test :)


Processor AMD Phenom II X6 1055T @3.2Ghz
Cooler Cooler Master Hyper 212 Evo
Memory 8GB 1.48V 1600Mhz DDR3 Crucial Ballistix Tactical
Motherboard Asrock N68C-GS FX
Graphics XFX HD 7950
Storage 1TB WD Caviar Blue HDD, 240GB Kingston Hyper X SSD, Thanks OCC & Kingston, makes a big difference  :)
Optical Disk Drive Sony Optiarc DVD/CD ReWriter
Power Supply Cooler Master GX 650W Bronze
Case Cooler Master CM690, Akasa 4 port USB front panel (MOAR USBs  :woot:)


#32 ClayMeow

ClayMeow

    Member Title Exceeds Member Title Character Limit

  • Review Editor
  • PipPipPipPipPipPipPipPip
  • 19463 posts
  • Gender:Male
  • Location:New York, NY

Posted 09 April 2014 - 06:34 AM

Seeing that there is a huge lack of benchmarks for cards a few years old I decided to follow your lead!  Here are my results for my GTX560TI SLI setup with a little cut and paste with my results added in and:

 

All at stock gpu clocks (880ghz), cpu is at 4.6ghz.  

Batman: Arkham Origins: 47 (All settings set to max at my monitor's native resolution of 2560x1440, Vsync off)

Metro: Last Light: 44.67 FPS (1920x1080, DX11, Physx On, SSAA Off - AAA - DOF On)

After installing the new Geforce Experience and driver (337.50), followed by a reboot, I reran each benchmark.

Batman: Arkham Origins:47 (All settings set to max at my monitor's native resolution of 2560x1440, Vsync off)

Metro: Last Light: 44.67 (1920x1080, DX11, Physx On, SSAA Off - AAA - DOF On)

 

Absolutely no change for me in the final average results...however, in the min/max frame rates I did see a slight bump in the max frames recorded.  Metro went from (min/max) 5/218 to 5/225 and Batman Origins went from 22/65 to 22/69.  

 

Perhaps there would be a better increase in newer games?  It seems that both of these title hardly needed a driver update, so minimal gains perhaps where to be expected...

I haven't tried Last Light (one of the many games in my Steam backlog), but for Arkham Origins, are you running it with all the DX11 features? I say this because they're very demanding and should probably only all be on if you're running a high-end card. Turning off even a couple would probably put you over the 60 mark. Did you check to see what GFE claims is the optimal setting?

 

My initial testing on my GTX 770 (non-Beta driver) was 60FPS (avg) with everything max, but with all DX11 turned off, it jumped up to 76FPS. Turning just the "snow" on, it dropped to 69FPS.

 

Obviously this won't impact the gains (or lack thereof) of the new driver, but just thought I'd mention it.


CMt.png
- Steam - Raptr - Xfire - GameSpot - Kongregate - IGN Blog -
- ASUS P8Z77-V Deluxe - i7 3770k - NVIDIA GTX 980 Ti - 16GB DDR3-1866MHz - Acer Predator X34 -
::: Follow OCC on facebook.png ::: OCC Xfire Clan ::: OCC Steam Community Group ::: OCC Rules :::

::: OCC E3 2013 Awards ::: CES 2013 ::: COMPUTEX 2010 :::

::: NVIDIA SHIELD Android TV Review ::: Acer Predator X34 Review :::


#33 Crow47

Crow47

    Certified Geek

  • Members
  • PipPipPipPip
  • 1324 posts
  • Gender:Male

Posted 09 April 2014 - 06:53 AM

 

Seeing that there is a huge lack of benchmarks for cards a few years old I decided to follow your lead!  Here are my results for my GTX560TI SLI setup with a little cut and paste with my results added in and:

 

All at stock gpu clocks (880ghz), cpu is at 4.6ghz.  

Batman: Arkham Origins: 47 (All settings set to max at my monitor's native resolution of 2560x1440, Vsync off)

Metro: Last Light: 44.67 FPS (1920x1080, DX11, Physx On, SSAA Off - AAA - DOF On)

After installing the new Geforce Experience and driver (337.50), followed by a reboot, I reran each benchmark.

Batman: Arkham Origins:47 (All settings set to max at my monitor's native resolution of 2560x1440, Vsync off)

Metro: Last Light: 44.67 (1920x1080, DX11, Physx On, SSAA Off - AAA - DOF On)

 

Absolutely no change for me in the final average results...however, in the min/max frame rates I did see a slight bump in the max frames recorded.  Metro went from (min/max) 5/218 to 5/225 and Batman Origins went from 22/65 to 22/69.  

 

Perhaps there would be a better increase in newer games?  It seems that both of these title hardly needed a driver update, so minimal gains perhaps where to be expected...

I haven't tried Last Light (one of the many games in my Steam backlog), but for Arkham Origins, are you running it with all the DX11 features? I say this because they're very demanding and should probably only all be on if you're running a high-end card. Turning off even a couple would probably put you over the 60 mark. Did you check to see what GFE claims is the optimal setting?

 

My initial testing on my GTX 770 (non-Beta driver) was 60FPS (avg) with everything max, but with all DX11 turned off, it jumped up to 76FPS. Turning just the "snow" on, it dropped to 69FPS.

 

Obviously this won't impact the gains (or lack thereof) of the new driver, but just thought I'd mention it.

 

 

What resolution are you running at, just out of curiosity? FPS can be somewhat subjective, at least for me, I am usually OK with anything in the 40-50 fps range if it means I can set everything on max. Overall fidelity vs. smoothness is a tradeoff I can accept. Can't speak for you, or others, however.

 

I will say, while in the benchmark I had an average of 51 fps, during actual gameplay I'd say there's a good 60-70% of the time I am above 60 fps. Naturally, the benchmark is designed to be stressful to the system and isn't always representative of what the majority of gameplay will be like. 


CrowcompX

Gigabyte X99 UD3P - i7-5820k 4.3GHz - 16GB DDR4 2400 HyperX - Asus Strix 1070  - 500GB m.2 Samsung 850 Evo -

750 Watt eVGA Supernova G2 - Corsair H80i - In Win 303 Black - Win 10 Pro x64 - LG 34" Ultrawide w/Freesync


 


#34 ClayMeow

ClayMeow

    Member Title Exceeds Member Title Character Limit

  • Review Editor
  • PipPipPipPipPipPipPipPip
  • 19463 posts
  • Gender:Male
  • Location:New York, NY

Posted 09 April 2014 - 07:44 AM

 

 

Seeing that there is a huge lack of benchmarks for cards a few years old I decided to follow your lead!  Here are my results for my GTX560TI SLI setup with a little cut and paste with my results added in and:

 

All at stock gpu clocks (880ghz), cpu is at 4.6ghz.  

Batman: Arkham Origins: 47 (All settings set to max at my monitor's native resolution of 2560x1440, Vsync off)

Metro: Last Light: 44.67 FPS (1920x1080, DX11, Physx On, SSAA Off - AAA - DOF On)

After installing the new Geforce Experience and driver (337.50), followed by a reboot, I reran each benchmark.

Batman: Arkham Origins:47 (All settings set to max at my monitor's native resolution of 2560x1440, Vsync off)

Metro: Last Light: 44.67 (1920x1080, DX11, Physx On, SSAA Off - AAA - DOF On)

 

Absolutely no change for me in the final average results...however, in the min/max frame rates I did see a slight bump in the max frames recorded.  Metro went from (min/max) 5/218 to 5/225 and Batman Origins went from 22/65 to 22/69.  

 

Perhaps there would be a better increase in newer games?  It seems that both of these title hardly needed a driver update, so minimal gains perhaps where to be expected...

I haven't tried Last Light (one of the many games in my Steam backlog), but for Arkham Origins, are you running it with all the DX11 features? I say this because they're very demanding and should probably only all be on if you're running a high-end card. Turning off even a couple would probably put you over the 60 mark. Did you check to see what GFE claims is the optimal setting?

 

My initial testing on my GTX 770 (non-Beta driver) was 60FPS (avg) with everything max, but with all DX11 turned off, it jumped up to 76FPS. Turning just the "snow" on, it dropped to 69FPS.

 

Obviously this won't impact the gains (or lack thereof) of the new driver, but just thought I'd mention it.

 

 

What resolution are you running at, just out of curiosity? FPS can be somewhat subjective, at least for me, I am usually OK with anything in the 40-50 fps range if it means I can set everything on max. Overall fidelity vs. smoothness is a tradeoff I can accept. Can't speak for you, or others, however.

 

I will say, while in the benchmark I had an average of 51 fps, during actual gameplay I'd say there's a good 60-70% of the time I am above 60 fps. Naturally, the benchmark is designed to be stressful to the system and isn't always representative of what the majority of gameplay will be like. 

 

I normally run at 1680x1050, but did the bench at 1920x1080 since that's the norm.

 

My point is irregardless of resolution though - DX11 is resource intensive no matter what you're running. 16FPS difference is rather large...but again, that was before the beta driver where DX11 performance was supposedly boostered. But yes, I too noticed that actual gameplay seemed to perform better than the benchmark, which I often find in most games anyway.


CMt.png
- Steam - Raptr - Xfire - GameSpot - Kongregate - IGN Blog -
- ASUS P8Z77-V Deluxe - i7 3770k - NVIDIA GTX 980 Ti - 16GB DDR3-1866MHz - Acer Predator X34 -
::: Follow OCC on facebook.png ::: OCC Xfire Clan ::: OCC Steam Community Group ::: OCC Rules :::

::: OCC E3 2013 Awards ::: CES 2013 ::: COMPUTEX 2010 :::

::: NVIDIA SHIELD Android TV Review ::: Acer Predator X34 Review :::


#35 Crow47

Crow47

    Certified Geek

  • Members
  • PipPipPipPip
  • 1324 posts
  • Gender:Male

Posted 09 April 2014 - 07:46 AM

 

 

 

Seeing that there is a huge lack of benchmarks for cards a few years old I decided to follow your lead!  Here are my results for my GTX560TI SLI setup with a little cut and paste with my results added in and:

 

All at stock gpu clocks (880ghz), cpu is at 4.6ghz.  

Batman: Arkham Origins: 47 (All settings set to max at my monitor's native resolution of 2560x1440, Vsync off)

Metro: Last Light: 44.67 FPS (1920x1080, DX11, Physx On, SSAA Off - AAA - DOF On)

After installing the new Geforce Experience and driver (337.50), followed by a reboot, I reran each benchmark.

Batman: Arkham Origins:47 (All settings set to max at my monitor's native resolution of 2560x1440, Vsync off)

Metro: Last Light: 44.67 (1920x1080, DX11, Physx On, SSAA Off - AAA - DOF On)

 

Absolutely no change for me in the final average results...however, in the min/max frame rates I did see a slight bump in the max frames recorded.  Metro went from (min/max) 5/218 to 5/225 and Batman Origins went from 22/65 to 22/69.  

 

Perhaps there would be a better increase in newer games?  It seems that both of these title hardly needed a driver update, so minimal gains perhaps where to be expected...

I haven't tried Last Light (one of the many games in my Steam backlog), but for Arkham Origins, are you running it with all the DX11 features? I say this because they're very demanding and should probably only all be on if you're running a high-end card. Turning off even a couple would probably put you over the 60 mark. Did you check to see what GFE claims is the optimal setting?

 

My initial testing on my GTX 770 (non-Beta driver) was 60FPS (avg) with everything max, but with all DX11 turned off, it jumped up to 76FPS. Turning just the "snow" on, it dropped to 69FPS.

 

Obviously this won't impact the gains (or lack thereof) of the new driver, but just thought I'd mention it.

 

 

What resolution are you running at, just out of curiosity? FPS can be somewhat subjective, at least for me, I am usually OK with anything in the 40-50 fps range if it means I can set everything on max. Overall fidelity vs. smoothness is a tradeoff I can accept. Can't speak for you, or others, however.

 

I will say, while in the benchmark I had an average of 51 fps, during actual gameplay I'd say there's a good 60-70% of the time I am above 60 fps. Naturally, the benchmark is designed to be stressful to the system and isn't always representative of what the majority of gameplay will be like. 

 

I normally run at 1680x1050, but did the bench at 1920x1080 since that's the norm.

 

My point is irregardless of resolution though - DX11 is resource intensive no matter what you're running. 16FPS difference is rather large...but again, that was before the beta driver where DX11 performance was supposedly boostered. But yes, I too noticed that actual gameplay seemed to perform better than the benchmark, which I often find in most games anyway.

 

 

Ah, Ok, I was starting to wonder my SLI setup with ~2600 cuda cores was performing slighty worse than your 1536 (yeah I know about SLI scaling and optimization, etc). 

 

You're absolutely right, DX11 is certainly resource intensive. 16 FPS is a big difference, but it's less of a difference between 60 and 44 than it is from 44 to 28. It's all relative and subjective. Just depends on what you like :)


CrowcompX

Gigabyte X99 UD3P - i7-5820k 4.3GHz - 16GB DDR4 2400 HyperX - Asus Strix 1070  - 500GB m.2 Samsung 850 Evo -

750 Watt eVGA Supernova G2 - Corsair H80i - In Win 303 Black - Win 10 Pro x64 - LG 34" Ultrawide w/Freesync


 


#36 ClayMeow

ClayMeow

    Member Title Exceeds Member Title Character Limit

  • Review Editor
  • PipPipPipPipPipPipPipPip
  • 19463 posts
  • Gender:Male
  • Location:New York, NY

Posted 09 April 2014 - 07:50 AM

 

 

 

 

Seeing that there is a huge lack of benchmarks for cards a few years old I decided to follow your lead!  Here are my results for my GTX560TI SLI setup with a little cut and paste with my results added in and:

 

All at stock gpu clocks (880ghz), cpu is at 4.6ghz.  

Batman: Arkham Origins: 47 (All settings set to max at my monitor's native resolution of 2560x1440, Vsync off)

Metro: Last Light: 44.67 FPS (1920x1080, DX11, Physx On, SSAA Off - AAA - DOF On)

After installing the new Geforce Experience and driver (337.50), followed by a reboot, I reran each benchmark.

Batman: Arkham Origins:47 (All settings set to max at my monitor's native resolution of 2560x1440, Vsync off)

Metro: Last Light: 44.67 (1920x1080, DX11, Physx On, SSAA Off - AAA - DOF On)

 

Absolutely no change for me in the final average results...however, in the min/max frame rates I did see a slight bump in the max frames recorded.  Metro went from (min/max) 5/218 to 5/225 and Batman Origins went from 22/65 to 22/69.  

 

Perhaps there would be a better increase in newer games?  It seems that both of these title hardly needed a driver update, so minimal gains perhaps where to be expected...

I haven't tried Last Light (one of the many games in my Steam backlog), but for Arkham Origins, are you running it with all the DX11 features? I say this because they're very demanding and should probably only all be on if you're running a high-end card. Turning off even a couple would probably put you over the 60 mark. Did you check to see what GFE claims is the optimal setting?

 

My initial testing on my GTX 770 (non-Beta driver) was 60FPS (avg) with everything max, but with all DX11 turned off, it jumped up to 76FPS. Turning just the "snow" on, it dropped to 69FPS.

 

Obviously this won't impact the gains (or lack thereof) of the new driver, but just thought I'd mention it.

 

 

What resolution are you running at, just out of curiosity? FPS can be somewhat subjective, at least for me, I am usually OK with anything in the 40-50 fps range if it means I can set everything on max. Overall fidelity vs. smoothness is a tradeoff I can accept. Can't speak for you, or others, however.

 

I will say, while in the benchmark I had an average of 51 fps, during actual gameplay I'd say there's a good 60-70% of the time I am above 60 fps. Naturally, the benchmark is designed to be stressful to the system and isn't always representative of what the majority of gameplay will be like. 

 

I normally run at 1680x1050, but did the bench at 1920x1080 since that's the norm.

 

My point is irregardless of resolution though - DX11 is resource intensive no matter what you're running. 16FPS difference is rather large...but again, that was before the beta driver where DX11 performance was supposedly boostered. But yes, I too noticed that actual gameplay seemed to perform better than the benchmark, which I often find in most games anyway.

 

 

Ah, Ok, I was starting to wonder my SLI setup with ~2600 cuda cores was performing slighty worse than your 1536 (yeah I know about SLI scaling and optimization, etc). 

 

You're absolutely right, DX11 is certainly resource intensive. 16 FPS is a big difference, but it's less of a difference between 60 and 44 than it is from 44 to 28. It's all relative and subjective. Just depends on what you like :)

 

Well actually I'm talking about my 16FPS difference from 60 to 76...which is even less important :lol:


CMt.png
- Steam - Raptr - Xfire - GameSpot - Kongregate - IGN Blog -
- ASUS P8Z77-V Deluxe - i7 3770k - NVIDIA GTX 980 Ti - 16GB DDR3-1866MHz - Acer Predator X34 -
::: Follow OCC on facebook.png ::: OCC Xfire Clan ::: OCC Steam Community Group ::: OCC Rules :::

::: OCC E3 2013 Awards ::: CES 2013 ::: COMPUTEX 2010 :::

::: NVIDIA SHIELD Android TV Review ::: Acer Predator X34 Review :::