Jump to content

NVIDIA 'The Way It's Meant to Be Played' 2013 Event


ClayMeow

Recommended Posts

Awesome! My question stumped them :lol: I had seen the November release date mentioned, and that's why I was curious. It would be weird if it launched during the bundle period and it was not included, since the 780 and Titan are; the cards above and below it. As you said though, we'll know more later.

Share this post


Link to post
Share on other sites

As far as I know, having a frame rate that matches the monitor's perfectly, say 60 fps should provide the most smooth experience. If it goes above that frames are wasted and might not be shown. If your gpu is capable enough G-Sync should allow for a much smoother experience, because all frames will be display, so even minuscule changes will be displayed giving a more realistic look.

Not entirely sure that's how it works, but I think it is.

Share this post


Link to post
Share on other sites

Well, the first monitor to support G-Sync is the ASUS VG248QE, but it does require the G-Sync module be modded in. It is capable of 144 Hz for the 2D refresh rate, so you're not going to be hitting the cap much with it. Chances are then that all future monitors with the module integrated will have refresh rates higher than 60. But, if the framerate were to surpass the maximum refresh rate, my guess would be that the module will just throw out those frames. That or the GPU is still informed of the maximum refresh rate, and told not to go above that, it's just a higher rate than the current standard of 60.

Share this post


Link to post
Share on other sites

That or the GPU is still informed of the maximum refresh rate, and told not to go above that, it's just a higher rate than the current standard of 60.

^This is what it does.

 

All G-Sync monitors will have a max refresh rate, which for now is 144. There is also a minimum, but it's well below playable at that point (I think it was like 15 or something).

 

Basically, the way it works in layman's terms is, whatever frame rate you're getting on your Kepler GPU, it communicates with the G-Sync module in the supported monitor to tell it to match it 1:1. So the drawing of the frames by the GPU and the scanning (or refreshing) of your monitor are always in sync. As such, you get the benefit of v-sync (no stutter), but with absolutely no tearing because the monitor is keeping up with the GPU.

 

Another way of looking at it is:

Current monitors = locked refresh rate

G-sync monitors = variable refresh rate

 

Hope that explains how it works. If you're not familiar with how GPUs and monitors work together, it's hard to grasp. That's why NVIDIA is going to have an uphill battle - it's not something most people can fully grasp unless they see it in person. When you see it, it becomes clear how beneficial this is.

Share this post


Link to post
Share on other sites

Easy way to imagine it:

 

You have 10 buckets and 4 apples. Put them evenly into the buckets. The problem? You can't space them evenly. Frames coming into the monitor are the same - they need to be evenly spaced or you perceive the variances as stutter. This problem is worse when your framerate (number of apples) varies continuously (as games do). Your eyes pick up on the frame-to-frame variation a lot more than you'd expect.

 

 

G-Sync takes the 4 apples and puts them in 4 buckets. Perfectly even. If you have 6 apples...6 buckets. There's no mismatch or stutter because the frames are delivered at exactly the rate that the GPU is producing them without any interpolation into buckets.

 

 

Another obvious example is 24 FPS moves on a 60 Hz screen. The jutter from frame to frame is because the 24 frames don't divide evenly into 60. Now, make that more complicated, and constantly change the framerate from 20-30 FPS. No modern monitor can display that smoothly. Sure, monitors with higher refresh rates make it less noticeable, but it's still there.

Share this post


Link to post
Share on other sites

Easy way to imagine it:

 

You have 10 buckets and 4 apples. Put them evenly into the buckets. The problem? You can't space them evenly. Frames coming into the monitor are the same - they need to be evenly spaced or you perceive the variances as stutter. This problem is worse when your framerate (number of apples) varies continuously (as games do). Your eyes pick up on the frame-to-frame variation a lot more than you'd expect.

 

 

G-Sync takes the 4 apples and puts them in 4 buckets. Perfectly even. If you have 6 apples...6 buckets. There's no mismatch or stutter because the frames are delivered at exactly the rate that the GPU is producing them without any interpolation into buckets.

 

 

Another obvious example is 24 FPS moves on a 60 Hz screen. The jutter from frame to frame is because the 24 frames don't divide evenly into 60. Now, make that more complicated, and constantly change the framerate from 20-30 FPS. No modern monitor can display that smoothly. Sure, monitors with higher refresh rates make it less noticeable, but it's still there.

Spoken like a true engineer :lol:. Nice apple analogy :)

Share this post


Link to post
Share on other sites

×
×
  • Create New...