Jump to content

Framerate


Recommended Posts

Heya... I was hoping that some of the video fanatics here could help me out with this lil bit of confusion:

 

If your brain can only register a frame rate of ~30 to 45 frames/sec, then what is the logic in spending the big bucks for a high-end video card that spits out 60-80+ fps?? If a mid-range card provides all of the features (DX9 , pixelshader, etc) at a consistant 30 fps, then why spend close to $400 for a 9700/5900?

 

Also, besides the 'because you can' reason, is OCing a GPU worth it? Is getting 185 fps in a benchmark any more noticable than getting 180 fps?

 

Is there something that I'm neglecting here?

 

Thx! ;)

Share this post


Link to post
Share on other sites

well good point i guess, i would just like to correct you the pricing there. A 9700 pro is only about $250. That's still more than i can spend, so im stuck on a 32mb geforce 2 :P, works great for me.

Share this post


Link to post
Share on other sites

Because the faster the card, the longer it will last. A card that can get 30fps in current games is fine for now, but in a year, it will then be running games at 10fps. Then you have to dump some more cash for a new card. However, if you have a card that can handle current games at 100+fps, then you'll be fine for several years before you have to upgrade again. Yeah the midrange card costs a lot less than a top-of-the-line card, but you'll end up buying several of them as opposed to just one.

Share this post


Link to post
Share on other sites

Umm ur brain can register 60 fps a second I know this because I did stop motion animation videos for fun and when I did it anything less then 60 seemed jerky. Aside from that the 9700 and 5900 are both no longer 400 dollars. The 5950 ultra can be had for that and the 9800 xt for a lil more if u look.

Share this post


Link to post
Share on other sites

Because the faster the card, the longer it will last. A card that can get 30fps in current games is fine for now, but in a year, it will then be running games at 10fps. Then you have to dump some more cash for a new card. However, if you have a card that can handle current games at 100+fps, then you'll be fine for several years before you have to upgrade again. Yeah the midrange card costs a lot less than a top-of-the-line card, but you'll end up buying several of them as opposed to just one.

what you said doesn't make sense. Using your same argument, one could deduce the opposite...that you might as well wait until the prices come down before purchasing a card...thus, buying a lower-end card, or a generation-old card. If one's videocard is producing 35 or 60fps (whichever you wanna say) with all current games, then wouldn't it be wiser to wait a couple of months (or longer) to purchase a better card at $100-150, rather than spend $250+ now?

Share this post


Link to post
Share on other sites

Ok if someone doesnt agree with me on framerate then I would like someone to do the following. Make one short stop motion animation video. At Least 15 sconds long and make it in two versions a 30 frame per second and a 60 frame per second version. Now after you do that post it on here and it will clearly be visible that the 30 second is jerky. Why do u think movies have so many frames pers second?

Share this post


Link to post
Share on other sites

Naw, the Air Force did a study a while back showing that the human eye/brain can comprehend up to 220 fps in some circumstances. The myth about anything over 30fps is just that, a myth.

 

Here's an article which quotes some material from the Air Force study, I can't find the original study.

 

http://amo.net/NT/02-21-01FPS.html

Edited by GTSticky

Share this post


Link to post
Share on other sites

anything more the 24fps is seen as fluid motion to a human. if you go to the theater only 24 frames past though the projector lens every second...check out the fps of your favorate dvd, divix, etc and it probably doesn't go over 25-26. as for the stop motion stuff...if you have too much of a gap between frames it's going to be jerky no mater how many fps the play back is...think gumby.

 

what the brain can "comprehend" whatever that means, doesn't matter....24fps is all you need for fluid motion.

 

now BACK to the topic of the thread, i want to play a game with all the detail turned up, and as long as it isn't jerky i'm happy.

 

aside from comparing/testing settings benchmarks are only for bragging rights, just like 180fps

Share this post


Link to post
Share on other sites

what you said doesn't make sense. Using your same argument, one could deduce the opposite...that you might as well wait until the prices come down before purchasing a card...thus, buying a lower-end card, or a generation-old card. If one's videocard is producing 35 or 60fps (whichever you wanna say) with all current games, then wouldn't it be wiser to wait a couple of months (or longer) to purchase a better card at $100-150, rather than spend $250+ now?

Huh? Your little theory still lines up with my post...if you waited, then you'd be buying a mid-range card instead of a top-level card. And you're right, it would be cheaper, but you would have lost all the time from when the card was released until you bought it. (because a video card is only useful for a finite amount of time) And card prices don't drop nearly as fast as you mentioned...a $250 card doesn't drop to $100 over the course of a couple of months.

Share this post


Link to post
Share on other sites

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
×
×
  • Create New...