• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

General FPS question

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.
Actually, just to clear some stuff up, movies aren't shown at 24 fps. Originally, when the projector was invented, movies were filmed around 60 fps, and played back at 60 fps. However, it was quickly discovered that making movies of any length would be incredibly expensive if everything was shot at 60 fps. Frame rates went down until eventually it was settled upon that a film could be shot at 24 fps without dropping action. However, when played back at 24 fps from a projector, there is still a flicker to films. To correct this, each frame of a motion picture is doubled, and then played back by the projectionist at 48 frames per second to create the smooth action you see in the theatre. With games though, anything over 30 fps seems to be fine... But just a little history I thoght I'd throw into the mix... :)
If you mean "wasted" as in not shown then yes. The frames are going to the screen faster than it can draw them, so it only draws some (I think...). However, when lots of stuff comes onto the screen as in heavy battle, it is the computer that can't keep up, and so will slow down its fps. The frame rate drops and less frames are sent every second. If your computer normaly ran at 50fps where nothing was going on, and then dropped down to 30, you would notice the slower screen updates. However at 90fps and then dropping down to 70, you really won't notice anything because it is WAY too fast.

To sum up, yes the frames are not shown on screen. However the high frame rate is used to ofset the lower frame rates of intense battle scenes and such. This is just to make the game look better because is isn't jerking around....

Hope this settles it...
this is a question that will never die. if you really want to find out play counterstrike with your fps limit set to 24 then play it, then set at 100 and play it, then tell me you dont see a difference
the fps rate is specific to you.you don't actually see in frame rates.your brain tricks you into thinking your seeing fluidly.kind of like racing,at 160 mph a drivers brain oc's making responses faster and seeing ahead farther.the longer he drives at this speed the more accustomed to this he will become.after awhile slowing down he would tend to be to repsonsive for 30 mph,for awhile.
its like seeing a stop sign in the distance,you dont actually read it your brain fills in the letters before you can actually read them,but your not aware of it.we are not aware of just how much our brain fills in for us most of the time.one of the few times you will notice this is reading a sign from a distance,getting closer you find it says somthing else,your brain filling in details for you,incorrectly(remind you of a p4?).
now with that said frame rates vary , so does your mind.if you can tell a difference frame rates matter whatever they are.is more than your monitor can handle a waste sure,but only if it remaines that high which it usually does'nt.
did you know you see with only 1 side of your brain and talk with the other?
And the human eye can definitely see more than 30fps. I can tell the difference myself, between 60 and 90.
Also, high fps is critical, because many of those same graphics cards that hit 99fps in CS will also be running around 30 in action. And I hate to go below 30. It's a pretty big handicap against those getting 70fps in that same fight.
BlakeN (Jul 09, 2001 12:33 p.m.):
I dont have any scientific proof but I have have vsync on I only get about 40-50 fps and if its off I get arround 120 (this is in counter-strike). Whenever I reinstall drivers I always forget to turn vysinc back off and I can always tell.

As far as maximum refresh rates for monitors go youll just have to look at the specs. My viewsonic will go up to about 180Mhz but I dont know at what resolution.

Half Life has a max framerate of 100fps :)

You have to set this in the console, the default fps cap is 72.

fps_max is the command

So fps_max 100 would set the cap to... you guessed it! 100!
Valid (Jul 10, 2001 12:40 a.m.):
I feel kind of proud of myself for getting a decent string going. I also feel kind of weird because besides the initial post, I didn't say anything. So basically, I just want to ask this question to cement things in my mind. Frames-per-second pushed out by the graphics card that are in excess of the monitor's refresh rating are wasted, are they not?

Do you really need to ask us this?
Depends. Marketing warps scientific words a lot nowadays. I basically wanted to make sure hertz in ol' monitor land still meant "per-seconds." Physics/Comp Sci major seeing a lot of word "raping" going on lately.
While you guys are sitting here jibber and jabbering about FPS, and movies, and games, and eyesight, and refresh rates, and resolutions, I am sitting here wondering what I can throw in here to futher confuse everyone.

And here it is.

dot pitch!

Take that!