• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

How many FPS is smooth to you?

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

How many FPS for a game to be considered Smooth?

  • 25-35

    Votes: 129 10.2%
  • 35-45

    Votes: 378 29.9%
  • 55-65

    Votes: 403 31.9%
  • 65-85

    Votes: 222 17.6%
  • 90+

    Votes: 132 10.4%

  • Total voters
    1,264
65ish is smooth... but I like it really high their is an article somewhere on why more is better (ie 100+), but its late.

it feels better with more.
 
But ... it also depends on what you are used to.
For example, my lil brother is barely able to play Ut2k4, but he still manages to get a fair amount of frags with a framerate between 10...15, go figure. Choppy as hell for me when I look at his screen, but he is wondering why I don't find that playable. :)

For me ... well, as long as the FPS is above 25 (SP) or 35 (MP) and (more important) stable! I'd rather tweak my system for a low stable framerate than to have big differences.
 
As mensioned above anything above 25 fps arent noticeable
in anyway to the human eye, Just do a google on the fact ...

I play with Vsync on at all times if possible and so i dont see more than
85fps in games multiplayer or single player but just to make things
interesting i do change vsync every now and then to see those
UT99 fps shoot up to near impossible inconceivable high fps :D ...

As always though with game fps - The higher the better in my oppinion ...
 
My computer can handle pretty much any game I throw at it right now. It's latency that kills in FPS. As long as I stay above 25-30 fps I'm happy. Usually I'm a lot higher than that but I don't really notice the dips as long as it stays around 30. It's when my ping spikes from 10-30 to 100+ that I see a differance.
 
M4D said:
As mensioned above anything above 25 fps arent noticeable
in anyway to the human eye, Just do a google on the fact ...

In fact, you should do some googling of your own and you'd find a significant number of truly scientific studies that say otherwise.

Back to the subject at hand, I voted for 35-45. Uber-fluid FPS are always nice, but I never had the money (until recently) to keep up with the newest of available hardware. A P2-450, 512mb of PC100 ram and a GF3 Ti200 went a VERY long way for me, even to the point of UT2K3... I don't need an uber-ton of framerate to be happy with a game.

Which is a good thing, because even on my rig now, I can't play FarCry with those kinds of framerates anyway :p
 
35-45 for me is what I'd consider the "smoothness threshold".

My own UT2K3/UT2K4 FPS scale:
25FPS - Game is slightly choppy while running directly forward
30FPS - Game is slightly choppy while doing normal turns (to change direction)
35FPS - Game is slightly choppy while doing quick turns (to face your attacker)
40FPS - Game is slightly choppy while doing REALLY quick turns (such as busy sniping, and somebody attacks you from behind [I usually forget to zoom out, causing MASSIVE pixel deltas in each frame])
45FPS - Game is smooth for nearly anything you can do.


The more the merrier, as I like to run at 50FPS (even though I'm perfectly happy at 35) while playing. The more frames the computer can squeeze in to make abrupt changes in motion smoother, the better :D

JigPu
 
I have to be over 50 or else it just seems to choppy. I believe I have very good eyesight because I could tell the difference between 75 and 80-85 FPS very easily when I increased my resolution and my vsync lowered my max FPS in CS to 75. I usually run 1024x768 so I get 85 FPS constant. Even in the UT2k4 I still get 50-70 FPS with mostly everything set to highest. Guess this 9800se is not that bad after all. ;)
 
4 me, a decent number of fps is at least 75. That's for your monitor. 85 is even better. Getting more in a game than your monitor can support in that resolution causes tearing. And don't give me that "your eyes can't see more than 30 fps" bull****. I can clearly tell the difference between 40 and 100 fps.
Anything below 50 fps and things just get a little choppy.
 
I usaully will change a vid card if my FPS drops below 12-14, in whatever game I happen to be playing at any given point in time. unless I come across a deal I can't pass up. Last upgrade was free (GF4 Ti-4600). When this thing can't handle more than 12-14 FPS, I'll buy something new. Everyone is different. Can I see the difference between 14 FPS and 100 FPS? Of course! Some people can't play with low framerates, and some people can. I'm one that can. Low framerates don't affect how well I play. It just makes me have to predict rather than react in games. Having good eyesight has nothing to do with it. It's what your brain can compensate for is what make the difference.
 
Valk said:
Since the human eye cant see more than 25 images in one second, Im comfortable with my fps at 25-30. its really kind of rediculous to have more, since you cannot see the minute detail changes at the higher frame rate. if you turn the camera swiftly, your eye naturally blurs the image since it cannot track the change of image locations quickly enough to redrew it crisp and clear.

I agree on the fact that we can't see more than 25 different images per second, but if you try and play quake 3 for example at 25fps, its VERY jerky, which makes me wonder what the differences are between TVs and computers when displaying motion.
 
TV's run at 60 interlaced fields per second, which creates a full 30 frames per second, which produces a bit of motion blur. And in most cases, the actual source media for your TV (DVD's, VHS tapes, actual production over-the-air video) has motion blur in it too (ever pause a high-speed action sequence on a DVD? notice how it's blurry?)

Computers update the screen one-entire frame at a time, which gives no sense of blur. Combined with full-frame rasterizers that don't blur either, and you get 60 completely different frames per second technically. They aren't blurred together, they are all discrete in their motion. Thus, you can perceive the difference easier on a monitor than you can on a TV.

What's really interesting is watching a normal TV compared to an HDTV running in a non-interlaced video mode (480p or 720p). The demonstration I saw was Pirates of the Carribean on DVD, both being played at the same time index on two identical DVD players plugged into two identical TV's. One was running in "standard" mode, one was in HD 480p mode...

The HD mode was incredibly smooth, it could almost make you motion sick how smoothly the skeletons were crawling over the ship deck in the final fight scene. That's because HDTV in non-interlaced mode is actually running at a true 60fps, and you can tell the difference quite noticeably.
 
MetalStorm said:


I agree on the fallacy that we can't see more than 25 different images per second, but if you try and play quake 3 for example at 25fps, its VERY jerky, which makes me wonder what the differences are between TVs and computers when displaying motion.
 
Back