• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

How many FPS is smooth to you?

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

How many FPS for a game to be considered Smooth?

  • 25-35

    Votes: 129 10.2%
  • 35-45

    Votes: 378 29.9%
  • 55-65

    Votes: 403 31.9%
  • 65-85

    Votes: 222 17.6%
  • 90+

    Votes: 132 10.4%

  • Total voters
    1,264
The question might really be what is the minimum FPS since the FPS fluctuates during a game. For example, my system will often run at 75 FPS and other times drop down to 35, which is too low. So I guess in that sense I would prefer 100 FPS on average so that it never dips below 60 FPS.
 
Last edited:
minimal FPS is the determining factor of "smoothness". Average FPS means nothing. A minimal FPS of 30 will seem smooth. You can't tell the difference between 50fps and 120fps. You are lying to yourself.
 
Correct me if Im wrong but the human eye will preceive "smooth" at 32 frames per second. So therefore anything after that is just like icing on the cake.
 
The human eye can capture 18 frames per second, however because of misalignments, etc etc etc, even 18 seems jittery. The average "smooth" rate is ~24-26. This however does not count limitations brought on by the game. A game can be running at 24 frames, but if your computers speed is not sufficient, then the 24 frames will still appear jerky.
 
Flasher702 said:
minimal FPS is the determining factor of "smoothness". Average FPS means nothing. A minimal FPS of 30 will seem smooth. You can't tell the difference between 50fps and 120fps. You are lying to yourself.

are you guys sure? i no i can see a diffrence im so use to 50hz gaming as soon as i play 60hz mode it's faster and smoother :shrug:
 
I don't understand why this is still an issue to those who think that you cannot detect smoothness past 30fps or 60fps, or whatever it is you think.

Guaranteed that the majority of us here can tell the difference between vsync enabled 85fps and 100fps. I can. Don't tell me I can't... I CAN! I have even done a 100fps to 120fps comparison with my friends and I caught it. That was with vsync enabled.

The human eye is rated in a different manner. Synchronizations between your eye and a monitor are in no way perfect. That is why you can see the difference (in my opinion).

THE FACT STILL REMAINS THAT WHEN IT COMES TO COMPUTER GAMING YOUR EYE CAN AND WILL SEE THE DIFFERENCES IN FPS!!! Stop arguing about this, and go test yourself. No one here is lying.

Maybe the difference between 100fps and 120fps may not work for everyone, but I did this with my friends numerous times.

What some of you are talking about may not apply to computer gaming and synchronizations between your eye and a computer monitor.
 
In a game, when is says the FPS is XXXfps that is an average. If it displays 1 frame in the first .25seconds (4fps) and 99frams in the next .75seconds (132fps) it will tell you 100fps. However, it's not going to look smooth at all. Average FPS is a game is not an accurate way to measure "smoothness". Minimal FPS is. There isn't a standard way to measure minimal FPS, but you get the idea.

(the following are from my memory and googling, if you can correct or quote a more accurate source please do so)

minimum "video": 14fps
Movie theatre: 24fps (and most of all prime-time TV)
NTSC video tape: 30fps
480i DVD (480 scan lines): 30 fps (60 interlaced fields per second)
progressive-scan 480p DVD: 60fps (480 vertical pixels of resolution)
HDTV (1080 scan lines): 60fps (maximum HDTV stadard, no one broadcasts this, not all HDTVs can even display it)

Almost everyone can tell the difference between 24fps and 30fps if you mention to them that there is a difference. Most people can barely tell the difference between 30fps and 60fps even if you promt them, some can't tell at all. So, if you're saying that you need at least 100fps or it seems choppy realize that you're using at least 100 *average* FPS as a benchmark to avoid dropping your minimal FPS below a certain point at which it negatively affects your reaction time (probably about 50fps).
 
I like to have the game run in the 35-45 range on average, as long as it doesnt dip below 25fps. I like to turn on as much as possible, no need to run at 80fps.
 
@Flasher702

Yes, but I'm talking about VSYNC ENABLED. It's already all in synch with the monitor's refresh rate. Also, since I do have a powerful enough card (x850xt @ pe speeds), my game WILL RUN AT 100fps minimum and maximum I'm almost sure. If my monitor's refresh rate is 100hz at 1152x864 (what I used to play at on my old CRT), and if the game says 100fps constant, then that means the minimum is also 100fps. Why? simply b/c I'm already capped off at 100fps, meaning the maximum can't rise more than that, so if it displays 100fps, then there is no minimum to have to cancel out with a maximum; thus, I really am at 100fps average and solid, with vsync enabled.

Anything other than a video game will be a whole different story, which I do not know how I will perform on if I took fps into consideration. They probably work entirely differently.

My 21" crt gave me a CONSTANT 100fps, no dips or crests.
 
funnyperson1 said:
This is false, many people can see the difference between 80 and 100 fps. There was a great article posted here somewhere about this.

25 fps is only the point at which pictures blur into motion. There is definately a sensing of faster/slower motion beyond that point.


Proof? everything i've ever read or seen states 30, and that's coming from everything from places like Beyond3D to college professors explaining motion capture...

so, yeah, i'd really like to see proof
 
Dude, we're talking about how we can see the difference between a "game's" 80fps vs. a "game's" 100fps.
 
i'm telling you that you can't, it doesn't work like your saying it does
that's basically a rumor propagated by people who want to try and support something stupid by saying that it's worth the extra $100 to get 100 FPS vs 80 FPS
it doesn't matter

and why is game in ""'s ?

the human eye cannot meter FPS as accurately as people like to think it can, it's just not that great, so just stop thinking your so amazingly special that you can see 100 FPS different from 99 FPS and that you can meter things out
 
obobskivich said:
i'm telling you that you can't, it doesn't work like your saying it does
that's basically a rumor propagated by people who want to try and support something stupid by saying that it's worth the extra $100 to get 100 FPS vs 80 FPS
it doesn't matter

and why is game in ""'s ?

the human eye cannot meter FPS as accurately as people like to think it can, it's just not that great, so just stop thinking your so amazingly special that you can see 100 FPS different from 99 FPS and that you can meter things out

I really think you should come over my house and test me. Have you seen 85fps with vsync on on my computer vs. 100fps with vsyc on on my computer? No, you haven't. I'm not saying that my eye can ACTUALLY SEE 100fps, but the fact is that if you set my game to cap at 85 and you set my game to cap at 100, I will definitely see the difference.

Please, you really don't know what you're talking about. If you're thinking physics, then based on that it may be correct... but if you're thinking GAMING and GAMING FPS and what it looks like to my eyes on my monitor, then you're incorrect. I don't understand why you're telling me that what I'm seeing is false b/c I already tested this with my friends. Want me to make a video of this? I mean why are you being so defensive about it.

The eye sees things at its own rate, but it doesn't mean that the game's fps and the eye's capture are in synch... if they were, you probably would be correct.

Why else do they come out with refresh override programs? They make these programs so that you can break the 60hz/60fps barrier while vsync is on.
 
The fact of you "thinking" you have seen a difference to 85-100FPS is virtually irrelevant. The general basis is that scientific studies have shown the human eye can see only ~24-26. Everything else is purely psychological.
 
Midnight Dream said:
The fact of you "thinking" you have seen a difference to 85-100FPS is virtually irrelevant. The general basis is that scientific studies have shown the human eye can see only ~24-26. Everything else is purely psychological.
You people that claim I cannot see the difference are close-minded. If I can guess 85fps and 100fps 100% correctly 10 times in a row with my friends testing me, then either I am psychic, or I have experienced a very rare coincidence.

Your basis is all hearsay, yet mine is first-hand. I have 5 monitors at home that I've used, all with different refresh-rates that perform at different fps with vsync enabled. I have used the 21" CRT only because I can see the difference in fps.

I will prove you all wrong if you show up to my house. I will ace any experiment you attempt on me, as long as it's not a difference of less than 10fps since I have not tried that.

If you say, here's experiment A and here's experiment B, and you give me two choices of fps and I needed to match which went with which, I will ace every... single... time! (10-15fps differences at anything no more than 100fps --this is ONLY for vsync enabled. I have not tried it disabled).
 
Back