• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

How many FPS is smooth to you?

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

How many FPS for a game to be considered Smooth?

  • 25-35

    Votes: 129 10.2%
  • 35-45

    Votes: 378 29.9%
  • 55-65

    Votes: 403 31.9%
  • 65-85

    Votes: 222 17.6%
  • 90+

    Votes: 132 10.4%

  • Total voters
    1,264
IMOG said:
Yeah I agree on the fps importance difference between campaign and multi. Except FC does have a decent multiplayer, but that wasn't your point and it isn't real popular yet AFAIK.

Do most good monitors support a refresh rate of 125hz at a good resolution? I have never had a good CRT... Went from an old gateway 15" monitor to a 17" mv700 compaq monitor (1024x768@85Hz), both freebies.


Those IBM/Sony monitors I linked to some time ago in the Cyber deals area all did 1024x768@120Hz (supported resolution, not some funky non supported res.)
 
not quite sure, probably 40. But that is odd because all television and video is 29fps and it looks perfect. Most Divx rips are 25. I wonder what makes 25 on a video card look choppy.
 
veryhumid, The reason movies and TV are smooth at around 30fps is because it blurs the images when switching to the next frame, whereas a game puts up a picture, then takes it down and pops up a new frame with no blurring. search the the forums and read the entire thread and you'll find a much better explanation....

Another thing to note is that reason people like really high frame rates, is even at 125 fps it's just an average, it doesn't meen every 1/125 seconds a new frame gets put up...there could be a longer delay to render a frame for whatever reason (complex, mis-rendered, etc). This delay would be very bad for someone running 40fps average...that glitch would cause the apearance of 20fps for that time...very bad! For me after around 75 I can't really notice the difference (partly because my monitors only 85hz) but for people who run 40 and lower, your cards must be putting those images up really smooth to not be able to notice....or it could be you notice, but it just doesn't bug ya...that's where I think the biggest difference is....it's just what exactly smooth means to you...not what you can physiologicaly detect.
 
Well said ajrettke. Take a look at utk2k3/4s stat fps tool. it displays a current fps and an avg fps. the avg fps looks like what i've seen in other games' fps tools. the current fps is obviously an instantatneous measure of the frame rate.. anyway, the point is that the instantaneous fps can dip down significantly from the avg -- this is my understanding and observation, anyway.

oh, i like 55-65 fps ideally.. but i'll play 35-45 without complaints.
 
I'm sorry, but I didn't feel like reading all the posts.

I'm EXTREMELY used to 100fps, since I play Condition Zero (new CounterStrike) with VSync at 1152x864.

I have a "built by" ATI 9700Pro, but my old ti4600 (b4 I shorted it with my gold bracelet) used to do 120fps.

My ti4600 ran Half Life and CS much better, so it seems.

100fps is solid to me. 85fps ain't so bad. 60fps is the BARE MINIMUM!

I used to play Half-Life at 1600x1200 @ 85fps on my ti4600 (Either the vid card or the 21" KDS didn't support more). I'm scared of over refreshing my monitor. I have a KDS VS-21" monitor. Could I get away with a 15 hertz over refresh? Say 100hz at 1600x1200 instead of the default max of 85 hz?
 
I don't like going below 60. That said, anyone who said 90+ is diluted, as the eye cannot detect anything faster than 60fps.
 
TenementFunster said:
That said, anyone who said 90+ is diluted, as the eye cannot detect anything faster than 60fps.

But you must realize that the refresh rates of our eyes and the refresh rates of a computer monitor do not match each another's rates. For that reason, my eyes DO see a SIGNIFICANT difference between 60fps and 100fps. As a matter of fact, I even see a difference between 100fps and 120fps.

I am not diluted (I don't understand how diluted makes sense in this situation anyways).

I am positive someone else would agree with this. If our eyes matched the monitor's refresh rates at the same exact moments that they refreshed (which is 100% impossible) then I may agree with you. Still though, I don't understand how "diluted" makes sense in the situation. :)
 
Thanks for explaining, ajrettke, I didn't know that. I run with vsync on my 60Hz monitor and it is very smooth. About 40-45 it starts to feel choppy.

This was a very good post, earlier: http://amo.net/NT/02-21-01FPS.html

very interesting. but I don't understand how motion blur exists with our eye on film/tv, but not with a graphics card using lcd/crt monitor.
 
Last edited:
I really HATE anything under 85fps. 50-60fps is the lowest I'd ever play with.

I've always played with high FPS, and couldn't see any lower. As I've said b4, I played HL at 1152x864 at 120FPS. My monitor is 21" and can withstand the high refresh rates, but 100fps is just perfect to me. I now play Counter-Strike-Condition Zero all the time at 100FPS.
 
define smooth

I like 85 on fps for multiplayer (my crappy mon only supports 85hz@1024x768 max res 60hz@1600x1200)

but anything above 30 is great for single player.

also FPS is NOT everything internet connection can also have a significant effect on gameplay. If your like me playing games with an average of 125-200 ping you dont see that much of a differance online

but the difference is definatly there at lans
 
20-35 is usually what i play my games at online ofline or whatever.... i can kickass in unreal at whatever frame rate lol...headshots are all the same whether you see there heads go by at 20 frames or 60 ...probably cause i am used to shIEtty frames since i have been playing with outdated computers my whole life hahah
 
Last edited:
I have to say 50fps is smooth in an fps for me, but what really ^@%@ me off is how crappily coded games give an advantage to those with super frames, etc. I personally believe, however, that this "advantage" is overblown in most scenarios, and are just excuses for pathetic game skillz. I still believe your personal skills matter most, and your computer gives a small advantage, but over 125 fps in quake engine still does not make you a good gamer. The main thing I notice is you have to get used to what you are playing the game on. If you switch rigs after a while, you would initially do worse because it runs slightly differently. If you are not used to low fps, you die. However, people used to these frames can easily compete with most others, as long as they are not run on a noobed engine, from what I hear you move faster in Quake with super fps?? Anyway, as long as its smooth Im happy, but even Halo is still fun at jaggy 30 fps.
 
FPS means very little in my book :D I can usually play at 30FPS just fine on my monitor... Some will wonder how I could ever stand it, but until you realize that this is a 15" monitor (14" viewable :eek: ), you won't understand.

You see, it's not the eye/brain seeing 100, 50, 30, or even 5 discrete pictures each second that makes an image jerky. It's how different each of the discrete pictures are that does that. Something will look just as smooth at 5FPS than at 100FPS so long as the distance an object moves between frames in each case is the same. It's how sensitive our eyes/brains are to movement, and how large a display we're looking at that determines what framerate we find to be smooth.


veryhumid said:
very interesting. but I don't understand how motion blur exists with our eye on film/tv, but not with a graphics card using lcd/crt monitor.
Well, in the case of TV, it's really easy to see where the motion blur comes from when you understand that TV is an interlaced medium. When you TV draws an image, it does it in two steps. The first is to scan from top to bottom, drawing on every other line. The second is to scan from bottom to top, drawing on all the lines it didn't draw the first time. Because the image sent to the TV will have changed in the 1/60th of a second it took the TV to go from top to bottom, the bottom to top pass will be slightly different, providing a crude form of motion blur.

I don't know how film does it though.... :-/
JigPu
 
It depends on the game. I like a minimum fps of 30 for most games with an average of 45-55. In some games, like COD, where physics is tied to the framerate, I like to have 85+. To me, the speed of the physics is more important than frame rates as the eye cannot see more that about 25-30 fps but they eye can detect lagging or slow physics. Film keeps things smooth at 25 fps by causing there to be the exact same amout of delay between each frame while on computers, it is an average.
 
Last edited:
Anything below 60fps sucks to me... 85-100 is what I want... PERIOD! My eye doesn't like anything less than 60fps, heck even 85fps. I don't care what reasoning there is... I can SEE the difference. Whatever explanation there may be won't change the fact that I can really feel a difference between 60fps and 100fps. There IS a significant difference. It has to do with some relativity. If you're used to 30fps, you'll LOVE 100fps. If you're used to 100fps, you will HATE even 50fps.
 
Anything under 85fps makes me literally sick (game wise, windows is fine since it's a static image for the most part). I prefer 100 minimal for all games, but that's just impossible. Right now, my lowest is 30fps on CoH and I can only stand it for so long (max window mode, makes multitasking less laggish).

So:
85+ = minimal
100+ = best

Oh, and yes, I can tell the difference between 85 and 100, especially when playing a shooter and you need reflex and aim. More FPS means better accuracy and reflex timing.
 
Back