• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

what's the minimum playable frame rate?

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.
i myself consider 25 the minimum. anything less becomes choppy. i prefer to have it in the 40-50 range if possible.
 
john240sx said:


i'm sure there's going to be a few people who will argue with that but i have to agree.

the refresh rate for TV's is just under 30FPS i think. (not sure, i may be wrong)

dunno what tv's are at but all movies are done at 24fps as far as i know.
 
Maxvla said:
i myself consider 25 the minimum. anything less becomes choppy. i prefer to have it in the 40-50 range if possible.

yeah i was running glexcess today just checking out this card i got and it was on the liquid scene and at about 24 fps it looked chopper but 25+ was kinda hard to tell if it was.
 
I think this is a really interesting subject. Mainly because i seem to disagree with most.
OK, before i start writing here, I should mention that I'm from a tiny country in Europe named Norway. In Norway we speak norwegian so I guess my english is rather crappy.

A common TV runs at 25fps. Every television frame has a blur effect. This is the biggest difference between computer animations (games) and Television animations. In The cinema you see 24fps. Sometimes I don't even have to consentrate to see that television animations are choppy. I can see where the first image ends, and the second begins. (If you don't believe it, I won't force you to). This is mainly during action scenes or scenes with quick animations.
On top of that, the refresh rate of a TV is about 50Hz (I mentioned Europe, didn't I?). New HDTV's have refresh rates close to 100Hz. Ah, lovely to look at!
When it comes to the refresh rates of computer monitors, I can easily notice the difference between 60Hz, 85Hz and 100Hz. I have proven this to many non-believer friends. I have never seen 120Hz, so I can't comment on that. 60Hz makes me sick. I can't look at it for more that a few seconds.

A friend of mine can easily play a game at a constant 25fps. I can't. No way. It's far too choppy.
I can take NOLF2 as an example. I play at 1024x768 with medium details. He does too.
I have a GeForce 4 Ti 4600 (325/725), 512MB DDR RAM (150MHz*2) and an Athlon XP (1650MHz).
Here's the fun thing: He has a GeForce 256, 650MHz Athlon and 512MB SDRAM. We play at same res and IQ, with not exactly the same performance to put it like that. (I get 12500 3DM2k1, so there's nothing wrong with my comp)

I usually play at 100fps and yes, I can see a huge difference between that and 60fps. So what am I trying to say? I think it's pure BS to say that the hardware of today is far superior to the software. I think the software is way too demanding for the hardware. R9800Pro, here I come!!
 
30fps..
Anything slower than this and your going to seriously start see your frag counts drop :) You obviously have to see your enamie to kill it and with fps lower than 30fps, your slimming your odds.

Its halarious when people say they play games with less than 30fps which is reality lowers there overall gaming value. In fact, there are reviewers that become biosed when they throw a cool PC game into there PC just to find out it runs horrible... So then they give it a bad score because of this. They would say something like this game was poor octimized or so fourth.

Hope this helps a little.


OC-Master
 
Ohyeah,,

Dont compare 2D to 3D.

2D needs FPS to be between 24FPS and 30FPS while 3D is artificial and needs double the frame count to look realistic... Its just how it goes.

DVDs are all 30FPS 480x720 NTSC standard. Your antena television is all 384x512 24FPS as well as tapes.


OC-Master
 
For fastpaced multiplayer games: 50+

For singleplayer: 25+


And saying that 25 is enough and eye can't see more is pure BS. It is tested that almost all people will see difference between 30 and 60 fps. Some fighter pilots can see upto 220 fps. When you're watching TV/film there's a thing called motion blur going on. That makes them look smooth enough at 24 fps. When playing with your computer the frames are different (sharper, no motion blur) and transitions between frames are more noticeably.
 
oddarne84 said:
I think this is a really interesting subject. Mainly because i seem to disagree with most.
OK, before i start writing here, I should mention that I'm from a tiny country in Europe named Norway. In Norway we speak norwegian so I guess my english is rather crappy.

A common TV runs at 25fps. Every television frame has a blur effect. This is the biggest difference between computer animations (games) and Television animations. In The cinema you see 24fps. Sometimes I don't even have to consentrate to see that television animations are choppy. I can see where the first image ends, and the second begins. (If you don't believe it, I won't force you to). This is mainly during action scenes or scenes with quick animations.
On top of that, the refresh rate of a TV is about 50Hz (I mentioned Europe, didn't I?). New HDTV's have refresh rates close to 100Hz. Ah, lovely to look at!
When it comes to the refresh rates of computer monitors, I can easily notice the difference between 60Hz, 85Hz and 100Hz. I have proven this to many non-believer friends. I have never seen 120Hz, so I can't comment on that. 60Hz makes me sick. I can't look at it for more that a few seconds.

A friend of mine can easily play a game at a constant 25fps. I can't. No way. It's far too choppy.
I can take NOLF2 as an example. I play at 1024x768 with medium details. He does too.
I have a GeForce 4 Ti 4600 (325/725), 512MB DDR RAM (150MHz*2) and an Athlon XP (1650MHz).
Here's the fun thing: He has a GeForce 256, 650MHz Athlon and 512MB SDRAM. We play at same res and IQ, with not exactly the same performance to put it like that. (I get 12500 3DM2k1, so there's nothing wrong with my comp)

I usually play at 100fps and yes, I can see a huge difference between that and 60fps. So what am I trying to say? I think it's pure BS to say that the hardware of today is far superior to the software. I think the software is way too demanding for the hardware. R9800Pro, here I come!!

nice post man!!!

i personaly cant stand less than 32 MINIMUM FPS, 60fps look cool, but 100fps are extasic!!!! :D I can feel the difference ahd you are right on the gaming value thingy. I rather wait with some burned cds laying on my rack until i get good hardware than play em at crap iq - fps...

as a proof, when i go to the cinema the first 10 minutes are horrible, i can see the flicker all over the picture no matter if its blurred or not, damn l337 hardware ;)
 
Overclocker550 said:
15fps-minimum from short distances
20fps minimum for sniping
30fps threshold for fooling the fastest eye into fluid motion
60fps overkill LOL

Since this topic has already been covered, I'm going to save myself some time and quote my earlier response:

larva said:

Heh, this is what people with slow systems like to believe, and repeat over and over to themselves to reinforce their decision not to buy some current hardware.

Come play Q3A online with less than 125fps and you will find yourself at a most pronounced disadvantage against those that do. The game engine's physics are most effective at 125fps allowing you to jump higher and run faster, and the feel is vastly better. Playing single player with any game is a cinch, try it with some internet induced lag and the vastly superior reflexes and unpredictablity of good human players in a truly fast paced game like Q3 and the feel and immediate response afforded by 75+ fps performance makes a big difference, even if the game engine's physics aren't affected as they are in Q3.

And before I get a litany of responses saying how you own online with your TNT2, note that I was ranked #9 worldwide in Q3A DM last week. There is a difference between thinking you are owning, and actually doing it. If you are serious about winning and using a GF4 you are running 1024x768 (at most), and no AA or AF of any kind. And this is assuming you are carrying a big stick, system wise (high fsb/memory P4 at 2.4+ GHz or unlocked high fsb/memory AthlonXP at 1.8GHz actual or more). If you are making a painting, turn on every option you want. If you are being competitive, turn all that crap off.

Note that 9700 ATI's droop considerably less with these options enabled than a Ti4600 does. If you feel you can't live without silly resolutions and AA and/or AF, spend the money on a 9700. The Ti4600 is just as capable with no AA, but markedly less effective with it. Everyone's priorities are different, and player skill is a huge factor. But sooner or later you will play against someone with equal skill to yours (or vastly greater...) and if they have a machine like mine set up realistically and you don't you have no chance. Ping is also a giant factor, but no matter what your connection quality is like teaming it with a truly high effective system performance level will always maximize your results.

As far as what the miniumum playable rate is, that is a subjective conclusion. I know some very skilled players that compete well with 25-30fps. But they know how big a disadvantage it is, and they compete even more favorably once they get real fps. For most games, 75fps is enough. For Q3, best bring 125. What is the minimum playable? That depends on your personal characteristics and how long you can ignore the advantage big fps gives you in order to put off spending money.
 
I always thing its funny when people play at 640x480 with all options off and maximum contrast on their high end hardware so they can win Q3 games. I'd get sick if I had to look at such a crappy screen all the time. I rather have it look more real, even if it is more challenging. Play for fun instead of sacrificing everything for FPS to win. If thats all it was about they wouldn't even bother texturing the walls.

And I can definetly tell the difference between 15, 30, 60, 85, 100 fps. It depends on the game though. 15-30 is fine for sim games, 30-60 for racing or mech games, 60-85+ for shooters. Refresh makes a huge difference on the eyes too. One reason TV doesn't bother you as much is the motion blur between frames on that kind of display. A monitor is so sharp its easier to see a lower refresh when things are moving. Thats why HDTV is 100Hz (I think someone mentioned it was 100?).

-Rav
 
If you are running a T-bird or a P3, 640x480 it is if you want to win. Modern hardware does allow a greater performance envelope. A well executed P4 running 2.2+ GHz or an AthlonXP running 1800MHz and up in conjunction with a GF4 Ti allows 1024x768 with full detail, should you prefer it. AA is still a big no-no. ATI 9700's allow AA if you wish. But the popularly concieved notion (most popular amongst those with slow machines) that you can run 1280x1024 full detail with AA on and not suffer hugely performance-wise for it are ignoring the reality of the situation.

Most Quake3 players play for the competition, not the beauty of the game. Q3 wasn't even awe-inspiringly beautiful at the time of its introduction, and was unplayable at high resolution and full detail on the hardware that was available at that time. Nowdays the hardware has caught up with Q3, but new games still extract severe fps penalties if you wish to take advantage of all the eye-candy they bring to the table.
 
Excelsior said:
Well DVD's are 24fps.

As noted above televisions use tricks to make the effective framrate much higher. But even if it was only 24fps, this has nothing to do with playing a game. As pointed out analog signals are less sensitive to low fps than digitial ones. And it would be 24fps, constant. If you have a game that is averaging 24fps, under moments of high screen activity and rendering content you are going to dip to 1-5fps, something even the users most determined to ignore the need for real hardware can't ignore. And finally the DVD is not trying to shoot you. What fps is tolerable for simply viewing action has no relevance to what fps is necessary once you wish to interact with it.
 
Last edited:
http://amo.net/NT/02-21-01FPS.html

guys you should really read this.

short and sweet:- minimum acceptable frame-rate depends on what you are looking at. montiors NEED a higer minimum than T.V's, lcd's NEED a higher minimum than monitors. As the technology improves, so does its demands.

btw, oc-master:- NTSC may well be 30fps, but pal is 25.
 
james.miller said:
http://amo.net/NT/02-21-01FPS.html

guys you should really read this.

short and sweet:- minimum acceptable frame-rate depends on what you are looking at. montiors NEED a higer minimum than T.V's, lcd's NEED a higher minimum than monitors. As the technology improves, so does its demands.

btw, oc-master:- NTSC may well be 30fps, but pal is 25.

Yup,, correct

but not to worry, people over seas can use HDTV which is 1080p 30fps :) So its a win win situation now!


OC-Master
 
to be honest, you dont really notice it. in fact, the picture quality of pal is superior over ntsc due to the greater resolution. ntcs prigressive scan is the best of course, but true pal progressive is starting to arrise.......
 
I've seen this topic posted before and most of the responses make perfect sense, but:

TV looks good at low frame rates because of motion blur and computer games need a higher frame rate because there is no motion blur. Right?

Why then, does Metroid on the GC look so smooth on the TV at 30 FPS? Does the gamecube incroporate motion blur? Or does the interlace make up for the low frames?
 
its the t.v doing the bluring. thats why all console games look so smooth. its not untill you see them on a monitor that you realise just how choppy they really are.
 
Back