• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

What is the max FPS the human eye can see?

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

BPM

Member
Joined
Sep 19, 2003
Location
Fort Worth, TX
Given that most people choose to rank a video card in several areas, and one of those being FPS, I began to wonder...

What is the maximum FPS the human eye can recognize? I want to say I heard around 30fps, but this was more than a few years ago. (What that means is that after 30fps, you cannot tell a difference.)

I myself have probably never played a game at more than 30 fps so I wouldn't know if there's a difference or not.

If someone can provide some real optical/medical data on this, that would be great :D
 
There have been several threads on this exact topic, if you search for them you should be able to find a fair amount of information. :)


First off, framerate is only half the problem. Generally, 30FPS is considered the threshold for movement. It is at this framerate that "similar" still images will be stitched together within the brain to give the illusion of fluid movement. However, if the frames are disimilar, the brain cannot as easily stitch them together, meaning that a higher framerate is required for an illusion of fluid motion to kick in. Conversely, if the frames are extremely similar, the brain can stitch them together very easily, and a lower framerate before fluid motion kicks in is required.

So when would these conditions occur? Well, at 30FPS, similar images aren't too hard to come by. You'll see them running down halls, running outside in an open plain, etc. What about times where the frames are disimilar? At 30FPS, significantly different frames will appear when you do quick turns (for example, if you turn the camera 90 degrees in a quarter second [not too hard], each image will differ by 12 whole degrees!).

Even monitor size can have an effect on just how different the frames are. Even though the image will be changed by proportionally the same ammount, a 21" monitor will move each pixel further than a 15" one (at the same distance from the eye), leading to the brain "needing" more FPS on the bigger monitor. If you move the bigger monitor further from the eye though, the screen --and frame differences-- takes up less of our field of vision (which is part of the reason why even 24FPS high movement sequences look good on a movie theater's giant screen)


Also, where the brain's "fluidity limit" is at dosen't stop it from recognizing individual frames. If you insert a totaly different frame (say for subliminal messaging to buy Doom 8 Zillion ;)) every 30 frames, you brain will pick it up and "see" it. You won't have any idea what you saw (since the brain tried desperatly to stitch it in with the rest of the frames), but your brain did see the image. This occurs even at 60FPS from what I've heard, and probably higher (though I have no reasonable information backing me up).


So basically, framerate means very little, especially when people try to argue that "XXX FPS is enough for anybody!". The eye can "see" a lot faster than 30FPS, and the brain dosen't care how many frames/second you throw at it, so long as each frame is similar enough to the last to stitch together.

JigPu
 
Interesting article, however, I looked around the site some more then began to doubt their impartiality. Some of the information may be reliable, however the manner of presentation seems to fit with the style of the rest of the site, PR and sales pitch for new technologies.

The great reliance on reference to that airforce experiment "proving" that the eye can see 220 fps is somewhat misleading when you realise that what they are saying is that the eye could detect a 220/1 sec flash and not anything else. If they put 3 pics of the plane in sequence and one had it's tailfin missing and you saw THAT, then that might prove something of relevance to the FPS debate. After all it has been long proven that the fully dark adapted eye can detect a single photon, it's impossible to get a shorter flash than that. Flood it with photons however and the time it takes the neurons to reset becomes the limiting factor. These are basically functions of the "rod" in the human eye though, and at normal illumination levels, they are going to be overwhelmed. You're gonna have to be in a very black room with a very low intensity display to have your "rods" take much part in the perception of how many FPS you can see.

There is a big difference between how short a pulse of light an eye can respond to and how long the duration of the output pulse to the visual cortex is for that pulse of light. Searching on the internet will find you a figure of around 0.05 secs to "hundreds of milliseconds" for minimum cycle time to distinguish independent pulses. The author appears to be willing to go into all sorts of other detail to attempt to impress (and maybe confuse) you, but not into the actual response time of cones, which would tend to disprove his assertions. There's an experiment you have probably done in school, take a disk painted with sectors of rainbow colors, spin it at better than 25rpm and it looks white. Long known experiment, sit and stare at it all you like and if you can convince yourself that you can see the colors flashing past instead of white, then maybe you alone have 200fps eyes.

Good points are made however about how the eyes perception of the smoothness of moving images is improved by motion blur artifacts due to photographic technique. Another good point is that one's display hardware has to be up to the task, wiggle your pointer around on the screen, does it leave a blur?

I think what most of us are REALLY saying when we say 60fps is smooth and 30fps is jerky is that a game with an everage 60fps that sometimes zooms to 80fps and drops to 40fps is smoother than a game that runs an average 30fps, zooms to 40 and drops to 20. It's not the 30 that bothers us, it's the 20fps it drops to occasionally, even if it's for tenths of a second.

regards,

Road Warrior
 
It's the drops that kill you, your brain can handle a steady "unsmooth" FPS long as it is at a decent rate and make it smooth by it's own nature, just when it get's a drop from 40 to 20, it can't compensate as it hasn't been "running" that way, so you notice it (same as you'll notice 60 to 30). Drops are worse as the visual cortex can't predict anymore, and alot of smooth vision is on prediction. It's all in your head. :)

Rods are a moot point, they are there for enviromental awareness, nothing more - that is why they are located to the periphery - you want something simple that gives simple information to the VC as complex info takes time to compute and if something is coming at you, you may not have that time. Darwinism.

I agree with RoadWarrior about the article, and so does at least one of the two anatomy books I have.
 
This is interesting. I often wondered this myself.
Wondering wheather 300fps on 3dmark was any better than 80fps on farcry.
They both look smooth, well of course, it was smooth on my gf2.. 3dmark was smooth I mean.
 
I've usually heard that movies and TV are at 30FPS cuz that's about the maximum that the human eye can see. But when I play C-Strike and change the FPS_MAX from 30 to 100, I can totally see a difference. But as the people above me said, it's probably just the split-second framerate drops that make things look jerky.
 
kiyoshilionz said:
I've usually heard that movies and TV are at 30FPS cuz that's about the maximum that the human eye can see. But when I play C-Strike and change the FPS_MAX from 30 to 100, I can totally see a difference. But as the people above me said, it's probably just the split-second framerate drops that make things look jerky.
The resolution and complexity of the image are totally different between a movie screen and a monitor, hence the need for more FPS. Also, of the 30 fps in a movie, 3 of those are the same frame. Check out those links above ;)
 
Neither your eyes nor your brain see things in frames per second. Your eyes, the connecting nervous system and your brain all function on chemical reactions and incredibly small electrical currents.

All of that basically means that person A could potentially see events taking only 1/200th of a second, whereas person B could potentially see only events taking 1/100th of a second and nothing faster. Furthermore, by simply drinking more water or having more electrolytes in your body could actually change that value, maybe even drastically. Hell, a ton of things are capable of changing it: what you had to eat in the last 24 hours, body temperature, allergies ;), general state of health, age, even your general stress level can affect these chemical processes.

Also something to remember: frames rendered in movies and TV are radically different than frames rendered on a computer screen. Both movies and television use motion blur, so that very high motion doesn't appear choppy even at a paltry 30FPS. Your computer doesn't perform motion blur (in most circumstances) so the frames can be very jerky during a quick movement.
 
Back