• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

what's the minimum playable frame rate?

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

axhed

Member
Joined
Feb 7, 2003
Location
cleveland
i've been looking at dozens of benchmark tests lately and i've seen no mention as to what would be considered acceptable. what do you guys use for gaming?
 
15fps-minimum from short distances
20fps minimum for sniping
30fps threshold for fooling the fastest eye into fluid motion
60fps overkill LOL
 
Yodums said:
Your eye won't notice a difference anything above 35fps. I believe TVs run at that refresh rate.

i'm sure there's going to be a few people who will argue with that but i have to agree.

the refresh rate for TV's is just under 30FPS i think. (not sure, i may be wrong)
 
john240sx said:


i'm sure there's going to be a few people who will argue with that but i have to agree.

the refresh rate for TV's is just under 30FPS i think. (not sure, i may be wrong)

Well if it were 30 or so it would just give you a general idea it is around the 30's.

My bad if it is wrong
 
I think Films, etc are at 25FPS. There is a major difference between the frames of acutal film/TV, and the frames that you see on your screen
 
The "30FPS de-facto standard" has to do with TV, not computers. Go to Google and research.

Fact outproves theory in this case...

Run 3DMark2001 with everything grossly cranked up and see what you think of a scene running at ~30fps.

Then turn everything down and see what you think of it at 60-100fps.

Bet you'll notice a difference.
 
35fps is good...no hiccups/choppy sequences, and doesn't take too much hardware to hit that FPS. 24FPS is what movies run at, but the diff between movies to vid games is that every movement caught on a camera is blurred a bit, so when the frames are put together they blend smoothly into each other. In a vid game, you get still images that don't blend in together, and have no blur, so you need more FPS to play videogames. does that make sense? good!
 
Literatii said:
The "30FPS de-facto standard" has to do with TV, not computers. Go to Google and research.

Fact outproves theory in this case...

Run 3DMark2001 with everything grossly cranked up and see what you think of a scene running at ~30fps.

Then turn everything down and see what you think of it at 60-100fps.

Bet you'll notice a difference.

What kind of monitor do you run? With a good monitor you shouldn't be able to see that difference anyways.

Would it even matter of it is comp or TV? It is what your eyes are capable of seeing.
 
It does matter. I remember doing this big thing on it in one of my Science classes. TV and Computer have different ways of projecting each frame.....so its not the same. I can't always tell between 30 and 60 frames on my computer
 
Not sure you've ever looked at video tape, but analog frames capture movement or "blur". It's what 3dfx hyped (and died because of). Frames blend because of it.

Frame buffer renders through any API (DX or OpenGL) do not capture that. They are digital.

Again, go try out 3DMark as I suggested. If you don't want a benchmark, download FRAPS and fire up your favorite game.
 
i would say the minimum acceptable framerate is whatever YOU feel comfortable playing with. i can tell a pretty big difference between 30 and 60, 60 just seems smoother.
 
snyper1982 said:
i would say the minimum acceptable framerate is whatever YOU feel comfortable playing with. i can tell a pretty big difference between 30 and 60, 60 just seems smoother.

I really agree with this sentiment.

There's a point where you "don't know what you're missing unless you've had it".

People with a 2.5ghz CPU and Radeon 9700 pro will seldom go back to anything less ;)
 
I usually shoot for around 60fps if you're talking averages. At that level, your minimum frame rate will be around 30 or so. If you just have an average of 30 that's way too low because your minimum will probably be around 15.
 
I just played CS - 1152x864 @ 100Hz, I set my FPS max to 30, 60 and 100...

now at 100FPS everything is fine, it all runs smooth and cant see any motion flaws...

at 60FPS everything again is fine, it all runs smooth, but everything just looks different whilst moving, almost as if its moving slower.

30FPS everything again is fine, runs smoother, but now it feels like a brick moving, as if the guys having a real hard time moving his legs.

now perhaps this is due to my monitors refresh rate being significantly higher than the frame rate, I dont know if this is right but im assuming 30FPS at 100Hz would leave an image on the screen around 3 times longer than an image on the screen from 100FPS at 100Hz? so perhaps its just this im seeing that makes it feel slower.
 
Yodums said:
Your eye won't notice a difference anything above 35fps. I believe TVs run at that refresh rate.

exactly (dad use to produce comercials...)

I gennerally cant stand anything under 18fps but it really all depends on the type of game

**edit**

i do want to add however that i've yet ot have a nice computer that can hit 45+ (outside of looking at empty map)
 
http://www.100fps.com/how_many_frames_can_humans_see.htm

ugh, do research before you talk. READ IT, before you add any more comments


"Take again "Test 1: Smoothness of motion". You have a fluid film with 24 fps. The film roll has to roll thru the projector. To not see it rolling you have to make the picture black while the film rolls on. You would have to blacken the screen 24 times per second. But 24 black moments are too visible. Thus you have smooth motions but flicker.
The solution is: Show each frame 3 times and make the screen black 3 times per frame. This makes the black moments shorter and more frequent: "Triple the refresh rate". So you see about 72fps in the cinema, where 3 consecutive frames are the same. Strange solution? Solution of an analog world. And an example how "Brightness eats darkness".
 
The truth on this topic!

The eye threshold is 60 fps. They way they get away with using "30" on North American TVs is by changing every other horizontal line (odd/even) 1/60th of a second. This is a similar trick is used at the movies. You are shown a frame, and then the shutter is closed. The shutter is then opened with the same frame still there. Then the shutter is closed and the shutter reopens with a new frame etc. They open and close the shutter 60 times a second, while showing you the same frame twice successively.
In Europe they use a higher resolution and only go 55 fps. When i go to Europe I can see a slight flickering in action sequences since my American eyes are used to 60 fps.

All this info I learned as a research assistant for the foremost Electrical engineer in the world working with High Definition TVs. His name is Prof Edward Delp. Do a search on google for him. He has wrote more technical papers on HDTV than any other man in the world. He is also the name who consulted the president of the USA when they were concerned about taliban hiding encrypted messages pictures send through e-mail. I also did some work here, but I had to sign a non-disclosure agreement ;-)
 
It's all personal preference. I only upgrade vid cards when my sustained framerate drops below 10-12 FPS. My brother can't play if he doesn't get a minimum of 30-40 FPS.
 
Back