• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

PC Build Help - Budget around $2000

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.
No the human eye and brain can't tell the difference from 25-30 fps or 60 fps and beyond. QUOTE: The human eye and its brain interface, the human visual system, can process 10 to 12 separate images per second, perceiving them individually.

Here is the link. https://en.wikipedia.org/wiki/Frame_rate

Movies are still at 24-25 FPS people still can't tell the difference, also all TV HD 1080i (i) means (interlaced), out of the 60Hz first the odd frames are drawn on your screen then the even frames by doing so it splits the frames into 30Hz for each, (can you see that happening, NO.)

The only reason why people think FPS is so important today compared to when I had my voodoo2 video accelerator Glide at 20- 30 FPS with my need for Speed SE that ran so smooth at 20-30 fps, is because if there is blurring or micro stutter, frame skip people get new graphics cards and the only thing we test is FPS with that understanding new graphics card always help anyway. Graphics are so complicated now that FPS is not telling the story anymore.

If you want to take a blind test to see if you can tell the differences from 60 fps or 30 fps I will do a run-through with BF4 locking my GTX 970 1080p Graphics card at a set frame rates and I won't have the OSD.

Sorry, but no. Movies and TV use blurring to make 24 FPS seem smooth. Modern games with the motion blur feature attempt to do the same for systems with lower FPS.

If frames from a movie were not blurred, it would look like garbage (just like a game running at 24 FPS without motion blur would)...

EDIT:

Also, this is like 2 paragraphs down from what you quoted from that wiki:



Which reads to me as: stop looking at wikipedia for information haha "written on a forum was said" LOL :rofl:

The brain can only process 10 to 12 separate images per second. I read the hole article first before I posted, they change it year to year they don't really know how many fps the eyes can see fore sure, actually does the eyes see frames or is it a picture.

A romantic comedy movie and TV's use blurring, BF4 blurring disabled ok:thup:, NOT. Is that all you have I'm out of here.

EDIT:
 
Last edited:
The brain can only process 10 to 12 separate images per second. I read the hole article first before I posted they change it year to year they don't really no how many fps the eyes can see fore sure, actually does the eyes see frames or is it a picture.

A romantic comedy movie and TV's use blurring, BF4 blurring disabled ok:thup: NOT. Is that all you have I'm out of here.:rofl:

Google translator is giving me nothing... I don't understand what you're trying to say?
 
No the human eye and brain can't tell the difference from 25-30 fps or 60 fps and beyond. QUOTE: The human eye and its brain interface, the human visual system, can process 10 to 12 separate images per second, perceiving them individually.

Here is the link. https://en.wikipedia.org/wiki/Frame_rate

Movies are still at 24-25 FPS people still can't tell the difference, also all TV HD 1080i (i) means (interlaced), out of the 60Hz first the odd frames are drawn on your screen then the even frames by doing so it splits the frames into 30Hz for each, (can you see that happening, NO.)

The only reason why people think FPS is so important today compared to when I had my voodoo2 video accelerator Glide at 20- 30 FPS with my need for Speed SE that ran so smooth at 20-30 fps, is because if there is blurring or micro stutter, frame skip people get new graphics cards and the only thing we test is FPS with that understanding new graphics card always help anyway. Graphics are so complicated now that FPS is not telling the story anymore.

If you want to take a blind test to see if you can tell the differences from 60 fps or 30 fps I will do a run-through with BF4 locking my GTX 970 1080p Graphics card at a set frame rates and I won't have the OSD.

Sorry, but no. Movies and TV use blurring to make 24 FPS seem smooth. Modern games with the motion blur feature attempt to do the same for systems with lower FPS.

If frames from a movie were not blurred, it would look like garbage (just like a game running at 24 FPS without motion blur would)...


EDIT:

Also, this is like 2 paragraphs down from what you quoted from that wiki:



Which reads to me as: stop looking at wikipedia for information haha "written on a forum was said" LOL :rofl:

We are talking about differentiation of the frame rates from 25-60+ this would be your answer QUOTE: The human eye and its brain interface, the human visual system, can process 10 to 12 separate images per second, perceiving them individually. I read the hole article first before I posted, they change the article year to year they don't really know how many fps the eyes can see fore sure. Actually does the eyes see frames per second or is it a picture.

Blurring of a movie or TV also gaming is not for smoothness, when you blur a image it makes streaks and is blurry, The reason they do blurring for movies, TV, gaming, is to show fasts movement also blurring effects because the 25-30 FPS is to smooth and slow for real life human eyes looking at a Screen. So when you are going down the highway in a car look out the side window the image is blurred if you scan, that is how the human eyes work.
 
Blurring of a movie or TV also gaming is not for smoothness, when you blur a image it makes streaks and is blurry, The reason they do blurring for movies, TV, gaming, is to show fasts movement also blurring effects because the 25-30 FPS is to smooth and slow for real life human eyes looking at a Screen..

What?! What? Just stop, man. Or, continue. I don't really care. :bs:
 
Back