"Eyes Can Only See at 40 fps" -- The myth

General Discussion
1 2 3 5 Next
Oh my... This myth kills me more than any other out there, due to having such a vested interest in PC gaming. This is such a common myth that it honestly makes my head hurt.

I'd like to correct this and if only 4 people walk away after seeing this thread and know better, I'll feel great. I'm going to keep this simple.

Myelinated nerves can fire between 300 to 1000 times per second in the human body and transmit information at 200 miles per hour. What matters here is how frequently these nerves can fire (or "send messages").

The nerves in your eye are not exempt from this limit. Your eyes can physiologically transmit data that quickly and your eyes/brain working together can interpret up to 1000 frames per second.

However, we know from experimenting (as well as simple anecdotal experience) that there is a diminishing return in what frames per second people are able to identify. Although the human eye and brain can interpret up to 1000 frames per second, someone sitting in a chair and actively guessing at how high a framerate is can, on average, interpet up to about 150 frames per second.

The point: 60 fps is not a 'waste'. 120 fps is not a 'waste' (provided you have a 120hz monitor capable of such display). There IS a very noticable difference between 15 fps and 60 fps. Many will say there IS a noticeable difference between 40 and 60 fps. Lastly, the limit of the human eye is NOT as low as 30-60 fps. It's just not.

The origin of the myth: The origin of the myth probably has to do with limitations of television and movies. Movies, when they were recorded on film reel, limited themselves to 24 frames per second for practical purposes. If there is a diminishing return in how many frames people can claim to actually notice, then the visual difference between 24 fps and 60 fps could not justify DOUBLING the amount of film reel required to film a movie.

With the advent of easy digital storage, these limitations are mostly arbitrary anymore.

The numbers often cited as the mythological "maximum" the eye can see are 30 fps, 40 fps, and 60 fps.

I would guess the 60 fps "eye-seeing" limit comes from the fact that most PC monitors (and indeed many televisions now) have a maximum refresh rate of 60hz (or 60 frames per second). If a monitor has that 60 fps limit, the monitor is physically incapable of displaying more than 60 fps. This is one of the purposes of frame limiting, Vsync and adjusting refresh rate in video games.

tl;dr: The human eye can physiologically detect up to 1000 frames per second. The average human, tasked with detecting what framerate he/she is looking at, can accurately guess up to around 150 fps. That is, they can see the difference in framerates all the way to 150 fps.

Phew.

A quote to leave off on:
The USAF, in testing their pilots for visual response time, used a simple test to see if the pilots could distinguish small changes in light. In their experiment a picture of an aircraft was flashed on a screen in a dark room at 1/220th of a second. Pilots were consistently able to "see" the afterimage as well as identify the aircraft. This simple and specific situation not only proves the ability to percieve 1 image within 1/220 of a second, but the ability to interpret higher FPS.


Resources to save myself pain in the future (in addition to citing myself for taking so much biology in college):
1)http://amo.net/NT/02-21-01FPS.html
2)http://amo.net/nt/05-24-01FPS.html
3)http://www.ualberta.ca/~chrisw/howfast.html
4)http://en.wikipedia.org/wiki/Frame_rate

If there are any biology majors/professors out there reading this and notice any egregious errors, please point them out. I'm not in the game of deceiving people.
TIL eyes can only see at 60 FPS
eyes don't see in frame rates. Applying a number to it is meaningless because that's not how eyes work.
11/22/2012 07:15 PMPosted by Sabinah
TIL eyes can only see at 60 FPS


No. The limit of most monitors is 60 fps. The brain can perceive many, many more than that.

11/22/2012 07:15 PMPosted by Fudgesicle
eyes don't see in frame rates. Applying a number to it is meaningless because that's not how eyes work.


It's actually not that crazy of a comparison because nerves do not continuously send information. They can only fire about once every 15 milliseconds. Because of this, they send pulses, pictures, many times a second, quite like frames per second.

However, this liberal use of the words "frames per second" are implied in my "I'm going to keep this simple" caveat in the first post.
I was only vaguely aware of this myth previously, but I have read your post and am now educated on the subject!
Film is not 30fps.

Film is usually 24.

VIDEO is usually 30.
11/22/2012 07:13 PMPosted by Renfeild
Movies, when they were recorded on film reel, limited themselves to 30 frames per second for practical purposes.


I believe film reels are actually 24 FPS.

But yes, the whole things is bollocks. I can tell when my game drops from 60 to 50 FPS. Always. I instantly know it's happening. To say we can only interpret 40 or 30 or whatever is moronic.
The origin of the myth: The origin of the myth probably has to do with limitations of television and movies. Movies, when they were recorded on film reel, limited themselves to 30 frames per second for practical purposes


This, plus the fact that past a certain point (24fps?), the brain interprets a series of pictures as motion (rather than just a set of images). People love misinterpreting things.

But anything above your monitor's refresh rate is a total waste.
Film is not 30fps.

Film is usually 24.

VIDEO is usually 30.


I stand corrected. I'll adjust the post, but the point at least remains untouched.

Thanks :)
11/22/2012 07:18 PMPosted by Terciel
The origin of the myth: The origin of the myth probably has to do with limitations of television and movies. Movies, when they were recorded on film reel, limited themselves to 30 frames per second for practical purposes


This, plus the fact that past a certain point (24fps?), the brain interprets a series of pictures as motion (rather than just a set of images). People love misinterpreting things.

But anything above your monitor's refresh rate is a total waste.


Technically a waste, but it's a good measure of how much "buffer" room you have before you're framerate really starts to take a dive.

What I mean is this: If my game says it's running at 90 fps, despite my monitor only being able to show 60 fps, it's a good measure of how much more action I would need on screen to damage my visible framerate. Having "90 fps" showing in the corner of my screen tells me I can have a lot more action on screen than I currently have before it dives into a slideshow.
11/22/2012 07:22 PMPosted by Renfeild


This, plus the fact that past a certain point (24fps?), the brain interprets a series of pictures as motion (rather than just a set of images). People love misinterpreting things.

But anything above your monitor's refresh rate is a total waste.


Technically a waste, but it's a good measure of how much "buffer" room you have before you're framerate really starts to take a dive.

What I mean is this: If my game says it's running at 90 fps, despite my monitor only being able to show 60 fps, it's a good measure of how much more action I would need on screen to damage my visible framerate. Having "90 fps" showing in the corner of my screen tells me I can have a lot more action on screen than I currently have before it dives into a slideshow.


It's wasted for me. I need to run with V-sync on or I get major and unbearable screen tearing. It's horrible. :(
well that my be, but you can only smell 60 SPS (smells per sec)...
This, plus the fact that past a certain point (24fps?), the brain interprets a series of pictures as motion (rather than just a set of images). People love misinterpreting things.


That's what the whole thing about tv and movies are actually about.

You guys are confusing a lot of things here. The "myth" you are referring to, I've never heard about. That's some reinvention based on known facts about how people perceive fluid motion.


Oh, gods, I can't tell you how many console VS PC debates I've seen in which a scary amount of console gamers try to claim "Oh pishaw, your high FPS is a total joke. The human eye can only see 30 fps! Power to our televisions!"

What made me write this up is a thread I came across while googling the "Great item squish" debates for item levels growing exponentially in the game. The thread is here:

http://us.battle.net/wow/en/forum/topic/4904620718?page=1

This is but one example in which people run around spouting the mythological limitations of the eye.

It pushed me over the line. I've seen such arguments so many times and each time I've always said "Man, I need to say something." Finally, I did.

11/22/2012 07:26 PMPosted by Bigscreen
well that my be, but you can only smell 60 SPS (smells per sec)...


Haha.
11/22/2012 07:17 PMPosted by Vanarela
Movies, when they were recorded on film reel, limited themselves to 30 frames per second for practical purposes.
I believe film reels are actually 24 FPS.
all film reals are set for 24 fps, any drawn animation is also drawn for 24.
disney animators explained to me how your eye interprets 24 fps as seamless and constant, as opposed to pictures flashing by.

I believe film reels are actually 24 FPS.
all film reals are set for 24 fps, any drawn animation is also drawn for 24.
disney animators explained to me how your eye interprets 24 fps as seamless and constant, as opposed to pictures flashing by.


While this may the minimum threshold for your brain to start piecing together images using a blurring technique, people can most certainly tell the difference in smoothness between 24, 30, 40, 50, 60 fps and higher.

Put any person down in a chair who is used to playing games at 60 fps and slowly decrease the FPS by 10s. The player will accurately tell you the FPS every time. It's no placebo effect.
11/22/2012 07:17 PMPosted by Vanarela
Movies, when they were recorded on film reel, limited themselves to 30 frames per second for practical purposes.


I believe film reels are actually 24 FPS.

But yes, the whole things is bollocks. I can tell when my game drops from 60 to 50 FPS. Always. I instantly know it's happening. To say we can only interpret 40 or 30 or whatever is moronic.


The reason you can "Notice" that, is because most games are programmed to aim for 60 FPS. Whenever a game cannot refresh the display at 60 FPS, it starts skipping frames so that the user can still see what is happening.

When frames are skipped, animations are not fluid and you'll see some "jerking".

I used to play Final Fantasy XI -- that game was capped at, IIRC, 29.9 FPS because of the fact that it was originally a PS2 game and the way the game was coded forced the developers to cap the PC's version at 29.9 FPS too.

However, its animations always looked just as smooth, even though you were only getting 30 FPS instead of 60. Why? Because there's no frame skipping. Whenever a computer can't refresh the display as fast as its meant to, it is because there's not enough processing power and it has to skip some frames to make the 'deadline' while it processes other, non-visual stuff in the background.

Join the Conversation

Return to Forum