Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
When I used to play CS 1.6 in a professional level, we still used CRT and 100FPS with 100Hz and it is a hell lot smoother than those first LCD running at 60Hz and having a huge delay of 5Ms... Yes all those small details matter and you become aware of them.
 
Have you ever gamed? There is a notable difference between 60 and 30 fps, easily distinguishable by the human eye. For example Call of Duty is a game that runs ca 60 fps on consoles as thats their target, it most likely is somewhere between 40-60. On the other hand, Battlefield 3 was developed with a target of 30 fps, and its v-synced so it wont go above it. Now, prior to playing BF3 I had played Modern Warfare 2 for a fair amount of time, and when I played BF3 there seemed something off, put I didn't quite get what. It didn't have the same smoothness MW2 had due to the difference in framerate.
I hope this helps to maybe clarify some things. In console games, developers have to decide between more visual fidelity and go with a lower framerate like if BF3, or with less visual detail, but with a higher fps, like COD.

Erm, my living is vision. The cap for persistence of vision is around 20fps, you might be able to perceive the difference when it is higher, but 20fps is all that is needed to provide the illusion that what you are seeing is fluid, rather than a series of pictures
 
Last edited:
Erm, my living is vision. The cap for persistence of vision is around 20fps, you might be able to perceive the difference when it is higher, but 20fps is all that is needed to provide the illusion that what you are seeing is fluid, rather than a series of pictures

And if you're able to percieve the difference, then there is a difference, is there not? Even though we get the illusion of fluidity at ca 20 fps, every increased frame adds visual info, which translates into a more fluid and less blurry picture. I'd rather like to know, when does the human eye not perceive the difference between a higher and a lower framerate, as people claim they see a difference between a 120hz and a 60hz display?
 
Subjectively for me, the biggest difference comparing the 20fps, is between the 15fps, rather than the 30fps.

This isn't subjective. The differences you'll see the between the 15 FPS and 30 FPS range are much more apparent than the differences between 30 and 60. Anything beyond 30 is varying degrees of smoothness, from good to great.

From my experiences, higher framerates have a more noticeable impact when it comes to moving about in an environment than it does for character animations. For instance, if you were to compare two similar looking characters side by side, one done with 24 frames of animation, and the other 48, you would notice smaller details on the character with the higher framerate, like he might do something with his hands between frame 5 and 10 than the lower framerate character doesn't do, but both would still move realistically. That's because the eye (or the brain) applies a good bit of interpolation when it comes to animation. It has a tendency to fill in the blanks. You don't need a perfect gradient of frames between positions to achieve realistic looking animation.

But the eye (or the brain) doesn't do as good of a job at filling in the blanks when it comes to large scale movement. Like if you were to compare someone pivoting around at varying speeds in an FPS game at 24 and 48 frames per second, you'd see a marked difference between the two.

And like I said earlier, when you're in direct control of what an image does, you tend to notice missing frames a little more. Like if you're tracking a character in your little onscreen targeting reticle, you're more aware of the gradient of frames between, say, moving your point of view between a tree and a bush 10 feet away.

This is why high framerates in games are more important than they are in movies. It isn't the animation exactly, but the fluidity of movement while looking around and moving through an environment. We expect it to look more like what we see with our own eyes, which of course doesn't interpret the world in frames, but in arcminutes.

edit: oh, and the difference between 60hz and 120hz LCD TVs isn't about fluidity of motion...well, sorta, not exactly. It's about being able to display fast moving objects better. This is kinda hard to explain, but...well...you'll only ever mentally account for 60 frames per second regardless. But if something like a bird or a ball were to pass really quickly from left to right on the screen, the 120hz TV would be able to maintain the image of it without making it look as blurry or juttery.

If you watch a ton of football and baseball, you'll want a 120hz screen. But if you spend most of your time watching movies, you'll be good either/or (unless you're watching a Bruckheimer flick).
 
Last edited:
And if you're able to percieve the difference, then there is a difference, is there not? Even though we get the illusion of fluidity at ca 20 fps, every increased frame adds visual info, which translates into a more fluid and less blurry picture. I'd rather like to know, when does the human eye not perceive the difference between a higher and a lower framerate, as people claim they see a difference between a 120hz and a 60hz display?

Ok, so basically the eye is a receptor of data. This data is then passed through to various parts of the brain for further processing.

The reality we perceive is not reality as it actually is, rather its a constructed by our brain after it has processed the data it's presented with (through information collected by our senses). In the same way a detective forms a theory after collecting evidence, and when it comes to a certain point, sufficient evidence has been collected that any more evidence is excessive/redundant.

So to kind of answer your question, there are two points when data starts going to waste. You could argue that the first is at excess above 20fps, because that is when the brain is able to perceive the series of images as fluid(although of course we already know that we can also improve this by increasing the number of FPS, although as once we hit that critical limit (say 20fps) new information gets increasingly less important for constructing our theory)

The next point is what is the maximum amount of information that can be processed. I'm not sure I can directly answer this (though I'm sure it's been researched), but it will related to the amount of information/data that can physically be communicated at anyone time. The bottleneck will either be in the brains/eye's ability to receive or process information, or the limitation of the information actually traveling from the source to the eye/brain.

So basically, there are two areas where the limit can be reached I guess - 1) when information becomes excess to requirement for forming the theory about the environment and 2) when information is excess to what can physically be communicated/processed. How do these fit in terms of FPS? I'm not sure, but I imagine it would also depend on some level to amount of information present in each frame.

This is a fairly crude response, but it's late but I wanted to answer as I've got to go soon :)
 
Last edited:
Any coder entangling input and renderloop this way should be shot and then hanged outside to see for all other coders. It is the worst way possible to do a engine runloop...
It's not the placement of input in the game loop which causes the apparent latency, but the render-wait-pageflip cycle. You can poll input at 1000fps and still have 16.67ms latency if the last rendered frame was rendered according to data from 16.67ms ago.

The correct fix is to shrink the time spent waiting between rendering and page flipping. High FPS is a brute force way to solve this problem. You can also try to schedule the frame render as late as possible but still in time for vblank. This is a heuristic method and always runs the risk of introducing dropped frames. Also, changes in input latency can be more disorienting than large input latency.
 
The human eye is pretty rubbish at noticing once you go over 24fps, that's why all cinema's have 24fps as standard.
Actually, it's because shooting film costs more the higher the framerate is (because you're using more film) and 24fps was the minimum they could get away with where a majority of people would perceive fluid motion rather than a series of static images.
24fps is not enough to accomplish this for me, and as bad as interpolation can be, I need it to watch 24p content without getting a headache.

For that matter, I get a headache from playing a lot of games at 30fps as well, and you can't run at anything between 30 and 60 without other problems. (screen tearing or stutter, depending on whether you choose to disable v-sync, or force triple-buffering)


60fps is the minimum framerate I can tolerate when playing games for more than an hour or so.
To stay at 60fps at all times, you need enough performance headroom so that you do not just have an average of 60fps, but the minimum will stay above 60fps.

If you can run at a game with an average of say 80fps when the framerate is unlocked, it is unlikely to drop below 60 when you enable v-sync and lock it to 60.

If 120Hz was an option for televisions (I use a 46" display hooked up to my PC) and hardware was fast enough to handle staying above 120fps at all times (a single titan will struggle to do that, and SLI introduces stutter) I would absolutely upgrade to that as well.
 
Because if you're in a graphically demanding area... and frame rate drops... you won't notice. Mostly it's for bragging rights and/or future-proofing.
 
The human eye is pretty rubbish at noticing once you go over 24fps, that's why all cinema's have 24fps as standard.

I'm afraid this is the opposite of reality. 24fps isn't really sufficient to provide convincing motion...it's just barely "good enough" to not look like a slideshow, which is why film makes heavy use of motion blur. The human eye is very good at seeing high frame rates; in fact it's generally better to have high frame rates than it is to have high resolution because of the way we see motion.

a refresh rate of 25 frames per second is completely fine - thats an image update of once every 40ms. You wouldn't perceive jerkiness.

Yes you will, very much so. You need motion blur at such low frame rates to compensate, which is why traditional stop motion animation looks jerky even when animated at the full 24fps. Here is a page with some discussion about frame rates.

TVs are 60hz (or 50hz in the UK)

Not necessarily; there are 120Hz and 240Hz displays (and higher).

--Eric
 
We want a minimum of 60fps, or whatever the max your display supports.

120hz is good for 3D, but that's divided between each eye. 60fps 3D is great though!

And even then things above 60fps do look nicer. My local gaming bar has them, they just look more natural and life-like. It's not perceived by everyone (my gf couldn't see any difference), I imagine it's something you'd only notice if you play a lot of 60fps games.
 
Yes you will, very much so. You need motion blur at such low frame rates to compensate, which is why traditional stop motion animation looks jerky even when animated at the full 24fps. Here is a page with some discussion about frame rates.

--Eric

Hi Eric, I read the link, and one of the paragraphs say this:
"
Even though about 15 fps is needed to initiate the illusion of continuous motion, the effect by no means stops there. Visual studies have shown that even if one cannot distinguish discrete images, a frame rate all the way up to 60-80 fps makes footage appear more lifelike by enhancing clarity and smoothness."

Doesn't that kinda contradict your point? And mine actually, as it's saying 15fps is enough to provide the illusion of continuous motion...
 
Last edited:
I think the difference comes from most people considering "normal" viewing while many shooter gamers are so wildly rotating around that it gets a problem at low fps. But one has to be conscious that this kind of movement is totally unnatural.

A nice example was a friend trying out my copy of TombRaider. I got nearly seasick from all the rotating he was doing - Im sure I did less rotating in the whole game then he in 1 hour...
 
Not necessarily; there are 120Hz and 240Hz displays (and higher).
This is achieved through interpolation - televisions will rarely accept anything above 60Hz, even if they are a "240Hz" model; or 480Hz like my TV, 960Hz like some others.

There are a few displays that can now be "overclocked" and will accept above 60Hz though, such as those cheap Chinese 4K displays. I think they will go up to 144Hz at 720p.
 
The human eye is pretty rubbish at noticing once you go over 24fps, that's why all cinema's have 24fps as standard.
I'm surprised to read that from a professional of gaming. 24 fps is nowhere near smooth even with the motion blur of the cinema camera. I believe the reason they chose 24fps is just to double the average 12 fps of the first cinema projectors. Just like they shot The Hobbit at 48 fps.
I think that even above 60 fps the human eye can perceive differences.
 
There's a significant difference between watching a movie (where there is no feedback loop) and playing a video game (which has a feedback to your actions. ie. moving your mouse to change your viewpoint).

In the former, 24 fps is enough for me to experience smoothness most of the time (there are certain high actions scene that could use more frame rate).

In the latter, I stop "feeling" differences somewhere in the neighborhood of 50+ FPS. I'm sure though if you were to quantify my performance in some way (like reaction time in an FPS), I could still benefit from even higher frame rates.

Using a standard monitor with 60 Hz refresh rate, it technically doesn't make sense to render higher than 60 FPS, since you're doing work that won't be displayed anyways.
 
Doesn't that kinda contradict your point? And mine actually, as it's saying 15fps is enough to provide the illusion of continuous motion...

An illusion of motion, yes. Smooth motion, no.

The eye - sure, the brain - no.

Yes it does. My first real experience with high fps gaming was Quake 2...for various reasons, I didn't get around to playing it until years after it was first released. By that time I had a machine that could potentially run it at hundreds of frames per second.

When I first launched it, I tried to run it at my desktop resolution (1280x960 at the time), but quickly discovered that the UI elements like health and so on did not scale; they were tiny and hard to read at a glance. So I dropped it to 800x600, in order to make the UI usable.

It happens that my monitor could run 800x600 at 120Hz, so that's what I did. I made sure vsync was on. Then I started playing, and it was something of a revelation. Until then, I'd always thought 60fps was plenty, but the difference between that and 120fps was very clear. I finally saw what "fluid movement" really meant.

The main drawback, of course, is that more modern and demanding games couldn't hope to match that framerate, so having to settle for a mere 60fps (or possibly less) for most games after that was disappointing....

--Eric
 
e-peen



edit:
yes, i have seen and played 60fps + content.

Even video you notice the difference, without the "feedback loop".


Is it "needed"?

Nope.


Then again, i'm oldschool, and started gaming when (well, BEFORE... back in say 1984) filled polygon 3d was new (previously, wireframe was state of the art), and we were happy if we got FPS up in the teens or better doing that. Your brain can adapt to the frame rate - I remember playing simulations and stuff with perhaps 5-10 FPS. If you're competing as a professional gamer, well... sure higher FPS will maybe be an advantage. But that's a very small niche.
 
Last edited:
There is definitely a difference when playing first person shooter games like Counter Strike above 60fps, regardless of what the eye can see or the hz of the screen. All things equal (I know that's impossible) the person with higher FPS has an advantage.
 
All the iOS games I have created run at 60 FPS. If I set them to 30 FPS, there is a noticeable difference with fast motion.

24 FPS in movies if OK when things don't move too fast, but that is being increasingly ignored. Films such as Transformers 2 are unwatchable because the movement is far too fast for the frame rate (not that I thought it was a good film from the part I watched before I had to turn it off).
 
Why do people insist on more than 60fps in games?
A question I fail to understand. This 60 being for 60hz screens. All the extra frames are being wasted. Maybe there's something I don't known but from my understanding of it if your game fps constantly equals the refresh rate of your screen you have the best and anything more is superfluous.

In some really intense fast action scenarios were there is a lot of movement having higher than 60 fps can make the scenario clearer and look less jerky.
 
The brain can and does perceive difference in fluidity above 60 fps.

Pure placebo. Just think a moment: Normal reaction delay for humans is 200-300ms. 1 frame at 60fps has roughly 17ms - It is impossible for the brain to react to anything so fast, it would need 12+ frames for a fast reaction + the time for the movement of the arm/hand/fingers.

What really is important: vsync to avoid tearing and a stable framerate. Then even 30fps is more then enough.
 
Like somebody already said.. Play CS 1.6 at 60fps and 60mz and then both values at 100 you will notice difference
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.