Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
There is a reason why all movies are shot in 23.976 or simply 24fps - All human beings, no matter what they tell you, can't notice the lag starting from 23.976fps. Fact: All movies and YouTube videos are displayed at 24fps and 30fps respectively.
Therefore, all games, in order to look smooth, must have at least 24fps. This is the VERY MINIMUM for all human beings. If someone can "tell" the difference between 24fps and 120fps, then that person has been brainwashed with the idea of "higher the numbers, better the computer", which is totally false.

Most monitors refresh at 60Hz, or 60fps. That DOES NOT MEAN everything has to be 60fps; it looks the same as 120fps or 24fps. NO ONE, NOT EVEN HARDCORE GAMERS, CAN TELL THE DIFFERENCE!

That being said, it is best for games to display at this rate:
NEVER dips below 24
Average anywhere between 24 and 60
DOES NOT GO ABOVE 60, because at that point your GPU could display more stuff at a reasonable FPS. Ignore this if all settings are on extreme.

Ehh, sorta. If you plug your computer into an HDTV and set the refresh rate to 24 or 30Hz, then move your mouse it will definitely look slow and unnatural. Put it at 60Hz and it will look normal. You really can tell a difference in refresh rates. So ideal FPS would be around 60, which is what most monitors are.
 
There is a reason why all movies are shot in 23.976 or simply 24fps - All human beings, no matter what they tell you, can't notice the lag starting from 23.976fps. Fact: All movies and YouTube videos are displayed at 24fps and 30fps respectively.
Therefore, all games, in order to look smooth, must have at least 24fps. This is the VERY MINIMUM for all human beings. If someone can "tell" the difference between 24fps and 120fps, then that person has been brainwashed with the idea of "higher the numbers, better the computer", which is totally false.

Most monitors refresh at 60Hz, or 60fps. That DOES NOT MEAN everything has to be 60fps; it looks the same as 120fps or 24fps. NO ONE, NOT EVEN HARDCORE GAMERS, CAN TELL THE DIFFERENCE!

That being said, it is best for games to display at this rate:
NEVER dips below 24
Average anywhere between 24 and 60
DOES NOT GO ABOVE 60, because at that point your GPU could display more stuff at a reasonable FPS. Ignore this if all settings are on extreme.

Anyone can notice a difference in the smoothness of animations between 3D graphics at 30 fps and 60 fps. Good examples of this are early PS games like Battle Arena Toshinden (30 fps) and Tekken (60 fps).

You are assuming that because films shot at 24 fps look smooth then 3D graphics at 24 fps are just as smooth. This is plain wrong. The subtle motion blur on each film frame, combined with the inherent detail of the medium compensates for the relatively low frame-rate to create animation that looks smooth. At the same frame-rate, a 3D engine with less detail and none of the motion-blur and general blurring looks like a fast slide-show.

Also note that James Cameron recently expressed interest in shooting future film projects at 60 fps. That is not someone who has been brain-washed and he knows a lot more about making CGI seem realistic than all of us on this forum combined.
 
How do you check your FPS? On the PC side, I use FRAPS.

In Source games (Portal, TF2, L4D and the like) bring up the console and type in "net_graph 1". or 0 to turn it off again.

Displays FPS, data transfer rates and the like.
 
Therefore, all games, in order to look smooth, must have at least 24fps. This is the VERY MINIMUM for all human beings. If someone can "tell" the difference between 24fps and 120fps, then that person has been brainwashed with the idea of "higher the numbers, better the computer", which is totally false.

Most monitors refresh at 60Hz, or 60fps. That DOES NOT MEAN everything has to be 60fps; it looks the same as 120fps or 24fps. NO ONE, NOT EVEN HARDCORE GAMERS, CAN TELL THE DIFFERENCE!

That being said, it is best for games to display at this rate:
NEVER dips below 24
Average anywhere between 24 and 60
DOES NOT GO ABOVE 60, because at that point your GPU could display more stuff at a reasonable FPS. Ignore this if all settings are on extreme.

Dude, you couldnt be anymore wrong.

Anyways, i prefer atleast 60FPS and optimally 100-120FPS. 120HZ monitors FTW. Anything under 50 Fps is unplayable for me. I dont see how people play with ~30fps
 
In Source games (Portal, TF2, L4D and the like) bring up the console and type in "net_graph 1". or 0 to turn it off again.

Displays FPS, data transfer rates and the like.

If you want just an FPS meter in Source games use "cl_showfps 1" in the console or 2 to display average, min, and max FPS and 0 to turn it off.
 
If you can see a difference once your over 30fps , I've some gold plated monitor cables to sell you , I'll even throw in the London bridge they are stored on.
 
Generally for me 30fps is the mark of satisfactory performance although 60fps is really ideal for maximum fluidity, I wonder if anything above 60fps really makes a difference.

I play World of Warcraft and I find it difficult to find a balance of good detail and fps of 30+ every since Cataclysm, I use the 9400M and three years on it's beginning to show it's age. Although I'm sure things would be different if Apple got themselves in gear and released an OpenGL 3.x implementation.
 
If you can see a difference once your over 30fps , I've some gold plated monitor cables to sell you , I'll even throw in the London bridge they are stored on.


Generally for me 30fps is the mark of satisfactory performance although 60fps is really ideal for maximum fluidity, I wonder if anything above 60fps really makes a difference.

[...]

Guess you'll have to give myself and him that London Bridge. :D

Anything below 60 is unplayable for me, especially online. If you have a 60 Hz screen (like me) having more than 60 fps merely makes it less likely that your "actual" fps will drop below 60 fps. Other than that, you'll need a screen with a higher refresh rate to see any other differences.
 
60fps.

Now if I'm playing a game where the fps can vary drastically (think playing WoW when doing dailies then doing a 25 man raid with a graphics heavy encounter like Ignis, with all the fire effects going on) then minimum of 30fps will do, just to avoid the major frame rate swings.
 
If you can see a difference once your over 30fps , I've some gold plated monitor cables to sell you , I'll even throw in the London bridge they are stored on.

Oh I can tell the difference, what's more I can do a demo that you will notice the difference as well. :) The best test is something with high speed movement and then start to add in more movement in a different plane or axis. An example from a Feral game (Borderlands) would be play as the gunner on one of the faster vehicles you can use to drive around the map, once your teammate is boosting at full speed start to look about quickly, depending on your machine it will not feel as smooth as it did when you are standing still or moving slower. Basically the faster or greater the movement the bigger the jump between frames and the higher frame rate required to make it feel as smooth.

This effect with motion blur in films is why cinema gets away with 24 fps and also why when watching a high speed pan of the landscape in a film it can feel juddery as your brain does not have enough frames to make the animation feel smooth. 24 fps is the minimum required for the human brain to constantly think the images are animated not a series of stills, this does not in any way mean that a human cannot tell the difference between 30,60 and 120 fps depending on the animation being shown. This is why fast games like shooters, beatem ups etc have traditionally had much higher fps than slower paced games like platformers.

Talking about gold plated cables they are useful for analogue signals but also for digital especially when you have very long cables in these circumstances reducing any noise anywhere on the line lowers the need for error correction, leaving you with a cleaner image. I have experienced this with 30 Meter HMDI cables I installed into a few walls so I can pipe A/V around the house cheaply. Gold plated HDMI connector Cables only make a very small difference but it was enough on the 30M cable between signal drop out and signal holding steady.

Edwin

p.s. On the whole I completely agree unless you have a specific reason playing extra for gold plated cables they are un-nessasary. I just thought I would let you know I did find one example of the higher quality gold plated cables making a difference in the digital world. :)
 
There is a reason why all movies are shot in 23.976 or simply 24fps - All human beings, no matter what they tell you, can't notice the lag starting from 23.976fps. Fact: All movies and YouTube videos are displayed at 24fps and 30fps respectively.
Therefore, all games, in order to look smooth, must have at least 24fps. This is the VERY MINIMUM for all human beings. If someone can "tell" the difference between 24fps and 120fps, then that person has been brainwashed with the idea of "higher the numbers, better the computer", which is totally false.

Most monitors refresh at 60Hz, or 60fps. That DOES NOT MEAN everything has to be 60fps; it looks the same as 120fps or 24fps. NO ONE, NOT EVEN HARDCORE GAMERS, CAN TELL THE DIFFERENCE!

That being said, it is best for games to display at this rate:
NEVER dips below 24
Average anywhere between 24 and 60
DOES NOT GO ABOVE 60, because at that point your GPU could display more stuff at a reasonable FPS. Ignore this if all settings are on extreme.

You're very wrong. People will actually shoot digital at 24fps (rather than higher) just to get that 'film' look, because the viewer can see the difference. Watch any movie that has a fast panning shot, or better yet, say, a tracking shot along a picket fence. You can see the image 'strobing'-- the effect of the lower framerate. In fact, I believe that some of the GTA games have used a 24fps effect to develop a cinematic appearance. If you're used to watching movies at 24fps, a film shot at 60 will look almost unnaturally smooth. Generally, 30-60fps will be reasonably smooth in appearance, but for high speed action, 60+ is preferred. It varies for each individual.
 
Last edited:
You're very wrong. People will actually shoot digital at 24fps (rather than higher) just to get that 'film' look, because the viewer can see the difference. Watch any movie that has a fast panning shot, or better yet, say, a tracking shot along a picket fence. You can see the image 'strobing'-- the effect of the lower framerate. In fact, I believe that some of the GTA games have used a 24fps effect to develop a cinematic appearance. If you're used to watching movies at 24fps, a film shot at 60 will look almost unnaturally smooth. Generally, 30-60fps will be reasonably smooth in appearance, but for high speed action, 60+ is preferred. It varies for each individual.

+1 The human eye can recognize a dark images from light in as little as 1/100th of a second, and can recognize light in dark up to nearly 1/500th of a second. With the eye being able to recognize that images in as little as 1/220th of a second. So theoretically your FPS would need to be 220FPS to get the maximum effectiveness out of your game. Now you just need a monitor that can keep up.

http://www.100fps.com/how_many_frames_can_humans_see.htm
 
I built a PC to play WoW and some other games, and overbuilt to make it as future-proof as possible. Now all I play is Minecraft, and I get 500+ fps... man, what a waste.:eek:

It'll be worth it when GW2 comes out, I hope...
 
You're very wrong. People will actually shoot digital at 24fps (rather than higher) just to get that 'film' look, because the viewer can see the difference. Watch any movie that has a fast panning shot, or better yet, say, a tracking shot along a picket fence. You can see the image 'strobing'-- the effect of the lower framerate. In fact, I believe that some of the GTA games have used a 24fps effect to develop a cinematic appearance. If you're used to watching movies at 24fps, a film shot at 60 will look almost unnaturally smooth. Generally, 30-60fps will be reasonably smooth in appearance, but for high speed action, 60+ is preferred. It varies for each individual.

OK, maybe I'm wrong about the use of 24 fps nowadays. However, I never met ANYONE who complained of strobing before.
Also, that means everyone in Europe should notice a slight strobe, because they use the PAL standard, which is 25 fps.
Point is, anywhere from 24 to 60fps should be smooth enough for the game to play without lag. You WILL NOT notice the difference unless you lived your entire life in a 60fps environment. You don't get much benefit from increasing fps; as long as it's PLAYABLE for you, then that's the frame rate you should stick with.

For me, I stick to 24-40 fps. Anything below that is serious strobing, and anything above that is just excess GPU power going to waste for me. If you don't think so, then go with a higher fps. It's all a matter of opinion, and if you think 60fps is better than 30fps, then it is for YOU. Only YOU can decide which is better.

Just keep in mind that 25-30fps is where the strobing is less-to-not noticeable, and it's been proven by science. However, it might be less smooth on monitors because of the refresh rate of 60Hz trying to display an ill-fitting video stream. Therefore, 30fps will SLIGHTLY look better than 24fps, that is also true. However, between 60fps and 30fps, not much is different. IMHO I'd rather have a minimal-strobe, high-detail than turning down settings just to achieve a "magical" 60fps.

Tips for using low-FPS settings:
Turn on motion blur. This is what makes 30fps and 60fps indistinguishable.
Do not go below 24fps. Like I said, artifacts are visible.
Know your limits. Make sure you're comfortable with your settings.
 
Last edited:
OK, maybe I'm wrong about the use of 24 fps nowadays. However, I never met ANYONE who complained of strobing before.
Also, that means everyone in Europe should notice a slight strobe, because they use the PAL standard, which is 25 fps.
Point is, anywhere from 24 to 60fps should be smooth enough for the game to play without lag. You WILL NOT notice the difference unless you lived your entire life in a 60fps environment. You don't get much benefit from increasing fps; as long as it's PLAYABLE for you, then that's the frame rate you should stick with.

For me, I stick to 24-40 fps. Anything below that is serious strobing, and anything above that is just excess GPU power going to waste for me. If you don't think so, then go with a higher fps. It's all a matter of opinion, and if you think 60fps is better than 30fps, then it is for YOU. Only YOU can decide which is better.

Just keep in mind that 25-30fps is where the strobing is less-to-not noticeable, and it's been proven by science. However, it might be less smooth on monitors because of the refresh rate of 60Hz trying to display an ill-fitting video stream. Therefore, 30fps will SLIGHTLY look better than 24fps, that is also true. However, between 60fps and 30fps, not much is different. IMHO I'd rather have a minimal-strobe, high-detail than turning down settings just to achieve a "magical" 60fps.

Tips for using low-FPS settings:
Turn on motion blur. This is what makes 30fps and 60fps indistinguishable.
Do not go below 24fps. Like I said, artifacts are visible.
Know your limits. Make sure you're comfortable with your settings.

You're awfully authoritative on something you don't know much about.

Another poster explained why 60fps is better for computers. A 60hz display will display 60 still images a second. If you're running a game and hitting 60fps then every single frame is displayed. If you're at 40 then a percentage of those frames will be displayed twice, or longer (depending on the strain on the GPU).
And motion blur isn't a magical cure for this problem. Neither is Object motion blur (but it is a step in the right direction).

Today I've been optimising GTAIV on my new iMac. I can run it at max settings and get 40-50fps, or lower textures to medium for a constant 60fps. There is a clear difference when playing. It's not imaginary.
 
You're awfully authoritative on something you don't know much about.

Another poster explained why 60fps is better for computers. A 60hz display will display 60 still images a second. If you're running a game and hitting 60fps then every single frame is displayed. If you're at 40 then a percentage of those frames will be displayed twice, or longer (depending on the strain on the GPU).
And motion blur isn't a magical cure for this problem. Neither is Object motion blur (but it is a step in the right direction).

Today I've been optimising GTAIV on my new iMac. I can run it at max settings and get 40-50fps, or lower textures to medium for a constant 60fps. There is a clear difference when playing. It's not imaginary.

That's why I said 30fps is better than 24fps. Not only is the framerate higher, the display rate is also in sync with the monitor.
And that's also a factor in you sensing these strobing effects. From 30fps onward, the human eye has an extreme hard time distinguishing each frame. The end result is that the eye processes it as a BLUR. So, if you have a 60Hz monitor showing a 60fps screen, then your eye senses it as a single fluid motion due to it processing each frame as a blurred transition between them. Think of the eye performing the "Tween" effect in Photoshop between each frame it sees.
Since this blurring effect created by the eye will be lesser at lower framerates, this shortage is compensated by recreating the same effect on screen, or creating motion blur on the image itself instead of the eye making it. This way, the difference between 30fps and 60fps is minimal to the brain, and indistinguishable for most people.
30fps might not be the same as 60fps, but with motion blur, it's pretty damn close. Try it first, then tell me.
 
+1 The human eye can recognize a dark images from light in as little as 1/100th of a second, and can recognize light in dark up to nearly 1/500th of a second. With the eye being able to recognize that images in as little as 1/220th of a second. So theoretically your FPS would need to be 220FPS to get the maximum effectiveness out of your game. Now you just need a monitor that can keep up.

http://www.100fps.com/how_many_frames_can_humans_see.htm

What that's talking about is that the eye can detect a FLASH of light in 1/220th of a second. It DOES NOT MEAN that the eye sees each frame at 1/220th of a second; rather, it would see at a much lower pace due to the fact that the 220fps peak that the eye can see cannot be sustained.
Also, if that was true, then right now all of our monitors would be unviewable due to jitter and stutter, wouldn't they?
The same website says: "Do you like to play Quake? Do you think "More is better"? Maybe that's why you think 200fps is better than 180fps."
and this in bold letters: "Blurring simulates fluidity, sharpness simulates stuttering."
To make 30fps video look much more fluid, motion blur is implemented.
 
First, there is a BIG difference in the refresh rate on an LCD vs CRT; the technology in an LCD means that it doesn't have to redraw the entire screen every time but that doesn't mean that it doesn't or that there is no refresh rate. Generally speaking, the LCD 60Hz refresh is better then the CRT 60Hz refresh; by a long shot. With the old CRT's if the refresh was below 80Hz it would drive me crazy, especially if the refresh rate was set to a harmonic of the 60Hz fluorescent lights (typically at work not at home).

Want to see the difference easily? Go to Best Buy or any TV store. Look for a cheap 60Hz LCD TV and then the 120Hz ones. Even with slow motion, the 120 just FEELS better (because it is smoother). There are many 240Hz and even a few 480Hz LCD TV's out there now, because they improve the visual quality. Have the store put on a high speed movie, something like 2Fast2Furious and you can REALLY see the difference.

Even if your display cannot push more then 60Hz having the engine run faster then that is still beneficial because the updates are more accurate and will feel correct. Even though the image you see into the game is only redisplayed once every 1/60th of a second, the engine is still processing away and recalculating things and thus can make things more fluid. If you have a screaming processor that can do 200 fps but your display only does 60, it is still noticeable in the game. (oddly enough, you would think that it wouldn't, but yes, you can feel the difference)
 
In Quake engine games, your frame rate is intimately tied to your network performance online. In Call of Duty 4, for instance, if you can get to 125 consistently you will notice a huge improvement. More is better. If you can get 250 or more you will notice a massive improvement in hit registry and your ability to acquire targets quickly (at least I do).


The yitch3 config for CoD4 is great for not-so-strong computers trying to get high FPS.

http://bashandslash.com/index.php?Itemid=74&id=320&option=com_content&task=view
 
To get back to the original question, I think most gamers will agree that having a higher FPS is better than having the image quality set at high. I think the important thing to keep in mind is how low is the slowest FPS you're going to get. If you're running inside a house and get 120fps, big deal. Once you get to fast paced close quarter combat with explosions/fire going off everywhere, you don't want to go under 30 fps where your computer probably starts chocking up. That's when your really fps affects your score.
 
In this thread, individuals who clearly have never gamed. YES you can tell the difference between 30 and 60 FPS ON SCREEN.
Also, a computer that averages 24 fps WILL struggle maintaining that, and you will se artifacts.
Your retina may refresh at an average of 24 fps, but your display pulses at 60, and your eyes fill in the rest of the frames.

Games must run CAPPED at 30 fps, or AVERAGE 60 fps to be considered playable by me.

Case in point: Halo 1 was great (always capped at 30, rarely dropped below)
Any source game: Portal, TF2, and CS require low reaction times, and stutters below your display's refresh rage greatly disadvantages you. Cap it at 30 with motion blur, and it is still playable.

Call of duty 4: My mini can hit 60, but more realistically averages about 25. The game is unplayable, even on minimum settings simply because I routinely see drops into the teens or lower.

Play crysis (very heavy on motion blur utilization) on a mac, then play it on a computer designed for gaming. The experience is a little smoother, but you WILL perform and notice a difference better than at 30 fps.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.