Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Libertine Lush

macrumors 6502a
Original poster
Nov 23, 2009
682
2
I ordered a MBP recently and I intend to get a few games for it, even though I've mostly been a very casual console gamer. So I have a few basic, wonderfully inept questions:

1) What's generally considered the standard resolution to run games at? And what's the ideal?
2) Similarly, what's generally considered the standard frame rate to run games at (is it 30fps)? And the ideal?
3) I know that most console games don't even run at 720p, including most "triple A" titles. In the week or so I've been checking out gaming threads here, I notice it seems 1024x768 is a fairly common resolution used, which effectively makes it a higher res than most console games. So does that mean that if you're playing the PS3/360 version of a non-720p game at 1024x768 or higher on your computer, and assuming all settings maxed and getting at least 30 fps, that you'd be effectively getting a better graphical experience than the console version?

Thanks a lot.
 
There's no standard or expected resolution/frame rate. The higher for each the better. Some games (GTA) have an option to lock their framerate to 30fps to give a more cinematic feel though.
Personally I set all my graphics options so my games run at 60fps. Anything less gives me a headache after a while.
 
1. The native resolution of your display. Failing that, a resolution that matches the aspect ratio of your display so things don't get stretched or squished.

2. Most gamers aim for at least 60 FPS or faster, because gameplay is smoother and when the action onscreen ramps up, a higher framerate will mean your FPS won't drop so much.

3. Actually, the PS3 and Xbox 360 upscale a lot of games, and there are plenty of native 1080p titles on both. Regardless, computers still support higher resolutions, better textures and so on, so you will have a better graphical experience on a PC.
 
All decent PS3 games are at least 720p (1280x720), and the better ones are 1080p (1920x1080). The thing is that you have no idea what settings on the computer equal the settings on the PS3 at the same resolution. Generally, you have to fiddle with the settings yourself, I aim for at least a consistent 30fps at the highest setting possible.
 
I run most games at at least 1280x800, and I want to have between 30-60 FPS, hopefully never dropping below 30, but of course I don't mind in WoW when there are lots of people all attacking a boss or something.
 
Thank you for all the great info guys.

Some games (GTA) have an option to lock their framerate to 30fps to give a more cinematic feel though.

How does a lower frame rate lend a more cinematic feel?

2. Most gamers aim for at least 60 FPS or faster, because gameplay is smoother and when the action onscreen ramps up, a higher framerate will mean your FPS won't drop so much.

Is there ever a point where too high a frame rate is not desirable?

Generally, you have to fiddle with the settings yourself, I aim for at least a consistent 30fps at the highest setting possible.

Yea, I don't look forward to all the initial tweaking I'll have to do to figure what settings/resolution will work best, especially when I won't even understand most of the settings terminology.

Is there usually a option or text-command in games to display the frame rate on screen, as I've often seen in screenshots in threads here? Or does that require installing an additional program (for OSX)?
 
How does a lower frame rate lend a more cinematic feel?

Films at the cinema run at 24fps (I believe), so they aim to be around there.

Also from a dev POV 30fps means you can fit more eye candy in since the graphics card has twice as long to process the image. Thats why some of the higher-end console games run at 30.
 
Films at the cinema run at 24fps (I believe), so they aim to be around there.

Yes, it's 24 for films. I've just never looked into why 24 is ideal for films.

Also from a dev POV 30fps means you can fit more eye candy in since the graphics card has twice as long to process the image. Thats why some of the higher-end console games run at 30.

Great point. Thanks.
 
1) The native resolution of your display.

2) I'd say the minimum is 30 fps for a 'smooth' enough feel so that your head/eyes don't hurt trying to play the game. Ideal is subjective but personally, 60 fps at native resolution with maxed out graphics/audio is the one for me.

3) Don't play much on consoles but they do upscale games and there are native 1080p games for PS3/360 as others have said but the quality of the graphics will sufffer when compared to the PC ( assuming the title is actually made for the PC in mind from scratch and if ported, they don't keep it 'dumbed down' ) :eek:

But some don't get satisfied even when the game is giving 100+ fps. I have seen some EVE players get annoyed at the drop of fps from 170 to 140. I mean jeez, would they even worry/know about it if they were not using FRAPS ? *hides*

Anyways, enjoy the games on your MBP knowing what it can handle. Take care of it well and don't run it too hot. Use smcFanControl ( http://www.macupdate.com/info.php/id/23049/smcfancontrol ) to control your mac's fans. Its a must-have for any MacBook owners who intend to play games.
 
Yes, it's 24 for films. I've just never looked into why 24 is ideal for films.

Films only play at 24-30 fps because there is no point in anything more, as your eye only responds at those speeds. The only reason high frame rates (60Hz+) are good in games is that it helps negate "ghosting", which is the effect of seeing a fast moving object on the screen appear as a kind of tiled effect, instead of the blurred image it should appear as (hard to explain!). Bad example, but it's kind of like this: ]|||

Real cameras have an exposure time, and therefore naturally blur motion, negating the ghosting effect. Computer generated images usually do not. Some new games (Crysis is one) have motion blur, which simulates the effect of a non-zero exposure time. For this reason, such games will look identical at ~30 Hz to 60+Hz.

So in that case, the best graphical experience is to be had at the highest resolution (or closest to native) and graphics settings to put the frame rate at about 30fps for motion blur games, or probably about 60 for others.
 
In answering your question, I don't know if either of these links are entirely accurate, but I found them interesting.

Thank you. They were great reads.

Take care of it well and don't run it too hot. Use smcFanControl ( http://www.macupdate.com/info.php/id/23049/smcfancontrol ) to control your mac's fans. Its a must-have for any MacBook owners who intend to play games.

So it's not recommended or just not most ideal to allow OSX to naturally regular the fans with gaming? I'd have to do a lot of reading to ensure I understand what I'm doing if I'm to mess around with the fans. Thanks for the link.

Some new games (Crysis is one) have motion blur, which simulates the effect of a non-zero exposure time. For this reason, such games will look identical at ~30 Hz to 60+Hz.

So in that case, the best graphical experience is to be had at the highest resolution (or closest to native) and graphics settings to put the frame rate at about 30fps for motion blur games, or probably about 60 for others.

Wow, that's very interesting. Is that to say that when I see graphics cards benchmarks testing Crysis and some achieving nearly 100 fps, those 70 extra frames would be immaterial and unnoticeable to the player, so long as an average frame rate of roughly 30 is maintained?
 
OS X will spin the fans up, but manually controlling it will let you make them go as fast as they can. If you manually control the fans there isn't any real problem with it, just make sure to turn them back down when you're done. ;)

I think a base frame rate of 30 would be better, not an average of 30.
 
800x600 is the minimum resolution. 1024x768 would be a good resolution to shoot for.

Frame rates- if playing a FPS, a min of 40 fps is good but 30 fps should work as long as it does not drag way below that. For a role playing game 20 fps. Although not ideal in a game like WoW I was able to play that in the teens because lightning fast reflexes are not needed. Once you get down to 10 fps or lower you're getting into the slide show range.
 
1920x1080, medium settings and +30FPS

That's for (gaming) desktop of course. For laptop, hmm, native res and +30FPS, graphics aren't that important for me but I want it to run smoothly at native res
 
In answering your question, I don't know if either of these links are entirely accurate, but I found them interesting.

I read the first link... Really interesting. I studied Physics in college so I definitely appreciated the perspective the author took. Very cool.

That said... your monitors refresh rate is optimal (usually around 72fps). The demand for higher frame rates (say 100fps) is only because if you hit heavy demand and you drop by 30fps you're still hopefully no lower than your monitor's native rate.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.