Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I've always wondered something about FPS...

above 60 is there any noticable difference to the human eye?

I mean, getting 90 or higher is nice and all in a future proofing kind of way, but can you actually *see* the difference?

For that matter, would 30 fps be acceptable? Most movies run at 24 fps and they seem more than good enough to the naked eye...

Or am I missing something here?
Movies got motion blur added to make them not look like ****, you will notice a lot of smoothness difference in 30 vs 60 fps, not to mention that you will notice things a bunch of microseconds earlier in fps games and such (but TFTs got both response time and more importantly input lag which makes them slower compared to a CRT there anyway.)
Your eyes notice things passing thru at 200 fps aswell, some pilot/military stuff tested this by showing a plane in one frame at over 200 fps if I remember it correctly, and they could even tell what kind of plane it was ..

So no, your eyes wont limit you for a long time, however most TFTs have a refresh rate at 60 Hz so anything above 60 Hz and you will still only SEE 60Hz anyway. If you turn vsync on the game will be rendered in 60Hz max and if you turn if off it may be rendered at higher FPS aswell and then be output on the screen as it looks atm when it refreshes, that gives you more recent data but also "weird" looking screens since some parts may be another fps ahead, so I would prefer to play with vsync on at 60 Hz, CS-players probably wouldn't they would just ignore that it looks weird.
 
17" MBP Santa Rosa 2.4ghz - 8600M GT 256MB DDR3 - (Normal, BootCamp Driver) - Specs in Sig - Bootcamp Partitioned.

CPU hit degrees 87c Peak in GRAW2. Alt tabbed out of FEAR, menu screen posting this, I'm peaked at about 65.

Games I have tried so far: (Screenshots are in very low quality for fast upload)
These were all taken with normal stuff running. AV/Anti-Spyware/Firewall + Firefox. Also, I don't use AA very often.

F.E.A.R - Highest 1024x768 - http://img243.imageshack.us/img243/108/fear2007072722105557ef0.jpg
GRAW2 - High 1024x768 - Average 35 FPS (If you want screenshot, ask)
WoW - 1600x1024 - http://img262.imageshack.us/img262/9100/wow2007072722501503cd8.jpg - FPS Average in Corner
Counter-Strike: Source - High 1600x1200 - http://img405.imageshack.us/img405/396/hl22007072723181867ct1.jpg - Stress Test said average 115fps
Call of Duty 2 - 1024x768 Max Texture - http://img165.imageshack.us/img165/1706/cod2sps2007072723095107ec6.jpg - Average FPS was 40
Halo PC - High 1600x1200 - http://img266.imageshack.us/img266/4337/halo2007072722453798sg3.jpg - FPS average in corner.
GTA:SA - High 1600x1200 - http://img442.imageshack.us/img442/6973/gtasa2007072722560073jr1.jpg - Average FPS was 50
Battlefield 2142 - High 1024x768 - http://img525.imageshack.us/img525/1233/bf21422007072803490248woq8.jpg - Average FPS was 85


Videos: (First uploaded videos ever, very basic.)

http://youtube.com/watch?v=Hc_uY9F_Sj4 Call of Duty 2
http://youtube.com/watch?v=7Y37RCJVPJA BF2142

Just a couple of points here:

First, sitting in the menu is not an accurate indication of average FPS when in game. I'll show you Enemy Territory running at 1680 x 1050 @ 4000 FPS (yes, I have screenshots) while staring at the wall, but it wont indicate actual gameplay.

Second, I'm curious why you ran the game in Windows (I see FRAPS in the corner). I'd like to see some OS X games running in OpenGL.

COD2 running in DirectX 9.0 is not the ideal test. Its very well known that COD2 is flawed with the implementation of dx9.0 (Another reason I'd like to see it in OpenGL) Most people will bench COD2 in dx 7.0 - much more stable numbers.

Also, some games require the 2nd core disabled on the cpu to run correctly, or you will see crippled results.

Many more points, but unless you're posting these on a site, I wont go into it, though I'm happy to offer any tips.
 
Can somebody try a bioshock demo plz

I did. (MBP SR 2.4) Played on native widescreen resolution and is indeed very smooth. No slow downs what so ever. Can't tell you a FPS cause I didn't check.

HOWEVER. I am noticing some graphic artifacts at really bright places in the game. Just tiny black squares poping up and disappearing. Wanted to update drivers but obviously forceware refused to install :(

Running Bootcamp 1.4. Anyone have a fix?
 
Really?

Hrm, I'll have to upgrade and try again since I'm in Bootcamp 1.3, but the only way I got an acceptable framerate in the Bioshock demo was turning everything as low as it could go and bringing the resolution down to 1024x768...

EDIT: On a 2.2 SR MBP, but still, I wouldn't think it'd make THAT drastic of a difference.
 
Really?

Hrm, I'll have to upgrade and try again since I'm in Bootcamp 1.3, but the only way I got an acceptable framerate in the Bioshock demo was turning everything as low as it could go and bringing the resolution down to 1024x768...

EDIT: On a 2.2 SR MBP, but still, I wouldn't think it'd make THAT drastic of a difference.

Oh dear god I just updated the drivers and it made a world of difference.

Runs well at the highest settings, but runs smooth as silk on medium.
 
Oh dear god I just updated the drivers and it made a world of difference.

Runs well at the highest settings, but runs smooth as silk on medium.

Are you getting the video artifacts I'm was talking about? I saw traces of them in the demo where I was being explained what the little sister and the big daddy were. At the time when she was being shun on from a stage light? I could see video artifacts cropping up.
 
The Bioshock demo runs great on my MacBook Pro, just the 15" 2.4Ghz 2G MB 8600GTM 256MB with 15G for Windows (almost full allready).
I'll try taking the resolution down a notch, maybe it'll run even better...

No artifacts by the way, I did manage to install the newest drivers (by changing the .ini file like it was explained in a topic in the 'Windows on Mac' forum).
 
I've always wondered something about FPS...

above 60 is there any noticable difference to the human eye?

I mean, getting 90 or higher is nice and all in a future proofing kind of way, but can you actually *see* the difference?

For that matter, would 30 fps be acceptable? Most movies run at 24 fps and they seem more than good enough to the naked eye...

Or am I missing something here?

i was sure that it was past 30-35 fps and the human eye cannot notice it. ideally 40-50 is the best place to be, of course you should be able to get much higher at times
 
bioshock seems to run ok on high settings. however if you have a 360 you'll get a much more consistently fluid experience on that instead.

i had high settings, and set screen to 1440x900 (even though there is a WS issue where basically WS = 4:3 image cropped to 16:10 or whatever but there is an unofficial fix for it now)

it seemed to run well, i never tried FRAPS but my guess is it was sub 30FPS at the start, but then picked up once in doors which is to be expected. its not a terribly fast game so 60FPS+ guaranteed isnt a requirement. didnt play with the settings too much though (not used to fiddling with the settings after playing exclusively on a 360 for about a year) but for me... the medium setting looks horrible, its like the graphics have barely moved on from UT2004.

if you have to play on medium but have a 360.. get this game for the 360.

MSAA doesnt work because the UE3 engine uses deferred rendering or something like that so Multi sampled Anti Aliasing is out of the question. Apparently DX10 will allow AA but i have no idea that is true.

people have tried to force AA in the control panel but i dont think its worked. However Super Sampled AA does work, but if you know anything about how SSAA and MSAA work you'll know using SSAA is pointless unless you own a 8800 or something because it needs a lot of horsepower to do.

so i played 0xAA and 8xAF @ 1440x900 at high settings. i think for a better experience you should

move down to 4xAF (its a dark game and MIP-MAP boundries/blurry textures wont be so obvious, not that some of the textures in the game are hi-res to start with)

move down to a lower res like 1280x768 or something, it still looks good stretched to fill the MBP's screen and it taxes the GPU less.

i found doing that gave me a better experience. with the Bloom type HDR and really soft lights in the game you cant really notice that the game isnt running at the panels native res where it would look sharpest.
 
ArmA on a MacBook Pro

Has anybody ever tried to run Armed Assault on a MacBook Pro ?

Could any of you post a video on youtube about that ?

Thanks.

:)
 
I would like to ask if the guys that have the low end MacBook Pro (2.2 15" NVidia 8600 GT with 128mb) how does Battlefield 2 or 2142 runs?
Can you change the settings to highest?

I ran BF2 a long ways back (in June) and have the 2.2Ghz version. It ran just fine, but I can't remember if I went all out on the graphics settings. That was also way back in the BootCamp BETA 1.2 days, just before 1.3 came out with the proper drivers.

So in summary, it worked fine. I am trying to get BootCamp going again, but Leopard is being a pain the the @#(
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.