Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Libertine Lush

macrumors 6502a
Original poster
Nov 23, 2009
682
2
I have a few questions regarding the Mac version of Call of Duty 4: Modern Warfare, asked from the perspective of someone who's never played computer games much and is new to Macs.

1) How is it different from the PC/console version of the game? I've found and read at least 5 Mac reviews of the game, but none have ever compared it to the PC (or console) version. Is the performance (graphics, fps) and gameplay in even the slightest way inferior (as I've read Mac ports sometimes suffer)?
2) I know the "Game of the Year" version for PC and consoles has 4 new maps. Is it correct that Aspyr has since patched the Mac version to also included those maps?
3) Are there still many people on the servers? It's a short game, so if there isn't much activity for multiplayer, it won't be worth buying.
4) What's the difference between buying the DVD or downloading the game via GameAgent? I read in one thread that if you buy the DVD, you have to have the DVD in the Mac while you play the single player campaign. Slight inconvenience. But will the DVD copy limit you to only installing the game on 2 systems, like the downloaded version, or only installing a limited number of times on the same system?
5) I'll be buying the new 15" i7, hi-resolution MacBook Pro with the GeForce GT 330M (512MB). Since this game is almost 3 years old, would I be right to assume I'll be able to play it with all settings/resolution (even at 1680x1050) at the highest possible? I did check the system requirements, but I don't know much about graphics cards, so I don't know how the 330M compares with those listed as recommended.

I feel a bit greedy laying out so many questions all at once. Thanks to anyone who can help me out.

EDIT: 6) I just noticed that the PC version of this game sells for only $15 at Steam. With Steam coming to Mac this month, is there any reason to believe this game may get ported to Mac? I recall the only promise was that Valve games would get ported.
 
The PC version will always be the better version. You can use either a mouse/keyboard or controller, the graphics are better.

The PC (not Mac) version is on sale on Steam at the moment. 50% off (£9.99 here). So if you have bootcamp I really recommend going down that route.
 
The Mac version is a port of the PC version. It doesn't run as well under OSX as the PC version runs under windows. I get significantly better frame rates using Bootcamp and Windows XP or Windows 7. So, in order to get the SAME frame rates, I have to turn the graphics settings down on the Mac version (this is on a Mac Pro with Mac version 4870 GPU).

Those technical issues notwithstanding, the Mac version is virtually identical to the PC version in game play. Most of the PC game makes that do port to the Mac don't necessarily keep up with the version numbers. Usually, the Mac upgrade release lags the PC version, so there are often times when you can't play on your favorite server because it's moved on to the new point release while your Mac version hasn't been updated yet.

There are still a fair number of COD4 servers out there and a fair number of players, but it's true many have moved on. I dont' play COD4 much at all these days. Our clan still maintains a COD4 server, but it doesn't get much play. Modern Warfare 2 isn't a bad game design, but the lack of dedicated servers has made it a generally lousy experience and I don't play MW2 at all. Most people that I know have gone to Battlefield Bad Company 2. We're all waiting for Call of Duty 7, which is set in the Viet Nam era and reportedly will go back to allowing dedicated servers.
 
It doesn't run as well under OSX as the PC version runs under windows. I get significantly better frame rates using Bootcamp and Windows XP or Windows 7. So, in order to get the SAME frame rates, I have to turn the graphics settings down on the Mac version (this is on a Mac Pro with Mac version 4870 GPU).

Could you give me a rough recollection of what the frame rate was in OSX vs. Bootcamp? Was this significant difference evident in just whatever utility you may use to display the frame rate at the top of the screen or actually easily evident as you play?

Is the 4870 GPU you mention better than the 512MB 330M on the MBP?; do you think I could run it all maxed out at its 1680x1050 resolution?

There are still a fair number of COD4 servers out there and a fair number of players, but it's true many have moved on.

That's good to know. As long as there's a few dozen people on at most times, there should be enough people for me to shoot, though more likely to be killed by.

Thanks.
 
Could you give me a rough recollection of what the frame rate was in OSX vs. Bootcamp? Was this significant difference evident in just whatever utility you may use to display the frame rate at the top of the screen or actually easily evident as you play?

Is the 4870 GPU you mention better than the 512MB 330M on the MBP?; do you think I could run it all maxed out at its 1680x1050 resolution?


The Mac Pro with 4870 is in an entirely different league and a MacBook Pro. Performance on that machine has absolutely nothing to do with the other. In that regard, the frame rates that I relate would be meaningless. The 4870 is currently the top-of-the-line GPU that one can have in any Apple computer. It's miles beyond what any mobile-class GPU can do.

That said, I adjust my graphics and resolution to get me at least 100 FPS average minimum. Under BootCamp, I can get that easily with resolution set to 2560x1600 (native resolution of the 30" ACD I use) and all graphics at max (except AA which I cut back to 2X).
 
The Mac Pro with 4870 is in an entirely different league and a MacBook Pro. Performance on that machine has absolutely nothing to do with the other. In that regard, the frame rates that I relate would be meaningless. The 4870 is currently the top-of-the-line GPU that one can have in any Apple computer. It's miles beyond what any mobile-class GPU can do.

That said, I adjust my graphics and resolution to get me at least 100 FPS average minimum. Under BootCamp, I can get that easily with resolution set to 2560x1600 (native resolution of the 30" ACD I use) and all graphics at max (except AA which I cut back to 2X).

Using the 4850 512 MB in my i7 iMac, I can run multiplayer (I don't really do single player) at 2560x1600 with everything maxed out and still get 40-60 frames per second. Stands to reason your superior GPU should be able to do the same thing. You only need 30 frames per second, the 70 frames of headroom you have aren't doing you any good =P

To Libertine Lush, yes the Mac version includes the 4 additional maps. It also has cross-platform servers for multiplayer, so regardless of if you're running the Mac or PC version, you have access to the same severs and games. If you just want to play whenever you feel like it without rebooting, get the Mac version. If you want to be able to squeeze the most graphics and performance out of the game, the PC version will require a copy of windows and a reboot, but will look nicer.

EDIT: I don't get your question #6. Modern Warfare has already been ported to Mac by Aspyr, it's not going to get "re-ported". If you're asking if Aspyr's port will be available via Steam for 15 bucks, probably not, but you never know.

-Nick
 
. You only need 30 frames per second, the 70 frames of headroom you have aren't doing you any good =P

Sorry, but I completely disagree. 30 FPS is unplayable on a 3D first-person-shooter. You might be thinking of video or movie frame rates. It certainly isn't true of video games.

For one thing, a frame rate of less than 125 FPS prevents a LOT of access to certain areas of certain maps. I'm not talking about glitches, but actual mapped areas.

COD4 was designed by IW to run acceptably at 60 FPS. I agree that that is tolerable, but still a noticeable difference compared to 100 FPS. I don't find much advantage to higher than 125 other than map access to certain areas.


edit:

BTW I'm talking about online multiplayer. I have no interest in single player modes.
 
Win 7 Boot Camp

I play it, and it runs fine. 27 iMac gets super hot tho. I wonder if I am killing my computer with such heat generation....save on heating bills.
 
Sorry, but I completely disagree. 30 FPS is unplayable on a 3D first-person-shooter. You might be thinking of video or movie frame rates. It certainly isn't true of video games.

For one thing, a frame rate of less than 125 FPS prevents a LOT of access to certain areas of certain maps. I'm not talking about glitches, but actual mapped areas.

COD4 was designed by IW to run acceptably at 60 FPS. I agree that that is tolerable, but still a noticeable difference compared to 100 FPS. I don't find much advantage to higher than 125 other than map access to certain areas.


edit:

BTW I'm talking about online multiplayer. I have no interest in single player modes.
You must not be very good, then. I've played quite comfortably with ~30 FPS (I have a better GPU now, but at the time I didn't) and I generally came in first or second place. You don't need 100+ FPS to play comfortably or easily, it just gives you more headroom if the action onscreen becomes more intense.

Don't most gamers turn down their settings to get the absolute fastest framerate possible anyhow? (in online multiplayer, that is).
 
Sorry, but I completely disagree. 30 FPS is unplayable on a 3D first-person-shooter. You might be thinking of video or movie frame rates. It certainly isn't true of video games.

For one thing, a frame rate of less than 125 FPS prevents a LOT of access to certain areas of certain maps. I'm not talking about glitches, but actual mapped areas.

COD4 was designed by IW to run acceptably at 60 FPS. I agree that that is tolerable, but still a noticeable difference compared to 100 FPS. I don't find much advantage to higher than 125 other than map access to certain areas.

It's silly to say that 30 FPS is unplayable on a modern first person shooter, especially since a lot of console games, like Halo 3, are capped at 30 FPS max.
The only way 30 FPS is unplayable is if there's enough action on screen to make the FPS flux below 30.

-Nick
 
You must not be very good, then. I've played quite comfortably with ~30 FPS (I have a better GPU now, but at the time I didn't) and I generally came in first or second place. You don't need 100+ FPS to play comfortably or easily, it just gives you more headroom if the action onscreen becomes more intense.

No, 30 FPS is basically unplayable on a PC version if you're serious about the game. I'm inclined to doubt your claim of skill. The value of an adequate frame rate in an FPS, especially on a Quake-engine game has been obvious since the days of Quake III.
 
I had the PC version of COD4 and it played extremely well under Bootcamp running Vista Home Premium. I sold the PC version and got the Mac version when it came out. Personally for me it ran HORRIBLY on my system compared to the PC version.
 
To Libertine Lush, yes the Mac version includes the 4 additional maps. If you want to be able to squeeze the most graphics and performance out of the game, the PC version will require a copy of windows and a reboot, but will look nicer.

Thanks for confirming the presence of the 4 additional maps. You mention squeezing the "most graphics" and that the PC "will look nicer." Are you saying besides a superior frame rate, the graphics/textures/etc are also better? And how apparent is this?

For one thing, a frame rate of less than 125 FPS prevents a LOT of access to certain areas of certain maps. I'm not talking about glitches, but actual mapped areas.

How does having an FPS less than 125 prevent access to certain areas? Again, I'm not a computer gamer, so this is totally foreign to me.

I'm very curious to learn now: what are considered by most to be the lowest acceptable and most ideal frame rates?

It's silly to say that 30 FPS is unplayable on a modern first person shooter, especially since a lot of console games, like Halo 3, are capped at 30 FPS max.

Interesting comparison. Why then is there this discrepancy between the max/ideal frame rate on consoles (30) and the ideal frame rate on computers (whatever that # is...100?)?

Thank you all.
 
Sorry, but I completely disagree. 30 FPS is unplayable on a 3D first-person-shooter. You might be thinking of video or movie frame rates. It certainly isn't true of video games.

For one thing, a frame rate of less than 125 FPS prevents a LOT of access to certain areas of certain maps. I'm not talking about glitches, but actual mapped areas.

COD4 was designed by IW to run acceptably at 60 FPS. I agree that that is tolerable, but still a noticeable difference compared to 100 FPS. I don't find much advantage to higher than 125 other than map access to certain areas.


edit:

BTW I'm talking about online multiplayer. I have no interest in single player modes.

Im a competitive gamer and i have not heard anything like this, i use the 360 but i play games on the pc too. What type of games do not give you access to areas based on your fps? sounds stupid to me.
 
How does having an FPS less than 125 prevent access to certain areas? Again, I'm not a computer gamer, so this is totally foreign to me


.

FPS of 125, 250 and 500 are the "magic" numbers. Maintaining those frame rates will give you 6.66% higher jumping ability. That means that you can jump into areas that otherwise aren't accessible. One trick to help that out is get up against the area you want to jump to (ledge, balcony, roof, second story) and look up at the sky. When you watch the frame rate counter you'll see your fps spike. I don't know why a higher frame rate allows higher jumping -- it's part of the Quake engine physics. A buddy explained the math to me once, but I don't remember the rationale.

That's just jumping into otherwise inaccessible areas. The smoother game play of a higher FPS makes for better mouse tracking. Pointless to have a high resolution mouse if you're only mustering 30 fps. The frame rate will impair accuracy far more that a crappy mouse. Far better to have an adequate frame rate and a high resolution mouse.

./
 
No, 30 FPS is basically unplayable on a PC version if you're serious about the game. I'm inclined to doubt your claim of skill. The value of an adequate frame rate in an FPS, especially on a Quake-engine game has been obvious since the days of Quake III.

You're coming dangerously close to sounding like a snob. ;)

Honestly, how many games are based on the Quake engine nowadays? Companies are either using their own engines, or Unreal Engine 3.

Your numbers might be true for Quake, but the game and it's engine are old news now.

And you can doubt me all you like, that doesn't change anything. :) I even pointed out that a high framerate helps when there's a lot of action on screen – otherwise, what's the bother?
 
Thanks for confirming the presence of the 4 additional maps. You mention squeezing the "most graphics" and that the PC "will look nicer." Are you saying besides a superior frame rate, the graphics/textures/etc are also better? And how apparent is this?

I meant to say, with XYZ graphics card (insert your card here), you'll be able to achieve higher settings with the game in Windows than in OSX, but the actual graphic capability of the game is the same.
For example (pulling random numbers here). With all settings on Medium, you might get 50 frames per second in OSX. But if you're playing the Windows version on that same machine, you might get 50 frames per second on Medium / High settings. Windows will see better frame rates than OS X when they're on the same graphics quality settings.

-Nick
 
I cap Q3 at 30 fps and kick butt. So yeah your claim is crap, sorry. Nobody needs more than 30 or 40 fps to play perfectly smooth. Especially Q3.
 
You're coming dangerously close to sounding like a snob. ;)

Honestly, how many games are based on the Quake engine nowadays? Companies are either using their own engines, or Unreal Engine 3.

Your numbers might be true for Quake, but the game and it's engine are old news now.

And you can doubt me all you like, that doesn't change anything. :) I even pointed out that a high framerate helps when there's a lot of action on screen – otherwise, what's the bother?
First of all, let me clarify that I'm talking about PC play only, not consoles. I know nothing about console games and have no interest in them. If you guys are talking about 30 fps on consoles, fine, although they're designed to run COD4 at 60 fps. On PC's, nobody "kicks butt" at 30 fps. This is well known and to try to assert that 30 fps on a PC video game is playable is just silly. IW itself says that COD4 was designed for 60 fps on the PC and anybody who knows these games knows this.

I don't buy games for it's particular game engine, and to try to be snobbish about which engine you use is stupid. The best selling FPS's of the last decade, the Quake and COD series, has used the Quake engine, and it will continue to do so. It remains as good as anything out there. I have no problem with the Unreal engine, only that I didn't care a bit for the games themselves. The CryEngine is fine. I thought the game Crysis was dumb, but I am playing BFBC2 a lot these days. I don't play MW2 because it uses that silly peer-to-peer concept. If I wanted those kinds of limitations, I'd have bought a PS3 or 360.

Anyway, it doesn't matter. Play the game at whatever settings your computer can muster and have fun. I'll do the same. If you're able to do so at 30 fps, good for you. I've seen people say stranger things.
 
I meant to say, with XYZ graphics card (insert your card here), you'll be able to achieve higher settings with the game in Windows than in OSX, but the actual graphic capability of the game is the same.

Thank you. That's very helpful to know, and reassuring should I choose to pick up the Mac version for the MBP I just ordered.
 
I meant to say, with XYZ graphics card (insert your card here), you'll be able to achieve higher settings with the game in Windows than in OSX, but the actual graphic capability of the game is the same.
For example (pulling random numbers here). With all settings on Medium, you might get 50 frames per second in OSX. But if you're playing the Windows version on that same machine, you might get 50 frames per second on Medium / High settings. Windows will see better frame rates than OS X when they're on the same graphics quality settings.

-Nick

Yes, that's true. When I put the 4870 in my Mac Pro, I could turn all the graphics settings to max and run at native resolution of my 30" ACD 2560 x 1600. I am able to frame rates well over 100 under windows, and only about 60 under OSX.
 
Yes, that's true. When I put the 4870 in my Mac Pro, I could turn all the graphics settings to max and run at native resolution of my 30" ACD 2560 x 1600. I am able to frame rates well over 100 under windows, and only about 60 under OSX.

So out of curiosity, does anyone know why this occurs? Is it just a natural consequence of porting a game from one platform to another--whether it's from PC to Mac, Mac to PC, or PS3 to 360?
 
First of all, let me clarify that I'm talking about PC play only, not consoles. I know nothing about console games and have no interest in them. If you guys are talking about 30 fps on consoles, fine, although they're designed to run COD4 at 60 fps. On PC's, nobody "kicks butt" at 30 fps. This is well known and to try to assert that 30 fps on a PC video game is playable is just silly. IW itself says that COD4 was designed for 60 fps on the PC and anybody who knows these games knows this.

I don't buy games for it's particular game engine, and to try to be snobbish about which engine you use is stupid. The best selling FPS's of the last decade, the Quake and COD series, has used the Quake engine, and it will continue to do so. It remains as good as anything out there. I have no problem with the Unreal engine, only that I didn't care a bit for the games themselves. The CryEngine is fine. I thought the game Crysis was dumb, but I am playing BFBC2 a lot these days. I don't play MW2 because it uses that silly peer-to-peer concept. If I wanted those kinds of limitations, I'd have bought a PS3 or 360.

Anyway, it doesn't matter. Play the game at whatever settings your computer can muster and have fun. I'll do the same. If you're able to do so at 30 fps, good for you. I've seen people say stranger things.

No need to clarify, I'm not talking about console gaming either. I haven't played a console game in months. And if you think nobody 'kicks butt' just because they don't have a frame rate over 100, well, you'd be wrong. I can come up with just as many anecdotes and claims as you can, because you have absolutely no proof. Did Infinity Ward design Call of Duty 4 to run at 60 FPS? Perhaps they did. Yet the game is still quite playable at 30 FPS, no matter what you fantasize about it.

Then why do you refer back to Quake so much? Also, you're wrong, as of Call of Duty 2 IW has used their own engine, not id Tech 3. Don't believe me? Check IGN's own expose on the matter.

As for what frame rates I can muster, I have a better GPU than you do. ;) I've seen people say strange things too. But to claim you can't do well just because you only get 30 FPS? That's one of the strangest I've heard today.

So out of curiosity, does anyone know why this occurs? Is it just a natural consequence of porting a game from one platform to another--whether it's from PC to Mac, Mac to PC, or PS3 to 360?
It's because porting isn't an exact process – and games are written to take advantage of a certain platform's features – Cell with the PS3, DirectX with Windows and so on.
 
Then why do you refer back to Quake so much? Also, you're wrong, as of Call of Duty 2 IW has used their own engine, not id Tech 3.

The problem with the internet is that it encourages you to do half-assed research. Unfortunately, 30 seconds with a search engine isn't a substitute for knowing what you're talking about. The MW2 engine is nothing more than a modification of the Id Tech engine, which is nothing more than a modification of the Quake engine. You should know this already. I suspect you do.

I refer back to Quake because that is the prototypical FPS and that game engine has set the standards for the entire genre. It should be obvious to an expert gamer like you.:rolleyes:



As for what frame rates I can muster, I have a better GPU than you do. ;)

You have a better GPU than a 4870 in aMac Pro and you play the game at 30 fps? Sheesh. That's about enough time wasted in this thread.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.