Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Objective-C is slower than C and C++. Period. OSX uses Objective-C (especially it's inherent slow message passing versus procedural calls in C/C++). Windows uses C and C++ with many portions in the Kernel in assembly (machine code on metal is fastest you can get). Yes, the simple MOST basic task of calling a function is slow on OSX because of Objective-C. How can they fix this? Start porting to assembly and C. Get rid of Objective-C. It is not the future (especially when it drags along archaic interpreted memory management concepts). The future is low power and high performance in mobile devices. Interpreted stuff require lots of ram, lots of power wasted. This is the reason XNA was dropped on Windows Phone and also Windows 8. C# (interpreted language) is a dying breed in the future. Even Android game developers bypass the Java and code everything in C/C++, and not many code games on it because it is horribly complicated because of the Java layer. Google may drop Java requirement in the future in android to save themselves.

OSX uses VLVM compiler, which will hog up all your ram (2x at least) in order to get similar optimizations (or worse) than basic GNU compiler. It needs all that ram to look for optimizations. Unfortunately it will hog so much resources half of your stuff will be dumped to the harddrive and the swapping back and forth will negate any benefits you "may" get. I mention "compiler" because OSX has these "convenience" vs "performance" problems where it will leave graphics drivers in an uncompiled state ready for running (interpreted) on a virtual machine backend of the vlvm. Anything interpreted will need more than 2x the memory and 2x the cpu cycles to do the same thing than non-interpreted languages. Anything you see requiring Virtual Machines means slow (like Java, C#, Python, etc). Even Objective-C has some (or carries with it concepts of garbage collection which requires a virtual machine, even if you manually release the memory). When did VLVM first show up as a requirement on OSX? Lion and Mountain Lion. This is why there are so many threads on why Snow Leopard is like 50% to 2x faster than Lion (and Mountain Lion).

The Feral comment on DOF I think is "depth of field". It is the post processing that makes things blurry for things far away or too close to you that is not in focus. Even with that off, Windows will run faster than OSX. It comes back to square 1: Performance on the layer between the game and the metal (OSX).

So what can Apple do? Move towards C/C++ and assembly, start getting rid of interpreted stuff and virtual machines in all their products. Getting off Objective-C might be hard, but it is a superset of C, so it is not difficult to move in that direction.

Wanted to chime in because people think getting better graphics drivers or the OpenGL versus DirectX will fix their problems. It won't when the language and the operating system is the slowest layer. Even on OpenGL vs DirectX9, Windows games run faster than same OSX games (not a DirectX11 thing). Even when Snow Leopard versus Windows, OSX was slower (Objective-C). But back then it was only 20% slower. Now with interpreted and virtual machine stuff everywhere (from vlvm) in Lion and Mountain Lion, you are getting 200% slower in games.

Wow. I wanted to argue because I've used Objective-C a lot in the past, but my own tests are proving you correct. I need to find a way to use some other compiler (Xcode doesn't seem to have other options) and get back into C programming. Thanks for the information.
 
Direct X is the crown jewel in the crown of Windows. It smokes OpenGL for gameplay all around.... and Apple's implementation of OpenGL isn't superb.

This is a commonly-stated meme, but so far those who do have done so have not come back and provided evidence and fortifications for it.

I hope you will be different.
 
Objective-C is slower than C and C++. Period. OSX uses Objective-C (especially it's inherent slow message passing versus procedural calls in C/C++). Windows uses C and C++ with many portions in the Kernel in assembly (machine code on metal is fastest you can get). Yes, the simple MOST basic task of calling a function is slow on OSX because of Objective-C. How can they fix this? Start porting to assembly and C. Get rid of Objective-C. It is not the future (especially when it drags along archaic interpreted memory management concepts). The future is low power and high performance in mobile devices. Interpreted stuff require lots of ram, lots of power wasted. This is the reason XNA was dropped on Windows Phone and also Windows 8. C# (interpreted language) is a dying breed in the future. Even Android game developers bypass the Java and code everything in C/C++, and not many code games on it because it is horribly complicated because of the Java layer. Google may drop Java requirement in the future in android to save themselves.

OSX uses VLVM compiler, which will hog up all your ram (2x at least) in order to get similar optimizations (or worse) than basic GNU compiler. It needs all that ram to look for optimizations. Unfortunately it will hog so much resources half of your stuff will be dumped to the harddrive and the swapping back and forth will negate any benefits you "may" get. I mention "compiler" because OSX has these "convenience" vs "performance" problems where it will leave graphics drivers in an uncompiled state ready for running (interpreted) on a virtual machine backend of the vlvm. Anything interpreted will need more than 2x the memory and 2x the cpu cycles to do the same thing than non-interpreted languages. Anything you see requiring Virtual Machines means slow (like Java, C#, Python, etc). Even Objective-C has some (or carries with it concepts of garbage collection which requires a virtual machine, even if you manually release the memory). When did VLVM first show up as a requirement on OSX? Lion and Mountain Lion. This is why there are so many threads on why Snow Leopard is like 50% to 2x faster than Lion (and Mountain Lion).

The Feral comment on DOF I think is "depth of field". It is the post processing that makes things blurry for things far away or too close to you that is not in focus. Even with that off, Windows will run faster than OSX. It comes back to square 1: Performance on the layer between the game and the metal (OSX).

So what can Apple do? Move towards C/C++ and assembly, start getting rid of interpreted stuff and virtual machines in all their products. Getting off Objective-C might be hard, but it is a superset of C, so it is not difficult to move in that direction.

Wanted to chime in because people think getting better graphics drivers or the OpenGL versus DirectX will fix their problems. It won't when the language and the operating system is the slowest layer. Even on OpenGL vs DirectX9, Windows games run faster than same OSX games (not a DirectX11 thing). Even when Snow Leopard versus Windows, OSX was slower (Objective-C). But back then it was only 20% slower. Now with interpreted and virtual machine stuff everywhere (from vlvm) in Lion and Mountain Lion, you are getting 200% slower in games.

While it's hard to write very fast Objective-C code (but it can be done) I guess most games are not translated from their original language to Objective-C. I have not ported any game to Mac OS X, but judging from open source projects and commercial games whose code was made public many years later (like many Quake3 based titles), there is only very limited to none Objective-C in these ports. Mainly it's all C/C++, especially the parts of the core game engine.

Nobody—not even Apple—stops you from using C/C++ in your application if you think that a part of your application can not or should not be implemented in Objective-C.
 
Pretty much this. Obj C is a superset of C so you can just use C in your program. Plus, it's not specific to games. If that was the issue, all OS X software would be slower; and I don't think it's true.
 
I guess most games are not translated from their original language to Objective-C. .... Mainly it's all C/C++, especially the parts of the core game engine.

Nobody—not even Apple—stops you from using C/C++ in your application if you think that a part of your application can not or should not be implemented in Objective-C.

That's correct games are written almost exclusively in C++ and that code is not altered when brought to the Mac. Obj-C is great for many things but extreme high performance is not one of it's strengths. C/C++ is a lot more optimised VinegarTasters pretty much covered all the main points in his post.

This is a commonly-stated meme, but so far those who do have done so have not come back and provided evidence and fortifications for it.

I hope you will be different.

It depends really some areas are very close between Mac and PC some are... less so :)

If you have the time run SPECviewperf® 11

http://www.spec.org/gwpg/gpc.static/vp11info.html

In Bootcamp then again on OS X (you will need to compile your own Mac version) and compare the results. The hardware will be identical (your Mac) but the results will vary quite a lot between OS's.

Edwin
 
I'd again like to shed some positive light on the situation.

1) There are engines out there that really support the Mac: Unity3D, Unigine and Source. The fate of Mac Source is somewhat in the air right now until word is given of Half-Life 3, but it's save to assume that HL3 will be on the Mac day one. That leaves Unity and Unigine, so the stronger they get the better for the Mac. Unreal is not really a contender since Unreal is a middleware solution. I am also pretty sure that Doom4 will be on the Mac. Problem is that id won't provide a day one Mac version themselves, which is a shame.

2) The stronger and better Cider and similar solutions get, the better. Yes, Cider is ugly. But if you want to see future blockbuster titles (think PS4 and XBox1 titles) on the Mac, that is the only way to go. Cider needs DX11 translation. Which will be insane to do.

3) OpenGL ES is the future of OpenGL. The more mobile titles on iOS using OpenGL ES the better. Highend mobile titles are as rich as low-end Mac titles are right now. Give it three more years and this will become more obvious. That is the major reason why the low end Mac market (OpenGL 3.x, OpenGL ES, DirectX 9 ports) is relatively safe.

4) Strong integrated graphics from Intel. Apple's bet that integrated graphics from Intel will suffice most users has paid off. In three years Haswell adoption will have improved the situation, and devs will release games that were targeted at the iX generation and above.

5) Make the AppleTV a gaming console. Make the internal hardware decent and aimed at games. Position the box below the PS4/XBox1. Maybe release at as a Smart TV. Then, titles could be easily ported to OS X and maybe released with scaled-up content. Gabe Newell recently claimed that if Apple went that way they could own the living room. And he's right. Problem is that Apple doesn't care. They only care for traditional media. And they just can' bring themselves to hook a controller to that thing (maybe in the future - recent rumors suggested that). Phil Shiller is not a gaming guy. It's just pathetic every time he plays a game on stage.

Remember that Apple was taken by surprise how well the iPod Touch sold as a gaming device. That just don't get that. Maybe that has to do with Apples history. In the nineties they thought Windows succeeded because they ripped of Apple's GUI so well. And yes, that was part of the reason. But the other reason was that Windows was the better gaming box. In the nineties games were not mainstream yet, so gaming was one of those things that nobody would publically admit to as a hobby, yet everybody did that. And Apple just didn't get that.

***

I am not sure what that is supposed to mean. Apple should have it's own version of OpenGL? What's the point of adopting OpenGL then? OpenGL is a standardized product through the Khronos group, which Apple is a member of.

Or do you mean that Apple should open-spource OpenGL libraries? Yeah, that's certainly feasable. The libraries for Linux are already open source and they are going well performance-wise. However, I can't see Apple doing that. The internet will laugh it's ass off once they see the code. And there propably aren't that many people willing to improve Apple's drivers anyway as there are for Linux.

Yes, Apple could create their own graphics library. Like they did with QuickDraw 3D in the nineties. But that would put them in a even weaker position, since nobody would adopt it. And to a agree they already have that, with the Core technologies (and OS X Mavericks will bring a new one, specifically targeted at 2D games).

Apple could and should do something. They could open the way for Mac gaming by taking openGL and tailor it to OS X and their h/w making their own "directX" in a way of speaking, like other successful platforms as you said. Like they did with bsd to create OS X, webkit engine to create safari etc. But they didn't.

Huh? Developers should write their own libraries? So, if Blizzard doesn't like OpenGL on the Mac they should just go and write their own? I mean sure, Activision/Blizzard does have more money than god right now, but that's a bit of a stretch.

There is however some historical precedent. In the late nineties Loki software (co-)developed libraries/software to support their Linux dev efforts, which are industry standards today: SDL and OpenAL.

I'm saying a developer can do tons of things... tons... they don't like one of OS X's libraries or APIs that have performance problems... write their own. I never said its all in their hands, Apple could of course do a lot too, but blaming Apple for everything is just a developer who won't do what is required.

Yes, that's OpenGL 4.1. And many devs are now saying that's not even a half-assed effort, since OpenGL 4.2 and 4.3 is what devs are working with right now.

If you look at Apple's WWDC app, they have a scheduled list of labs. In that list of labs is one for today (June 13), from 2-3 pm Pacific called "What's new is OpenGL for OSX". In the notes it explicitly OpenGL 4.1.

This is AWESOME news!
 
I'd again like to shed some positive light on the situation.

1) There are engines out there that really support the Mac: Unity3D, Unigine and Source. The fate of Mac Source is somewhat in the air right now until word is given of Half-Life 3, but it's save to assume that HL3 will be on the Mac day one. That leaves Unity and Unigine, so the stronger they get the better for the Mac. Unreal is not really a contender since Unreal is a middleware solution. I am also pretty sure that Doom4 will be on the Mac. Problem is that id won't provide a day one Mac version themselves, which is a shame.

To be honest when porting multi-platform titles the engine does not really matter they can all be ported and even engines that "support Mac" will still need a lot of work to get them running as any AAA game the engine is massively modified and these modifications are in part platform specific.

That said it's easier to to a sequel as you can look at the engine code from the last game in the series for help. :)

2) The stronger and better Cider and similar solutions get, the better. Yes, Cider is ugly. But if you want to see future blockbuster titles (think PS4 and XBox1 titles) on the Mac, that is the only way to go. Cider needs DX11 translation. Which will be insane to do.

It's not the only way to go, we have been porting games including DX11 titles and we never touch a drop of cider technology to do it! In fact I would argue that a custom solution like ours will always give a better performance than a cider port :) I agree DX11 is a lot of work though, a LOT OF WORK! ;)

3) OpenGL ES is the future of OpenGL. The more mobile titles on iOS using OpenGL ES the better. Highend mobile titles are as rich as low-end Mac titles are right now. Give it three more years and this will become more obvious. That is the major reason why the low end Mac market (OpenGL 3.x, OpenGL ES, DirectX 9 ports) is relatively safe.

OpenGL ES is not good enough to run any AAA games, a "HD" version of a mobile phone game sure but not something like Total War or Deus Ex!

4) Strong integrated graphics from Intel. Apple's bet that integrated graphics from Intel will suffice most users has paid off. In three years Haswell adoption will have improved the situation, and devs will release games that were targeted at the iX generation and above.

Haswell can support games and is no longer as completely useless as it was when we had the Intel GMA but it's not exactly great for gaming. I suspect with the next gen console games the Intel HD series will likely be the "min spec" and higher settings will require a dedicated graphics solution.

5) Make the AppleTV a gaming console. Make the internal hardware decent and aimed at games. Position the box below the PS4/XBox1. Maybe release at as a Smart TV. Then, titles could be easily ported to OS X and maybe released with scaled-up content. Gabe Newell recently claimed that if Apple went that way they could own the living room. And he's right. Problem is that Apple doesn't care. They only care for traditional media. And they just can' bring themselves to hook a controller to that thing (maybe in the future - recent rumors suggested that). Phil Shiller is not a gaming guy. It's just pathetic every time he plays a game on stage.

To make the internal hardware decent for games you will end up with an Apple TV that is the same size, spec and price as a next gen console. Apple then need to implement a full featured high performance graphics API etc. I just don't think this is something they want to get involved in.

They already have an API for controllers for all iOS and Mac OS machines starting with iOS7 and 10.9.

Also Phil Shiller being rubbish at games means nothing, so is Bill Gates does that mean the Xbox 360 sucks?

Or do you mean that Apple should open-spource OpenGL libraries? Yeah, that's certainly feasable. The libraries for Linux are already open source and they are going well performance-wise. However, I can't see Apple doing that. The internet will laugh it's ass off once they see the code. And there propably aren't that many people willing to improve Apple's drivers anyway as there are for Linux.

It's not really feasible, parts of it are already open source and I doubt anyone would laugh their ass off at the code it's not THAT bad.

Yes, that's OpenGL 4.1. And many devs are now saying that's not even a half-assed effort, since OpenGL 4.2 and 4.3 is what devs are working with right now.

You can't run before you walk, lets see if they get 4.1 stable and fully working then we can ask them for 4.2 and 4.3 (both needed for full DX11 support).

Edwin
 
Huh? Developers should write their own libraries? So, if Blizzard doesn't like OpenGL on the Mac they should just go and write their own? I mean sure, Activision/Blizzard does have more money than god right now, but that's a bit of a stretch.

There is however some historical precedent. In the late nineties Loki software (co-)developed libraries/software to support their Linux dev efforts, which are industry standards today: SDL and OpenAL.

I'm saying quite the opposite. That Apple should do what Microsoft did with directX (it might be very late now, though, not sure). Everyone says that the public OpenGL is a mess so Apple could use it as a base in order to make a clean, Mac optimized port that won't suffer from the same performance issues.
Devs will still use the OpenGL as they know it, of course.

As for the Cider ports, I tend to disagree. At user level, it's great as it gives you one more chance to get those Windows-only games running without bootcamp. But to encourage game making companies to use it to create official ports of their games ? Hell, no.

Officially porting games to Cider and call them "mac games" is cheating (and too cheap and easy way to become a "mac supporting game company" for that matter). If native Mac games suffer from performance in comparison with Windows ports on the very same h/w, Cider makes things even worst. Plus it gives to the companies a fake alibi to keep releasing these.
 
Everyone says that the public OpenGL is a mess so Apple could use it as a base in order to make a clean, Mac optimized port that won't suffer from the same performance issues.

I would settle for OpenGL 4.1,4.2 and 4.3 support with performance levels the same as on Linux and Windows. That would be fine.

They don't need to make their own new OpenGL and I would not want them to.

Edwin
 
Ed, do you know if the drivers that ship with 10.9 fix the bug you alluded too with regards to dirt 2 performance?
 
Ed, do you know if the drivers that ship with 10.9 fix the bug you alluded too with regards to dirt 2 performance?

Mavericks is under NDA now so even if I could answer I can't. What I can promise is we check all our games in new OS betas and if any bugs are found we log them so they can (hopefully) be fixed before release.

Edwin
 
When did VLVM first show up as a requirement on OSX? Lion and Mountain Lion. This is why there are so many threads on why Snow Leopard is like 50% to 2x faster than Lion (and Mountain Lion).

Thanks for an interesting post.
About the quote above – it really sounds like the VLVM complier is quite worthless compared to others. One can wonder why Apple is requiring it since Lion. There must be some benefits, no?

What about Linux, is it any better than OS X when it comes to this? If Linux was as big as OS X and the games we see for OS X was available for Linux, would they perform better in Linux?
 
I would settle for OpenGL 4.1,4.2 and 4.3 support with performance levels the same as on Linux and Windows. That would be fine.

They don't need to make their own new OpenGL and I would not want them to.

Edwin

Agreed.
 
Thanks for an interesting post.
About the quote above – it really sounds like the VLVM complier is quite worthless compared to others. One can wonder why Apple is requiring it since Lion. There must be some benefits, no?

It's not really that worthless as some say. As of the middle of 2012, runtime performance has improved to the point where GCC wins some benchmarks and Clang others.

Apple software makes heavy use of Objective-C, but the Objective-C front-end in GCC is a low priority for the current GCC developers. GCC is also GPL version 3 licensed, which requires developers who distribute extensions for (or modified versions of) GCC to make their source code available, whereas LLVM has a BSD-like license which permits including the source into proprietary software.

Twice as fast compilation times with a sixth of the memory usage was also I suspect a bug reason Apple went with Clang, it has had a learning curve getting up to speed but long term it has a lot more growth potential than the huge and unwieldy GCC compiler.

Just my 2 cents :)

Edwin
 
I can't believe I just read all this. My head is swimming lol :eek:

Then here's some thoughts from a game engine programmer to really get your head spinning ;).

- Optimising game performance, particularly maximising GPU utilisation is a really, really hard development problem. Even on consoles where you can manually manage the GPU heap and write your command buffers directly.

- GPU vendors care about OpenGL performance. Their Windows implementations are often very close to D3D - and may even better it as Valve found out recently. The very lowest levels of code may well be shared in some cases so they can reuse all their GPU specific code rather than writing it twice.

- Apple's GL often trails the vendor's own implementations on Linux and Windows. Closing the gap on AMD & NV would make the single biggest difference to Mac games. The driver folks at Apple, AMD & NV are doing their utmost, certainly they've been great to me when I've needed their help - but right or wrong, games aren't core to OS X for Apple. They focus on control & relative consistency across GL running on their hardware - sometimes it would be nice if those at the very top gave the GL teams elbow room to flex their GPU muscles though.

- I've worked @ Feral with Ed, fixing graphics and performance bugs. I've also met John Mikros & co. @ Blizzard. Both sets of programmers work relentlessly to improve the performance of the libraries they've nurtured for years and the games they ultimately ship. Few potential avenues for optimisation are left on the table.

- Writing a good GL is very, very hard. The hardest part is engineering performant GPU drivers. Worse still if you have to reverse engineer them with no documentation. That's why Mesa/Gallium only reached GL 3 over the last year and doesn't support all the GPUs you might care to use.

- Not having GL 4 hasn't been a problem thus far as most games were still using D3D9 renderers, for which GL 2.1 + extensions was enough. This last year or so has seen the rise of DX11 so the move to GL 4.1 is a relief. Even without compute shaders the specifications are closer than they've been for a while so it should make things much easier.

- It is odd, however, that 10.8 can run some limited OpenGL compute shaders with the right #defines - though it lacks control functions - but 10.9 is only targeting GL 4.1. Emulating them with OpenCL might be possible - but at a performance cost as you have to deal with data and object synchronisation.

- Intel GPUs and integrated GPUs in general aren't going to be target hardware for AAA-games development anytime soon. They will get there if the weakest parts keep getting faster and the discrete parts don't accelerate away - but the consoles will dictate the target desktop hardware. The more direct access to hardware traditionally provided by console APIs and their fixed, dedicated nature as games consoles make it easier to extract the maximum performance so more powerful hardware is required on the desktop to compensate.

- As Ed points out Clang/LLVM is a very good optimising compiler. On the respective trunk versions of Clang & GCC there's not much to pick between them. They may both still produce executables that lag behind those compiled with ICC and MSVC though.

- Mac game ports don't use much Objective-C - therefore the overhead of a selector invocation versus a virtual function call is by-the-by. The original PC source is all C/C++, so the only Objective-C is likely to be for a few MAS-features, plus window & OS event handling. None of that will have any appreciable performance impact compared to the actual game logic & rendering when everything gets compiled.

- Unreal absolutely is a fully-fledged game engine. Studios license it for the cross-platform support, industry leading tools, scripting and renderer. First class Mac support is a positive thing for Mac gamers - but when it might add a mere ~50-100k units sold in a market that considers ~3 million to be abject failure, uptake will probably continue to be patchy.

- PS3 did ship with PSGL - but no-one used it as it didn't have the performance or feature set required. Sony introduced libgcm pretty early on to address that. They've already said that the PS4 will have a similarly low-level API available, which is what most games developers will be using.

- OpenGL vs GL ES. The distinction isn't as great as it was - GL ES keeps getting more features to catch-up with full-fat GL, which is much leaner from 3.2 onwards. The real difference is still the hardware. With WinPhone's relative failure and IE11 supporting WebGL it looks like GL has a bright future regardless.

- Cider & Wine are actually pretty good despite losing some potential optimisations and adding overhead by running the unmodified Windows binaries. They might not be quite as good as recompiling for OS X, but if they can work out the ideal way to drive GL for a specific title the different shouldn't be huge. That'll be the hard part - it can be difficult to fathom without the context why a title makes the calls it does.

- OS X's multithreading model is no better than any other pthread's based system and frankly as far as games care, no different to Windows. Games spawn and manage their own threads, most now have their own cross-platform implementations of thread/task pooling similar to GCD. Apple's libraries & tools don't magically do anything more to improve your code for you than those Microsoft supply on Windows. In fact Apple's GL tools are sadly no match to the tools Microsoft provide for D3D.
 
That was a very interesting read. Thank you for taking the time to share all of that with everyone here.
 
It's not really that worthless as some say. As of the middle of 2012, runtime performance has improved to the point where GCC wins some benchmarks and Clang others.

Apple software makes heavy use of Objective-C, but the Objective-C front-end in GCC is a low priority for the current GCC developers. GCC is also GPL version 3 licensed, which requires developers who distribute extensions for (or modified versions of) GCC to make their source code available, whereas LLVM has a BSD-like license which permits including the source into proprietary software.

Twice as fast compilation times with a sixth of the memory usage was also I suspect a bug reason Apple went with Clang, it has had a learning curve getting up to speed but long term it has a lot more growth potential than the huge and unwieldy GCC compiler.

Just my 2 cents :)

Edwin

Thanks for your two cents! :)

And yours, marksatt!

By the way, this:
Apple's libraries & tools don't magically do anything more to improve your code for you than those Microsoft supply on Windows. In fact Apple's GL tools are sadly no match to the tools Microsoft provide for D3D.
sucks. :( Who can go and let Apple know that we don't like this? :)
 
Last edited:
I'd like to put something else out there.

OS X gets most of it's performance through highly optimized multithreading, but most Windows games use only two or three threads at most. If you look at any basic tutorials, you will see that all drawing and computing code are on the main thread and network stuff is on other threads. Windows does have good, but not excellent multithreading performance, so there are delays and significant performance drops for heavy tasks being moved to other threads. In short, you get that annoying "hiccup" when you try to create threads like that. Windows makes up for this with a kernel that is highly optimized for single-thread tasks. Windows also splits tasks when possible to be used on multiple CPU cores, so you get lightning fast gaming performance.

OS X is fundamentally different. In addition to the previously mentioned complaints, OS X is built so much for fast multitasking that it wants everything broken into dozens of threads. There is little to no performance loss when it creates or switches threads, which is good, but every single process is significantly slower because OS X also manages all of those threads. When a game is made with only two or three threads, OS X chokes because it's optimized code is bypassed and it cannot cope without slowing down significantly. In short, OS X games have to be made in a totally different way in order to be as fast, but then they are likely to be faster.

That is just the CPU though. The problems of graphics cards and drivers still remain. The GPU is stuck in early versions of OpenGL because Apple wants backwards compatibility so much that they cannot push ahead. My old G5 still has full support for everything Apple is releasing, despite the fact that the hard drive is totally fried. Apple is taking their time so the less up-to-date computers can still run everything, but now they are so far behind that they probably won't be able to catch up. Granted, OS X Mavericks will have full OpenGL 4.1 support, better 3.2 support, and (supposedly) drop the legacy and fixed-functionality versions, but it still isn't going to be enough. What Apple needs to do is optimize in the areas they do not typically bother to look at. Heck, their developer tools already have absolutely crazy optimization tools that will reconstruct your code and make it faster, and even generate the assembly and machine code for it. They might as well run the OS X code through that and see what happens.

Man, this is so full of baloney. Windows has great multi-threading performance. And OSX is not considered so great. Go read developer comments on this topic, I could point you to a few.Here are some regarding music software performance.

http://cubase.net/phpbb2/viewtopic.php?t=142211

http://www.gearspace.com/board/music-computers/525353-dawbecnh-osx-vs-windows.html

Geekbench tests higher in Windows on Mac hardware.

People should not just spout off things because that's what they just happen to think. The game developer above also states the Mac is no better than Windows in this sense.
 
Last edited:
Man, this is so full of baloney. Windows has great multi-threading performance. And OSX is not considered so great. Go read developer comments on this topic, I could point you to a few.Here are some regarding music software performance.

http://cubase.net/phpbb2/viewtopic.php?t=142211

http://www.gearspace.com/board/music-computers/525353-dawbecnh-osx-vs-windows.html

Geekbench tests higher in Windows on Mac hardware.

People should not just spout off things because that's what they just happen to think. The game developer above also states the Mac is no better than Windows in this sense.

Really.. Cubase?..... From 2010.... :cool:
 
Man, this is so full of baloney. Windows has great multi-threading performance. And OSX is not considered so great

I would not go that far :) I would still prefer OS / Unix threading for performance over Windows if I had to choose one. However most "poor threading performance" issues are down to programmer implementation of the applications threading code not the actual OS implementation of the threading API. Threading, contention and knowing when to spawn, not spawn and release threads is a very complex art form.

For example we had a AAA title a few years back that was one of the early games with heavy threading.We used an implementation in C++ in the original code that was not platform dependant but it spawned over 100 threads at once and took up 100's of lines of code in the source. We used this in the early beta versions of the game. We then during development took the time to do what the console version of the game did and implemented the threading again this time a lots simpler and in assembly (machine code) we had to do this in PPC and Intel. :)

The threading now always had 13 threads running and we got around 25%~ performance increase, plus massively reduced memory requirements. Now remember not one line of game code altered, the only difference was how the game threads are controlled.

This is a perfect example of how "threading performance" can usually be massively improved by writing cleaner, lower level and lower impact code. Yes OS's have limitations on threading but usually the difference between a good and bad application in terms of threading is down to the implementation in the application not in the OS.

Oh and some things cannot be threaded so it is not a golden bullet to solve all issues just tasks that have multiple things that need doing at once.

Edwin
 
What the initial reply said: poor GPU support, no DirectX. A damn shame really, since Apple has a killer combination of AirPlay and the AppleTV. If Apple took gaming even remotely seriously, they would've done years ago what the nVidia Shield promises to do this summer (that is, remote gaming by having the PC render into an H.264 stream, and pipe it to the handheld wirelessly). Considering how badly Microsoft and Sony are shafting users of their digital marketplaces with the new systems, it seems like a no-brainer way of Apple getting into the console arena without actually making a new console or device.

A decent GPU and proper drivers would go a long way to garnering Mac gaming support.
 
Try a Valve game, the only major game developers worth supporting anymore.

Not for OS X. The Steam client is still lousy and their ports are sub par. I can run Portal 2 on the Boot Camp side of my MBP fully maxed at native resolution. In OS X the same game requires I lower the resolution and details to even get close to the same performance.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.