Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
But there's more. OpenGL is practically dead. v4.x is a mess, says every dev. It's too bloated and the long expected overhaul did not happen.

Well, but it has to happen or we will see a notable setback in Mac OS X gaming.

Currently the soon-to-be-obsoleted gaming consoles are technically equivalent to DirectX 9 on Windows. DirectX 9 is roughly equivalent to OpenGL 2.x. Which means, it is technically feasible to port titles to Mac OS X which have their origin on any of the gaming consoles or are based on Windows games which have an DirectX 9 rendering path.

But this will change soon. The new consoles are based on DirectX 11 capable GPUs. There are lots of DirectX 11 capables GPUs out there in Windows machines. So there is no reason to add an DirectX 9 render path into future games.

Unless Apple finally decides that "the world's most advanced desktop operating system" should advance at least to OpenGL 4.0 or higher, the current state of Mac gaming — which offers quite a lot of decent titles —*will suffer a long drought and the descent into shallow ugly social gaming boredom.

[Update]
Just a few days later Apple finally went public with Mac OS X 10.9 announcement which will support OpenGL 4.x. Nice!
 
Last edited:
You can't do much as a game developer who is porting a game from Windows to OS X, when graphics drivers or even the API available (directX vs OpenGL) is worst. Their hands are tied.
That is really just not true... there are tons of things you can do. Are all non-DirectX platforms horrible? You know only Windows and Xbox is DirectX, nothing else... right? Microsoft refuses to make, or even license out DirectX for any other platform.
 
That is really just not true... there are tons of things you can do. Are all non-DirectX platforms horrible? You know only Windows and Xbox is DirectX, nothing else... right? Microsoft refuses to make, or even license out DirectX for any other platform.

I guess you mean the non-microsoft consoles. But I was talking from the developer's side of view, not Apple's. What can a game developer do if the platform won't perform well ? Heck even Blizzard's games have a huge gap between Windows and OS X versions (check my first post in this thread, fps were literally doubled in Windows for Diablo 3, same h/w). Same with WoW. Check also the numerous performance-related complains for Witcher 2 and so on.

Apple could and should do something. They could open the way for Mac gaming by taking openGL and tailor it to OS X and their h/w making their own "directX" in a way of speaking, like other successful platforms as you said. Like they did with bsd to create OS X, webkit engine to create safari etc. But they didn't. I cannot blame game developers for "bad" or unoptimized code as they are the last link of the chain.

I don't say you HAVE to use directX to get a successful gaming platform. I'm just saying that as things are now, game developers cannot make anything more to close the gap, the ball is in Apple's court.
 
I guess you mean the non-microsoft consoles. But I was talking from the developer's side of view, not Apple's. What can a game developer do if the platform won't perform well ? Heck even Blizzard's games have a huge gap between Windows and OS X versions (check my first post in this thread, fps were literally doubled in Windows for Diablo 3, same h/w). Same with WoW. Check also the numerous performance-related complains for Witcher 2 and so on.

Apple could and should do something. They could open the way for Mac gaming by taking openGL and tailor it to OS X and their h/w making their own "directX" in a way of speaking, like other successful platforms as you said. Like they did with bsd to create OS X, webkit engine to create safari etc. But they didn't. I cannot blame game developers for "bad" or unoptimized code as they are the last link of the chain.

I don't say you HAVE to use directX to get a successful gaming platform. I'm just saying that as things are now, game developers cannot make anything more to close the gap, the ball is in Apple's court.

I'm saying a developer can do tons of things... tons... they don't like one of OS X's libraries or APIs that have performance problems... write their own. I never said its all in their hands, Apple could of course do a lot too, but blaming Apple for everything is just a developer who won't do what is required.

Blizzard performance differences... ? People act like Blizzard are perfect since they've support Apple so long. I'm sure there is tons of stuff they could optimize and make better in the code. As to other titles that are ported over after the fact... they spend years making a Windows version based on Direct X, then make a native mac port... well they do not do it from scratch nor spend the time to make the engine changes as good.

Another thing to take into account for graphics related performance is Nvidia and AMD... a lot of their cards will run OpenGL much poorer than DirectX, and there is tons of stuff they could do about it, but do not.
 
Driver support, among other things, are definitely big issues. I've run the exact same game(s) at twice the frame rate in my Windows partition than in OS X. Sometimes the enhanced driver support Windows side even allows for higher graphic settings while maintaining a higher frame rate... such as 8x anti-aliasing as opposed to 4. There really is no point in playing an OS X version of a game if a Windows version is available based on performance alone. :\
 
Gabe Newell during a interview from 07;


Kikizo: People keep asking you about a potential Macintosh version, and your stance is that this is a strictly Windows project...?

Gabe: Well, we tried to have a conversation with Apple for several years, and they never seemed to... well, we have this pattern with Apple, where we meet with them, people there go "wow, gaming is incredibly important, we should do something with gaming". And then we'll say, "OK, here are three things you could do to make that better", and then they say OK, and then we never see them again. And then a year later, a new group of people show up, who apparently have no idea that the last group of people were there, and never follow though on anything. So, they seem to think that they want to do gaming, but there's never any follow through on any of the things they say they're going to do. That makes it hard to be excited about doing games for their platforms.


Since then Steam has appeared on Mac, but the comments are still interesting, and they sort of support some of the speculations that there is a certain level of incompetence within the company when it comes to forward thinking software.

From their one step forward, two steps back with things like iMovie to their stagnating iOS, it really can't come as a surprise they have no clue.


Nobody can match the build quality of their computers, or the slickness of OS. But what a cost it comes at...
 
DirectX is what makes Windows a better gaming platform.

DirectX is owned by Microsoft and will never be modified for OS X (Microsoft has zero interest in allowing OS X to run better). Apple could write their own but nobody would design games using it.

OpenGL is all we will ever have and I doubt you will see improvements there.

Use bootcamp / Windows for gaming or live with what you have under OS X. Nothing is going to change.

I recently posted in the Mac Pro forum about my disappointment with purchase of a Radeon 7950 mac edition video card for my Mac Pro 3,1.

The Radeon 7950 Mac edition supports the Mac Pro (2010 or later). Numerous people have stated that it will run in an older Mac Pro but people have reported performance issues / bugs.

Can you return it and get the GeForce GTX680 instead ? It supports the Mac Pro 3, 1.
 
I'm saying a developer can do tons of things... tons... they don't like one of OS X's libraries or APIs that have performance problems... write their own. I never said its all in their hands, Apple could of course do a lot too, but blaming Apple for everything is just a developer who won't do what is required.

Blizzard performance differences... ? People act like Blizzard are perfect since they've support Apple so long. I'm sure there is tons of stuff they could optimize and make better in the code. As to other titles that are ported over after the fact... they spend years making a Windows version based on Direct X, then make a native mac port... well they do not do it from scratch nor spend the time to make the engine changes as good.

Another thing to take into account for graphics related performance is Nvidia and AMD... a lot of their cards will run OpenGL much poorer than DirectX, and there is tons of stuff they could do about it, but do not.

From a strictly technical point of view, yes, you're right that a dev can do tons of things, even build on his own anything the API doesn't offer, or doesn't perform well (although they can't do much about a poor-made graphics driver).

But this point of view is not a very realistic one for the market. A game developer is not going to invest all this time and money to reinvent the wheel on the Mac side, avoiding all the OpenGL's bad sides , especially if he doesn't have to do the same on the windows side. We cannot expect gaming companies to love the Mac that much. It's apple that has to make the platform look more appealing to the dev's eyes.
 
I don't know if it is a hardware problem on the pc or a windows software problem, or if this is a good comparison (at all) but my macbook pro with an INTEL FREAKING HD 4000 can play minecraft at double the frame rate in os x as it can in windows, the same as a gaming rig with a quad core i7 and nvidia gt 670, and twice the frame rate as another one with an ATI 5770 and an intel core 2 quad. This shows that something on OS X is clearly better.
 
DirectX is owned by Microsoft and will never be modified for OS X (Microsoft has zero interest in allowing OS X to run better). Apple could write their own but nobody would design games using it.

OpenGL is all we will ever have and I doubt you will see improvements there.

Use bootcamp / Windows for gaming or live with what you have under OS X. Nothing is going to change.



The Radeon 7950 Mac edition supports the Mac Pro (2010 or later). Numerous people have stated that it will run in an older Mac Pro but people have reported performance issues / bugs.

Can you return it and get the GeForce GTX680 instead ? It supports the Mac Pro 3, 1.

The 680 performs worse in the 3,1 Mac pro than the 7950, no such problems under Windows...

Running dirt 2 in Windows, my older 3,1 Mac Pro outperforms much newer Mac Pros with 680s or 7950s running OSX by a good 20 frames per second according to the barefeats benchmarks.

I've seen posts about "good" Mac ports such as Borderlands 2 and while Borderlands 2 runs acceptably on my Mac Pro under OS X, I found that on the same hardware running windows, I can turn settings up much higher and still have a better framerate.
 
Under OS X I had to use a mix of medium/Low/Off @1680x1050 to get a playable framerate and even then on some tracks the framerate would plummet to a level that was barely acceptable.

So where do these problems lie in OSX? Apple's OpenGL framework? The Graphics vendor's drivers? or are the game ports to OSX simply unoptimised?
Thoughts?

You are hitting a performance bug introduced by AMD in their graphics drivers, the performance when this bug is fixed by AMD/Apple will boost the FPS back up again close to the PC speeds. This bug was not present when we launched the game but was introduced later in an OS update.

If you would like to email our support we can note you down to be contacted when the bug is fixed. If the bug is not fixed (or marked as won't fix) we will investigate the issue again and see if we can work around the problem. If memory serves it is one of the settings that has the issue not the entire engine so if you disable the correct graphical option (I think something like DOF) the FPS drops go away as they are in a specific part of the drivers.

Cheers,

Edwin
 
You are hitting a performance bug introduced by AMD in their graphics drivers, the performance when this bug is fixed by AMD/Apple will boost the FPS back up again close to the PC speeds. This bug was not present when we launched the game but was introduced later in an OS update.

If you would like to email our support we can note you down to be contacted when the bug is fixed. If the bug is not fixed (or marked as won't fix) we will investigate the issue again and see if we can work around the problem. If memory serves it is one of the settings that has the issue not the entire engine so if you disable the correct graphical option (I think something like DOF) the FPS drops go away as they are in a specific part of the drivers.

Cheers,

Edwin

Thanks for the insight Ed, it's a pity Apple don't allow graphics driver updates separate to OSX updates, since what you are saying is that the bug has remained for at least two point releases of OS X?
 
I recently posted in the Mac Pro forum about my disappointment with purchase of a Radeon 7950 mac edition video card for my Mac Pro 3,1.

There are definitely improvements there compared to the 8800GT that was previously in the Pro but certainly not the night and day boost I was expecting.

I will focus on Dirt 2 as the point of comparison. Under OS X I had to use a mix of medium/Low/Off @1680x1050 to get a playable framerate and even then on some tracks the framerate would plummet to a level that was barely acceptable. Other game's performance such as the Witcher 2 and WoW, were also underwhelming.

So I was kinda fed up and decided I would install Windows 7 on my Pro and see if that faired any better.

I was expecting maybe slightly better performance under Windows 7 but the performance delta is huge.

Dirt 2 under windows 7 with everything turned up as high as it can go and with post processing effects higher than that allowed in OSX in addition to 8x AA was absolutely flawless. It never ever dropped below 60FPS and without vsync my framerates were averaging around 100+ Other games showed similar improvements.

So where do these problems lie in OSX? Apple's OpenGL framework? The Graphics vendor's drivers? or are the game ports to OSX simply unoptimised?

It's really disappointing to me as I would prefer not to have to run windows at all but sadly for decent gaming performance it seems to be a necessity.

Thoughts?

Poorly coded ports and a really old OpenGL stack in OSX.
 
I know this is old but none the less relevant and a good chuckle if nothing else. There was an article a few years ago that somewhat summed it up. Google "Mac lags Windows in gaming performance, excels at stability". I particularly like the part about Portal was 5X more stable on Macs than Windows and later said was using the metric of minutes played versus number of crashes.

I wonder what the current data on stability of Macs and Windows shows. Any game required to be played online is gathering data and phoning home.
:D
 
Not sure if it has been mentioned elsewhere on these forums but I watched a video of the WWDC keynote today on the Apple site and there was mention of OpenGL 4 coming in OS X Mavericks at one point. There was no discussion about it, just a mention in passing while discussing performance improvements. Sounds like good news to me. :D
 
Not sure if it has been mentioned elsewhere on these forums but I watched a video of the WWDC keynote today on the Apple site and there was mention of OpenGL 4 coming in OS X Mavericks at one point. There was no discussion about it, just a mention in passing while discussing performance improvements. Sounds like good news to me. :D

sure.. its good... for those that take advantage of it in the future.
 
I wonder what the current data on stability of Macs and Windows shows.
:D

We usually fix at least a few PC stability issues while developing the Mac version so if I had to guess I suspect that fact is still true based off the games I have worked on anyway. :)

Edwin
 
sure.. its good... for those that take advantage of it in the future.
Games will be able to use tessellation on OS X. I'm not sure if there are many OS X games whose Windows version use tessellation. The Frostbite engine has it and is supposed to be ported to the Mac. Same for Tomb Raider. Not sure about bioshock infinite.
 
If you look at Apple's WWDC app, they have a scheduled list of labs. In that list of labs is one for today (June 13), from 2-3 pm Pacific called "What's new is OpenGL for OSX". In the notes it explicitly OpenGL 4.1.

This is AWESOME news!
 
I'd like to put something else out there.

OS X gets most of it's performance through highly optimized multithreading, but most Windows games use only two or three threads at most. If you look at any basic tutorials, you will see that all drawing and computing code are on the main thread and network stuff is on other threads. Windows does have good, but not excellent multithreading performance, so there are delays and significant performance drops for heavy tasks being moved to other threads. In short, you get that annoying "hiccup" when you try to create threads like that. Windows makes up for this with a kernel that is highly optimized for single-thread tasks. Windows also splits tasks when possible to be used on multiple CPU cores, so you get lightning fast gaming performance.

OS X is fundamentally different. In addition to the previously mentioned complaints, OS X is built so much for fast multitasking that it wants everything broken into dozens of threads. There is little to no performance loss when it creates or switches threads, which is good, but every single process is significantly slower because OS X also manages all of those threads. When a game is made with only two or three threads, OS X chokes because it's optimized code is bypassed and it cannot cope without slowing down significantly. In short, OS X games have to be made in a totally different way in order to be as fast, but then they are likely to be faster.

That is just the CPU though. The problems of graphics cards and drivers still remain. The GPU is stuck in early versions of OpenGL because Apple wants backwards compatibility so much that they cannot push ahead. My old G5 still has full support for everything Apple is releasing, despite the fact that the hard drive is totally fried. Apple is taking their time so the less up-to-date computers can still run everything, but now they are so far behind that they probably won't be able to catch up. Granted, OS X Mavericks will have full OpenGL 4.1 support, better 3.2 support, and (supposedly) drop the legacy and fixed-functionality versions, but it still isn't going to be enough. What Apple needs to do is optimize in the areas they do not typically bother to look at. Heck, their developer tools already have absolutely crazy optimization tools that will reconstruct your code and make it faster, and even generate the assembly and machine code for it. They might as well run the OS X code through that and see what happens.
 
Direct X is the crown jewel in the crown of Windows. It smokes OpenGL for gameplay all around.... and Apple's implementation of OpenGL isn't superb.
 
Objective-C is slower than C and C++. Period. OSX uses Objective-C (especially it's inherent slow message passing versus procedural calls in C/C++). Windows uses C and C++ with many portions in the Kernel in assembly (machine code on metal is fastest you can get). Yes, the simple MOST basic task of calling a function is slow on OSX because of Objective-C. How can they fix this? Start porting to assembly and C. Get rid of Objective-C. It is not the future (especially when it drags along archaic interpreted memory management concepts). The future is low power and high performance in mobile devices. Interpreted stuff require lots of ram, lots of power wasted. This is the reason XNA was dropped on Windows Phone and also Windows 8. C# (interpreted language) is a dying breed in the future. Even Android game developers bypass the Java and code everything in C/C++, and not many code games on it because it is horribly complicated because of the Java layer. Google may drop Java requirement in the future in android to save themselves.

OSX uses VLVM compiler, which will hog up all your ram (2x at least) in order to get similar optimizations (or worse) than basic GNU compiler. It needs all that ram to look for optimizations. Unfortunately it will hog so much resources half of your stuff will be dumped to the harddrive and the swapping back and forth will negate any benefits you "may" get. I mention "compiler" because OSX has these "convenience" vs "performance" problems where it will leave graphics drivers in an uncompiled state ready for running (interpreted) on a virtual machine backend of the vlvm. Anything interpreted will need more than 2x the memory and 2x the cpu cycles to do the same thing than non-interpreted languages. Anything you see requiring Virtual Machines means slow (like Java, C#, Python, etc). Even Objective-C has some (or carries with it concepts of garbage collection which requires a virtual machine, even if you manually release the memory). When did VLVM first show up as a requirement on OSX? Lion and Mountain Lion. This is why there are so many threads on why Snow Leopard is like 50% to 2x faster than Lion (and Mountain Lion).

The Feral comment on DOF I think is "depth of field". It is the post processing that makes things blurry for things far away or too close to you that is not in focus. Even with that off, Windows will run faster than OSX. It comes back to square 1: Performance on the layer between the game and the metal (OSX).

So what can Apple do? Move towards C/C++ and assembly, start getting rid of interpreted stuff and virtual machines in all their products. Getting off Objective-C might be hard, but it is a superset of C, so it is not difficult to move in that direction.

Wanted to chime in because people think getting better graphics drivers or the OpenGL versus DirectX will fix their problems. It won't when the language and the operating system is the slowest layer. Even on OpenGL vs DirectX9, Windows games run faster than same OSX games (not a DirectX11 thing). Even when Snow Leopard versus Windows, OSX was slower (Objective-C). But back then it was only 20% slower. Now with interpreted and virtual machine stuff everywhere (from vlvm) in Lion and Mountain Lion, you are getting 200% slower in games.
 
Last edited:
Objective-C is slower than C and C++. Period. OSX uses Objective-C (especially it's inherent slow message passing versus procedural calls in C/C++). Windows uses C and C++ with many portions in the Kernel in assembly (machine code on metal is fastest you can get). Yes, the simple MOST basic task of calling a function is slow on OSX because of Objective-C. How can they fix this? Start porting to assembly and C. Get rid of Objective-C. It is not the future (especially when it drags along archaic interpreted memory management concepts). The future is low power and high performance in mobile devices. Interpreted stuff require lots of ram, lots of power wasted. This is the reason XNA was dropped on Windows Phone and also Windows 8. C# (interpreted language) is a dying breed in the future. Even Android game developers bypass the Java and code everything in C/C++, and not many code games on it because it is horribly complicated because of the Java layer. Google may drop Java requirement in the future in android to save themselves.

OSX uses VLVM compiler, which will hog up all your ram (2x at least) in order to get similar optimizations (or worse) than basic GNU compiler. It needs all that ram to look for optimizations. Unfortunately it will hog so much resources half of your stuff will be dumped to the harddrive and the swapping back and forth will negate any benefits you "may" get. I mention "compiler" because OSX has these "convenience" vs "performance" problems where it will leave graphics drivers in an uncompiled state ready for running (interpreted) on a virtual machine backend of the vlvm. Anything interpreted will need more than 2x the memory and 2x the cpu cycles to do the same thing than non-interpreted languages. Anything you see requiring Virtual Machines means slow (like Java, C#, Python, etc). Even Objective-C has some (or carries with it concepts of garbage collection which requires a virtual machine, even if you manually release the memory). When did VLVM first show up as a requirement on OSX? Lion and Mountain Lion. This is why there are so many threads on why Snow Leopard is like 50% to 2x faster than Lion (and Mountain Lion).

The Feral comment on DOF I think is "depth of field". It is the post processing that makes things blurry for things far away or too close to you that is not in focus. Even with that off, Windows will run faster than OSX. It comes back to square 1: Performance on the layer between the game and the metal (OSX).

So what can Apple do? Move towards C/C++ and assembly, start getting rid of interpreted stuff and virtual machines in all their products. Getting off Objective-C might be hard, but it is a superset of C, so it is not difficult to move in that direction.

Wanted to chime in because people think getting better graphics drivers or the OpenGL versus DirectX will fix their problems. It won't when the language and the operating system is the slowest layer. Even on OpenGL vs DirectX9, Windows games run faster than same OSX games (not a DirectX11 thing). Even when Snow Leopard versus Windows, OSX was slower (Objective-C). But back then it was only 20% slower. Now with interpreted and virtual machine stuff everywhere (from vlvm) in Lion and Mountain Lion, you are getting 200% slower in games.

Very interesting and informative....
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.