Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Anyone have optimal cs go settings for this card and imac? I just have the standard bootcamp drivers. Trying to reduce screen tearing.
 
Depends on the game.
At 5K, older games play fine (=60 fps) at max settings: TR Anniversary, Homeworld remastered, Portal 2 (if you disable motion blur, which causes a graphical glitch), and even CS:GO (same glitch as Portal, but I don't do matches, I only do tests with bots). The thing is to disable anti-aliasing since it is unnecessary given the pixel size.
At 1440p, all the games I tried play flawlessly at high settings: Doom (Windows), Dota 2 (macOS), Hitman (both), Alien isolation (macOS), and many other Mac games (Bioshock Infinite, Borderlands 2, etc)

I haven't tried 4K much since it would introduce some blur due to pixel interpolation. I doesn't make sense to play at 4K on a 5K display.
 
The answer to the original question is No.

Apple doesn't have good opengl drivers so anything that depends on that will simply not run well on the mac. Additionally, the rx580 is midrange gpu designed for coin mining and 1080p gameplay. Anything more than that is beyond the capabilities of the chip.

If you want 4k 60fps gameplay you should consider nvidia 1080 ti or at least rx vega 64. The latter has been show to be almost 20 percent faster in dx12 for a few games.
 
Upcoming games won't rely on OpenGL but on Metal.

Upcoming games will mostly depend on dx12 and vulkan, with the exception of a handful of game engines that are developed to support a large variety of backends, but they come with their own set of problems. No company will choose to spend money developing a game that would use Metal, that's just a masturbatory fantasy, especially when nowadays most games are developed for windows, ps4, and the xbox. Best case scenario would be a game that uses some translation mechanism from dx12/vulkan to metal, which would add about 15-30 percent overall performance penalty and additional bottlenecks.
 
So what was your post about Apple OpenGL drivers about? o_O WTH do they have to do with games on Windows or consoles?
My point is that future games on macOS (obviously), will rely on Metal, not openGL. Most AAA games released for the Mac in the last months use Metal. The major porting houses have completely moved to Metal and won't be using OpenGL.
As a side note, there are more Metal games on MacOS than DX12 games on Windows. And if you count iOS, Metal is by far the most used API among the new ones (DX12, Vulkan, Mantle, Metal).
 
So what was your post about Apple OpenGL drivers about? o_O WTH do they have to do with games on Windows or consoles?
My point is that future games on macOS (obviously), will rely on Metal, not openGL. Most AAA games released for the Mac in the last months use Metal. The major porting houses have completely moved to Metal and won't be using OpenGL.
As a side note, there are more Metal games on MacOS than DX12 games on Windows. And if you count iOS, Metal is by far the most used API among the new ones (DX12, Vulkan, Mantle, Metal).

I was simply implying that 4k 60fps on the mac is currently not really possible for the vast majority of cases. If the game uses opengl natively then it's penalized by Apple's poor drivers, if it's using opengl through a translation it's doubly penalized because of dx->opengl and the drivers, although some games can run very well through that mechanism. It just takes effort by the company to make that successful. No matter how you see it, opengl on the mac is dead and with it all hopes for 4k 60fps for most games that are available *today*.

Regarding Metal. It's been available for several years and currently only a handful of games are using it natively. Maybe in the future we will see many more games written with Metal in mind, but I'm not an optimist by default, I'm a realist, so if Metal's previous performance is considered seriously, then I don't see any reason why the situation should change in the future. Basically, Metal games will be very few to the point that it won't matter in most cases. Nevertheless, I sincerely hope you are right and I am dead wrong.
 
Metal, especially Metal 2, can power all current games. It's a high-performance low-overhead APIs that does most of what other new APIs do, and it's much easier to use. It's just a matter of developers using it and Macs getting decent GPUs. With eGPU support, the latter is becoming a reality. As for the former, it's already the case. All developers that matter, Feral, Aspyr, Epic Games, Unity and others, have transitioned to Metal.
 
Metal, especially Metal 2, can power all current games. It's a high-performance low-overhead APIs that does most of what other new APIs do, and it's much easier to use. It's just a matter of developers using it and Macs getting decent GPUs. With eGPU support, the latter is becoming a reality. As for the former, it's already the case. All developers that matter, Feral, Aspyr, Epic Games, Unity and others, have transitioned to Metal.

Right. Most game developers are just dying to use an obscure and proprietary api to target less than 10 percent of the market for games. On top of that, macs don't have fast gpus, so now you as a "gamer" have to additionally acquire an egpu which is a terrible and hacky solution in itself. That's a lot of hoops for everyone to jump through.
 
  • Like
Reactions: BlueTide
Don't take my word for it. Go ask developers from Aspyr, Feral and others, who post on this forum, what they think about Metal.

Edit to add: the issue with Mac gaming is, and always will be, an issue of marketshare, unless Apple opens macOS to PCs. It's not an issue of graphics APIs.
Apple can't use DirectX, since it's proprietary and integrated into Windows. It could have used Vulkan. But -- and this is taken from a developers's mouth -- Vulkan will be as irrelevant as openGL for gaming. The world will continue to use DirectX on the PC. So it's better for Apple to use an API that they control and that is simpler to use than Vulkan (which BTW was released 2 years after Metal).
 
Last edited:
I have seen some misinformation across this page, so I would like to clarify.

I have used the Late 2015 iMac 27'' 5K with a Radeon 395X 4GB for 1440p gaming at 60FPS with nearly maxed out graphics across many different games. This includes Resident Evil 7, Witcher 3, Skyrim, Fallout 4, and many Steam titles across both my Windows Bootcamp partition and Mac Sierra partition. I can't play 4K without running into severe performance issues, however. The Radeon 580 8GB is considerably better, so you should still be able to run 1440 perfectly and even approach 4K gaming as well.

Here is a great comparison between the two cards. (sorry for the ad in the middle)

Screen Shot 2017-10-16 at 5.59.58 AM.png


As for graphic optimization goes: It's important to have V-sync turned on and shadow quality turned down for any game you play, if you really want to see better performance when running into issues.
 
I know this is a old thread, but here is a video on World of Warcraft with everything maxed out except AA (MSAA 2x) and on 1440p.

It gets around 60 FPS. Sometimes slightly above, sometimes slightly bellow. The video was made during the most intense phase of the end game (the ones that play the game, it was during a heroism).

Just by dropping a level on the shadows option or disabling the outline mode, it increases a lot the FPS and gets constantly above 60fps.


 
Last edited:
  • Like
Reactions: SackJabbit
I have one stupid question: The 27" models have a 5k display. Now if I play games and put the resolution in-game down to - let's say - 1080p or 1440p - will games look horribly jagged like when I put the game resolution from the monitors native 1080p down to 720p, because my computer currently can't handle more? Or is this less noticeable on the larger resolutions?

Hi,

We've just bought a new iMac, and we're thinking of switching from FCP7 to Premier (Creative Cloud.) {Haven't used it since we had to switch to Macs and FCP when my partner started her course at the LCC about a decade ago.}

We film in 1080p at 25fps.

If I use a 4k external monitor with a 60hz refresh rate for playback and scrubbing the timeline or the clips in the library/browser, will this be this be ok?

I'm obviously gonna want my video to cover the whole external monitor, so it will have to scale up from 1080p to 2160p, while it will be playing at 25fps with the monitor refreshing at 60hz.

Apparently, the monitor I'm looking at as something called freesync which wiki says:

FreeSync is the brand name for an adaptive synchronization technology for LCD displays that support a dynamic refresh rate aimed at reducing screen tearing.

It works with AMD, who I think make these Radeon Pro 580 GPUs. And I'll use a display port from USB-C lead, instead of HDMI, which some say helps.

I'm not really sure what this screen tearing is, but does this Freesync automatically set the right refresh rate?

I've just had a look at my Macbook Air plugged into an old 1360x768 tv, and the Refresh Rate (set to 60hz) is greyed out. Do you know why this would be?

I'd guess that when editing, we'd like a refresh rate of either 25 or 50hz, given we shoot in 25p.

I don't want to buy a 4k 60hz monitor if it will make my 25fps 1080p video look shîte.

Though some people on the web say that the Mac will only output 30hz to a 2nd display.

If you can help, I'd really appreciate it.
 
Hi,

anyone know or can estimate the performance of this card ?
Will it run popular games at 4K ?

Not with max settings. 25-30 FPS @ 2560x1440 max settings in Heaven benchmark. Barely supported GTX1080ti gets 60-ish FPS under MacOS, even better under Windows.

What you want is a Vega card. The Vega Pro 64 in the iMac Pro runs about 50-70% faster than the GTX1080 under MacOS.
[doublepost=1514849281][/doublepost]
I have one stupid question: The 27" models have a 5k display. Now if I play games and put the resolution in-game down to - let's say - 1080p or 1440p - will games look horribly jagged like when I put the game resolution from the monitors native 1080p down to 720p, because my computer currently can't handle more? Or is this less noticeable on the larger resolutions?

It will look bad unless you drop it by halves. So 5120 x 2880 (5K) should reduce to 2560x1440 (half-res, 1/4 pixels) or 1280 x 720 (quarter-res, 1/8 pixels).
 
Hi,

We've just bought a new iMac, and we're thinking of switching from FCP7 to Premier (Creative Cloud.) {Haven't used it since we had to switch to Macs and FCP when my partner started her course at the LCC about a decade ago.}

We film in 1080p at 25fps.

If I use a 4k external monitor with a 60hz refresh rate for playback and scrubbing the timeline or the clips in the library/browser, will this be this be ok?

I'm obviously gonna want my video to cover the whole external monitor, so it will have to scale up from 1080p to 2160p, while it will be playing at 25fps with the monitor refreshing at 60hz.

Apparently, the monitor I'm looking at as something called freesync which wiki says:

FreeSync is the brand name for an adaptive synchronization technology for LCD displays that support a dynamic refresh rate aimed at reducing screen tearing.

It works with AMD, who I think make these Radeon Pro 580 GPUs. And I'll use a display port from USB-C lead, instead of HDMI, which some say helps.

I'm not really sure what this screen tearing is, but does this Freesync automatically set the right refresh rate?

I've just had a look at my Macbook Air plugged into an old 1360x768 tv, and the Refresh Rate (set to 60hz) is greyed out. Do you know why this would be?

I'd guess that when editing, we'd like a refresh rate of either 25 or 50hz, given we shoot in 25p.

I don't want to buy a 4k 60hz monitor if it will make my 25fps 1080p video look shîte.

Though some people on the web say that the Mac will only output 30hz to a 2nd display.

If you can help, I'd really appreciate it.

FreeSync only work in Windows, NOT MacOS.

Premier work better with Nvidia GPU.

But FCP7 barely use any GPU acceleration, so it will still an upgrade.

It's correct that display port work better than HDMI most of the time.

Unless your display is very wrong. No need to worry about the refresh rate. The OS should handle it automatically.

4K 60Hz monitor should NOT make a 1080P 25FPS video look worse.
 
Hi,

We've just bought a new iMac, and we're thinking of switching from FCP7 to Premier (Creative Cloud.) {Haven't used it since we had to switch to Macs and FCP when my partner started her course at the LCC about a decade ago.}

FCPX works a lot better on the iMac but before buying anything I would give a try to Davinci Resolve since it is free.
 
But FCP7 barely use any GPU acceleration, so it will still an upgrade.

Not true at all, like at all!!! it can even take advantages of 2 gpus
Try running Optical flow without a gpu and with a gpu and you will see how HUGE the difference is, same for a lot of background rendering task.

It is true hoever that premier takes almost no advantace, well it does not have background rendering at all, but even then...

2016-10-29-image-3.jpg
 
Not true at all, like at all!!! it can even take advantages of 2 gpus
Try running Optical flow without a gpu and with a gpu and you will see how HUGE the difference is, same for a lot of background rendering task.

It is true hoever that premier takes almost no advantace, well it does not have background rendering at all, but even then...

2016-10-29-image-3.jpg

I said FCP 7, not FCP X
 
  • Like
Reactions: PJivan
I Know this is an old topic but just wondering.

I have heard that the i7 (and I guess the high end i5 the same as the are both 91W chips) gets really warm and high revs with certain tasks.

Any of you guys playing recent games (Far Cry 5 and such) has experienced this type of issues?
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.