Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

iMacPro as gaming rig


  • Total voters
    128
  • Poll closed .
I think you’re missing the point by several parsecs.

No Mac “fanbois” who could actually afford an iMac Pro would buy one for gaming.

It’s more a theoretical question: what would an iMac Pro be like for gaming if you had one in front of you.

I’m interested to find that out as well, though I’m not insane enough to expect it to become my gaming machine.
 
Last edited:
  • Like
Reactions: Precursor
Apple is embracing VR after their keynote in Summer and it's now full swing with support in the latest Final Cut Pro X update. Why is everyone still saying Macs are not synonymous with gaming? There is even developer testing with the eGPU enclosure, HTC Vive with steam, Nvidia Mac driver support plus GeForce NOW, and the desktops Apple have released this year are all top of the line graphics from AMD.

I just really wished Apple would revert back to Nvidia, but it seems they are still under contract or getting great deals for custom built GPUs from AMD.
 
  • Like
Reactions: vkd
I just really wished Apple would revert back to Nvidia, but it seems they are still under contract or getting great deals for custom built GPUs from AMD.

Other than CUDA, what benefit does Apple gain going back to Nvidia? Just curious.
 
Other than CUDA, what benefit does Apple gain going back to Nvidia? Just curious.
Besides the CUDA benefits with Adobe applications I just personally prefer the performance, drivers, and TDP values Nvidia offers on both Windows and Mac OS.
 
  • Like
Reactions: Mac32
Besides the CUDA benefits with Adobe applications I just personally prefer the performance, drivers, and TDP values Nvidia offers on both Windows and Mac OS.

Oh. Those kind of benefits.

It's hard to say why Apple hasn't used Nvidia chips for a while now (last model I remember was the GT 750M found in the MacBook Pro Retina before the M370X). Maybe Apple likes the performance model of AMD silicon, especially in OpenCL applications? I think it's just the fact that AMD offers a better price to performance ratio.

It's weird if you think about it. Nvidia is dominating the consumer desktop and notebook space while AMD's graphics technologies are found in every major console since 2013, every Mac since 2015, and now they're developing chips in collaboration with Intel.
 
I know in the past, nVidia had screen tearing issues which is horrible for movie development. I remember playing games through the early 2000s with constant screen tearing. When I went ATI, I didn't see that, though at the cost of performance.

nVidia also probably left a bast taste in Apple's mouth with the overtemping of their cards. They used to target 80 degrees C as the max temp, then around the 7-8k series, they went to 100, ignoring the rule of 80C = "lifetime" or 10 years life, cut in half every 5C added. That's not as much a problems nowadays. They did cause a massive failure of the 8k & I think 9K series.
 
I think the bigger question is if Apple will deliver with the next regular iMac release, as far as gaming capabilites. They could really do great things if they wanted to - ie. upgrading the fan system (from iMac Pro) and removing the HD, which would facilitate using better GPUs and the latest 6-core line of CPUs from Intel. Knowing Apple I wouldn't get my hopes up too much, but you never know...
 
Last edited:
I think the bigger question is if Apple will deliver with the next regular iMac release, as far as gaming capabilites. They could really do great things if they wanted to - ie. upgrading the fans systems (from iMac Pro) and removing the HD, which would facilitate using better GPUs and the latest 6-core line of CPUs from Intel. Knowing Apple I wouldn't get my hopes up too much, but you never know...

Mechanical hard drives should have been discontinued on the iMac years ago, just like they were starting with the 2012 Retina MacBook Pro models. External storage these days is even easier with Thunderbolt technology and such.
 
  • Like
Reactions: c0ppo and Mac32
It’s more a theoretical question: what would an iMac Pro be like for gaming if you had one in front of you.

Looking at the Geekbench OpenGL Compute scores, the Vega 64 in the iMac Pro is slightly slower than a PCI form factor Vega 64 connected to a 5K iMac via an eGPU box over TB3. An unsubstantiated claim states that the Vega 64 GPU in the iMac Pro is down clocked by around 15% compared to the PCI card version so this could be why the score is lower.

As such, you're arguably better off with an iMac 5K and the Vega 64 in an eGPU case and the combination should be a fair bit cheaper than an iMac Pro with the Vega 64.
 
  • Like
Reactions: Precursor
Looking at the Geekbench OpenGL Compute scores, the Vega 64 in the iMac Pro is slightly slower than a PCI form factor Vega 64 connected to a 5K iMac via an eGPU box over TB3. An unsubstantiated claim states that the Vega 64 GPU in the iMac Pro is down clocked by around 15% compared to the PCI card version so this could be why the score is lower.

As such, you're arguably better off with an iMac 5K and the Vega 64 in an eGPU case and the combination should be a fair bit cheaper than an iMac Pro with the Vega 64.
Interesting. This is a bit OT, but do you know just how smooth the gaming experience is with these TB3 eGPUs and iMacs/PCs? Is the overall gaming experience similar to internal GPUs (apart from a reduction in overall performance)? If the TB3 system introduces stutter and uneven framerates, that would make the perceived experience inferior to internal GPUs. Not saying this is the case, but I've been wondering about this.
 
Looking at the Geekbench OpenGL Compute scores, the Vega 64 in the iMac Pro is slightly slower than a PCI form factor Vega 64 connected to a 5K iMac via an eGPU box over TB3. An unsubstantiated claim states that the Vega 64 GPU in the iMac Pro is down clocked by around 15% compared to the PCI card version so this could be why the score is lower.

As such, you're arguably better off with an iMac 5K and the Vega 64 in an eGPU case and the combination should be a fair bit cheaper than an iMac Pro with the Vega 64.

Every iMac GPU is down-clocked for thermal considerations. Same goes for the MacBook Pros with discreet graphics. The Radeon RX 560 is clocked at 1263MHz whereas the Radeon Pro 560 at 907MHz. Both the same die and architecture. When you hook up external GPUs you lose roughly ~15% total performance right off the bat.
 
Theoretically speaking how much performance would one loose in games, because of the slower clocked Xeon CPUs? Best case scenario is a whole 1Ghz lower base clock compared to the high-end iMac (from 4200Mhz i7 to 3200Mhz 8 core Xeon). (There's only 300Mhz difference in turbo boost, but Xeon only has Turbo Boost 2.) Not saying I want to buy an Imac Pro for a gaming PC, just curious...
 
Last edited:
Theoretically speaking how much performance would one loose in games, because of the slower clocked Xeon CPUs? Best case scenario is a whole 1Ghz lower base clock compared to the high-end iMac (from 4200Mhz i7 to 3200Mhz 8 core Xeon). Not saying I want to buy an Imac Pro for a gaming PC, just curious...
i think its about more amd 580 vs that vega gpu
 
The cores in the main won't be used.
Under direct x 11 you top out at quad cores range, extra cores bring little to no performance gains - at least within games. This was set to change under direct x 12, but I've only seen a few examples of where games use more cores.

intel has flipped in the mainstream units to HEX cores (from quad) so i would hope to see a move upwards in the future - but it will depend on the game engine.

This is a good read if you are interested.
https://www.pcworld.com/article/303...es-you-really-need-for-directx-12-gaming.html
 
The cores in the main won't be used.
Under direct x 11 you top out at quad cores range, extra cores bring little to no performance gains - at least within games. This was set to change under direct x 12, but I've only seen a few examples of where games use more cores.

intel has flipped in the mainstream units to HEX cores (from quad) so i would hope to see a move upwards in the future - but it will depend on the game engine.

This is a good read if you are interested.
https://www.pcworld.com/article/303...es-you-really-need-for-directx-12-gaming.html
Thanks for the link. Yes, I know about the lack of performance beyond quad cores. That's why I was curious about this.
I assume there would be a fairly noticeable penalty in gaming performance compared to using an iMac w/4.2Ghz i7 CPU with an eGPU w/Vega 64, right?
 
I assume there would be a fairly noticeable penalty in gaming performance compared to using an iMac w/4.2Ghz i7 CPU with an eGPU w/Vega 64, right?

I don’t think we know for sure at this time, but I assume it’s not that noticeable.
An eGPU will not perform to max either and it might be similar than the internal in iMac Pro. With more modern architecture e.g. better thermal and larger CPU cache the iMac Pro might perform the same or better under constant load and not just a short benchmark under peek.
 
The CPU is used for the physics in the game, so depends on the game engine and how it's using the cores/threads will impact what you see. Think of a core as a mouth, think of a thread as a hand - you can only eat as much data as you have mouths, the arms (threads) reduce the wait time to get food in your mouth for you to eat (sorry if you understand all that already) CPU drives the frame rates, the GPU is polygons and textures. W has a much higher memory bandwidth which will help if the game is memory constrained. Other than that the I7 has an higher operating frequency, is more efficient (a lot) so i guess what I'm saying is you're on a path of diminishing returns - the W's positives give you no real benefits for this workflow. I'm not going to say it will be worse as I've not tested one at this stage.

Higher clock speed will always win in games and the the cores will be limited by the engine.
 
You can actually get lower performance from more cores because of the way intels ring bus is used during parallel operations. On the new chips i believe they now have a mesh bus, but regardless if the game engine is not wrote to take advantage of the cores, you wont see anything and it will come down to the frequency of the CPU..
 
You can actually get lower performance from more cores because of the way intels ring bus is used during parallel operations. On the new chips i believe they now have a mesh bus, but regardless if the game engine is not wrote to take advantage of the cores, you wont see anything and it will come down to the frequency of the CPU..
Exactly; that's why a lot of us reference the single thread as being important. The GPU usually makes up for it, though.
 
I'd certainly be interested to see what it can do. Obviously I'd never buy it as a gaming rig (which to me implies gaming as a first priority), but if I were to get one for normal work I'd love for it to be able to competently play some games when I want a break. It takes far too long to get to Bootcamp and I really can't be bothered enough with games to have a separate computer or (worse still) a console for them. The only way I'm going to play anything is if it's a double-click away on my main computer.

My retina iMac (max spec of the first configuration) is tolerable for quite a lot of things, but a bit more oomph would be good, particularly if it carried across through Parallels and let me play some PC games too.
 
- 5K screen has big input lag, poor for gaming
- no Freesync system
- potential problem with GPU drivers (crippled version AMD desktop card)
- potential overheating
- no GPU or CPU upgradability
- AMD has a poor history for "soldered GPU"
 
Last edited:
- 5K screen has big input lag, poor for gaming
- no Freesync system
- potential problem with GPU drivers (crippled version AMD desktop card)
- potential overheating
- no GPU or CPU upgradability
- AMD has a poor history for "soldered GPU"
What’s the input lag compared to the older 2560x1440 iMac screens?
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.