Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
That was insinuated due to the 95W cap on the i9 and only running 8 cores at 3.7 while the i5 could run all 6 at 4.2.

It's by far worse than the i5. It can run circles around the i5 with the 8 cores 16 threats even if the frequency is slower, so if you want power, I9 is the race car.

The fly in the soup however is that not all apps can really utilize more than 4 or 6 cores (or threads), let alone 16. So depending on what software you're running, all that 'race car' power may be left on the table anyway.

https://macperformanceguide.com/blog/2017/20171223_2311-why-more-cores-often-dont-help.html

"going beyond 6 CPU cores, many factors conspire to make further improvements incremental."
 
The fly in the soup however is that not all apps can really utilize more than 4 or 6 cores (or threads), let alone 16. So depending on what software you're running, all that 'race car' power may be left on the table anyway.

https://macperformanceguide.com/blog/2017/20171223_2311-why-more-cores-often-dont-help.html

"going beyond 6 CPU cores, many factors conspire to make further improvements incremental."

And the i9 should still be faster even if only processing 4-6 threads if my cinebench runs correlate to real world tasks. Now the difference is ~7% so if that doesn’t justify the i9’s cost, then the i5 is for you. If you’re one to upgrade every 2-3 years, then the i9 may also be harder to justify as even if/when software is able to utilize the i9’s HP, you’ll be buying a new machine anyway.

But if your upgrades are further apart, i9 may be worth it as the untapped power as of now may be tapped down the road. But then the 580X/Vega will probably be the bottleneck.
 
And the i9 should still be faster even if only processing 4-6 threads if my cinebench runs correlate to real world tasks. Now the difference is ~7% so if that doesn’t justify the i9’s cost, then the i5 is for you. If you’re one to upgrade every 2-3 years, then the i9 may also be harder to justify as even if/when software is able to utilize the i9’s HP, you’ll be buying a new machine anyway.

But if your upgrades are further apart, i9 may be worth it as the untapped power as of now may be tapped down the road. But then the 580X/Vega will probably be the bottleneck.

I figured someone would bring up benchmarks. The thing is, benchmark programs like cinebench and geekbench are some of the few that actually utilize all cores/threads to their fullest.

Oftentimes real-world software does not.

So to borrow from the old phrase: “theres lies, damn lies ... and benchmarks”.
 
I figured someone would bring up benchmarks. The thing is, benchmark programs like cinebench and geekbench are some of the few that actually utilize all cores/threads to their fullest.

Oftentimes real-world software does not.

So to borrow from the old phrase: “theres lies, damn lies ... and benchmarks”.

True hence why I said if they correlate. Even if real world doesn’t take advantage of the core/thread to the fullest, it does at least show that up to 6 threads, the i9 is still clocking at a higher speed than the 9600K. So in theory i9 should still hold a small lead over the i5. Hence why I brought up the benchmarks to show clockspeed more than the scores themselves.
 
And the i9 should still be faster even if only processing 4-6 threads if my cinebench runs correlate to real world tasks. Now the difference is ~7% so if that doesn’t justify the i9’s cost, then the i5 is for you. If you’re one to upgrade every 2-3 years, then the i9 may also be harder to justify as even if/when software is able to utilize the i9’s HP, you’ll be buying a new machine anyway.

But if your upgrades are further apart, i9 may be worth it as the untapped power as of now may be tapped down the road. But then the 580X/Vega will probably be the bottleneck.

This make sense. If I thought I would keep a 2019 iMac as long as I have kept my 2011 iMac, the i9/Vega would be a no-brainer for as much future proofing as Apple currently offers. But with a likely switch by Apple to its own processors in a year or two, it seems to make more sense for me to go with the $2299 iMac with the standard 2TB Fusion drive or a 512GB SSD to keep costs down while still getting acceptable but solid performance from the i5 9600K and 580X. I can add more RAM on my own. Even a no-option $2299 iMac is going to be a lot faster than what I have now.
 
This make sense. If I thought I would keep a 2019 iMac as long as I have kept my 2011 iMac, the i9/Vega would be a no-brainer for as much future proofing as Apple currently offers. But with a likely switch by Apple to its own processors in a year or two, it seems to make more sense for me to go with the $2299 iMac with the standard 2TB Fusion drive or a 512GB SSD to keep costs down while still getting acceptable but solid performance from the i5 9600K and 580X. I can add more RAM on my own. Even a no-option $2299 iMac is going to be a lot faster than what I have now.

Assuming the ARM prediction holds (I think it will), these may well be the last intel iMacs.
Given that, for me the calculation would be to get the very top end, and walk away into the sunset with the last great intel iMac.
I would upgrade the ram and drive myself but max out on processor and gpu.

I’ll probably give it a year more with my 2017 iMac, then pull he trigger on the 2019 27 incher.
 
Now the question I'm still pondering is (although it's not the topic of this thread): 580X or Vega? I'm not gaming on the Mac (use consoles for that), do you guys think there would be any real advantage on going with the Vega if my main uses are photo editing, office work and occasionally running Logic?
Since I think no one who uses photo editing software or Logic has answered yet in this thread (I don’t use Logic, and I barely edit photos), I can tell you that the Mac4Ever review about which I posted above says that Logic does not use the GPU, and I seem to recall reading comments to the effect that most Photoshop plugins do not yet take advantage of the GPU.

That being said, the trend seems to be to offload more and more computational work to the GPU, so as photo editing software gets updated, a more powerful GPU may yield greater performance gains (as I think Bboble has already rightly pointed out).

Office apps will certainly not take advantage of the GPU.
 
  • Like
Reactions: Bohemien
Since I think no one who uses photo editing software or Logic has answered yet in this thread (I don’t use Logic, and I barely edit photos), I can tell you that the Mac4Ever review about which I posted above says that Logic does not use the GPU, and I seem to recall reading comments to the effect that most Photoshop plugins do not yet take advantage of the GPU.

That being said, the trend seems to be to offload more and more computational work to the GPU, so as photo editing software gets updated, a more powerful GPU may yield greater performance gains (as I think Bboble has already rightly pointed out).

Office apps will certainly not take advantage of the GPU.

Totally agree, but how many cores do office apps really use?
 
I figured someone would bring up benchmarks. The thing is, benchmark programs like cinebench and geekbench are some of the few that actually utilize all cores/threads to their fullest.

Oftentimes real-world software does not.

So to borrow from the old phrase: “theres lies, damn lies ... and benchmarks”.
Well we don't use one software at a time anymore.
It's all the multi-tasking we do that made multi-core CPU's a need.

Popular thing to do these days is exporting video, playing a game, and streaming at the same time.
 
  • Like
Reactions: Colonel Blimp
Since I think no one who uses photo editing software or Logic has answered yet in this thread (I don’t use Logic, and I barely edit photos), I can tell you that the Mac4Ever review about which I posted above says that Logic does not use the GPU, and I seem to recall reading comments to the effect that most Photoshop plugins do not yet take advantage of the GPU.

That being said, the trend seems to be to offload more and more computational work to the GPU, so as photo editing software gets updated, a more powerful GPU may yield greater performance gains (as I think Bboble has already rightly pointed out).

Office apps will certainly not take advantage of the GPU.

My sense is this is kind of in flux and depends very much on the software you're using. Puget systems has some interesting video card comparisons for Photoshop benchmarks. Many tasks in PS run fine on integrated gpus, though that's not universally true. Most recommendations regarding LR and PS have been that a discrete gpu but that there was little performance gained from gpus above mid-level with 4gb vram. Adobe just release some AI resolution features into LR and that seems to benefit from higher performance gpus. Though pragmatically, if you're saving a couple of seconds a few times a day it doesn't add up to much. On1 Raw seems to be placing heavier demands on gpus. And I know there are some systems with integrated gpus on which it doesn't work well, if it works at all. No idea about Capture One. I think that currently there'd be very little real world difference in photo editing between any mid-level or higher performance gpu. Not so sure that'll be true a few years down the road.
 
Haha, that’s some upgrade!

By contrast, check out these new Geekbench tests with the 2019 iMac:

Not much difference! I’m guessing (hoping?) that’s simply because the brand-new Radeon VII driver in macOS 10.14.5 Beta (on which that system is running) has not yet undergone tuning.

If we look at all of the Macs running the Radeon VII in eGPUs and sort by score, we see that the Geekbench 4 Compute scores are higher in Windows than in macOS (which would seem to support my guess that there’s room for improvement in the current beta macOS drivers), but we also see that the Vega 48 in an iMac performs at a respectable 75% of the compute performance of the Radeon VII in an eGPU in Windows!

(Framerates in games, of course, are not limited only by raw compute power. I do not expect an iMac with a Vega 48 to give 75% of the gaming performance of a Radeon VII driving an external monitor.)
Maybe you can help me? I’ve been trying to figure out eGPU stuff and I’m a bit confused. I read on Apple’s site that you can use eGPU to drive the internal display now on iMacs. Doesn’t it use the iGPU + eGPU to compute? Not sure why the score wouldn’t be double? Also trying to figure out if driving the internal display is slower because it has to send the same signal back over the same cable? But I have an external 4K. Say I want to do 4K 60Hz gaming on higher settings in Windows. Is it possible to use the iGPU (Vega 48) + the eGPU (Vega 64 or Radeon VII) to get insane performance? Like maybe better than a single RTX 2080 Ti? Lastly I think there is something screwy with those Radeon VII benchmarks because I ran a query in Geekbench for the 2019 iMac but with the Vega 64 and saw results around 200,000 but can’t tell if some are faked since some are from last year? But they don’t seem to be iMac Pro and some are more recent.

Thanks. Still trying to understand eGPU.
 
I read on Apple’s site that you can use eGPU to drive the internal display now on iMacs.
Yes. In Boot Camp the eGPU will drive the internal display in all games and apps. (eGPUs in Boot Camp are not supported by Apple, but the folks at eGPU.io have got them working, and the unofficial, customized drivers at BootCampDrivers.com—which won’t support the internal Vega 64 before June—are reported to help in avoiding driver conflicts between internal and external Radeon GPUs.)

In macOS, the eGPU will drive the internal display only if the developer adds this support to the app or game. So far only one of Feral’s macOS games, Rise of the Tomb Raider, has eGPU support for the internal display. I don’t know what the situation is with other game developers, but I expect more will add support eventually.

See Use an external graphics processor with your Mac.

Doesn’t it use the iGPU + eGPU to compute? Not sure why the score wouldn’t be double?
If I recall correctly, yes, an application can use the internal GPU plus multiple eGPUs for compute. Perhaps it’s up to the application? If so, Geekbench may be restricting computation to a single GPU to make comparison of results less complicated. (Even if Geekbench were using both GPUs, however, I wouldn’t expect the score to double, since the external eGPU would normally be more powerful.)

Also trying to figure out if driving the internal display is slower because it has to send the same signal back over the same cable?

That is indeed what I’ve heard. An external display connected directly to eGPU will give the best performance, and will work even if the app or game developer has done nothing to enable eGPU support.

But I have an external 4K. Say I want to do 4K 60Hz gaming on higher settings in Windows. Is it possible to use the iGPU (Vega 48) + the eGPU (Vega 64 or Radeon VII) to get insane performance?

No, I don’t think so. For driving a display, I believe only one GPU is used, unlike operations that use GPU compute for, say, accelerating a Final Cut Pro render, where the GPUs are being used as very fast coprocessors.

Lastly I think there is something screwy with those Radeon VII benchmarks because I ran a query in Geekbench for the 2019 iMac but with the Vega 64 and saw results around 200,000 but can’t tell if some are faked since some are from last year? But they don’t seem to be iMac Pro and some are more recent.

Perhaps those 2019 iMac Vega 64 results from 2018 were from units inside Apple. The “AMD Radeon RX Vega 64 Compute Engine” could have been an eGPU, or perhaps they were testing the iMac with internal Vega 56 and 64. I’ve heard that the Vega 64 runs a lot hotter than the 56, and it could be that they both run hotter than the 48. Perhaps the thermal constraints of the iMac’s cooling system are why Apple chose to offer the Vega 48 instead of the 56, and not because of some artificial attempt to differentiate the 2019 iMac from the iMac Pro (but that’s pure speculation on my part).
 
Yes. In Boot Camp the eGPU will drive the internal display in all games and apps. (eGPUs in Boot Camp are not supported by Apple, but the folks at eGPU.io have got them working, and the unofficial, customized drivers at BootCampDrivers.com—which won’t support the internal Vega 64 before June—are reported to help in avoiding driver conflicts between internal and external Radeon GPUs.)

In macOS, the eGPU will drive the internal display only if the developer adds this support to the app or game. So far only one of Feral’s macOS games, Rise of the Tomb Raider, has eGPU support for the internal display. I don’t know what the situation is with other game developers, but I expect more will add support eventually.

See Use an external graphics processor with your Mac.


If I recall correctly, yes, an application can use the internal GPU plus multiple eGPUs for compute. Perhaps it’s up to the application? If so, Geekbench may be restricting computation to a single GPU to make comparison of results less complicated. (Even if Geekbench were using both GPUs, however, I wouldn’t expect the score to double, since the external eGPU would normally be more powerful.)



That is indeed what I’ve heard. An external display connected directly to eGPU will give the best performance, and will work even if the app or game developer has done nothing to enable eGPU support.



No, I don’t think so. For driving a display, I believe only one GPU is used, unlike operations that use GPU compute for, say, accelerating a Final Cut Pro render, where the GPUs are being used as very fast coprocessors.



Perhaps those 2019 iMac Vega 64 results from 2018 were from units inside Apple. The “AMD Radeon RX Vega 64 Compute Engine” could have been an eGPU, or perhaps they were testing the iMac with internal Vega 56 and 64. I’ve heard that the Vega 64 runs a lot hotter than the 56, and it could be that they both run hotter than the 48. Perhaps the thermal constraints of the iMac’s cooling system are why Apple chose to offer the Vega 48 instead of the 56, and not because of some artificial attempt to differentiate the 2019 iMac from the iMac Pro (but that’s pure speculation on my part).
Thanks for this, I understand a lot more now and can wrap my mind around it better. This will make further research easier because I wasn't even sure where to begin with some of this. I appreciate it!
 
  • Like
Reactions: Colonel Blimp
I was wondering if you ever decided between the i5 or the i9 iMac? I am a photographer (Nikon z6 Raw files) using Photoshop and Lightroom, as well as some Final Cut Pro work. I currently have a late 2012 i7 3.4 27" iMac with 32 gb ram and 512 internal ssd. Wondering if the i5 is a relative step back even though it should be quicker? Also wondering if I would gain all that much in Photoshop and Lightroom from the i9 over the i5? Regardless of i5 or i9 I would plan on a 512 gb ssd and 40 gb of ram. Please let me know your thoughts. Thanks.
 
I was wondering if you ever decided between the i5 or the i9 iMac? I am a photographer (Nikon z6 Raw files) using Photoshop and Lightroom, as well as some Final Cut Pro work. I currently have a late 2012 i7 3.4 27" iMac with 32 gb ram and 512 internal ssd. Wondering if the i5 is a relative step back even though it should be quicker? Also wondering if I would gain all that much in Photoshop and Lightroom from the i9 over the i5? Regardless of i5 or i9 I would plan on a 512 gb ssd and 40 gb of ram. Please let me know your thoughts. Thanks.

I'm not sure who you're asking, specifically. I really debated i5 vs i9 and 580x vs Vega. I opted for the i5 580x with SSD and then added 32gb of RAM. This generation of i5 is a true 6 core and if you compare benchmarks of the current i5 to the 2017 27" iMac with i7 they are very close on single core comparisons and the i5 benchmarks slightly faster on multi-core comparisons. So, at least for the benchmarks I've looked at, 6 true cores seems to be a little faster than 4 core with hypoerthreading. I'm pretty sure any pragmatic differences between the two would be pretty inconsequential. I don't think the i9 would buy too much performance increase for LR and PS. It might speed exports if you're exporting a lot of files. I know that DxO Photolab really utilizes all available cores and threads when exporting using prime noise reduction. What I've read would also suggest that once you have a solid cpu with 4+gb of video ram the graphic card won't have a big impact on LR or PS either. Future versions of LR and/or PS might better exploit the additional cores and hyper threading of the i9. I think where the i9 would really offer a meaningful difference would be with stuff like video editing and rendering, or for someone running many processes at the same time. I'm very happy with my iMac. I'd have preferred the i9 with Vega, but in the end just felt the additional performance would be pretty insignificant for the vast majority of stuff I do. Since I've never used an i9, I don't have any direct experience with which to compare.
 
I was wondering if you ever decided between the i5 or the i9 iMac? I am a photographer (Nikon z6 Raw files) using Photoshop and Lightroom, as well as some Final Cut Pro work. I currently have a late 2012 i7 3.4 27" iMac with 32 gb ram and 512 internal ssd. Wondering if the i5 is a relative step back even though it should be quicker? Also wondering if I would gain all that much in Photoshop and Lightroom from the i9 over the i5? Regardless of i5 or i9 I would plan on a 512 gb ssd and 40 gb of ram. Please let me know your thoughts. Thanks.

A current 6-core (i5) will be much quicker than an old 4-core. Hyperthreading may make the Mac more responsive when lots of different things are going on, but for raw power it is the core count which is important. And since 2012 there have been steady improvements in the speed of the cores.

Lightroom Classic does make use of all the cores in my i9 (some of the time). So the i9 will be quicker than the i5 (some of the time). But I doubt that as a casual photographer I could really notice any difference. Professional photographer maybe could justify the extra cost. No experience of FCP, but in general video work loves more cores.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.