Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Scott Sherman

macrumors member
Original poster
Nov 5, 2018
42
24
Washington State
Hi, I am one of those early adopters of the 2018 MBP i9 with 560x cpu. I have a 5k monitor which I use frequently in 5k aspect ratio. I have noticed that most things I do with my 560x model MBP work adequately at native resolution 2560 x 1080, but when I go up to 5120 x 2160 everything just slows down when loading. I was considering the purchase of the new Blackmagic eGPU to connect to my LG thunderbolt 5k2k monitor.

I can't help but be a bit curious though, how does the 580x graphic card in the BM eGPU compare to the internal Vega 20 gpu offered now with the latest MBP. Needless to say, it is over double the price of the Vega 20, but is it over double the value when using an external monitor. I suspect that because it is an 8 gb memory it will excel with my monitor over the Vega 20 but the 580x is sort of long in the tooth now. Even the newer BM pro offering the Vega 56 is one step behind the Vega 65 most are going to with in the razor eGPU.

I presume that folks deciding on whether to purchase the Vega 20 internal graphic card vs the more expensive BM eGPU or the pro version might be curios as well if there is value in purchasing the more powerful and expensive eGPU. My curiosity is whether the less impressive Vega 20 because it is internal and uses PCI buses gets better performance than the eGPU which relies on the thunderbolt bus to move data. I have not really been able to find any good comparison sites for these two graphic cards. I would love some opinions especially if you are able to do some sort of comparison or point to one on the web.

Thanks.
 
I’m very curious as well. That said, no matter what the benchmark results are, I’d personally go for a Vega 56/64 with a third-party eGPU box that offers an upgrade path when the time comes to buy a newer card.

The Blackmagic is essentially a beautifully designed external enclosure that ships with an overpriced and non-replaceable outdated card. Very Apple-like, if you ask me, but I honestly think that it defeats the whole purpose and advantage of going external.
 
I have RX580 Saphire Pulse in Razor Core X. For some reason this card is the most compatible with Mac systems, and this includes Mac Pro. It benches about the same in 3d like Vega 20, but it is some 50% faster in compute. Kind of weird, one would expect the opposite. This will be skewed more in favor of the RX580 when an external screen is connected. I have Vega FE also, and honestly I prefer the RX580 due to some small issues:
- RX is dead silent, VEGA FE can be loud under load,
- I have File Vault enabled and I'm not getting the initial password screen on external monitor attached to Vega, but it works on RX
- sleep disables the audio devices attached to Vega (HDMI/DP audio - have to reboot to get them back), and it works fine on RX

The most important thing about eGPU for me is it saves thermal headroom for the CPU and keeps the laptop fans at minimum without unnecessary ramping them up just because you have external screen attached.
 
  • Like
Reactions: trifid
In a way it’s like buying a MBP with the highest configuration you can afford. You know it’s going to be obsolete at some point. This is just the nature of technology today. I think the new MBP with the BM pro should become obsolete very similarly. So in 5 years you can sell them together and buy whatever is the best of that era.
 
It benches about the same in 3d like Vega 20, but it is some 50% faster in compute.
Interesting. What kind of computing tasks did you test it on, and how did you measure the performance?

The fact that the eGPU can be substantially faster on compute is encouraging, as personally I’m more interested in that than in 3D or gaming. I know what the theoretical GFLOPs count of each card is, but these numbers can be deceiving and I’m wondering how much of a performance hit one can expect to take because of the TB3 connection (though I know well that it depends on the particular task and on how much data is exchanged between CPU and GPU, etc., so drawing a general conclusion might not be possible).
 
Interesting. What kind of computing tasks did you test it on, and how did you measure the performance?

Nothing special, just Geekbench compute. I'm not a fan of it, because the tests are so quick, but because with eGPU you're not worried about cooling it paints a fairly accurate picture. I'm getting 130k in it using RX580 vs 70-85k on Vega 20.
 
Interesting. What kind of computing tasks did you test it on, and how did you measure the performance?

The fact that the eGPU can be substantially faster on compute is encouraging, as personally I’m more interested in that than in 3D or gaming. I know what the theoretical GFLOPs count of each card is, but these numbers can be deceiving and I’m wondering how much of a performance hit one can expect to take because of the TB3 connection (though I know well that it depends on the particular task and on how much data is exchanged between CPU and GPU, etc., so drawing a general conclusion might not be possible).

only gotcha is the benchmark compute potential doesn't necessarily translate to specific software, so you're always best to research the real world performance of specific workflows on the os/hardware combo.

I got burned hoping the improved OpenCL Geekbench scores for my Vega 64 combo'd with online reports that the card worked well in blender cycles would pan out, but it turns out blender's particular usage of OpenCL runs waaaay worse on macOS, and that the cards benefits would only be apparent in windows.

in this case its a specific software bug, but the relative niche market of Blender + macOS + AMD + egpu is too niche for them to invest resources to address, especially since apple is shuttering OpenCL support in the future.
[doublepost=1543892275][/doublepost]
Nothing special, just Geekbench compute. I'm not a fan of it, because the tests are so quick, but because with eGPU you're not worried about cooling it paints a fairly accurate picture. I'm getting 130k in it using RX580 vs 70-85k on Vega 20.

ha... another strike against my Vega 64. it also gets 130k in Geekbench compute, despite being theoretically twice as fast hardware. in windows bootcamp it gets 177k at lest.
 
ha... another strike against my Vega 64. it also gets 130k in Geekbench compute, despite being theoretically twice as fast hardware. in windows bootcamp it gets 177k at lest.

Good to know it is not just me. I think we have the same Vega 64, and I have also been disappointed with the 130K compute score.
 
Hi, I am one of those early adopters of the 2018 MBP i9 with 560x cpu. I have a 5k monitor which I use frequently in 5k aspect ratio. I have noticed that most things I do with my 560x model MBP work adequately at native resolution 2560 x 1080, but when I go up to 5120 x 2160 everything just slows down when loading. I was considering the purchase of the new Blackmagic eGPU to connect to my LG thunderbolt 5k2k monitor.

I can't help but be a bit curious though, how does the 580x graphic card in the BM eGPU compare to the internal Vega 20 gpu offered now with the latest MBP. Needless to say, it is over double the price of the Vega 20, but is it over double the value when using an external monitor. I suspect that because it is an 8 gb memory it will excel with my monitor over the Vega 20 but the 580x is sort of long in the tooth now. Even the newer BM pro offering the Vega 56 is one step behind the Vega 65 most are going to with in the razor eGPU.

I presume that folks deciding on whether to purchase the Vega 20 internal graphic card vs the more expensive BM eGPU or the pro version might be curios as well if there is value in purchasing the more powerful and expensive eGPU. My curiosity is whether the less impressive Vega 20 because it is internal and uses PCI buses gets better performance than the eGPU which relies on the thunderbolt bus to move data. I have not really been able to find any good comparison sites for these two graphic cards. I would love some opinions especially if you are able to do some sort of comparison or point to one on the web.

Thanks.

I just upgraded from a maxed out 15” 2016 MacBook Pro to the 2018 i9 with Vega 20.

I’m using the LG ultrafine 5K display and I upgraded mainly because I was disappointed by the performance when I was running the monitor in clamshell mode.

I can tell you that now with the Vega 20 it runs perfectly smooth and it’s also more silent than before.
 
  • Like
Reactions: baypharm
From the bits that I’ve read, if you’re using the LG 5K display then my understanding is the options are pretty limited for external GPUs - most don’t support it. And I don’t think there are any options yet that will support two of them. :-(
 
  • Like
Reactions: fpenta
I’m very curious as well. That said, no matter what the benchmark results are, I’d personally go for a Vega 56/64 with a third-party eGPU box that offers an upgrade path when the time comes to buy a newer card.

The Blackmagic is essentially a beautifully designed external enclosure that ships with an overpriced and non-replaceable outdated card. Very Apple-like, if you ask me, but I honestly think that it defeats the whole purpose and advantage of going external.
The black magic egu’s are the only ones that work with thunderbolt 3 monitors like the 5k ultra fine. So if you’re like me and have that monitor that’s your only option.
 
The black magic egu’s are the only ones that work with thunderbolt 3 monitors like the 5k ultra fine. So if you’re like me and have that monitor that’s your only option.
Good to know, but no, I have a different 4k monitor and would be connecting via HDMI.
 
Hi,

I use the same combo (macbook 560X 32gb ram i9 + Lg 5k)
and I was wondering to update the graphics adapter. Either with an external GPUs (Vega 64) or by updating my Mac with a the newer Vega 20).
I and maybe many people here would really love if we could post a same scenario in after effects Element 3d.

Thanks
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.