Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
@Spectrum just did exactly this. Anyway, in the other eGPU threads, lots of people have pitchforks out over the obscene "$600 markup".
Sure - an overstatement on my part. It still remains (to me) that $400-500 for the ability to output a TB3 signal is rather excessive. Of course they can price it however they like since there is no competition!

I also kind of doubt that the lifespan of an eGPU enclosure is very limited. There seems relatively little to fail (the external PSU maybe?), and it seems that there is a long way to go before the bandwidth of TB3 is saturated...but maybe I am wrong on that. How far off is TB4?
 
It doesn't drive a 5K. That's the biggest selling point of the BlackMagic. It's a niche product for a niche market. There's no point in comparing them to an ASUS enclosure until there are eGFX certified ASUS enclosures with dual TB3.

There's a point in comparing it to the Asus for every single monitor on the market except one; and there are pretty strong rumours that Apple will release its own monitor, effectively replacing that monitor, in 2019.

There isn't even any magic in Thunderbolt/5K. Dell's 8K monitor runs with DisplayPort.
 
Last edited:
Question: Is it possible to drive this following 5K display with either the 2018 mac mini and/or an eGPU?
https://iiyama.com/gb_en/products/prolite-xb2779qqs-s1/

Inputs state:
upload_2018-12-4_14-14-56.png
 
  • Like
Reactions: smirking
Sure - an overstatement on my part. It still remains (to me) that $400-500 for the ability to output a TB3 signal is rather excessive. Of course they can price it however they like since there is no competition!

Well, it's also for a very limited market so they might not be able to make it worthwhile at a lower price point. Those of us who need something like this are willing to pay for it to whoever makes it available for our needs.

I also kind of doubt that the lifespan of an eGPU enclosure is very limited. There seems relatively little to fail (the external PSU maybe?), and it seems that there is a long way to go before the bandwidth of TB3 is saturated...but maybe I am wrong on that. How far off is TB4?

I don't know enough about this to have a worthwhile opinon, but it just seems to me that the whole eGPU game is kind of still developing and it'll be hard to predict what things look like in 4 years. I expected that whatever solution I got, I'd hang onto for 3 to 4 years and in 3 to 4 years, I might own a laptop that makes the eGPU moot or I might have gone back to owning a desktop. There's only a modest chance the tech would be obsolete, but it doesn't have to be actually obsolete for it to be no longer desirable for my needs.

I made the determination that I had no idea what things would look like in 4 years so I didn't want to weight my decision on whether my purchase could follow me to my next computer.
 
Last edited:
The idea of having a eGPU rather than an internal GPU ist great. But Apple has completely failed on this, as they are not producing their own eGPU solution nor the available solutions work properly.

But Apple is extremly clever regaring their pricing (swiss francs exampe):

MacMini, LG 5, Blackmagic Vega 64, Keyboard & Mouse: CHF 5955
iMacPro same RAM, SDD, 5K Monitor, Vega 54, Keyboard & Mouse: CHF 5785

I'd say the vega 54 in the iMacPro is about as fast as the Vega 64 in the eGPU. So the iMacPro soluton might be the better choice.
 
Question: Is it possible to drive this following 5K display with either the 2018 mac mini and/or an eGPU?
https://iiyama.com/gb_en/products/prolite-xb2779qqs-s1/

Inputs state:
View attachment 808349

Yes, the Mini could drive it via HDMI in 4K. An eGPU with version 1.4 Display Ports can drive it natively. Even a Radeon RX 560 has DP 1.4.

MacMini, LG 5, Blackmagic Vega 64, Keyboard & Mouse: CHF 5955
iMacPro same RAM, SDD, 5K Monitor, Vega 54, Keyboard & Mouse: CHF 5785

Hmmm, did you mean to write "Vega 56" next to Blackmagic? It only has an RX Vega 56.

I'd say the vega 54 in the iMacPro is about as fast as the Vega 64 in the eGPU. So the iMacPro soluton might be the better choice.

The desktop RX Vega 64 is natively about 40% faster than the IGP Vega 56 Pro in the iMac Pro. This is moot though because the eGPU Pro is only an RX Vega 56. I do agree with your point - if one is trying to spec out a Mini with that 5K LG ball-and-chain monitor and a Blackmagic eGPU Pro, the better value is the base iMac Pro.
 
So something like this TB3>dual DP adapter should work too?

No, unfortunately not. What that adapter does is give you two independent DP 1.2 links from a Thunderbolt 3 port on your computer. When they are talking about driving a 5K screen, what they mean is using both DP links to independently drive the top and bottom of the screen like a bezel-less two monitor setup, which is how some early 5K screens worked.

This is technically also how the LG 5K works, however it just carries both DP 1.2 links over the single TB3 cable.

If the LG 5K wasn't so stupidly designed it would have had two DP 1.2 inputs (or a DP 1.4 input) in addition to the TB3 input so that it could be driven by anything. In that setup the TB3 port would function as a repeater like every other TB device that can be daisy chained. For a $1300 monitor it's not too much to ask for IMO.
 
As an Amazon Associate, MacRumors earns a commission from qualifying purchases made through links in this post.
Until recently, I didn't pay a lot of attention to ports for peripherals. I am probably not alone in that. This, and some other recent discussions, underscores how important it can be to understand how peripherals will work when purchasing big ticket items. Knowing what I now know, there isn't a chance in hell that I would purchase LG's 5K monitor, although I at one point had it on a list of options.

It's bad enough to pay $1300 for a 27" monitor that doesn't even cover Adobe RGB. Getting hosed by Blackmagic for an additional $1200 for a Vega 56 eGPU puts it in a league of its own.
 
Last edited:
No, unfortunately not. What that adapter does is give you two independent DP 1.2 links from a Thunderbolt 3 port on your computer. When they are talking about driving a 5K screen, what they mean is using both DP links to independently drive the top and bottom of the screen like a bezel-less two monitor setup, which is how some early 5K screens worked.

This is technically also how the LG 5K works, however it just carries both DP 1.2 links over the single TB3 cable.

If the LG 5K wasn't so stupidly designed it would have had two DP 1.2 inputs (or a DP 1.4 input) in addition to the TB3 input so that it could be driven by anything. In that setup the TB3 port would function as a repeater like every other TB device that can be daisy chained. For a $1300 monitor it's not too much to ask for IMO.
Thanks for the thorough explanation. I see that the LG is TB3 only. That is fine.

But would the adapter work for this monitor? (and future similar designs)

https://iiyama.com/gb_en/products/prolite-xb2779qqs-s1/

If not, is there an adapter that will work? (other than an eGPU)

Finally, if one did wish to use the LG (TB3), how much performance is lost by connecting it directly to a TB3 ports on a mac mini and then also a generic TB3 eGPU (out and back) separately to the mini?
 
But would the adapter work for this monitor? (and future similar designs)

https://iiyama.com/gb_en/products/prolite-xb2779qqs-s1/

If not, is there an adapter that will work? (other than an eGPU)

No, the DP inputs on that monitor are independent DP 1.4 inputs. DP 1.4 handles 5K natively. The only time an adapter like that would be useful for 5K is if you had a monitor that was specifically designed to be driven by two DP 1.2 ports at the same time in the top-bottom method I described AND you wanted to save a USB-C port on the back of the Mini.

I have not been able to find a reliable source for an off-the-shelf converter that will give you 5K DP 1.4 output from two DP 1.2 inputs. Or an adapter that will let you drive DP 1.4 from a TB3 port by combining two DP 1.2 links of the iGPU.

Finally, if one did wish to use the LG (TB3), how much performance is lost by connecting it directly to a TB3 ports on a mac mini and then also a generic TB3 eGPU (out and back) separately to the mini?

If I'm correctly understanding what you're trying to ask here, this probably won't work. Even if the UHD 630 could do this, your performance hit would be massive, way over the 30% 13" MBPs see when you drive their internal displays with eGPUs. Why would you want to do this?
 
Last edited:
Yes, the Mini could drive it via HDMI in 4K. An eGPU with version 1.4 Display Ports can drive it natively. Even a Radeon RX 560 has DP 1.4.

Hmmm, did you mean to write "Vega 56" next to Blackmagic? It only has an RX Vega 56.

The desktop RX Vega 64 is natively about 40% faster than the IGP Vega 56 Pro in the iMac Pro. This is moot though because the eGPU Pro is only an RX Vega 56. I do agree with your point - if one is trying to spec out a Mini with that 5K LG ball-and-chain monitor and a Blackmagic eGPU Pro, the better value is the base iMac Pro.

Yes I meant Vega 56 sorry. eGPUs are about 20% slower than their internal counterparts. But I'd like to see a comparison between the MacMini with the BlackmagicPro and the iMacPro Basic. The only reason I haven't bought an iMacPro yet is that I can't upgrade RAM by myself. And this with a $5500 computer. It's somehow completely ridiculus.
 
No, the DP inputs on that monitor are independent DP 1.4 inputs. DP 1.4 handles 5K natively. The only time an adapter like that would be useful for 5K is if you had a monitor that was specifically designed to be driven by two DP 1.2 ports at the same time in the top-bottom method I described AND you wanted to save a USB-C port on the back of the Mini.

I have not been able to find a reliable source for an off-the-shelf converter that will give you 5K DP 1.4 output from two DP 1.2 inputs. Or an adapter that will let you drive DP 1.4 from a TB3 port.

If I'm correctly understanding what you're trying to ask here, this probably won't work. Even if the UHD 630 could do this, your performance hit would be massive, way over the 30% 13" MBPs see when you drive their internal displays with eGPUs. Why would you want to do this?
OK...so the only way to drive 5K displays from the mini are:
1. Use the direct TB3 output and TB3-TB3 cable for the LG display
2. Similar but use Black Magic eGPU
3. By the Ilyama 5K display, but power it via a TB3 eGPU with DP1.4 output
4. Possibly use a generic eGPU to accelerate an LG 5K that is directly connected to the mini by TB3.

For option 4, I was just interested in the possible 5K solutions that are available. Even a 30% hit on a eGPU is a massive boost compared to UHD630, and it is a (potential) way to drive an LG 5K display without buying a Black Magic.
 
OK...so the only way to drive 5K displays from the mini are:
1. Use the direct TB3 output and TB3-TB3 cable for the LG display
2. Similar but use Black Magic eGPU
3. By the Ilyama 5K display, but power it via a TB3 eGPU with DP1.4 output
4. Possibly use a generic eGPU to accelerate an LG 5K that is directly connected to the mini by TB3.

For option 4, I was just interested in the possible 5K solutions that are available. Even a 30% hit on a eGPU is a massive boost compared to UHD630, and it is a (potential) way to drive an LG 5K display without buying a Black Magic.

Keep in mind the Ilyama is a 6-bit+FRC panel.

I must reiterate that the performance hit if #4 were possible is going to be WAY over 30%. The LG UltraFine 5K is a 10-bit (8-bit+FRC) panel, so at its native resolution you'll need 26.5Gb/s to send frame data back over the PCIe bus, which only has 22Gb/s. So you'd have to reduce the refresh rate. If you're using the eGPU to accelerate any tasks, that also needs PCIe bandwidth back to the CPU.

For now, I'd say you only have two real eGPU options with the LG UltraFine 5K (and 5K in general):

1) Pay the Black Magic tax

2) If using software that can utilize a headless eGPU, use a generic eGPU and run the 5K screen from the Mini's iGPU
 
OK...so the only way to drive 5K displays from the mini are:
1. Use the direct TB3 output and TB3-TB3 cable for the LG display
2. Similar but use Black Magic eGPU
3. By the Ilyama 5K display, but power it via a TB3 eGPU with DP1.4 output
4. Possibly use a generic eGPU to accelerate an LG 5K that is directly connected to the mini by TB3.

For option 4, I was just interested in the possible 5K solutions that are available. Even a 30% hit on a eGPU is a massive boost compared to UHD630, and it is a (potential) way to drive an LG 5K display without buying a Black Magic.

You could always have a go at Dell's 8K monitor and drive it with a couple of DisplayPort cables :)

And Sapphire's RX 580, RX 590, Vega 56 and Vega 64 GPUs have two DisplayPort ports.
 
8K is overkill! I'm just looking at the options that would be available if I did go down the 5K route in the future.

I think most likely I would get an eGPU with DP1.4 output and hope that the range of 5K displays expands beyond the Ilyama and LG.

Or probably I would just get a 32 or 27 inch 4K screen. I just need to determine for myself how best to handle the scaling and what gives the sharpest fonts. I find scaling options on a 4K retina iMac to be perfectly sharp in high Sierra to my eyes (so long as font smoothing is turned off). But a 4K 27 inch and especially a 4k 32 inch have a much lower dpi, so I wonder if fonts will then not look great when scaled. It is only PDF Fonts in preview that look terrible and soft. But I use AdobeDC mostly now to avoid that.

I have to say, in general, since I turned off font smoothing on a 27 inch Dell 2560*1440 running High Sierra, the font sharpness is much better, so I'm not even that sure I need to move up to a 4K or 5K display anyway. Much easier on GPU too. I have to see how Mojave affects things though...
 
The plain fact of the matter, at least as I see it, is that while Apple has provided a path for eGPUs where one did not formally exist before, they have not fully committed themselves to that path, as they tend to do with quite a few ideas and technologies over the years.

Unfortunately, I believe this is a cultural issue within Apple and I have no confidence that it will ever resolve itself to anyone's satisfaction. Exacerbating the issue are competing priorities with iOS versus macOS, for lack of a better description.

While I can (and have) composed spreadsheets weighing the merits of the Blackmagic eGPU, and now the Pro model, with a DIY solution, I think the sticking point for myself and others in this thread is that Blackmagic and Apple chose to release these models late in AMD's GPU release cycle. Polaris 20 was not a significant jump from Polaris 10 and the just released AMD RX 590 gave us yet another refinement (Polaris 30). While the Vega 56 and 64 are better than they are given credit for here and in other forums, they already feel old, even if they are only coming up on 16 months of age. I think this has more to do with the fact that they were not as competitive with NVIDIA's 10x0-Series GPUs as we would have liked, especially since they are really the only game in town for Mac users given Apple's embargo/blockade/Cold War with NVIDIA. The recent release of the 20x0-Series of GPUs is not helping with perceptions either, even if they are not as good as the tech pundits would have us believe. I am looking at you Tom's Hardware.

Thus, the BlackMagic eGPU at $699 feels more like a rip-off than it actually is once you evaluate the components needed to replicate it. We can all quibble about the price of the RX580, but having been researching for the past six (6) months what would be the best path for me to take, I have noticed that since the cryptocurrency insanity abated and AMD GPUs began declining in cost, that the price for an AMD RX580 with 8GB of GDDR5 has settled at $250 on average. Sure, New Egg and other shops have discounted them more and prices have fluctuated, but between $240 to $270 has been the price band on any given, non-Holiday Season day. So for purposes of my opinion, I am stating $250.

After looking at the breakdown of the Blackmagic eGPU on eGPU.io - https://egpu.io/blackmagic-egpu-review-apples-ultrafine-curse/ - the PSU specs stated are 400w, which equates exactly with one eGPU box, the Akitio Node. However, here I will take some liberty given that the Node only provide 15w of USB-C Power Delivery and not the 85w of the Blackmagic eGPU. Factoring that into the equation, I think and the next cheapest eGPU box that provides that amount of USB PD would be the SonnetBreakaway Box 550. The Node is currently $230.00 and the eGFX 550 is $300.00, which average out to $265.00.

The USB 3.0 hub in the Blackmagic is a bit better than the average, and more complex due to the way Blackmagic engineered their eGPU, so I will break it out at $35, if you were to buy it separately.

Total cost is $250.00+$265.00+$35.00, which gets us to $550.00. Given the engineering time spent on adding a Titan Ridge controller, developing the enclosure, thermals, implementing separate USB-C controllers, the ability to connect to a 5K Thunderbolt 3 display, sales/marketing/support/packaging and taking into account profit for Blackmagic, while acknowledging that buying the components separately also takes into account profit for the GPU, enclosure and USB hub vendors, a roughly 27% markup of the $550.00 above really makes the Blackmagic eGPU a decent value for those that have, want or need a 5K Thunderbolt display specifically, already use Blackmagic hardware, edit with DaVinci Resolve or find the convenience of unpacking the eGPU, plugging it in and going are worth its shortcomings and lack of ever being upgradeable.

The Vega 56 version, unfortunately, charges too high a premium when we look at its pricing given that is swap out the Vega 56 with the RX580 and on average, the Vega 56 is going to set you back $510.00 (I took the average of 12 cards from both New Egg and Amazon) and added to the $265.00 for the enclosure, which seem exactly the same PSU-wise and the $35.00 for the USB 3.0 hub, you are left with a grand total of $810.00, which means that when multiplied by 1.27 to account for the above items, which I think is fair considering the older RX580 based eGPU has most likely not recouped its R&D expenses yet, leaves us with a retail prices $1028.70. At the current price of $1199, the markup is 48%, which is excessive, given that the Vega 56 is not cutting edge, the engineering, while not amortized yet, is paying for itself and we are still faced with a non-upgradeable unit once its service life is over. It would seem that a price of $1049 or even $1069 would feel more appropriate, while still keeping the profit margin decent.

As it stands now, I can DIY my own eGFX 550 and a Vega 56 from MSI for about $710.00 minus the USB 3.0 hub.

Also, Sonnet offers the same bundle of the eGFX 550 and an MSI Vega 56 on their site for $699.00.

A decent RX580 offering can be had with a $220 Sapphire Nitro+ for $220 and the eGFX 550 at $300, making a $520 eGPU a worthwhile proposition, although I would say an extra $50 for a 2-meter active TB3 cable should be added to every eGPU, including that tiny rat tail that Blackmagic gives you.

The issue here, though, is still the cost. Especially for the Mac mini buyer. I would have to opine that a $250 to $350 fixed eGPU box similar to the Sonnet eGFX puck might gain traction in the market. For that to work, you are looking at some aggressive negotiation to try and get Apple to relent on allowing the RX560 back into its eGPU graces, which it expunged when it released 10.13.4. Also, to be a really worthy fixed eGPU, the truly creative company would give us a Vega 16 at $249 or Vega 20 at $349 along with 2-3 USB 3.0 ports, single GbE port, two DP 1.4 ports and a single HDMI 2.0 port. Now, whether this is possible and turning a profit are two different things. I am talking economics for anyone who wants or needs a decent, if not fire breathing GPU, to play games and give video editors a bit of extra oomph, plus those other odds and ends apps that benefit from an eGPU. This entails Apple doing a better job of integrating them into Mojave and better support from app vendors. I suspect 10.15 may give us a better path forward.

Bottom line, there is a market for eGPUs, but without a competitive and cost-effective GPU to light the path, the market is going to be moribund at best. The Blackmagic eGPU is not it. Perhaps Intel's Arctic Point will be that GPU.

Again, Apple provides solutions to a degree and then either loses interest or decides not to follow through on those less glamorous items that make computing viable and more productive for the rest of us. I am hoping that I am wrong.
 
The plain fact of the matter, at least as I see it, is that while Apple has provided a path for eGPUs where one did not formally exist before, they have not fully committed themselves to that path, as they tend to do with quite a few ideas and technologies over the years.

Unfortunately, I believe this is a cultural issue within Apple and I have no confidence that it will ever resolve itself to anyone's satisfaction. Exacerbating the issue are competing priorities with iOS versus macOS, for lack of a better description.

While I can (and have) composed spreadsheets weighing the merits of the Blackmagic eGPU, and now the Pro model, with a DIY solution, I think the sticking point for myself and others in this thread is that Blackmagic and Apple chose to release these models late in AMD's GPU release cycle. Polaris 20 was not a significant jump from Polaris 10 and the just released AMD RX 590 gave us yet another refinement (Polaris 30). While the Vega 56 and 64 are better than they are given credit for here and in other forums, they already feel old, even if they are only coming up on 16 months of age. I think this has more to do with the fact that they were not as competitive with NVIDIA's 10x0-Series GPUs as we would have liked, especially since they are really the only game in town for Mac users given Apple's embargo/blockade/Cold War with NVIDIA. The recent release of the 20x0-Series of GPUs is not helping with perceptions either, even if they are not as good as the tech pundits would have us believe. I am looking at you Tom's Hardware.

Thus, the BlackMagic eGPU at $699 feels more like a rip-off than it actually is once you evaluate the components needed to replicate it. We can all quibble about the price of the RX580, but having been researching for the past six (6) months what would be the best path for me to take, I have noticed that since the cryptocurrency insanity abated and AMD GPUs began declining in cost, that the price for an AMD RX580 with 8GB of GDDR5 has settled at $250 on average. Sure, New Egg and other shops have discounted them more and prices have fluctuated, but between $240 to $270 has been the price band on any given, non-Holiday Season day. So for purposes of my opinion, I am stating $250.

After looking at the breakdown of the Blackmagic eGPU on eGPU.io - https://egpu.io/blackmagic-egpu-review-apples-ultrafine-curse/ - the PSU specs stated are 400w, which equates exactly with one eGPU box, the Akitio Node. However, here I will take some liberty given that the Node only provide 15w of USB-C Power Delivery and not the 85w of the Blackmagic eGPU. Factoring that into the equation, I think and the next cheapest eGPU box that provides that amount of USB PD would be the SonnetBreakaway Box 550. The Node is currently $230.00 and the eGFX 550 is $300.00, which average out to $265.00.

The USB 3.0 hub in the Blackmagic is a bit better than the average, and more complex due to the way Blackmagic engineered their eGPU, so I will break it out at $35, if you were to buy it separately.

Total cost is $250.00+$265.00+$35.00, which gets us to $550.00. Given the engineering time spent on adding a Titan Ridge controller, developing the enclosure, thermals, implementing separate USB-C controllers, the ability to connect to a 5K Thunderbolt 3 display, sales/marketing/support/packaging and taking into account profit for Blackmagic, while acknowledging that buying the components separately also takes into account profit for the GPU, enclosure and USB hub vendors, a roughly 27% markup of the $550.00 above really makes the Blackmagic eGPU a decent value for those that have, want or need a 5K Thunderbolt display specifically, already use Blackmagic hardware, edit with DaVinci Resolve or find the convenience of unpacking the eGPU, plugging it in and going are worth its shortcomings and lack of ever being upgradeable.

The Vega 56 version, unfortunately, charges too high a premium when we look at its pricing given that is swap out the Vega 56 with the RX580 and on average, the Vega 56 is going to set you back $510.00 (I took the average of 12 cards from both New Egg and Amazon) and added to the $265.00 for the enclosure, which seem exactly the same PSU-wise and the $35.00 for the USB 3.0 hub, you are left with a grand total of $810.00, which means that when multiplied by 1.27 to account for the above items, which I think is fair considering the older RX580 based eGPU has most likely not recouped its R&D expenses yet, leaves us with a retail prices $1028.70. At the current price of $1199, the markup is 48%, which is excessive, given that the Vega 56 is not cutting edge, the engineering, while not amortized yet, is paying for itself and we are still faced with a non-upgradeable unit once its service life is over. It would seem that a price of $1049 or even $1069 would feel more appropriate, while still keeping the profit margin decent.

As it stands now, I can DIY my own eGFX 550 and a Vega 56 from MSI for about $710.00 minus the USB 3.0 hub.

Also, Sonnet offers the same bundle of the eGFX 550 and an MSI Vega 56 on their site for $699.00.

A decent RX580 offering can be had with a $220 Sapphire Nitro+ for $220 and the eGFX 550 at $300, making a $520 eGPU a worthwhile proposition, although I would say an extra $50 for a 2-meter active TB3 cable should be added to every eGPU, including that tiny rat tail that Blackmagic gives you.

The issue here, though, is still the cost. Especially for the Mac mini buyer. I would have to opine that a $250 to $350 fixed eGPU box similar to the Sonnet eGFX puck might gain traction in the market. For that to work, you are looking at some aggressive negotiation to try and get Apple to relent on allowing the RX560 back into its eGPU graces, which it expunged when it released 10.13.4. Also, to be a really worthy fixed eGPU, the truly creative company would give us a Vega 16 at $249 or Vega 20 at $349 along with 2-3 USB 3.0 ports, single GbE port, two DP 1.4 ports and a single HDMI 2.0 port. Now, whether this is possible and turning a profit are two different things. I am talking economics for anyone who wants or needs a decent, if not fire breathing GPU, to play games and give video editors a bit of extra oomph, plus those other odds and ends apps that benefit from an eGPU. This entails Apple doing a better job of integrating them into Mojave and better support from app vendors. I suspect 10.15 may give us a better path forward.

Bottom line, there is a market for eGPUs, but without a competitive and cost-effective GPU to light the path, the market is going to be moribund at best. The Blackmagic eGPU is not it. Perhaps Intel's Arctic Point will be that GPU.

Again, Apple provides solutions to a degree and then either loses interest or decides not to follow through on those less glamorous items that make computing viable and more productive for the rest of us. I am hoping that I am wrong.

I think that your GPU price assumptions are too high. For example, given that the new RX 590 sells for $280, it is highly unlikely that your assumed price of $250 for the RX 580 is sustainable.

I do not think that what we are seeing in the market are true "sales". It is vendors trying to find a post-cryptocurrency floor, and the frequency of the "sales" suggests that they have not found it yet.

I believe that we will see AMD GPU pricing return to launch prices, or close to them, which in the case of the Vega GPUs especially is significantly less than your assumptions. An additional factor to consider is pressure from the used market. Perhaps it is miners offloading, perhaps it is people trying to cut loses on GPUs that they purchased at stratospheric prices, but whatever the reason the used market is flooded. There is even the interesting spectacle of people on eBay trying to resell ASRock Vega 56 GPUs that they purchased a couple of weeks ago for $340 for whatever profit they can get away with.

In my view, the "average" prices that you use in this analysis don't represent anything meaningful. The price range for a given GPU is so great that there is no true average, at least not one that can be determined without knowing actual sales at the various price points. This is not a normal market, and as a result ordinary averaging doesn't work.
 
Last edited:
I think that your GPU price assumptions are too high. For example, given that the new RX 590 sells for $280, it is highly unlikely that your assumed price of $250 for the RX 580 is sustainable.

I do not think that what we are seeing in the market are true "sales". It is vendors trying to find a post-cryptocurrency floor, and the frequency of the "sales" suggests that they have not found it yet.

I believe that we will see AMD GPU pricing return to launch prices, which in the case of the Vega GPUs especially is significantly less than your assumptions. An additional factor to consider is pressure from the used market. Perhaps it is miners offloading, perhaps it is people trying to cut loses on GPUs that they purchased at stratospheric prices, but whatever the reason the used market is flooded. There is even the interesting spectacle of people on eBay trying to resell ASRock Vega 56 GPUs that they purchased a couple of weeks ago for $340 for whatever profit they can get away with.

In my view, the "average" prices that you use in this analysis don't represent anything meaningful. The price range for a given GPU is so great that there is no true average, at least not one that can be determined without knowing actual sales at the various price points. This is not a normal market, and as a result ordinary averaging doesn't work.

Sustainable indicates a reduction in the future, and I am talking about the here and now. I have been watching these prices for the past six months as I have no desire to pay over $200 for an RX580 8GB myself, but here we are, and the best price I have seen was $209 for a Sapphire Pulse RX580 8GB on New Egg two weeks ago and currently there is a Sapphire Nitro+ LE RX580 8GB on sale for $220.00 after promo code. Other than that, the prices have stayed in the $239-$269 range consistently, even with the introduction of the RX590, which is sitting at $279 on New Egg and Amazon, assuming you can find one. I did use the retail prices of items for sale from New Egg only and used Asus, Gigabyte, MSI and Sapphire as those brands are the most likely to be met with success. For the Vega 56, I added XFX and used the above brands on both New Egg and Amazon to build an average of as many possible cards as I could. Admittedly, I am not a statistician, but I stand by the prices I gave as you and I can spend all day going round and round about pricing and not agree.

Postulating about "true sales" all day long is simply moving the goal posts by adding in the "used" market and is completely meaningless to this discussion. What sales look like on eBay and how they may or may not affect retail pricing is of zero use in trying to build a meaningful comparison between what the Blackmagic costs and what someone here can build for themselves using retail parts. Truly, I can find used Sapphire Nitro+ RX480 GPUs all day on eBay for $120-$160 and put together a pretty nice eGPU using a eGFX 350 for $320-$360, but that is a skewed comparison for these purposes.

Regarding Vega 56 prices, I am speaking in the here and now, not some unspecified point in the future. I stand by the average price at $510 for the purpose of what I am trying to illustrate in my OP.

Your view is just that, your view. From reading your previous posts, you seem to be incensed, well, at least very perturbed at what Blackmagic is charging for their eGPUs. For my post, I simply wanted to illustrate the differences in pricing between the Blackmagic eGPU and the eGPU Pro and give a bit of insight into the margins run much higher on the Pro version. I think that for any user not comfortable with assembling their own unit, prefer leasing or simply do not want to do anything more than plug and play, the RX580 version is not a bad value once the numbers are broken down. The Pro version represents a particularly bad value in my opinion, because retail cost using the average pricing that I presented shows that the markup is close to 50%. Many, if not all of your posts have show similar numbers. I suppose I am put off by your intense effort to trump any numbers other than your own in your effort to show Blackmagic is simply soaking the consumer and that anyone that purchases one of these eGPUs is making a huge mistake.

In my opinion, $699 for the RX580 eGPU is a decent value, given the rather large downside of non-upgradeability and the Vega 56 eGPU Pro is a really bad value, given the excess markup, lack of meaningful additional features over the original version and the fact that AMD is on schedule to announce 7nm Navi and 7nm Vega Instinct (consumer) at some unspecified point in 2019. I tried the Blackmagic eGPU and despite the horrendously short TB3 cable, I liked the unit quite a bit. The problem was that I deal with Photoshop all day and so do the designers I work with, and for them, it would be of zero benefit. So it went back and we are deciding whether to wait for the 2019 iMac, moving up to the iMac Pro or settling for the 2017 iMac, which is cheaper and fairly reliable, but not particularly compelling

For Mac owners, the eGPU can be a positive asset. However, Apple still has work to do and prices still need to come down to make any significant inroads. I believe the magic number for that is probably around $199-$249 and will not be as powerful as some would like, but that is not the point. Unfortunately, having spent time on numerous forums, where flame wars about GPUs and frame rates break down into juvenille tribal warfare within two to three replies, most will simply move past any useful posts as the level of discourse goes downhill fast.

The point to remember, is that what works for some, does not work for others, but each users should make their own decision. I hope that my OP was helpful to others who are struggling with a decision or trying to weigh purchasing a Mac mini, or not.
 
  • Like
Reactions: jrholt and smirking
the best price I have seen was $209 for a Sapphire Pulse RX580 8GB on New Egg two weeks ago

Then you haven't been looking very hard. The RX 580 was available recently for US$180. With the new RX 590 selling for $280, it makes no sense to insist, as you have, on an average RX 580 price of $250.

With respect to both the RX 500 series GPUs and the Vega GPUs, there is no point in averaging prices that can be hundreds of dollars apart for the same item without knowing whether the "prices" are connected to actual sales. Surely this is obvious.
 
Last edited:
In my opinion, $699 for the RX580 eGPU is a decent value, given the rather large downside of non-upgradeability and the Vega 56 eGPU Pro is a really bad value

Thanks for that detailed breakdown. I hadn't realized just how significant of a premium they were charging for the Vega 56 version. I briefly owned the current version and was not happy about the price, but determined that it wasn't a ripoff and would be worth it if it worked for my needs. I assumed the story would be the same for the Vega 56 version and it looks like I might be wrong there.

Then you haven't been looking very hard.

I think @Zdigital2015's rather meticulously explained methodology says otherwise. Your reply seems to be proving his observations about you to be true. He's not even really disagreeing with you. He just doesn't agree with your conclusions 100%.
 
  • Like
Reactions: Zdigital2015
Then you haven't been looking very hard.

There is no point in averaging prices that can be hundreds of dollars apart for the same item without knowing whether the "prices" are connected to actual sales. Surely this is obvious.

What seems to be obvious is that you are condescending and belittling as though you are right, I am wrong, that is that.

Neither you nor I can know if the current market prices for any given item are tied to actual sales because neither of us has any visibility into how many units any particular retailer sells of a given category of item. We could guess at those sales, if AMD broke out volume of each GPU sold in their quarterly filings, but I highly doubt they do that.

However unscientific it may be, and with the rather small sample, I came up with numbers that I believe are fairly accurate, however, I readily admit they are not the gospel. I apologize if I presented them as such, other than to state that is what I decided they were for the purposes of my post.

I will say that if I take the current PC Part Picker listing for just the RX580, count both the 4GB and 8GB versions of the cards and throw out the lowest price ($159.99) and the highest price ($475.85), I get an average of $245.79 and if I leave those numbers in the mix, I get $251.31.

If I do the same with the RX Vega 56, I get $528.40 and $540.33, respectively.

There is a point to averaging the prices, in that it at least gives us a baseline to look at during a single point in time and to use to plot over time trends upward or downward. When I began following the pricing on the RX580, the average was $80-$100 higher for the same card on any given day. Launch pricing, while an indicator of what AMD intended, is not what the reality is today. In order to give enough information to anyone who takes the time to read my post, and deciding or debating what to buy, you have to use numbers in the here and now. Prefacing that with information about what they should be is informative, but not particularly useful when basing a decision in market reality.

I have no dog in the hunt and I gave my opinion that the original Blackmagic eGPU is not a bad value, while conversely arguing that the Pro version is a bad value because the markup is substantially higher than the original RX580 version and not just a few percentage points apart. If you have predetermined that neither of them is a good value and your only argument is based on price, then at least providing some data to back up your assertion is helpful in discriminating between noise and information. Otherwise, you just sound like you are bent out of shape and have an axe to grind with Apple and/or Blackmagic. You certainly are not the only one, we all have our gripes.

Certainly, if these numbers are not useful to you or you believe that I am misleading others in this thread, feel free to refute them, but try something better than negatives and condescension.
 
Otherwise, you just sound like you are bent out of shape and have an axe to grind with Apple and/or Blackmagic. You certainly are not the only one, we all have our gripes.


Right, I have it in for Apple, whose products I use, and whose shares, which I've held for many years through the many claims that the stock is going to crash any minute now, constitute the majority of my net worth.

I also have it in for Blackmagic, whose DaVinci Resolve, paid version, I use for colour grading, and whose new 4K camera I just purchased.

I've said what I've got to say. You can agree or not, but do yourself a favour and resist the urge to engage in childish ad hominem attacks.

I'll leave it at that, although it's a pretty good bet that you won't.
 
Thanks for that detailed breakdown. I hadn't realized just how significant of a premium they were charging for the Vega 56 version. I briefly owned the current version and was not happy about the price, but determined that it wasn't a ripoff and would be worth it if it worked for my needs. I assumed the story would be the same for the Vega 56 version and it looks like I might be wrong there.



I think @Zdigital2015's rather meticulously explained methodology says otherwise. Your reply seems to be proving his observations about you to be true. He's not even really disagreeing with you. He just doesn't agree with your conclusions 100%.

You're welcome. I hope it is helpful, if not particularly scientific. I followed up with a large sampling based on PC Part Picker, but those numbers are somewhat at the mercy of how they compile price listings.

I evaluated the Blackmagic eGPU as well, but because Adobe does not explicitly support eGPUs inside Photoshop, it was a non-starter for our purposes. That being said, we also lease iMacs and MacBook Pros with discrete GPUs, so while the Bm eGPU could be useful in balancing the load of multiple monitors, I would be hesitant to buy it, since it is Polaris 20, which is not that much faster than Polaris 10 and with Polaris 30 (RX590) recently released and January (CES) just around the corner, announcements from AMD about Navi, 7nm, Vega Instinct, et al. make it hard to invest in a non-upgradeable unit. Leasing it would be the best way for businesses to acquire it and use it, because either the eGPU or eGPU Pro are certainly very usable and viable for the next 24-36 months and it is an Apple sanctioned unit.

The Pro is just not cost effective, considering I can spend $1,200.00 and get the Sonnet eGFX 650 with the Vega 64 Frontier Edition - https://www.sonnetstore.com/collections/egpu-expansion-systems

The FE has its own set of issues, so it is by no means a panacea, however, the extra grunt and the ability to replace the GPU in the future make it worth looking at if you need that kind of horsepower.

I agree in part with the other poster, but I found his stridency off-putting. Value to one person is a ripoff to another, but it is their money to spend and I have found myself coming to the realization that I have to respect that what works for me, does not work for others and vice versa. It is the whole reason that some people buy a Honda or a Toyota or a Chevrolet or a Ford.

I believe that not everyone needs a dGPU to do what they do on a daily basis. I have a 2015 15" MacBook Pro with just the Iris Pro 5200 and that computer gets the job done for me. Apple had at least four distinct CPUs (28w U-Series, 45w H-Series, Kaby Lake-G and 65w S-Series) that could have gone in the Mac mini and despite the fact that the UHD 630 is hard to appreciate, the 65w CPU portion is a pretty darn good choice. There will be countless debate about whether Apple made the right choice, but that is moot since the Mac mini got the 65w S-Series.

The flip side is that Apple has been promoting eGPUs since the WWDC in June 2017, and while it is not the wild west it once was, there its plenty of room for refinement. There are no real value boxes out there that can serve the Mac mini or the 13" and 15" MacBook Pro. Sure, I can get a used RX480 and an eGFX 350 and spend $340-$370, but the RX480 is not a recent GPU, and not a new one. Apple knee-capped the RX560 at the last second when 10.13.6, which still baffles me. The 560 EGFX Puck at $399 was at least fairly cheap and useful. How many 13" and 15" MacBook Pro owners would be served by a Vega 16 or Vega 20 eGPU that costs roughly the same as the upgrade that Apple just released? Can a box the same general size and shape as the mini be produced with the RX560, Vega 16 or Vega 20 for those users and be profitable and successful. My frustration with Apple is that they are opaque in how they handle these things after they make a big splash when they introduce it and letting the market decide is very Capitalist of them, but not particularly reassuring to someone who purchased a MacBook Pro on July 20th for $3899 and could benefit from the Vega 20 in a portable unit.

Apple has thus far been fairly allergic to producing peripherals for what they sell and it has only gotten worse. I understand the printers, although some of their printers were the best in the business for their time. I can almost understand Wireless APs, although why you would scrap an easy sale for a captive audience in your own Apple Store when you have a metric crap-ton of wireless engineers employed still boggles my mind, but I digress.

However, the one thing I would look Time Cook, Dan Riccio, Phil or anyone else in the eye and say, "Are you crazy?!?" was the complete abandonment of the standalone Display line. For years, Apple's display had been simply wonderful, at least for me and the my designers. Expensive, yes, but usually a cut above, reliable, sturdy and beautiful to look at on a daily basis. Then along came the Thunderbolt Display. Flaky USB ports, horrible reflections, you name it, it dropped the ball. Then nothing, no Thunderbolt 2 version, no 4K and 5K follow ups to the gorgeous displays in the iMacs, no breakthroughs, no GPU in the back of the displays, nothing. Just a sorry looking LG 5K display that said, we gave up. After all those years and all the effort they put into the P3 displays for the Mac lineup, the iPhone and iPad displays, Liquid Retina, Super Retina, et al. and no one could be spared to make sure that anyone wanting to stay in the macOS ecosystem and add a display to their mini, Mac Pro, iMac, MacBook(Pro) could depend on Apple to have them covered. Instead, its a wild west of displays, some good, some bad, some compatible, some a little flaky.

My point is that if Apple wants eGPUs to takeoff, if they have faith in them and they are committed to graphics as many have argued, they have to ensure that the ecosystem exists, even if that means helping it along. Instead, we have a clear iPhone XR case, finally...

Could an Apple eGPU kill sales of third parties? Perhaps, but if not that, then just a roadmap and some guidance directly or indirectly to the value, mainstream and high-end segments might be more helpful.

More attention to integration and encouraging developers to take advantage of eGPUs would be helpful as well, but that may have to wait until 10.15.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.