Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Timi? No illusions. This guy just don't gives a F**** about Nvidia drivers.
It's funny because having a dock with a large monitor and a eGPU is a great option for an Apple laptop because of its very limited GPU options. Imagine taking your laptop out of your bag putting on a desk with dock and eGPU, then getting real work done. Then disconnecting laptop and taking laptop with you. Right now a single 1080 in an eGPU enclosure with tb3 connected to a laptop sounds too awesome. It's the future. Surprised Apple doesn't wanna be part of it. They are forcing everything smaller and smaller and not making any computers for heavy lifting. My hope is that when eGPUs become more standard Apple will be forced to play well with others.

Wait. I just described the razer blade.
 
Last edited:
As a former laptop owner and current Mac Mini owner, I totally get the appeal of the eGPU. But even so I feel that it's one of the smallest niches in the Mac universe.

If Apple can't even be bothered to support Airport routers and Time Capsule backup devices any more, which are far more mainstream, I don't see them caring even a little bit about eGPU support.

I hope I'm wrong, but I don't think so.
 
As a former laptop owner and current Mac Mini owner, I totally get the appeal of the eGPU. But even so I feel that it's one of the smallest niches in the Mac universe.

If Apple can't even be bothered to support Airport routers and Time Capsule backup devices any more, which are far more mainstream, I don't see them caring even a little bit about eGPU support.

I hope I'm wrong, but I don't think so.

I think the issue is that offering an official eGPU solution would be a PR and technical nightmare for Apple.

It makes a lot of sense for us; you're able to extend the life of your machine and give it new purpose. But to Apple, not only would that be admitting them some machines just aren't up to the job (not their style), it would vastly extend the time people would keep their machines rather than replace them.

Why sell someone a ~£500 solution to up GPU performance when you can force them to spend way more for the same thing and get a whole new machine. This makes them more money, is more simple, makes their sales figures look better, and means they can continue to justify supporting only a small subset of machines.

People have well established that eGPU works on the Mac Pro 2013 for example. If Apple sold an eGPU with Thunderbolt 3, you could convert it to 2 using their own adapter, and plug it in. Now I'm sure Apple would probably block eGPU functionality on machines they never intended it for (like they do their own DVD drive for example), but that would be trivial to bypass. But they're still selling the Mac Pro.. so they'd effectively be saying that it's not good enough, which isn't very Apple.

TL;DR, it makes sense, so they won't do it.
 
I think the issue is that offering an official eGPU solution would be a PR and technical nightmare for Apple.

Well on a technical level it's a pain. They'd actually have to devote some resources to coding for it. On top of that even windows machines have bugs with it so with Apple's limited resources...

As far as PR, I'd say that's a stretch. Though Apple fanboys appear literally immune to the fact that there are better GPUs out there, I don't think they'd be upset with an eGPU. Suddenly instead of 2008-speeds being okay they'll be ranting and raving about how amazing it is to have an eGPU.

In fact, I think there's a good chance Apple could go for one or at least help out a 3rd party. If you look at the 2016 rMBP and all the TB3 slots it really seems like a shame there's nothing to freaking plug in there to use that BW (apart from a bunch of 5k monitors, like that was a thing on a laptop).
 
I don't think so. They have very clearly defined product categories and they try their best not to overlap and thus cannibalise each others sales figures. Hence why they do an AIO (iMac) and a Workstation (Mac Pro) and then a low powered desktop (Mac mini), but lack a decent i7 or i5 desktop with a proper desktop-class GPU that doesn't come with a monitor built in (xMac). This is deliberate so it shoehorns you into making a choice.

Offering an eGPU upsets this balance, and means people could get what they want for a lot less;
  • If you want a high-powered desktop, but perhaps a decent monitor already, Mac Pro. £££
  • If you want just something to get online, but have your own stuff, Mac mini. £
  • If you want a simple clean midrange PC, iMac. ££
eGPU changes all that, by offering potentially a high-powered desktop experience for less than they'd get for a Mac Pro sale. Look at things like the Intel NUC, and you can see where at least some of the industry is going. A lot of people buy the Mac Pro for Xeons, ECC RAM and expandability(!), but a lot do as it's the only option that provides a decent dedicated GPU experience.

So, in a business sense at least, offering eGU doesn't make sense. Laptops are much the same; low, mid and high range offerings, and you pick what you need. eGPU benefits laptops more, since they start off with weaker GPU performance from the start, but again, Apple seem to view Macs these days as disposable, replaceable gear. They've taken everything they know about making iOS devices, and applied it to their Mac lineup, and the 2016 MacBook Pro exemplifies this design language to the Nth degree; almost none or none of the components (depending on which model you get) are user-replaceable, and I think that is the biggest clue there is. If Apple was interested in you upgrading your laptop with an eGPU, they'd probably also have low-profile swappable RAM DIMMs, use standardised MXM and M.2 standards, perhaps even not solder in CPUs. But they don't.

Based on that, I can't ever see Apple do eGPU. Wish it wasn't like this, but there you go.
 
I don't think so. They have very clearly defined product categories and they try their best not to overlap and thus cannibalise each others sales figures. Hence why they do an AIO (iMac) and a Workstation (Mac Pro) and then a low powered desktop (Mac mini), but lack a decent i7 or i5 desktop with a proper desktop-class GPU that doesn't come with a monitor built in (xMac). This is deliberate so it shoehorns you into making a choice.

Offering an eGPU upsets this balance, and means people could get what they want for a lot less;
  • If you want a high-powered desktop, but perhaps a decent monitor already, Mac Pro. £££
  • If you want just something to get online, but have your own stuff, Mac mini. £
  • If you want a simple clean midrange PC, iMac. ££
eGPU changes all that, by offering potentially a high-powered desktop experience for less than they'd get for a Mac Pro sale. Look at things like the Intel NUC, and you can see where at least some of the industry is going. A lot of people buy the Mac Pro for Xeons, ECC RAM and expandability(!), but a lot do as it's the only option that provides a decent dedicated GPU experience.

So, in a business sense at least, offering eGU doesn't make sense. Laptops are much the same; low, mid and high range offerings, and you pick what you need. eGPU benefits laptops more, since they start off with weaker GPU performance from the start, but again, Apple seem to view Macs these days as disposable, replaceable gear. They've taken everything they know about making iOS devices, and applied it to their Mac lineup, and the 2016 MacBook Pro exemplifies this design language to the Nth degree; almost none or none of the components (depending on which model you get) are user-replaceable, and I think that is the biggest clue there is. If Apple was interested in you upgrading your laptop with an eGPU, they'd probably also have low-profile swappable RAM DIMMs, use standardised MXM and M.2 standards, perhaps even not solder in CPUs. But they don't.

Based on that, I can't ever see Apple do eGPU. Wish it wasn't like this, but there you go.

Yeah I agree 100% that Apple wants to be a more disposable company using the iPad model and I agreee they want nothing to do with the eGPU. But they should play friendly with 3rd party vendors and allow people outside of the Apple distopia to use their equipment how they see fits.

I do know from some people that Apple did play with an external monitor with TB3/TB2 only connection that had GPUs built into the monitor. For anything that would require decent gpus. For VR and gaming and whatever else. But I think it would expose how limited AMD gpus are and they would probably saughter the GPUs then having outdated hardware in one generation. Like what happened to the MacPro. So eGPU is supposed to be part of the TB3 universe. I mean TB3 was bragged as a pci replacement.

As for where they are as a company. Offering a high end and mid and low portable computers. They are trying to phase it all out. They are over it. iPads and super thin low spec laptops and super thin low spec iMacs are the future.

Like right now? Where is the MacPro? It's in design Pergatory. They will either totally abandon anything like a tower or they will give us something between the Mac mini and MacPro and call it good. Then they would only have 3 "computers" of moderate specs, but slim aluminum shiny or dull metal looking. They will look beautiful hardware but GPU and CPU will be middle of the road. No more xeons.

The iPhone and iPad made them billions but it also made them realize they don't need to make computers anymore. They are more like fancy gadget makers that make gadgets that replace traditional computers.
 
  • Like
Reactions: Synchro3
Apple has a conflict of interest on whether to adopt eGPU. Thinner and lighter design was its motivation to use Thunderbolt 3/USB-C. The other players are following suit. While Microsoft, PC manufacturers, and Intel are slowly rolling out support for eGPU, Apple makes it a challenge for Thunderbolt 3 enclosures to be recognized by macOS.

There's currently a workaround for all Thunderbolt 3 enclosures to communicate with macOS. So far PCIe SSD, and other expansion I/O work except for GPU.

This picture shows a PCIe SSD in a Thunderbolt 2 enclosure daisy-chained to an eGPU in a Thunderbolt 3 enclosure connected to a 2016 MBP. All connections are linked successfully. The external volume showed up and worked fine. The Thunderbolt 3 external GPU could even send video output to an external display but it's crippled due to lack of acceleration and Metal support.

late-2016-thunderbolt3-thunderbolt2-daisy-chain.jpg
 
Apple has a conflict of interest on whether to adopt eGPU. Thinner and lighter design was its motivation to use Thunderbolt 3/USB-C. The other players are following suit. While Microsoft, PC manufacturers, and Intel are slowly rolling out support for eGPU, Apple makes it a challenge for Thunderbolt 3 enclosures to be recognized by macOS.

There's currently a workaround for all Thunderbolt 3 enclosures to communicate with macOS. So far PCIe SSD, and other expansion I/O work except for GPU.

This picture shows a PCIe SSD in a Thunderbolt 2 enclosure daisy-chained to an eGPU in a Thunderbolt 3 enclosure connected to a 2016 MBP. All connections are linked successfully. The external volume showed up and worked fine. The Thunderbolt 3 external GPU could even send video output to an external display but it's crippled due to lack of acceleration and Metal support.

View attachment 677119

Love the setup, external peripherals has infinite potential, its just so confusing to be in the Apple universe, I personally have 3 towers, iMac, macmini, 3 iPhones, 2 apple tv's, and a fully ethernet house. Im stuck Apple.. Its just weird, why do they screw over their so loyal customers? Just say, WE DON'T MAKE COMPUTERS ANYMORE, WE DON'T CARE ABOUT POWER USERS, WE JUST LIKE HOW STUFF LOOKS, WE LIKE SLOW GPU'S, stop stringing ex-power uses along.. Its so frustrating... Its an abusive relationship.

My hope is that Apple is just trying to fix their laptop graphics problems, and then tackle their crappy TB3 support. If Adobe, Blackmagic and others make affordable LINUX solutions, that would be cool, because Apple could care less. In my industry most high end Applications have a Linux solution, they just mark up the linux licenses so high!! Cause they assume only major studios are using linux for video and film production.
 
Some people have been able to get TB to work with an eGPU in OSX, it was extremely limited obviously but they did get hardware acceleration and it would drive an external monitor. It was very buggy.


I was surprised too. There is also a specific eGPU enclosure marketed towards Mac owners. The software is CRAP right now, granted, but to say this is absolutely not going to happen due to PR is a stretch.

Not that it matters, but seriously: What the **** are these owners of a 4 grand rMBP going to do with all that port bandwidth?? Granted, the answer could be "Nothing"

Those machines look damn nice in the hands of ironic hipsters writing their novels at starbucks.
[doublepost=1481376934][/doublepost]Woops missed this. This is what I was talking about
This picture shows a PCIe SSD in a Thunderbolt 2 enclosure daisy-chained to an eGPU in a Thunderbolt 3 enclosure connected to a 2016 MBP. All connections are linked successfully. The external volume showed up and worked fine. The Thunderbolt 3 external GPU could even send video output to an external display but it's crippled due to lack of acceleration and Metal support.
 
I should note a clear distinction that Thunderbolt 3 eGPU is the only connection not fully working in macOS. Older eGPU builds based on Thunderbolt 2 enclosures are working well for Macs with Thunderbolt support (all 3 generations).
 
  • Like
Reactions: JacobSyndeo
Interesting article on PCI-E scaling on the GTX 1080.
I run a 1080 on one of my Mac Pros, exclusively on BootCamp Windows 10, and consequently at PCI-E 1.1 x16, and Doom runs like a monster on this thing.
There's still life in these 5,1 Pros :)
What is somewhat more limiting however is the CPU, but again, marginally so.

https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_1080_PCI_Express_Scaling/25.html

I've found the hit on performance is around 10-15% for the 1080 when coupled with a W3670/1066Mhz RAM/PCIe 1.1 x16 slot, but the performance is still great. Doom gets between 80-120fps using Vulkan on a 27" ACD at 2560x1440 on Ultra/Nightmare settings. Not bad for a 6 year old machine :)

The Division is usually around 60, but occasionally dips lower, but I'm blaming Ubisoft for that ;)
 
I'm using an Oculus Rift with a 4,1 (flashed to 5,1) with GTX Titan X, of course this machine is still competitive.
 
Not that it matters, but seriously: What the **** are these owners of a 4 grand rMBP going to do with all that port bandwidth?? Granted, the answer could be "Nothing"

Reading about it all, hopefully its just a catch up problem with vendors and their hardware being designed for last Gens TB2. According to Netkas.. Its the "Intel’s Thunderbolt 3 chipset (Alpine Ridge) in combination with the first generation of TI USB-C chipset (TPS65982)"
http://forum.netkas.org/index.php/topic,11654.0.html

This would mean that its not Apple trying to screw us intentionally, fingers crossed, but that its the Alpine Ridge with new USB-C Chipset(TPS65982)

Hopefully eGPU's that are fully compatible will be out soon... I know that only USB-C devises are really working on the new MacBook Pros. Because it seems the new USB-C chipset is backwards compatible? And why tb2 is working for the same reasons, But to be fully Thunderbolt 3 a new USB-C chipset will take more time to hammer out. At least thats my hope.
 
I've found the hit on performance is around 10-15% for the 1080 when coupled with a W3670/1066Mhz RAM/PCIe 1.1 x16 slot, but the performance is still great. Doom gets between 80-120fps using Vulkan on a 27" ACD at 2560x1440 on Ultra/Nightmare settings. Not bad for a 6 year old machine :)

15% drop is typical for gtx 1080 at 4GBps (pcie 1.1 - 16x).

Hopefully eGPU's that are fully compatible will be out soon... I know that only USB-C devises are really working on the new MacBook Pros. Because it seems the new USB-C chipset is backwards compatible? And why tb2 is working for the same reasons, But to be fully Thunderbolt 3 a new USB-C chipset will take more time to hammer out. At least thats my hope.

That's why I think it's possible Apple may expect /be helping create/is making a GPU breakout box. It's probably best if they help someone else though, as the bugs are probably going to be pretty nasty to start out with and they might have to *gasp* be expected to help out GPU manufacturers with drivers for OS X.

But once again: WITHOUT the eGPU option, what is anyone going to do with that many TB3 lanes?
 
Last edited:
  • Like
Reactions: Synchro3
But once again: WITHOUT the eGPU option, what is anyone going to do with that many TB3 lanes?

I have no idea. Apples development of new products in a hidden secret bubble without any input from customers or the outside world used to be exciting under Steve Jobs. Now it's worrisome. Their next big announcement might be the abaility to use TB3 to connect to an iWatch or an iClock or use TB3 to use Apple Pay faster, and pretend like the high end world doesn't exist.
 
Truth be told I think it's unlikely we'll ever see 10XX series drivers for OS X. It certainly could happen, especially if Apple do start pushing eGPUs but it just seems unlikely right now.

Keep in mind that whilst the 9XX series GPUs do indeed have OS X support, the performance is well below what it should be, even when you factor in the quality of OS X ports and the fact that it uses an ancient version of OpenGL.
 
Keep in mind that whilst the 9XX series GPUs do indeed have OS X support, the performance is well below what it should be, even when you factor in the quality of OS X ports and the fact that it uses an ancient version of OpenGL.

In what applications? The Metal version of WoW runs very well on my TITAN X (Maxwell) and is within 10% of Windows with 10.12.2 and the latest web drivers.

Many OpenGL games are heavily CPU limited, and thus run poorly on macOS on a high-end GPU. I get the feeling that many game developers/porters get pretty lazy when dealing with slow low-end GPUs in official Apple products, and thus don't really care that a high-end GPU like a 980 Ti is massively CPU limited. As more games switch over to Metal, this should improve as we no longer have to contend with Apple's implementation of OpenGL.
 
Truth be told I think it's unlikely we'll ever see 10XX series drivers for OS X. It certainly could happen, especially if Apple do start pushing eGPUs but it just seems unlikely right now.

Keep in mind that whilst the 9XX series GPUs do indeed have OS X support, the performance is well below what it should be, even when you factor in the quality of OS X ports and the fact that it uses an ancient version of OpenGL.

Their might be some gaming applications where this is true, like DirectX or Vulcan or something else, pick a flavor. Or maybe you have a flashed Nvidia card that isn't registering as the full card, or showing limited RAM, like a 8GB card being recognized as 6GB. BUT for what I do, CUDA. I have found them pretty much equal. and in some ways CUDA on OS X is faster, if the underlying application is faster. I found same SPEC OS X system running Blackmagic Davinci resolve is a bit faster with CUDA than the equivalent PC. This is because Resovle was from Linux/Unix and was faster on OS X. But for gaming, I have no idea.. Maybe it is, who knows.. but using the drives from Nvidia, they should be similar.

OH if its OpenGL thing?? Well all I can say is OpenGL is crap, thats why there are so many replacements out in the world..
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.