Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I meant that they have built FAULTY keyboard and then, instead of reversing back to “good old one”, made at least two unsuccessful attempts to reiterate.
Same could happen to dense-packed macs - one miss, two misses... why not to try third ?
There was never an option to reverse course back to the old keyboard, the new design was too thin to accommodate the old keyboard. Apple would have had the redesign the body which is something they were unwilling to do. It takes Apple 3-4 years to do a major redesign so Apple has been stuck between a rock and a hard place vis-a-vi their laptop keyboard the last few years. They've tried to put band-aids to cover the flaws, with limited success.

The plan for the Mac Pro was first to ignore it, then eventually they came up with the idea of the iMac Pro to replace it, then at some point Apple decided that was not going to be good enough so they decided to build a new Mac Pro. If Apple was satisfied with the "densely-packed Mac" then they would have simply discontinued the Mac Pro and pointed customers to the iMac Pro. For this reason I believe that the Mac Pro will not be yet another trashcan. I could speculate what I think the new Mac Pro will be like but it doesn't matter, we will find out on June 3. I still have hope that Apple will make a worthy successor to the cheesegrater.
 
  • Like
Reactions: Nabru50
Apple has a very stable, Apple-written driver for AMD graphics. They have a partnership with AMD that provides them GPUs they're happy with. They don't want to write another driver for one machine that'll be the slowest seller in their whole line. They don't want to deal with hot-running, insufficiently cooled PC gaming cards, either...

I'm wondering how much of this Apple / Nvidia spat is down to Nvidia not wanting universal 10-bit output from GeForce cards (to keep that as a Quadro feature). It would be a very Apple thing to say "All Macs have 10 bit output, all the time, end of story. So make a 10 bit driver for all your cards or GTFO".
 
If they provided a $3000 Mac Pro with an open GPU slot and a NVidia driver (the web driver was never all that stable), they'd be dealing with a lot of gaming cards (and not just expensive, relatively conservative Titans, either). They'd also start getting badgered for a lot of less than stable, gaming-specific APIs...

Apple has a very stable, Apple-written driver for AMD graphics. They have a partnership with AMD that provides them GPUs they're happy with. They don't want to write another driver for one machine that'll be the slowest seller in their whole line. They don't want to deal with hot-running, insufficiently cooled PC gaming cards, either...

Riddle me this - close to 95% of computers sold worldwide manage to work with both Nvidia and AMD GPUs .
Some better than others, granted, but you don't see Microsoft not working with a GPU maker just because some people overclock their graphic cards .

So why is Apple doing so poorly on the GPU driver side of things, and has been for a decade or more ?

My theory is that they don't like to deal with anything they can't fully control , only work with the lowest bidder, and that Apple is lazy .
Lazy as in they can't be arsed to try harder and prefer to just use what is most convenient and affordable for them .

I really think that's the gist of it - Apple being lazy and cheap .
 
I'm wondering how much of this Apple / Nvidia spat is down to Nvidia not wanting universal 10-bit output from GeForce cards (to keep that as a Quadro feature). It would be a very Apple thing to say "All Macs have 10 bit output, all the time, end of story. So make a 10 bit driver for all your cards or GTFO".

None - this is a long term petty issue going all the way back to when P.T. Barnum was running the company.
 
  • Like
Reactions: Aldaris
None - this is a long term petty issue going all the way back to when P.T. Barnum was running the company.

Right, but modern Apple can make friendly with Samsung, Qualcomm etc, it seems an odd one to bear a grudge over, unless there’s some significant technical reason - sure metal might be it, but again, if Apple is looking at introducing a 10bit (bets on it being 8+2) display, I could imagine they don’t want GPUs in their ecosystem that can’t output 10 bit.
 
Right, but modern Apple can make friendly with Samsung, Qualcomm etc, it seems an odd one to bear a grudge over, unless there’s some significant technical reason - sure metal might be it, but again, if Apple is looking at introducing a 10bit (bets on it being 8+2) display, I could imagine they don’t want GPUs in their ecosystem that can’t output 10 bit.

Wouldn't it be odd for them to limit their entire range of Macs to AMD GPUs, just for the benefit of the one Apple product noone cares about ?
 
Perhaps Apple could "certify" particular cards and support them without having to play nice with every GPU out there. Just sayin'

That's what they've been doing for ages .
I mean, back when you could replace GPUs in a Mac .

The result was exorbitant prices for very few, Mac specific GPUs ,which were 1 - 2 years behind current tech, and a vibrant hacking community .
[doublepost=1557341532][/doublepost]
There was never an option to reverse course back to the old keyboard, the new design was too thin to accommodate the old keyboard. Apple would have had the redesign the body which is something they were unwilling to do. It takes Apple 3-4 years to do a major redesign so Apple has been stuck between a rock and a hard place vis-a-vi their laptop keyboard the last few years. They've tried to put band-aids to cover the flaws, with limited success.


Apple had a perfectly fine design readily available - the previous unibody case .
Still have it .

But they wouldn't give up on the touchbar , the slot limitations , or reintroduce magsafe and the old keyboard .
You know, get rid of things that were universally rejected, and bring back what was well liked and even loved .

That's not how Apple rolls, though .
 
  • Like
Reactions: ssgbryan
That's what they've been doing for ages .
I mean, back when you could replace GPUs in a Mac .

The result was exorbitant prices for very few, Mac specific GPUs ,which were 1 - 2 years behind current tech, and a vibrant hacking community .

With Mojave, Apple did have their official list of GPU make/models that would work with the 5,1. I imagine what @Blair Paulsen was thinking was having a similar list with the modular MP, and the modular MP would have a normal boot screen with any off the shelf PC GPU.
 
Last edited:
I'm wondering how much of this Apple / Nvidia spat is down to Nvidia not wanting universal 10-bit output from GeForce cards (to keep that as a Quadro feature). It would be a very Apple thing to say "All Macs have 10 bit output, all the time, end of story. So make a 10 bit driver for all your cards or GTFO".

No, at least _I_ have 10bit output with my GTX Titan. This works also for other Geforce cards for quite some time, and yes this is NVIDIAs webdriver (obviously).
 
I think we may have had a clue revealed as to the form factor of the new modular Mac Pro...

dPTmQkP.png


Obviously, a larger-than-life scale model, but there it is, the new modular Mac Pro...!

Sorry to keep trotting this out, but...

dPTmQkP.png


Imagine that cube as a larger than life sneak peek mockup of the new modular Mac Pro...

So, if we go by the SG mini, and assume the pictured cube (remember, larger than life mockup) is as tall as it is wide & deep...

We get a 7.7" x 7.7" x 7.7" modular Mac Pro...

High core count socketed CPU (Xeon or Threadripper?)
Minimum of four RAM slots
Sorry, soldered boot SSD with T2 sidekick
Custom Vega GPU card (MXM format?)
Second slot for optional second GPU
Minimum of four TB3 / USB-C ports
Minimum of four USB-A ports
Two 10Gb Ethernet ports

Blower fans on CPU & GPU(s)...

Keep that coffee hot all day...! ;^p

Bringing this back, with a Copy / Paste from my post over at the AppleInsider forums...

Even though this is not the mini-tower / tower that everyone wants, if Apple allows end users to easily add / change the CPU / RAM / GPU & add additional 2.5" SSDs for expanded storage, AND releases updates (looking at you, GPU module) on a regular schedule, it might work...

If we are looking at a 7.7" x 7.7" enclosure with rounded corners, then the GPU may arrive in the MXM format...?

Look at the work done on the PSU in the new Space Grey Mac mini, now imagine that same horizontal footprint, but extended vertically, a larger PSU with more power...

I could see Apple using a vertical backplane (mounted at the front of the chassis) with five slots for vertical daughtercards; one for CPU / RAM / base GPU (Navi 12), one for storage (T2 & four M.2 NVMe slots, two prefilled & remaining modules initially available from Apple, until OWC steps in), one for base I/O (eight TB3 / USB-C ports, two 10Gb Ethernet ports, one 3.5mm headphone jack), two remaining slots are for GPUs and/or AI / ML specific GPUs...

So basically you are going to have three slots filled from Apple, with you deciding the level of CPU / RAM / SSDs come pre-installed...

You can BTO up the CPU / RAM / storage & you can BTO add up to two Apple-sanctioned GPUs / GPGPUs...

I would imagine bottom intake / rear exhaust Apple-designed blower fans for the CPU & various GPUs...

It is Apple after all, why would they ever give us EXACTLY what we want...! ;^p

More thoughts on this, or are they fever dreams...?!?

So, why has Apple taken so long to replace the Mac Pro...?

Some may say the Mac Pro was slated fade away into the distance, with the iMac Pro as its successor...

Many ask as to WHY it is taking so long, a mini-tower workstation with a few PCIe slots should be easy...

My idea of the backplane with daughtercards modular Mac Pro cube still stands, but backplanes & daughtercards have been done by Apple before...

But then when one thinks of the smaller chassis size and the TDPs of current CPUs/GPUs, it seems like it would be another thermal corner for Apple to paint itself into...

UNLESS...!!!

Unless it IS a small cube with a backplane & daughtercard system...

But imagine if there were daughtercards that all had the same thing on them, and the more you added (say, up to four daughtercards...?) the more powerful the system became...!?!

Let me introduce you to the new ARM-powered modular Mac Pro...!!!

Still the same SG Mac mini style PSU (same horizontal footprint, but the full height of the chassis), with all I/O to the lower portion of the rear panel (eight TB3 / USB-C ports, dual 10Gb Ethernet. & 3.5mm headphone jack), as well as the power input...

Each daughtercard has the following:

Four A13X Bionic APUs
64GB RAM
2TB SSD

As one adds a new daughtercard, the system integrates the cards resources into one large homogenized pool...

A fully loaded system would consist of sixteen ARM APUs, 256GB RAM, & 8TB SSD...

The hardware engineering & (especially) the software (both for the homogenized pool of resources AND for the transition from x64 to ARM) would take a few years, I would figure...

So yeah, we just may be looking at the first ARM Mac with the forthcoming modular Mac Pro...!?!

Discuss...! ;^p
 
That's a big deviation from " between Mac Pro 2013 and 2010-2012" models. Neither one of those was about walling off a subset into a different SKU.

Upgradable like a 2012, proprietary like a 2013.

The only subset that would make remote sense would be Apple augmenting what they thought was complete for most, but some did not. For example, if Apple was 'done' with SATA drives to put those in a "snap on" case. ( It isn't snap on but there a subsection of the Dell 7920 on the right hand side that is just for SATA drives , ODD , and inserts. ( the power supply is over there but if just dropped the subassembly (and attached a SSD to the motherboard) you'd still have a fully working workstation. Just smaller.

Stacking boxes for relatively low bandwidth peripherals may fit with that they are doing. ( a bit of a variation on what the Sonnet rack rigs that Mini/MacPro as tossed into now ). But major, essential components? That is very dubious.

What I've heard is that it will be controversial just like the 2013. So if I was a betting man, and I see a few people talking about a stackable Mac Pro...

I hope stackable isn't right. It sounds like a mess. But at this point I wouldn't make a bet against it.

You can even see the new logic with ARM coming up. They ship an Intel box now, and when you want to go to ARM, you just buy a new brain and leave the rest of the components as is. They could have a i9 brain and a Xeon brain and eventually an ARM brain. In their heads it probably all fits together. They're probably going "ok the problem people had with Thunderbolt was all the boxes, how can we do the same Thunderbolt thing but without all the separate boxes."

It would be nice if these were all modular components inside a tower, but I don't think that would rise to a controversial product.
 
I miss the time when you could "down vote" a post here.

Yeah, if it is an ARM Mac Pro related post, it must be bad...

This is a singular thread related to musings on the as-yet-released (or even previewed) 2019 modular Mac Pro in public forum...

But your wanting to stifle ARM musings, that seems pretty much a fascist move, which is ironic considering your sig...

There, I fired my shot back, now let us not go into a back & forth bickering about an as yet unreleased product...!

ARM Mac Pro 4 Lyfe...!!! ;^p
 
Yeah, if it is an ARM Mac Pro related post, it must be bad...
In no way am I attempting to stifle any discussion. (If you can't state your disagreement with a post, you have no discussion.)

My real objection is to the "Lego Block" architecture. It's been tried without much success in the past, and today the higher speed busses make it even harder. If "Lego Block" had issues with 2.5 MT/sec busses, what does that foreshadow for the current 8 GT/sec (PCIe 3.0), and upcoming 16 GT/sec (PCIe 4.0) and 32 GT/sec (PCIe 5.0).

Have you noticed that the newer Intel CPUs have more memory channels? A simple reason - the higher speed DRAM isn't stable with three DIMMs on a channel. Therefore, add more channels with two DIMMs per channel.

There are serious problems trying to extend the lengths of these higher speed busses even on the motherboard - let alone using an external connector and cable to an external box.

ARM has nothing to do with this.
 
  • Like
Reactions: ssgbryan and Nugget
In no way am I attempting to stifle any discussion. (If you can't state your disagreement with a post, you have no discussion.)

My real objection is to the "Lego Block" architecture. It's been tried without much success in the past, and today the higher speed busses make it even harder. If "Lego Block" had issues with 2.5 MT/sec busses, what does that foreshadow for the current 8 GT/sec (PCIe 3.0), and upcoming 16 GT/sec (PCIe 4.0) and 32 GT/sec (PCIe 5.0).

Have you noticed that the newer Intel CPUs have more memory channels? A simple reason - the higher speed DRAM isn't stable with three DIMMs on a channel. Therefore, add more channels with two DIMMs per channel.

There are serious problems trying to extend the lengths of these higher speed busses even on the motherboard - let alone using an external connector and cable to an external box.

ARM has nothing to do with this.

But I am not talking about individual 'boxes / modules', I am talking about a singular cube chassis with an ultra-high-speed backplane & the actual "modules" are the daughtercards...

Oh, but you may be talking about the latency issues with data dispersion / retrieval / processing across a homogenized pool of resources...?
 
Upgradable like a 2012, proprietary like a 2013.

That could primarily hinge upon the default display GPU. Thunderbolt enabling like 2013 ( so more than just PCI-e coming of the internal edge(s). ), but more discrete like the 2012 ( user plug/unplug ).


What I've heard is that it will be controversial just like the 2013.

For the folks who compose their workstation starting inside with the GPU card and work their way out .... that is all the controversy Apple would need to generate. The rest could be almost classic standard and there would be dozens of pages of folks moaning and groaning on these forums. ( just like the last 6 years. )

About the same level of controversy if they dumped SATA drives from the standard box.

So if I was a betting man, and I see a few people talking about a stackable Mac Pro...

It is possible, but doesn't seem probable. I wouldn't bet much.


You can even see the new logic with ARM coming up. They ship an Intel box now, and when you want to go to ARM, you just buy a new brain and leave the rest of the components as is.

I don't see much logic there at all. More likely if Apple goes 100% for the Mac line up, the Mac Pro is dead. Apple won't do an relatively super low volume workstation ARM CPU and the whole line up fall back more so toward mobile and stop somewhere around the mainstream iMac.

Separating the CPU and GPU in a workstation isn't particularly logical at all. ( a 3rd GPU perhaps , 4th even more so given common demographic usage. Mac Pro didn't particularly carter to that in 2012 model anyway. A second GPU is wishy washy. )

If the "attach box" is primarily a SATA drive cage what is the substantive savings of not just buying new sleds?



They could have a i9 brain and a Xeon brain and eventually an ARM brain.

If looking to roll out a range of Mac Minis perhaps. But in the workstation space that is mostly Cupertino kool-aid more so than logic (at least on historic Apple trends). That's looks like a company highly itching to get into the xMac space and crank up the fratricide on the iMacs. I wouldn't bet on that.


In their heads it probably all fits together. They're probably going "ok the problem people had with Thunderbolt was all the boxes, how can we do the same Thunderbolt thing but without all the separate boxes."

The 'additional' cables weren't the sole primary crux. The relative cost was too. If these "snap on" augments are at least as high, if not higher, than the Thunderbolt options then there is something in their heads ... drugs.

Folks are probably also going to want to consolidate not just the data cable but the power cable. Thunderbolt has a 100W limit which is good for a couple of drives ( several if careful about tiptoeing around initial spin up spikes and use very modern low power drives. ). GPUs would be a different story in size and power. ( if flip between radically different archs in 'brain box' why would the drivers for these sunk cost GPUs implicitly follow along with the switch? )

For extremely PCI-e centric solutions, a "Thunderbolt without the box" is just a slot ( classic x16 PCI-e slot , M.2 slot , U.2 , etc. ). So yeah I could see Apple doing a empty x16 slot and perhaps some highly focused SSD slots. Unless manically attached to building a "small as possible footprint", literal desktop that shouldn't be a major problem. ( and more like Mac 2012 ).

On top of that is the glaring problem is that Apple R&D can't even get monolithic Mac "boxes" on out a regular and reasonable amount of time. ( Mini comatose for years , iMac fell into rut, MacBook largely comatose in terms of design for 4 years, Mac Pro looooong rut ). Where is the logic in Apple not being able to do a small number of boxes on a regular base going to add a higher workload of different 'augment" boxes? Or they are suppose to be farming all of that work out ? If they are keeping to themselves that is highly likely a worse problem.

Apple sells iMacs with a built in VESA Mount so it would a bit of a jump for them to sell a Mac that you attached to a 3rd party thing. The Mac Pro market though is has systemic problems with critical mass though. Some kind of Mac Pro only connector that 3rd parties will build boxes to snap onto.... possible but not very probable. ( even after some gallons of Cupertino kool-aid. .... going to have major problems if just clearly think ahead past the "won't this be slick/kool" phase. )

After all of the work Apple and Intel did to get Thunderbolt mainstreamed into the USB standard .... now when ecosystem is going to perhaps grow at levels expected initially ... Apple is going to introduce a new propriety 'fork' ? Put in lots of work and then walk away. I suppose there is some Cupertino kool-aid logic in that .



It would be nice if these were all modular components inside a tower, but I don't think that would rise to a controversial product.

Missing components would be (e.g., 3.5" SATA drives sleds superseded by M.2 slots ) or non-Rube-Goldberg TB enabling default GPU 'card" would be labeled "doom" by more than a few and still could be inside the tower.

There is even the "Xeon SP (and dual) or die" crowd that has occupied a sizeable chunk of this thread. Apple "stuck" at 18 cores (and single CPU package) would be controversial to those folks. 4 DIMM slot s rather than 8 ... some chatter will explode on that too. If Apple doesn't slavishly clone the other bigger players in workstation market their will be "controversy" by a "pitchforks and torches" crowd. Priced higher than $2999 price point..... ditto.
[doublepost=1557367052][/doublepost]
More thoughts on this, or are they fever dreams...?!?

So, why has Apple taken so long to replace the Mac Pro...?

Because it isn't strategic and doesn't make any significant, immediate impact to the overall financial picture of the Mac Product line up or to Apple overall. So it isn't allocated high priority resources and gets done in Apple's copious spare time.

That's the simple explaination that fits. Alien area 51 for some fancy Buck Rogers explanation is not "most likely" at all.



Some may say the Mac Pro was slated fade away into the distance, with the iMac Pro as its successor...

literal desktop Mac Pro 2013 --- successor ---> iMac Pro 2017 . Mostly yes.

The confusion is that Mac Pro is overloaded as a term. The path that the 2009-2012 model was on and the 2013 model are different. The iMac Pro is far more related to the latter than the former.


Many ask as to WHY it is taking so long, a mini-tower workstation with a few PCIe slots should be easy...

If don't assign anyone to work on it ... it can take forever.


My idea of the backplane with daughtercards modular Mac Pro cube still stands, but backplanes & daughtercards have been done by Apple before...

Actually the NeXT cube was done by NeXT; not Apple. Also a 12"x12"x12' cube probably wouldn't work well with Apple's typical literal desktop constraint objectives.




But then when one thinks of the smaller chassis size and the TDPs of current CPUs/GPUs, it seems like it would be another thermal corner for Apple to paint itself into...

There is no corner if add volume and fan(s). If primarily independently cooled 2-3 daughtercards wouldn't run into a "thermal corner" at all.



Let me introduce you to the new ARM-powered modular Mac Pro...!!!

ARM is simply just misdirection here. The root cause problem of "thermal corner" was highly coupled cooling. Not architecture.



Still the same SG Mac mini style PSU (same horizontal footprint, but the full height of the chassis), with all I/O to the lower portion of the rear panel (eight TB3 / USB-C ports, dual 10Gb Ethernet. & 3.5mm headphone jack), as well as the power input...

Each daughtercard has the following:

Four A13X Bionic APUs

Four independent computers isn't going to buy much but much higher NUMA impacts and need for something like (MPI ... which isn't a Apple core competency. )
 
Last edited:
  • Like
Reactions: barmann
Actually the NeXT cube was done by NeXT; not Apple. Also a 12"x12"x12' cube probably wouldn't work well with Apple's typical literal desktop constraint objectives.

Yeah, I know that NeXT did the NeXT cube, that is why it says NeXT right there on the front of it...

And I never mentioned a 12" cube, if you review the earlier posting, I clearly outline a footprint matching the Space Grey Mac mini (7.7" x 7.7"), but taller, like the stack of five minis shown at the 2018 October Mac event...

There is no corner if add volume and fan(s). If primarily independently cooled 2-3 daughtercards wouldn't run into a "thermal corner" at all.

A high core count Xeon & two Radeon VII GPUs would definitely put out more heat than could really be handled by blower fans in an 8" cube, without being thermally limited by design...?

Four independent computers isn't going to buy much but much higher NUMA impacts and need for something like (MPI ... which isn't a Apple core competency. )

Another possible reason for the long delay, which I did mention in my posting...

Either way, I have outlined a roughly 8" cube with a backplane / daughtercard assembly, which can be either Xeon / Threadripper / AMD GPU / Nvidea GPU based, or it can be a whole bunch of ARM APUs...

Just the right amount of over-engineering for an Apple product...!
 
Separating the CPU and GPU in a workstation isn't particularly logical at all. ( a 3rd GPU perhaps , 4th even more so given common demographic usage. Mac Pro didn't particularly carter to that in 2012 model anyway. A second GPU is wishy washy. )

We're talking about the company that believes everyone should be happy with Thunderbolt GPUs right?

Sure, yes, that would be super complicated in a workstation and unnecessary but uhhhh... again... We're talking about Apple.

It does not seem out of character for Apple at all to say "all GPUs are external." That sounds exactly like Apple.

If the "attach box" is primarily a SATA drive cage what is the substantive savings of not just buying new sleds?

Again, we're talking about SATA sleds and cost savings for customers... did I wander into the wrong forum? This is Apple we're talking about.

There's nothing stopping them from building some higher bandwidth proprietary version of Thunderbolt and basically treating every component in the stack as a short hop Thunderbolt device.

Is it probably a bad idea? Yes. Does that sound exactly like something Apple would do? Does to me.

I would personally hope for the backplane idea over stackable. But again, stackable sounds like exactly the sort of thing Ive would come up with. Backplane isn't "Apple" enough.
 
  • Like
Reactions: Boil
Yes, Microsoft has the other 92% of the PC market, but:

1.) Apple has over half of the market for PCs (desktop and laptop) over $1000. Most of Windows' overwhelming market share is $300-$500 computers Apple has no interest in.

2.) If you believe Windows is as stable and usable as MacOS, why are you waiting for a new high-end Mac? As many here point out, there is wonderful workstation hardware (in any configuration you want from entry workstations around $3000 to ultimate Z8 configurations north of $100,000) already out there from HP, Puget, and a number of others - but it all runs Windows. If that's not a disadvantage to you, why not choose any one of those systems - Apple won't release something with revolutionarily better hardware.

3.) Like it or not, gaming is a big part of the reason Windows isn't as stable and usable as MacOS. Microsoft supports AMD, NVidia and Intel graphics, but Microsoft also has problems related to that support, and to support for other things games do, some of which affect even systems that are never used for gaming. Apple's aggressive stance on not letting games wag the dog is one of the reasons MacOS works better for many people.

4.) The 10-bit thing rings true... I strongly suspect the new display they're working on is 10-bit, and the next iMac Pro will be as well. What if they're trying to roll out 10-bit graphics starting from the top of the line? If AMD is willing to work with them and NVidia is saying "only our most expensive cards", that could make a real difference.

5.) The Mac Pro won't sell even a million machines a year (if it does, it'll outsell all desktops over $2000 except the iMac combined). Apple sells close to 20 million computers a year (over 30 million if you count iPads as "computers"). Why would Apple (who already has a highly stable graphics driver) develop a new one to handle an option on one computer in 20 that they sell?
 
Last edited:
We're talking about the company that believes everyone should be happy with Thunderbolt GPUs right?

Sure, yes, that would be super complicated in a workstation and unnecessary but uhhhh... again... We're talking about Apple.

It does not seem out of character for Apple at all to say "all GPUs are external." That sounds exactly like Apple.
Matthew Panzarino (TechCrunch): What’s your philosophy on external GPUs?

Craig Federighi: I think they have a place.

Matthew Panzarino (TechCrunch): Seems like it would have offered the maximum flexibility in the space where you would never have to worry about thermal problems theoretically as long as the external GPU was built right.

John Ternus: I think there’s some aspects of them where they’re going to be beneficial and there’s some workloads where they’re going to be less effective.

https://techcrunch.com/2017/04/06/t...-john-ternus-on-the-state-of-apples-pro-macs/
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.