Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
If it's doing equivalent work, who doesn't want to save money in the process?

All else being equal you are correct. But power efficiency isn't even close to being worth the tradeoffs in platform cross-compatibility (loss of bootcamp, loss of virtualization). No Mac Pro buyer is going to say "Sure, I can't run VMware any more and Boot Camp is gone, but hey -- my power bill is $40 less this month so this is fine."
 
People don't install Boot Camp because they want to boot to Windows, they install Boot Camp because they want to run Windows applications. Windows 10 for ARM really only accomplishes the first one. That it "can still boot to Windows" is not really a feature at all.

Windows on ARM is a way for Microsoft to compete with the thin client/chromebook competition, not a path towards an Intel-less desktop workstation experience. It will get you a familiar Windows environment that can run a browser and a very minimal suite of applications.

Heck, until just three months ago it couldn't even run 64 bit ARM applications. Support for x86 apps (32 bit only) is nascent and comes with a performance penalty.

An ARM-based Mac Pro would be the effective end of Boot Camp. Linux dual boot would still be "ok," but likely compromised if you care about accelerated GPU.

Moreover, it's optimistic to imagine that an ARM platform version of VMware Fusion or Parallels would be coming any time soon (or at all).

macOS on ARM makes total sense to me for MacBooks and maybe even the MacBook Air. I think (hope) it's a non-starter for MacBook Pros and (i)Mac Pros though.

LOL. you misunderstand...

The forthcoming modular Mac Pro (if actually modular as outlined in the last few pages or so of this thread) would debut with a Xeon Brain module, & when Apple introduces the HPC ARM Brain module (2022...?) the original Xeon Brain module can be repurposed as a straight Windows machine...
 
when Apple introduces the HPC ARM Brain module (2022...?) the original Xeon Brain module can be repurposed as a straight Windows machine...

Any hypothetical "brain module" I bought in 2019 would be fully depreciated by 2022 and I wouldn't really care about it any more. If I need a Windows machine in 2022 I'll want one with 2022-era performance, not one that was three years old.

And will I really be happy with my hypothetical 2022 ARM "brain module" that is crippled by three year old interconnect speed when I try to plug it into my other modules? Surely I'll be enthused about Thunderbolt 6's new capabilities in 2022 and uninspired by how limiting my 2019-era modules are.
 
  • Like
Reactions: Biped and AidenShaw
Wrong, DP 1.4 support in TitanRidge is thru USB-C ALT Mode, so when you plug a DP1.4 cable(usb-c) to a TitanRidge TB3 it actually what does is to fall back to USB-C alt mode for Display Port, allowing full DP1.4 output (w/o tb3 overhead), of course you lose the ability to daisy chain TB3 Peripherals (as its no longer a TB3 once you plug a DP1.4 USB-C Display).

That's called 'Moving the goal posts". Not being incorrect. Yes there is DP 1.4 "fall back" mode. However, \


"... The big difference is that Titan Ridge adds support for allowing two DisplayPort 1.4 streams to be encapsulated into the TB3 connection, versus two DisplayPort 1.2 streams in case of the previous-gen TB3 controllers. ...
...So as these types of monitors become more mainstream and pure DisplayPort monitors shift over to DP 1.4, Thunderbolt 3 has needed to catch up. ...
... Meanwhile, because DP 1.4 has greater bandwidth requirements, it's worth nothing that TB3 displays incorporating Titan Ridge and DP1.4 still cannot exceed 40 Gbps offered by TB3. Formally, one DP 1.4 stream can carry 25.92 gigabits of data per second (32.4 Gbps with overhead) and can support a 5Kp60/8Kp30 display without compression, or a 5Kp120/8Kp60 monitor when the Display Stream Compression 1.2 (DSC) technology is used. However, since in case of the TB3 there is a bandwidth limitation, it will not be possible to plug, say, two 4Kp120 monitors to a single TB3 port on a laptop, despite the fact that Titan Ridge can carry two DisplayPort 1.4 streams. At the same time, something like a single 4Kp144 is now a theoretical possibility (at least for systems with a dGPU). ..."
https://www.anandtech.com/show/12228/intel-titan-ridge-thunderbolt-3

Intel's press release a couple of days earlier.

"...The JHL7540 and 7340 controllers provide computer makers the same Thunderbolt 3 40Gb/s performance and feature set as Alpine Ridge, and also adds DisplayPort 1.4 capability for increased video performance. ...
., Gamers can connect one or daisy chain two Thunderbolt 3 monitors with 2560 x 1440 resolution at a 144Hz refresh rate, while simultaneously transferring massive game libraries between a notebook and game drive at up to 40Gb/s ..."
https://thunderbolttechnology.net/b...yport-14-and-basic-peripheral-compatibility-u

As the number of monitors that dump two 1.2 TCONs raises then Thunderbolt has to stay in the game by transporting DP1.4 not just being bypassed for it.


Titan ridge TB3's internally still in DP1.2 mode if you plug actual Thunderbolt Monitor to the TB3 Daisy Chain .

Plugging in an older Thunderbolt docking statoin monitor will fall back to 1.2 if that is the only DiplayPort consumer around. The 'proof' that Titan ridge would be to plug in a 1.4 monitor down stream and it doesn't work.
 
How crazy would be the notion of Apple coming up with their own GPUs based on clustered ARM chips?

SoftBank’s investment fund (owner of ARM Holdings) dumps entire $3.6 billion stake in Nvidia
https://www.cnbc.com/2019/02/06/softbank-vision-fund-sells-nvidia-stake.html
:eek:

Errr. ARM makes GPUs too. So how much sense does it make for Softbank to own ARM and then bet against it with pushing money into Nvidia? Not much.

ARM also starting to push into ML area.

"... Arm again iterates the very large compute performance improvement compared to existing solutions, achieving beyond 2x performance boosts in vector workloads. Naturally, the N1’s ARMv8.2 ISA implementation also means that it supports 8-bit dot product as well as FP16 half-precision instructions which are particularly well fitted for machine-learning workloads, achieving performance boosts near 5x the predecessor platform. ..."
https://www.anandtech.com/show/13959/arm-announces-neoverse-n1-platform/4

and into ML in the auto context
".. .Actual SoCs based on the Cortex-A76AE can scale to up to 64 cores. Besides general-purpose compute cores, Arm’s autonomous-class compute complexes also integrate Mali-G76 graphics cores, ARM’s ML cores, and other necessary IP. Besides, the complexes are set to support Arm’s memory virtualization and protection technologies required for flawless operation of ML and NN accelerators. ..."
https://www.anandtech.com/show/13398/arm-unveils-arm-safety-ready-initiative-cortexa76ae-processor

ARM's ML cores
https://www.anandtech.com/show/12791/arm-details-project-trillium-mlp-architecture


".. SoftBank expects an AI revolution, and that it will occur much faster than any prior revolution ...."
https://www.anandtech.com/show/13488/arm-techcon-2018-keynote-live-blog


ARM doesn't overlap 100% with what Nvidia is doing, but the notion that there is almost zero overlap is exceedingly shallow and myopic. It is more than a good enough time to pulls profits off any bet Softbank made several years ago off the table and put them into other stuff. The cyprtocurrency mania bubble isn't going be a "money falls out of the sky" windfall for Nvidia going forward. It is going to get tougher for them going forward.


P.S. even if look at it as Softbank loosing out on a 'dividend' check from Nvidia. That too is pragmatically backwards. Nvidia doesn't have a viable general purpose CPU option other than ARM at the moment. So even if Softbank drops all Nvidia shares, Nvidia will stll be writing checks to Softbank via ARM.
 
Last edited:
Okay-not to be “that guy” but...

With today’s report that intel believes Apple will be transitioning to their own custom ARM, isn’t this a huge red flag for the Mac pro? I mean why put 3 years of R&D (even if it’s one guy in the basement) into a product that won’t exist longer than what it did in design.

There is almost nothing there in that Axios report that is inconsistent with Apple flipping the MacBook into an iBook and shipping a iOS laptop. Path to where iPad Apps are "universal". That means they'll run on iOS too. Intel looses a Mac product but so would the Mac product line up.

There is also almost nothing in the Axios story that really confirms the Bloomberg stuff. Lots of this looks like "echo chamber" more so independent path confirms. Same set of folks listening to the same set of rumors would likely produce exact the same echos.

There is almost nothing here that points to Apple cleaning house for ARM at all. Intel isn't the only source for x86. They could be dumped and most Mac products still running on x86. There is nothing particularly in the story that conflicts with that either.

if anything these reports seem to have a warped notion that only iOS apps moving to macOS are primarily the only thing that Apple is interested in. Moving the giant dumpster of the iOS App Store and dump that onto macOS .. is that really gong to "improve" macOS substantially? More "race to the bottom" priced apps. Apps chock full of disco ads. macOS strategically needs substantially more of that? Maybe Apple is that dim.

If the primary objecive is iOS app then another iOS system seems like it is just as good a fit as trying to drag Mac products into the loop.


Wouldn’t it make more sense to go ahead and “preview” at WWDC with the developer announcement and say “oh, and we will be shipping our first ARM Mac this December” -hitting their self imposed 2019 product window.

Err no. What would make more sense is to release the framework and first just see if it works well. Then "step 2" would be to add complexity. The first thing Apple has to prove is that can write decent apps with this framework for iPad and Mac as they exist now. Once master that ... then they can move on. Moving on before they got that done CORRECTLY is exactly why there are also just as many swirling stories about how Apple's software quality is tanking as there are about "moving to ARM".

For example, iPads are suppose to pick up having apps with multiple tabs ( window views ). Checking to see that works on iPad and Macs in apps would be far better than doing something kooky with the hardware.

Moving before 2020 makes about zero sense because they haven't even laid foundation yet. It would be like doing the mac OS 9 to mac OS X transition at the same time flipping from PPC to x86. Pick one. Do it right and then more on to the next one.


I mean at the most extreme I think this is the most likely scenario, unless they are involved in some sort of hybrid that can handle the transition. Hell why not have a machine that can switch between the two, pitch that to the developer audience. (I admit I think that is rather absurd but Apple 2019 who knows).

That doesn't look particularly likely at all. If Apple needs some "stub" ARM box to serve as a early access developer for a macOS port they could simply add some more storage/RAM ( 8GB , 512GB ) to an AppleTV. They don't need some "hybrid".

Similarly, if want folks to buy something. Ship an ARM iBook in early 2020. Then make it 'dual boot' macOS Fall 2020.



Almost nothing here in this "Move to ARM" mania stories appear to have anything to with the Mac Pro in any way. Apple has nothing to move to Mac Pro with. They haven't even been able to get an x86 one out the door and that's lower complexity. How are they going to do something even higher complex than they have bumbled for the last 5-6 years???
 
I don't expect Apple to move to ARM until they are ready to replace their entire Mac line up with it. But when they announce they will switch to ARM, I think transition will be very fast (it only took 14 months from announcement to complete transition, when they went from PowerPC to Intel)

If Apple is actually going to move ARM starting in 2020, I would say the chances of Mac Pro being ARM based is more than 50%.
 
How are they going to do something even higher complex than they have bumbled for the last 5-6 years???
Thumbs up. Phil "my ass" and the other amigos are flailing (failing?).

An ARM-based Mac Pro will be Apple's admission that they have failed as a computer company and are only interested in OSX as an Ios development platform. When they move Ios development to Ios - then OSX is dead.

Is an Ios MacBook anything more than a 15" Iphone?
 
Last edited:
Thumbs up. Phil "my ass" and the other amigos are flailing (failing?).

An ARM-based Mac Pro will be Apple's admission that they have failed as a computer company and are only interested in OSX as an Ios development platform. When they move Ios development to Ios - then OSX is dead.

Is an Ios MacBook anything more than a 15" Iphone?

Lets be fair Aiden. Those phones are as powerful as the average laptop.
 
WWDC Keynote Person (will have) said:
...what makes a Mac a Mac, is macOS, and the tight integration between hardware, software and services. The Processor is not what makes a Mac a Mac, any more than the brand of fan we use to cool it. macOS is an inherently processor agnostic operating system, as we've seen from multiple processor transitions throughout its 20 year history. With our new Marzipan technology, the same app will work unmodified on whatever Apple device you use. We're now in the 'post-processor' era, in which it no longer matters what brand or kind of processor you have in a Mac. Each machine will use whatever processor is best for that specific specific model, and that might change from year to year, depending on where advances are made. What matters is the overall package, not the individual components, and we think our customers want to pay us to choose the best processor for each device, so that they don't have to be concerned about it.
 
I don't expect Apple to move to ARM until they are ready to replace their entire Mac line up with it. But when they announce they will switch to ARM, I think transition will be very fast (it only took 14 months from announcement to complete transition, when they went from PowerPC to Intel)

last time Apple moved from PPC -> x86 for a couple of reasons.

1. move to a broader ecosystem (more implementers at each product level and broader R&D cost sharing)
2. generally more performance but advantages somewhat initially skewed to laptops.
3. Apple was committed to moving all the mac products forward on close to 12 month iterations.

If Apple is actually going to move ARM starting in 2020, I would say the chances of Mac Pro being ARM based is more than 50%.

That assumes they are looking for the same objectives. Several things point to that they are not. Move to ARM could mean chasing thinner laptops.

1. Apple largely doesn't update Mac products for years at a time. ( Mac Mini in comatose state for 4 years was not primarily an Intel drvein problem. Mac Pro ... ditto. entry MBA still pretty much comatose... ditto. )

2. Most of the hype is how Apple's ARM is going to catch the mid-higher end products. Not better performance that could just for a emulator gap.

3. The only way Apple gets "broader ecosystem" os by staying highly coupled to the iPhones. Otherwise this is backsliding at that aspect.

At this point Apple has three laptop Macs : MBA , MB, MBP
And 3-4 desktops products : Mini , iMac , iMac Pro , Mac Pro ( could collapse iMac Pro into iMac )

Apple hasn't updated more two dekstops in the same year since when?? The common pattern for last 5 year is to hit snooze button on at least one Desktop.

iMac 2015 -> 2017 -> 2019?
Mini 2014 -----> 2018
Mac Pro 2010 ---> 2013 ----------------------------- > 2019 (supposedly )
iMac Pro 2017 ----> (maybe) 2019 ( or maybe 2020 )

Does that look like a nimble company that can move multiple products forward relatively rapidly? No.
That pattern is absolutely nothing like Apple's pattern in the run up to the 2006 "12 month" conversion.

Apple has shown absolutely no competence at all at updating 6 Mac products of nominal complexity at the same time in a year in over a decade.

It is also probably more that wishful thinking that putting ARM processors in Macs is going to be make them "important", highest priority, strategic products again. Like the 'arm-ness" in and of itself is primarily force driving iPhone updates.
 
  • Like
Reactions: Nugget
The capability of intel Macs to run multiple VMs and windows apps in very acceptable performance is invaluable and seriously useful, a vital part for many user's workflow, I think.
The transition to ARM will have a negative impact, and I'm afraid that the performance for VMs will be reduced and the inability to run natively any important Windows application will force users to switch or run additionally windows systems.
My opinion is that destroying the current workflows will be a huge step back and a serious problem.
Of course not for all Mac users, facebook users or the people who are just browsing the net will not have any problem at all.
 
It would be crazy to switch the Mac Pro to ARM first - it's a low-volume product that benefits enormously from Intel (or AMD - Threadripper would work very well) architecture and gains nothing from ARM (it doesn't care about power efficiency). It would be extremely difficult to switch, because it would require not only a new chip, but also a new core to work at all as an ARM machine.

If Apple is at all rational (and, like one former CIA director said of the Soviet Union, "they may not always be entirely rational by our standards, but they are rational by their own"), they'll transition the smallest laptops first. The MacBook is a high volume product that doesn't even need a new chip (the most powerful iPad Pro chip will serve), and that can be restricted to the Mac App Store with very few complaints. If that works, they'll transition the 21" iMac and/or the Mac Mini. Those would need a more powerful chip than any iPad, but it could be more of the same cores, rather than a new core (much less expensive development).

I would expect Apple to keep the 27" iMac (probably relabeled into the iMac Pro line), the existing iMac Pro, the MacBook Pro and especially the Mac Pro on Intel for the foreseeable future (AMD is a far more reasonable possibility than ARM/A-series, especially for the desktops). Maybe they have a five-year plan to move everything over, but it would be a huge investment in at least one (and possibly two if you include the Xeon iMac Pro and Mac Pro) new core designs, which have to be updated annually. That's something like half a billion dollars in annual R&D to support part of a 20 billion dollar Mac business (many Macs can use iPad cores, so it's only the upper half of the Mac line that needs a new core).

What the A-series is competitive with is not fire-breathing desktop parts - it is extremely power-optimized mobile parts. The A12x is (more or less) a 7 watt part that is more powerful than comparably powered desktop parts from Intel, and competitive with 15 watt parts (there are task-by task differences, but that's the range). It already competes favorably with the MacBook processor and the low-power dual-core CPUs in the 2014 Minis and the dual-core 21" iMac.

It's reasonable to assume that a design that doubled the power to the chip and focused on "big" cores only (8 big/0 little instead of 4 and 4) could compete with 28 watt quad-core Intel desktop chips (MBP 13", quad-core 2018 Mini, possibly even the quad-core 21" iMac). If Apple's really lucky, an easy eight-core design like that one might even compete with low-power 6-core Intel designs (higher-end Mac Minis, the lower end of the 15" MBP range).

These are all designs operating at a watt or two per core - even the high-power cores in the A12x are not using more than a couple of watts at most.

Now give it a power budget of 45 watts - you get diminishing returns pushing power through the same cores, so the easiest redesign is to add cores. The 15" MacBook Pro ends up as a 24-core machine with excellent performance as long as the application is multithreaded enough. There are vanishingly few 24-core desktops, let alone laptops on the market (a tiny number of Xeon-SP workstations, soon to be joined by the W3175 and the top two Threadripper 2 chips), and almost no software takes advantage of that many cores. Single-threaded performance is way down from what the top Intel mobile chip could do - the A-series cores are comparable to a MacBook Air core in performance, not the fastest MacBook Pro.

At 90 watts (27" iMac), you end up with 48 cores, each with the performance of a MacBook Air core. At the 300 watt CPU power budget of a Mac Pro, you get 160 cores - there is nothing except a supercomputer that uses 160 cores or more, and supercomputers are notoriously incredibly tough to program. You can't just throw Photoshop on a Cray, even if the OS and CPus were compatible - and some supercomputers are actually Linux on Intel, so Photoshop won't run, but desktop Linux software actually might. The desktop software would simply take up residence on a few cores and ignore the rest of the machine! On many applications a 160-core Mac Pro would simply emulate a cross between a MacBook Air and a space heater!

Of course, there's nothing keeping Apple from developing a high-power A-series core in the 5 watt range for MacBook Pros and above, and maybe even an extremely high-power core (over 10 watts) for the Mac Pro (and possibly iMac Pro) only. Nothing except that developing a new core costs a huge amount of money (a significant fraction of a billion dollars), updating one annually isn't that much cheaper, and that nobody knows whether Apple would hit the same roadblocks Intel has been dealing with if they tried to scale up their successful A-series design. Just because you can make a great 1 watt core doesn't mean you can also make a 10 watt core.

Even if Apple successfully made high-power cores, and the investment was worth it for the (limited, compared to the iPhone) sales of high-power Macs, they still lose virtualization. A previous poster correctly commented that Windows on ARM is about Windows tablets and very small laptops, not workstations running ArcGIS, AutoCAD and the like. Even if you get Windows running on Boot Camp on ARM, you gain one application that doesn't already run on Macs - Edge. The big Windows applications that might be worth the trouble are Intel/AMD only...
 
Guys, there are NOT going to be a series of stackable Mac Mini shaped boxes. Just look at the rubbish being proposed in some of these videos, it is utterly preposterous. These guys are inventing 'sources' and mocking up images because they make money through their youtube clicks. They have literally no idea.

Apple would never create something as staggeringly inelegant as 'the stack'. There's a reason why Jony is a wealthy man with a job any industrial designer would drop their pants for. Everything Apple offer is as simple and as minimalist as possible. These Mac Mini stacks are exactly the sort of thing that people with no design talent first visualise when hearing the word 'modular'. Real designers don't just stop imagining when the first thing pops into their head.

Sure Apple may have removed connectivity options and forced dongles on some users through their minimalism, but it is laughable to think that they would leave the end result up to somebody having to arrange various physical boxes neatly.

Not to mention the masses of extra aluminium and cables just to house things separately (wastage is bad PR), extra setup processes (however simple) and multiple unnecessary failure points.

The new Mac Pro will be a simple, 2001 Space Oddessey-style block with very little external detailing, and will be rack mountable while also being desktop presentable at the same time. Think of the minimalism of the current Mac Pro, and then imagine some of the weak points eliminated.

I will repeat. It is NOT going to be a stack of separate boxes for you to pile on top of each other.
 
The rumour of stackable modules is so ridiculous and I'm baffled that "technical" website propagate it without questioning the possibility.
There are just so many issues.
- A stack is a chain of peripherals, which leads to bottlenecks and latency. It's utterly inefficient. You connect each module directly to the CPU module, not to the module below.
- The modules would have to be somehow screwed to each other to avoid accidentally breaking connectors. Very elegant indeed.
- A GPU module would have to be huge in order to dissipate the heat of any descent GPU.
- GPU modules would require their own power supply, which would lead to several power cords connecting to the stack. Again, how "elegant" is that?
- GPU modules would require a badass, impractical, connector if they're supposed to bring any benefits to thunderbolt eGPU boxes. If they just use thunderbolt, a GPU module would bring exactly zero benefit. Why would Apple release a module that is specific to just one type of Mac while people could use any cheaper eGPU PCIe box? Not considering that these boxes can take of-the-shelves GPUs, which the rumoured Mac Pro module probably would not. How to shoot yourself in the foot...
- If modules of stacked, it would mean you're limited to a single GPU module (while you can connect up to 4 eGPUs with thunderbolt), or that Apple would have to multiply connectors on each side of each module to avoid bottlenecks, which would not solve the issue of latency anyway.

Stacked modules can only work for storage, at best.
 
Guys, there are NOT going to be a series of stackable Mac Mini shaped boxes. Just look at the rubbish being proposed in some of these videos, it is utterly preposterous. These guys are inventing 'sources' and mocking up images because they make money through their youtube clicks. They have literally no idea.

Apple would never create something as staggeringly inelegant as 'the stack'. There's a reason why Jony is a wealthy man with a job any industrial designer would drop their pants for. Everything Apple offer is as simple and as minimalist as possible. These Mac Mini stacks are exactly the sort of thing that people with no design talent first visualise when hearing the word 'modular'. Real designers don't just stop imagining when the first thing pops into their head.

Sure Apple may have removed connectivity options and forced dongles on some users through their minimalism, but it is laughable to think that they would leave the end result up to somebody having to arrange various physical boxes neatly.

Not to mention the masses of extra aluminium and cables just to house things separately (wastage is bad PR), extra setup processes (however simple) and multiple unnecessary failure points.

The new Mac Pro will be a simple, 2001 Space Oddessey-style block with very little external detailing, and will be rack mountable while also being desktop presentable at the same time. Think of the minimalism of the current Mac Pro, and then imagine some of the weak points eliminated.

I will repeat. It is NOT going to be a stack of separate boxes for you to pile on top of each other.
The Jony Ive part of your comment is pure tripe.
Apple have and still are producing some ugly kit, they may have a valid reason for it but nevertheless it is still a thing.
 
The Jony Ive part of your comment is pure tripe.
Apple have and still are producing some ugly kit, they may have a valid reason for it but nevertheless it is still a thing.

You don't have to like the designs. But unfortunately you also don't understand them.

You call my comment 'pure tripe' because you think I'm evangelising the man.

I'm not hugely fussed whether he designs a slightly more attractive conventional PC, or something completely avant garde.

But I'm absolutely right about what he's doing.

Pure tripe. lol
 
You don't have to like the designs. But unfortunately you also don't understand them.
That has nothing to do with it, in case you have forgotten we are all individuals with different likes and dislikes. I understand the designs and the technical reasons/limitations for some of them. However none of that makes them beautiful or ugly.
That my friend is a fact.
 
I have no idea if it's a real source or clickbait, but...

- A stack is a chain of peripherals, which leads to bottlenecks and latency. It's utterly inefficient. You connect each module directly to the CPU module, not to the module below.

Apple has already done an ultra high bandwidth connector, that's easy to align, connect, lock in place, and looks elegant - the processor to backplane connector in the 4,1/5,1.
- The modules would have to be somehow screwed to each other to avoid accidentally breaking connectors. Very elegant indeed.

There are easier ways to lock things, for example, a half-circle cam on each top corner, that sits within the volume of a "top" module, but which rotates up into the body of a module placed on top of it - have it mounted physically on the outside, so it's visible at a glance if it's locked.

- A GPU module would have to be huge in order to dissipate the heat of any descent GPU.
The processor module could be as big as a full-size GPU accepting eGPU, "like a mac mini" may not mean a box literally the same size - look how big the processor section of a cheesegrater is, which arguably has better design for airflow and cooling than the trashcan

- GPU modules would require their own power supply, which would lead to several power cords connecting to the stack. Again, how "elegant" is that?

Macs used to come with passthrough power supplies, so you'd plug your monitor's power cord into the power out port on the mac. Cables which have a logical flow, for example loopback cables to mux video into a thunderbolt bus, are very elegant. Complicated internal circuitry requiring software control to route things is the rube-goldberg option.

- GPU modules would require a badass, impractical, connector if they're supposed to bring any benefits to thunderbolt eGPU boxes. If they just use thunderbolt, a GPU module would bring exactly zero benefit. Why would Apple release a module that is specific to just one type of Mac while people could use any cheaper eGPU PCIe box? Not considering that these boxes can take of-the-shelves GPUs, which the rumoured Mac Pro module probably would not. How to shoot yourself in the foot...

"ProLink" it'll just be a PCI extender ribbon, in a different form-factor. On the Mac Pro it can be a set of pins and a socket, on a heavy-duty laptop it can be a cable (with an adapter that plugs into the module pin-ins). "It's External Graphics (because Apple avoids the term eGPU) on Steroids"

Though it might never come to a portable due to being a cold-plug only option.

Personally, I think some sort of full-fat pci socket on a cable, perhaps a locking cable the way a BNC connector locks, is more likely than a stack with passthrough pins, but all the same...

- If modules of stacked, it would mean you're limited to a single GPU module (while you can connect up to 4 eGPUs with thunderbolt), or that Apple would have to multiply connectors on each side of each module to avoid bottlenecks, which would not solve the issue of latency anyway.

You can have more than one PCI slotted gpu in a row - that's all this stack concept will be, a stretched set of PCI slots.
[doublepost=1550997368][/doublepost]
You don't have to like the designs. But unfortunately you also don't understand them.

"good design has no need of clean lines, they are a form of decorative ornament"

Jony Ive is not a good designer, or rather, the design culture he has fostered does not produce good design. It produces very attractive, highly ornamented design, which almost universally performs much more poorly than an alternative, which concentrates less on the decorative ornamentation of thinness and minimalism, would perform.

  • The overheating trashcan mac pro, that is Jony Ive design.
  • The bent iPad, and bendable iPhone, that is Jony Ive design.
  • The mouse that can't be used while recharging, that is Jony Ive design.
  • The multiple generations of failing and unreliable macbook keyboards, that is Jony Ive design.
  • The macbook that is unrepairable, except in giant expensive clumps, which has no way to physically remove and recover its storage if the path between it and the data socket is damaged, that is Jony Ive design.

Jony Ive design represents the sacrifice of utility, in the name of ornamental decoration.
 
  • Like
Reactions: ssgbryan and rrl
I'm not telling you to like it. I'm not telling you it's perfect.

I'm telling you why he's not going to make the stack.


(EDIT: fully agree on the mouse, that's enragingly bad)
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.