Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Dont get crazy, Oct'30 is an PRO Keynote, the diversity of logos, only means apple sees (or they want us to believe that) a diversity of PROs users, from that, you can deduce all the launches will be PRO related and conclude with the mMP introduction (but avail 2019).

Analize that.

What I spect:

  1. Opening: a 30 minute video showing any kind of profeesional using Apple Stuff, from vaccums salesman (as T.Cook) to Neil deGrasse Tyson seaching for black holes (or a good joke, which seems he is chasing all the time, w/o good luck)
  2. Then Updated iPad Pros, and updated iPad mini (becuse only vertical pros markets still use iPad minis).
  3. MacBooks
  4. Then updated iMacs, and maybe an updated iMac Pro.
  5. New peripherals, as TouchBar enabled Magic Keyboard (also maybe an premium Magic Keyboard with two color e-Ink keycaps for those on video editing, etc requiring multiple Kbd)
  6. Closing with an updated "PRO" Mac Mini (dont expect more than 6 cores unless Apple switched to AMD).
  7. And One Last Thing: a Sneak Peek or introduction to the all new "modular" Mac Pro...
And closing with the blatant smiles from Cook, Federighi, and "Cant Innovate My Ass" Schiller...

Why do you think this is a "pro" focused event? I think it will be partly "pro" focused, but more consumer focused. These are mostly consumer products they are expected to release, including their new gaming eGPU (RX Vega 56) which suggests they will give some attention to video games. So expect at least 1 game developer to be on stage. I'd be very surprised if they give us any peak or additional details on the new Mac Pro.
 
Some people are going to leave because they have stuff to do with newer equipment and Apple has none for large chunk of those squatting on a older Mac Pro ( or looking to move off an approximately equally as old Windows systems . Some folks who 'left' in 2013-2014 are up on a renewal cycle. ).
If anyone is still hoping that Apple will suddenly turn around and pay attention to the higher end workstation market (or hell, even the low end) look at the timeline.

2012-10 - HP announces Z420, Z620, Z840 systems with E5-26xx (v1)
2012-10 - Apple selling the older technology cMP systems
2013-10 - HP announces Ivy Bridge CPU upgrades (E5-26xx v2) for Z420, Z620, Z820 systems
2013-10 - Apple's been hyping the trash can for months, although it is still months from shipping
2014-07 - HP announces Z440, Z640, Z840 systems (upgrade to E5-26xx v3)
2014-07 - Apple (crickets)
2016-04 - HP announces Broadwell CPU upgrades (E5-26xx v4) for Z440, Z640, Z840 systems
2016-04 - Apple (crickets)
2017-04 - Apple's tres amigos do the "we screwed up the trash can" mea culpa
2017-07 - HP Updates Z Workstations to Skylake SP: Z8 up to 56 Cores, 3 TB RAM, 9 PCIe Slots, 1700W
2017-07 - Apple (crickets)
2018-04 - Apple - the new, mysterious "modular" Mac Pro won't ship in 2018​

While Apple apologists try to put the blame on Intel, HP manages to make regular significant upgrades to the Z-series.
 
While Apple apologists try to put the blame on Intel, HP manages to make regular significant upgrades to the Z-series.

The way Apple tends to work, they wait on specific components. I bet they've pinned themselves to a specific Intel processor (or AMD if you want) and probably Navi.

If it sounds like I'm trying to blame Intel I'm not. I'm saying Apple should just work with that they have now as a starting point. People would complain if it was two months dated or whatever, but it's better than nothing. Intel will probably delay them, but Apple shouldn't be working that way in the first place. I'd bet Apple would want to design a new Mac Pro around 10 nm and... well...
 
The way Apple tends to work...
...is counter to the way that many pros want to work.

The concept of agile engineering is "Agile development is an approach to engineering under which requirements and solutions evolve through the collaborative effort of self-organizing and cross-functional teams and their customer(s)/end user(s).It advocates adaptive planning, evolutionary development, early delivery, and continual improvement, and it encourages rapid and flexible response to change". (from https://en.wikipedia.org/wiki/Agile_software_development - with edits to include all design rather than just software)

The Apple Mac Pro team is not "agile", but instead they are "lethargic". Waiting for CPU from vendor X and GPU from vendor Y and SSD from vendor Z is a recipe for failure.

I've been working on a proposal to replace a couple of dozen Maxwell and Pascal GPUs with Titan V GPUs. Now, I'm reworking it for RTX2080 Ti GPUs. Or possibly a mix of RTX2080 Ti and Titan V GPUs (the RTX2080 Ti sucks at FP64).

(edit: substituted several "RTX2018" references to "RTX2080")
 
Last edited:
I've been working on a proposal to replace a couple of dozen Maxwell and Pascal GPUs with Titan V GPUs. Now, I'm reworking it for RTX2080 Ti GPUs. Or possibly a mix of RTX2080 Ti and Titan V GPUs (the RTX2080 Ti sucks at FP64).


If you ever need anyone to destroy your *cough* EOL *cough* equipment for accounting purposes, I can help you out.

No trouble for me, I insist ;)
 
The way Apple tends to work, they wait on specific components. I bet they've pinned themselves to a specific Intel processor (or AMD if you want) and probably Navi.

If it sounds like I'm trying to blame Intel I'm not. I'm saying Apple should just work with that they have now as a starting point. People would complain if it was two months dated or whatever, but it's better than nothing. Intel will probably delay them, but Apple shouldn't be working that way in the first place. I'd bet Apple would want to design a new Mac Pro around 10 nm and... well...

Most likely it will have Radeon Pro 7nm Vega 20. Navi is mid-range Polaris replacement.
 
Most likely it will have Radeon Pro 7nm Vega 20. Navi is mid-range Polaris replacement.

The next Mac Pro system will need an entry and mid level cards ( the mid - high range in consumer space ). A 'floor' of Vega 20 is suicidal. From most reports Vega 20 is not going to be anywhere near affordable ( AMD Is looking to make high margin money with those. Throw on top Apple's 30% markup and they aren't going to be affordable range price range at all) . Audio processing workstation isn't generally going to need anything like that.

Apple is going to need a focused, small set of GPUs. If the module slot is reasonably flexible they could add another card to the BTO list in 6-10 months later. Holding up the whole system for just Vega 20 drivers is they slide 2-3 quarters is beyond goofy at this point of lateness. If the drivers are ready then should be around same time the next iteration Intel W arrive ( both were suppose to be Q4 on roadmaps last year and that seems to be sliding a bit now for volume availability). A new Mac Pro could ship without a bleeding edge top end card. It can't ship without the entry and mid level cards ( since that is what most folks are going to buy).

Waiting for Navi would be a sign of silliness on Apple's process not because of the range, but because I don't it has ever been "roadmapped" as a 2018 part back in 2017 (or 2016). It would be silly to couple the Mac Pro to 2019 parts that were still in the mostly incomplete stage. So far it seems doubtful that Navi is going to come early enough in 2019 to make a difference for the Mac Pro ( versus the damage done but pushing things out even further).

If Apple and Nvidia ended there feud there are other 2018 options. If they didn't, then a 12nm Polaris 'refresh' would work fine for a launch card. Late in 2019 (or early 2020) they could do a Navi update to the configs. An empty x16 slot for the Nvidia fans would be more important that the flavor of AMD in the default display slot ( if not two vendors for primary default video slot then an empty one is more important. An inexpensive Polaris option that some folks are just going to ignore is better than a more expensive Navi option. ).

GlobalFoundaires dropping out of the 7nm process services probably means that AMD is going to save 7nm for mostly high margin options Vega 20 and beating down Intel CPUs while they are "stuck" on 14nm (Zen 7nm ). Navi would be in line behind them.
 
Errrrr, when has gaming been the primary macOS (or even top 10) focus ? Even less so for the Mac Pro ... primarily built for gaming? Go back and watch that Mac Pro 2006 introduction video clip someone linkedi in a page or to back. Gaming mentioned even once? No. That AMD is 'down' on multiple GPU dies for gaming doesn't particularly have any Mac or Mac Pro relevance.

Actually, the statement says quite the opposite: AMD is not focussing much on delivering powerful cards for gamers (they are far behind on that, even if they will eventually equalise) but, as far as investing in ML and AI implementation with GPGPUs is concerned, they believe that the future is based on multiple card configurations.

What I was wondering is whether Apple will still focus on a macOS development to exploit multiple cards, when only the nMP (hopefully) would allow it, or just leave the multiple cards being used only by vertical applications, therefore ending like the 6,1 which never used both cards at the same time for the same purpose.

They said that one of the mistakes of the 6,1 was believing that the market would have moved to dual configurations instead of one powerful card.

It's kind of confusing to me. :D

That doesn't mean that the standard configuration for the Mac Pro would be dual/multiple GPUs. However, it would be a extremely bozo dubious move to exclude more than one GPU.

I agree. I think it will start with one card and offer multiple GPU configurations. Whether they are going to be also from nVidia, PCI-e or soldered, it's another discussion.

If Apple is serious, there should be a slot for 'compute' GPUs. ( if someone wants to add an Instinct , 'Tensor' , Nvidia, or other ) learning/inferencing card that should be an option. Apple doesn't have to sell, just enable the easy usage.

That was the 6,1: one card as a GPGPU and the other used for standard graphics. I wish I could use both of them for graphics or to focus on improving my simulations when I run them or even just to speed up my Finder. That needs apps AND an OS that are tuned together and Apple didn't deliver appropriate libraries (as far as I know) and didn't push developers enough since the 6,1 has been on the market since 2014 and Final Cut still runs slower than on a MBP with Quicksync.

But yeah, it would be fantastic if they realised that they would sell ********s of nMPs if they let us choose which and how many cards to install in our workstations and mainly letting us upgrade them with time and money: not everybody can start from the top configuration.
 
Last edited:
i f[it] let us choose which and how many cards to install in our workstations and mainly letting us upgrade them with time and money: not everybody can start from the top configuration.
(Emphasis added.)

Apple is not about user choice any longer; and hasn't been since circa the 2012 introduction of the rMBP.
 
  • Like
Reactions: Queen6
Yes, I originally included the delivery time dimension in my post but then removed it for brevity. But yes, the latency between any sort of "sneak peak" and the product shipping is what drives the risk. My point is not to evaluate that risk but simply that it is a non-zero risk, and therefore will factor into Apple's decision-making over whether or not to talk about Mac Pro at the event.

There is no non-zero risk in any option. Pointing at Apple options and saying there is risk there is a "sky is blue" point.


And as I said in my post, for a "sneak peak" to have enough value to actually bother with it needs to include sufficient detail to serve its purpose - so I assume if Apple doesn't have a "substantive, coherent system" at this point then they wouldn't do a "sneak peak" anyway.

If Apple is ready and the parts aren't then a "sneak peak" would be all they could do at this point. (e.g., they were shooting for a December 2018 release but major component volume availability slid ).

The value in the "sneak peak" is doing damage control. At this point, the "it is just vaporware" meme's are starting to grow larger. One of the motivators for the April 2017 meeting was because the Mac Pro "old age' odometer had rolled over the 1000 day milestone and about to be in the 1,200 day range. Apple had to rejigger the configuration to re-rationalize the price. At this point ( 1773 days) they are only about 7.5 months away from hitting the 2,000 day mark! If Apple thinks HP/Dell/Lenovo sale folks are clowning them now .... wait until they get to 2K days. ( if ignore the MP 2013 completely there are already past that milestone. )









On a related note to this, I do also wonder whether an iPad-focused event, presumably intended for a mass-market audience, is the sort of event where you talk about Mac Pro - perhaps the most niche product in their portfolio - at all.

iPad Pro isn't a "mass market" device. Especially if Apple tweaks the price slightly higher again. ( Face ID lead to iPhone price creep. No good reason to think it won't do the same for the iPad Pro). The iPad (the regular version; which isn't be released here) is. If the iPad mini shows up as a "even lower" option then perhaps, but that may not happen here.

So if Apple is mainly finished with the Mac Pro they should swash it for much later? The product is grossly late. WHATEVER event was close to where they were getting close to being finished they should talk about it. Period. Delaying at that point buys absolutely nothing than more hurt (and risk ! ) .

if they finish and there is no "normally" annual event schedule then just do it then. ( waiting for 2-3 months for WWDC if were finished is equally ludicrous. ). Apple has a multimillion dollar theater on their camps. If they wanted to do a demo on leap year day they could do it. Whenever they want they can hold a media event.


The "why" is irrelevant, in my view. And actually I disagree with your suggestion (if I'm reading it correctly) that the loss of upgradeability is a consequence of other things. I believe that the loss of upgradeability is a deliberate product strategy and design principle, and one that permeates across Apple's portfolio. And I make this point because I think it can help inform our thoughts on the new Mac Pro. I believe that the sort of upgradeability that people here want and expect of the new Mac Pro is an anathema to Apple's product principles. And I think that's clear from looking at Apple's behaviour.

The 'why" is irrelevant but then launch into a "why" this happens. Misdirection doesn't make your point more salient. If "sealing up the case for non upgrability" is such established rigid Apple dogma when didn't they do it to the Mac Pro 2013? Why did they leave the Mac Pro with that design for the last 5-6 years. it would relatively trival to modify the MP 2013 case so that was glued or exotically hex/torx screwed in the last 4 years. So if that Apple "job number 1" in product design why didn't they do it in all this available time?

That is why the notion you are hand waving about really probably doesn't inform much about the next Mac Pro. Because it isn't absolutely rigid dogma. Even more so in the Mac Pro subset of the market.



It's not about whether it's POSSIBLE, it's about whether it's INTENDED. These products are INTENDED, deliberately, to not be upgradeable by end users. And that's explicitly stated in Apple's support documentation too. The trend, again, is very clear that Apple has been progressively removing all upgradeability from its products.

All designed products have an intent. And most reasonably complex products have interacting constraints. For example the most lightweight battery subsystem won't have its own metal container enclosing it. Stipping the Li-ion battery of its rigid protective covering and using the computer system enclosure to protect it is a weight saving trade off.

Lawn movers come with documentation that says don't put your hand into the blade deck while the mover is running too. There is documentation about what not to do because frankly if you don't tell folks what to do someone will sue for doing something that beyond common sense.

There was a MSI system linked into about two pages back (around #9600 or so) that had two MXM GPU cards. If integrating a GPU with with thunderbolt that is a preferable design tradeoff than some Rube Goldberg loop back system. However, if stick with MXM that also caps how 'big' the GPU can be. Still can't wonder down the GPU isle at Frys//BestBuy/Microcenter and choose most of the GPUs there. Most of those forms don't match the function.





I'd say things like:
- To improve system reliability by only selling Apple-certified "modules" that are engineered and deeply tested with the Mac Pro and its ecosystem for performance and reliability.

why would Apple design custom DIMM blades when the standard ones work OK? They don't. Again exceeding little evidence in the iMac Pro or most of the 21.5" iMac line up that Apple is follow this silly dogma.

Where modules meet their design constraints Apple uses them. Where they don't they make their own.

the Mac Pro GPU cards are custom because there is no standard for making PCI-e + displayport edge pin cards that large.



- To avoid the traditional driver issues that people have on Windows from the ability to plug in any third-party hardware.

This is primarily just horse hokey. Apple has deployed Thunderbolt to almost every Mac device in the line up at this point. That does NOT necessarily drive down drive exposure. Is Apple trying to accommodate devices with tight BIOS only , 32-bit or other vintage/obsolete requirements ? No. That has jack spit to do with "upgradability" though. That's just Apple's distaste for old , 'boat anchor' stuff.

The crappy Windows drive issue is largely avoid by not enabling people to do products that don't meet minimal quality standards. Don't overly enable driver that are largely abandoned from active development.



- To be able to sell Apple-certified modules at a margin.

But Apple would have to do the modules for them to get a sizable margin. Apple isn't going to do a DAW DSP card. They aren't going to do an 8K video capture card. They laregly aren't going to do the vast majority of Display/Monitor options possible (and will have a connector that hooks to 3rd party monitors with a standard connector. ).

Socket CPUs ... nope.

Is Apple probably going to do their own primary Boot SSD via T-series. Yes. Is Apple going to get into business of 2-4 internal SSDs 'loose' SSD modules for sale. Probably not. Apple tightly coupling the boot SSD to the security system is at least as much about deeply integrating the security as anything else. Where the push back from several is the interwining of the constraints. Not a suggestion of a more functional solution ( more secure in this case). But rather form as a higher design priority than function.


- To be able to sell complete system "solutions" with a set of modules/components/whatever designed for specific use cases (photography, video editing, whatever).

And the highly specialized photography only ( keyboard / card ) Apple exclusive hardware previously has been what??? Nothing. Software digital audio workstation (DAW)? yes. Hardware DAW? No.

These roll-up hardware+software solutions have traditionally been colaborations with Apple ( 3rd party plug-in/software + Mac base system + 3rd party hardware ). That is unlikely gong to change. For the vast majority of the Mac line up that tie in can be done through Thunderbolt. But Apple doesn't need that to be completely exclusive. As long as Thunderbolt is viable and growing there is no need to unnecessarily completely kneecap the next Mac Pro. ( the subset of the market that doesn't need any std slots has the rest of the Mac line up. ).


In my view, such a practice would be more in-keeping with Apple's established behaviour.

Several of the above a blown ( complete lack of current examples). It isn't established at all.

I'm not suggesting the loss of upgradeability is a symptom of a physical constraint. I'm suggesting the loss of upgradeability is a deliberate design choice.

Design is about making trade-offs. Function and integration being more important than form you'll get some 'form' losses. Bemoaning lack of physical form standards for commodity parts is a physical constraint design choice.
 
  • Like
Reactions: barmann
Why do you think this is a "pro" focused event? I think it will be partly "pro" focused, but more consumer focused. These are mostly consumer products they are expected to release, including their new gaming eGPU (RX Vega 56) which suggests they will give some attention to video games

Apple branded, gaming focused eGPU ???? Since when? Rumor or information from where? ( I haven't read everything possible on the Internet, but that is flying extremely under the radar of alot of places if true.)

Apple worked with Blackmagic to do their eGPU. They have worked with Sonnet earlier than that. It would be extremely out of behavior for Apple to jump into what they had previous deemed as a "3rd party opportunity" hardware zone and start to push players out. Another 3rd party with yet another hardwired eGPU sure, but something that Apple is spending a ton on time on themselves? Why? They don't have a shortage of adaptable eGPUs and have at least two hardware wired ones already in the mix.


. So expect at least 1 game developer to be on stage. I'd be very surprised if they give us any peak or additional details on the new Mac Pro.

The 1 game developer could just as easily be in the iPad Pro section of the session as the Mac session. Apple makes more money on iOS games than macOS ones. Or a session showing same game on both systems.
 
Is Apple probably going to do their own primary Boot SSD via T-series. Yes. Is Apple going to get into business of 2-4 internal SSDs 'loose' SSD modules for sale. Probably not. Apple tightly coupling the boot SSD to the security system is at least as much about deeply integrating the security as anything else. Where the push back from several is the intertwining of the constraints. Not a suggestion of a more functional solution ( more secure in this case). But rather form as a higher design priority than function.

Whatever extra encryption security that the T2 chip may provide, the macOS system should still be alternatively bootable via an external USB drive device & therefore completely bypassing the T2 controller.
Otherwise, there would be no easy method of "rescuing" and/or data saving for a borked operating system.
Apple could just as easily provide a single soldered-in M.2 SSD (with T2 controller) for system security, yet still include allowance for alternate internal or external SATA, TB or USB bootable drives that do not utilize the T2 controller.
 
Whatever extra encryption security that the T2 chip may provide, the macOS system should still be alternatively bootable via an external USB drive device & therefore completely bypassing the T2 controller.
Otherwise, there would be no easy method of "rescuing" and/or data saving for a borked operating system...

This is an interesting sidelight AFAIC, because with the "Apple Tax" on everything, the consumer optimization strategy is going to consider minimizing the cost of the T2-controlled internal SSDs, and then buy a big external for their data storage.

At that point, to what degree is the external data going to be protected ... and how?

Because to tie it onto the encryption from the T2 probably means that if the desktop goes down, you just borked all of your data into non-recoverable. As such, the lower downside risk from the user perspective is to either run without any encryption at all, or to employ a method of encryption which is specifically NOT tied to the desktop and its T2 hardware.
 
  • Like
Reactions: Biped and ssgbryan
This is an interesting sidelight AFAIC, because with the "Apple Tax" on everything, the consumer optimization strategy is going to consider minimizing the cost of the T2-controlled internal SSDs, and then buy a big external for their data storage.

At that point, to what degree is the external data going to be protected ... and how?

Because to tie it onto the encryption from the T2 probably means that if the desktop goes down, you just borked all of your data into non-recoverable. As such, the lower downside risk from the user perspective is to either run without any encryption at all, or to employ a method of encryption which is specifically NOT tied to the desktop and its T2 hardware.

I'm not sure Apple is concerned about the customers' safety .

As long as they provide a 'recommended' internal solution with some sort of safety measures designed into it, they can can point fingers at users and component makers in case of a wide spread hardware security issue, as it happened some time ago .
And it's easier for Apple to react and avoid further blame - and litigation .

I don't think even Apple would dare to reduce external/ alternative internal drives to only work with a specific Mac or confirmed OS attached to it , or the other way around .
Who would buy such a thing ?
 
  • Like
Reactions: ssgbryan
Apple branded, gaming focused eGPU ???? Since when? Rumor or information from where? ( I haven't read everything possible on the Internet, but that is flying extremely under the radar of alot of places if true.)

Apple worked with Blackmagic to do their eGPU. They have worked with Sonnet earlier than that. It would be extremely out of behavior for Apple to jump into what they had previous deemed as a "3rd party opportunity" hardware zone and start to push players out. Another 3rd party with yet another hardwired eGPU sure, but something that Apple is spending a ton on time on themselves? Why? They don't have a shortage of adaptable eGPUs and have at least two hardware wired ones already in the mix.




The 1 game developer could just as easily be in the iPad Pro section of the session as the Mac session. Apple makes more money on iOS games than macOS ones. Or a session showing same game on both systems.

I’m not claiming it’s Apple branded, but it will be available from Apple like the Blackmagic Pro 580. I discovered it in the Vega driver in the latest beta. Details are in the Vega & Polaris support thread.
 
Apple won't release a Mac pro or iMac pro yet since AMD Navi compute focused GPU barely are sampling, unless they plan to switch to nVidia Volta (unlikely but still possible), a Mac/iMac pro with native volta gpgpu and cuda10 support in xcode would shut up some mouths, buy also would be shot on toe for metal unless apple restrict nVidia to only the top pro machines.

Gaming Macs? Nope, gaming iPad more likely.

I believe iMac and Mac mini to include AMD 12nm Rx680 (Polaris30) faster for most tasks than Vega64, possible both to include Zen CPU for a first time, up to 8 cores if so, next gen iMac pro and Mac pro likely to run on threadripper or epyc w/o regard having AMD or nVidia GPUs.
 
Last edited:
  • Like
Reactions: Boil
Apple won't release a Mac pro or iMac pro yet since AMD Navi compute focused GPU barely are sampling, unless they plan to switch to nVidia Volta (unlikely but still possible), a Mac/iMac pro with native volta gpgpu and cuda10 support in xcode would shut up some mouths, buy also would be shot on toe for metal unless apple restrict nVidia to only the top pro machines.

Gaming Macs? Nope, gaming iPad more likely.

I believe iMac and Mac mini to include AMD 12nm Rx680 (Polaris30) faster for most tasks than Vega64, possible both to include Zen CPU for a first time, up to 8 cores if so, next gen iMac pro and Mac pro likely to run on threadripper or epyc w/o regard having AMD or nVidia GPUs.

An all AMD solution for Macs would be interesting; Ryzen 3/5/7 CPUs & RX5XX series GPUs for the 'regular' laptops & desktops, Threadripper CPUs & Vega GPUs (this includes the Vega-based WX9100 workstation GPU) for the iMac Pro & modular Mac Pro...

But then there is the 2020 transition to ARM...?
 
Actually, the statement says quite the opposite: AMD is not focussing much on delivering powerful cards for gamers (they are far behind on that, even if they will eventually equalise) but, as far as investing in ML and AI implementation with GPGPUs is concerned, they believe that the future is based on multiple card configurations.

Errr, no. Unless AMD has created some obtuse marketing slang for MCM what MCM stands for is "Multiiple Chip Module". So if AMD says they are

"... We are looking at the MCM approach ..." they are talking about putting multiple GPU dies on one module. That is how you get to mutiple-GPUs . It isn't necessarily multiple cards. In fact, quite the opposite. AMD has a track record of putting multiple GPU packages on one card. Using MCM they could do it even more efficiently.

A GPU isn't the whole card. So if they say "multi-GPU is considerably different" you can't leap to that is is multiple cards. That can simply be just multiple dies.

For example, Intel put together a package/module with a AMD Vega GPU + HBM RAM and an Intel CPU die into a single package. Intel used EMIB to do it but that is just a variation on the same general concept.

15347901763891330773852_575px.jpg

https://www.anandtech.com/show/13242/hot-chips-2018-intel-on-graphics-live-blog

In the Intel case, one of the EMIB example's dies above is actually a interposer (the middle example with the HBM and Vega ) coupled to a x86 die and there is just PCI-e run over the silicon bridge between the two.

The dies doesn't necessarily have to be heterogenous. AMD essentially does a MCM when they package multiple 8 core Zen dies into a Threadripper package ( to get 16-32 if have 4 or 8 active on each die).
So AMD could easily be following a similar strategy with the next gen GPUs. If take two Interposers with Vega like GPU with attached 'local' HBM memory stacks and then run an AMD 'Infiniy Bridge' connection between the GPUs they could create a relatively very large MCM with the two glued together on the same card. ( If it is 'compute' targeted card they could completely toss the silicon chips and logic associated with driving external physical display connectors. ).

One of the problems is that vendors can only make interposer just so big. ( it is around same limits of how big can make a die since those a 'printed'/'fabbed' with similar techniques.) Humongous, super max sized dies have problems.


What I was wondering is whether Apple will still focus on a macOS development to exploit multiple cards,

They already do have substantive support. The new eGPU management stuff added with macOS 10.14 allow you to assign workload to one of more than one GPUs present in the system.

If you mean SLI/Crossfire of trying to make a flat memory space virtual GPU out of 2 (or more ) GPUs for generic graphics.... then no. Didn't before and probably won't now. Computationally, I think Metal 2 is behind the curve of OpenCL 2+ (and CUDA) on several of the "shared, flat memory address space" characteristics. They have work to do there.


when only the nMP (hopefully) would allow it, or just leave the multiple cards being used only by vertical applications, therefore ending like the 6,1 which never used both cards at the same time for the same purpose.

6,1 did engage multiple cards if put in the effort. There were multiple issues there. One, that didn't come to OpenCL until well after the 6,1 shipped. Two, some of the 6,1 GPUs didn't fully support OpenCL 2+ concepts. Three, for compute that will result just in

They said that one of the mistakes of the 6,1 was believing that the market would have moved to dual configurations instead of one powerful card.

That isn't quite it. This issue was there wasn't "one" market. Some folks did have workloads that were well aligned with two GPUs. Others didn't. Apple's problem with the 6,1 was two fold. One, they tried to span both markets with one standard configuration that had two GPUs. Two, switched bets on path to enabling multiple GPUs ( walked away from OpenCL and went Metal). The first was a problem because didn't have a system could configure well with just one GPU (e.g., a 300W GPU and a 140W CPU sharing the same 'core' would bleed too much heat to the lower of the two). It is also a problem because the value proposition is damaged in selling something costly to those who don't need it ( those who just one, not necessarily powerful, GPU). The second is a problem because even the "two (or more)" GPUs folks will have adoption problems if path forward is murky (and there are less turbulent paths to follow).

They also got the projection track wrong GPU TDP growth. The size/capacity and scope of memory being thrown at the cards had some disconnects with some of the particulars with the 6,1 design. The D700 probably came in over some projected power budget. Apple was also suck with AMD problems with iterating forward just one 'mid' class GPUs moving forward.


It's kind of confusing to me. :D

Parts of Apple seemed to be confused themselves.



I agree. I think it will start with one card and offer multiple GPU configurations. Whether they are going to be also from nVidia, PCI-e or soldered, it's another discussion.

it would be equally dubious to solder the 2nd card as it would be to not to offer it at all. Part of the problem was not just that there were two "twin" GPUs, but part of the issue was not all of the "2 or more" GPUs users needed the second GPU that Apple had to offer. The primary issue is that Apple won't ( can't is too strong a phrase but pretty close) make all of the cards for everybody. What they need in the 2nd, nominally empty, slot is something that other vendors can fill. That slot doesn't have to support boot screens or other Mac specific stuff. As a computational card it only really needs to crunch numbers. But it may not be a GPGPU computational card at all.

That is the other fundamental flaw. The notion that the "dual" cards have to be exact twins. That's nice in some contexts but it seriously should not be necessary.


There is really no need to solder the primary GPU card either. If they want to sell 'twins' even less so. Solder isn't the primary functional issue. Relatively seamless integration with Thunderbolt probably is. (export the DisplayPort out of the box on a external facing edge or not.).


That was the 6,1: one card as a GPGPU and the other used for standard graphics. I wish I could use both of them for graphics or to focus on improving my simulations when I run them or even just to speed up my Finder.

Neither card was directly coupled to a external display connector. In 2016 GPU cards could only run a couple of displays. The number of display the modern mid-high can run is past good enough for most. Even if crank up the resolution space of a smaller number of displays things are pretty good.

Getting back to the topic these "multiple GPU" solutions that AMD is looking into won't necessarily mean more graphics output. The output of Machine Learning / AI / Blockchain isn't graphics data per se.


That needs apps AND an OS that are tuned together and Apple didn't deliver appropriate libraries (as far as I know) and didn't push developers enough since the 6,1 has been on the market since 2014 and Final Cut still runs slower than on a MBP with Quicksync.

One, I think Quicksync is a corner where Apple didn't avoid a single vendor solution and put effort into exploiting it. Since all Macs had intel iGPUs in them, Quicksync worked across the whole product line up. Having to deal with AMD, Nvidia , Apple's , and Intel fixed function coders of varying abilities across a fragmented Mac space probably wouldn't turn out so well.

Second, for video that is outside the scope of Quicksync focused codecs the speed differences aren't so great. Quicksync is only good for a certain subset of video. (it happens to be a popular consumer subset, but subset none the less. )


But yeah, it would be fantastic if they realised that they would sell ********s of nMPs if they let us choose which and how many cards to install in our workstations and mainly letting us upgrade them with time and money: not everybody can start from the top configuration.

It is highly doubtful they would relatively sell "*****s of nMP". It won't pass iMac in sales. Probably at least one (if not two) orders of magnitude off of Mac laptop sales. For the next Mac Pro is probably is the case of whether the Mac Pro is a visible slice of any Mac product pie chart Apple execs look at in their weekly sales tracking meetings. ( 1-2% or less is a sliver that won't even show in a reasonable sized pie chart. ). If the Mac Pro got to 4% it would doing good. That isn't huge though (relative to the rest of the line up).

AMD's future MCM GPU solution probably would not be the starting/entry configuration at all. In fact, I'd be surprised is it was even put in the nominal, boot GPU 'slot' at all even if Apple deployed one in certain BTO configs.

The dual GPUs drive the base price of the Mac Pro from the $2.5K up to the $3K range. That part of "top configuration" ( actually "more expensive" ) configuration problem is true. Apple needs to either come back to something closer to $2.5K or shift the components cost budget to something that "everybody" they are targeting will like more of ( e.g., 16GB base RAM or bigger SSD or both ). The macOS software stack can easily consume and make good use of both of those with no changes to most currently actively developed programs.
[doublepost=1540793326][/doublepost]
I’m not claiming it’s Apple branded, but it will be available from Apple like the Blackmagic Pro 580. I discovered it in the Vega driver in the latest beta. Details are in the Vega & Polaris support thread.

There is nothing there that suggest that this is targeted at gaming. If it is a Blackmagic eGPU upgraded from 580 to Vega 56 then that system is focused on creation ( e.g., aid to Blackmagic Resolve) and "creation" , not gaming.
Gaming would be s side effect of what it could also be useful for, but if the Apple dog and pony show major theme is about creation ( "There is more in the making") then that's problem what they would demo ( something being created. ) Either with Resolve (if Bmagic is doing demo) or FCPX ( if Apple is doing the demo ).
 
I think you're confusing the MacPro with the new Mini.

I don’t understand what you read in order to form that conclusion. I was offering my opinion that tomorrow’s event will be less pro focused than consumer focused which may include some emphasis on gaming. The new Mac mini with an RX Vega eGPU should be a formidable gaming machine. I haven’t suggested that the new Mac Pro will be geared towards gaming. It’s a professional product. My best guess is that the new Mac mini will have Vega 12 graphics and the new Mac Pro will have Vega 20 graphics. That’s an educated guess based on device IDs in the drivers (Apple is testing unreleased Macs with those GPUs). I don’t believe either will have Navi, except possibly next year’s Mac mini if it is refreshed again.
 
Last edited:
I don’t understand what you read in order to form that conclusion. I was offering my opinion that tomorrow’s event will be less pro focused than consumer focused which may include some emphasis on gaming. The new Mac mini with an RX Vega eGPU should be a formidable gaming machine. I haven’t suggested that the new Mac Pro will be geared towards gaming. It’s a professional product. My best guess is that the new Mac mini will have Vega 12 graphics and the new Mac Pro will have Vega 20 graphics. That’s an educated guess based on device IDs in the drivers (Apple is testing unreleased Macs with those GPUs). I don’t believe either will have Navi, except possibly next year’s Mac mini if it is refreshed again.
Or it can be another gaming card with a new "Pro" name, as usual.
 
If you mean SLI/Crossfire of trying to make a flat memory space virtual GPU out of 2 (or more ) GPUs for generic graphics.... then no. Didn't before and probably won't now. Computationally, I think Metal 2 is behind the curve of OpenCL 2+ (and CUDA) on several of the "shared, flat memory address space" characteristics. They have work to do there.

You might want to get up to speed on the AMD ProRender Engine.

It will view any GPUs (team red or team green) and CPUs as 1 render engine.
 
You might want to get up to speed on the AMD ProRender Engine.

It will view any GPUs (team red or team green) and CPUs as 1 render engine.

And how does the AMD ProRender Engine insert itself into all & every app that needs rendering...?

It is a third-party rendering solution, not the underpinnings of Metal...
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.