Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Waiting for New cpu's / chipsets?? AMD Naples 1 or 2 cpu or ryzen workstation 1 cpu name??? vs intel Skylake-X or Kaby Lake-X 1 or 2 cpu. Intel systems with 1 cpu have limited pci-e.

I get Apple waiting for next gen chips. Makes sense.

Full crossfire / sli so that games can use it and or be able to have dual gpu rending and have one be for gpu computing. So that it's the apps the can be coded the way that works for the app.

I'm not sure I follow your second point.
 
I get Apple waiting for next gen chips. Makes sense.
If the design is focused on upgradability, then there's no need to wait for components.

The purchaser may decide to wait for the next CPU, but Apple doesn't need to wait.

On the other hand, the mMP might not be ready for 2 or 3 CPU generations.
 
"Modular"

Perhaps a technical definition of "modular" includes PCs where any part that can be swapped for another part, but I don't think that's how the term is typically used in context. If it did, almost every PC applies, in which case specifically calling out a PC as modular loses its meaning.

I know from past discussions that generally people think of "modular" as having separate, optional modules that can be easily attached and detached to a main central box, like the following images show. In fact, I got these images from searching Google for "modular computer". What doesn't come up in that search is any random typical Dell, HP, etc.

1472939894_979_IFA-2016-Modular-PC-small-form-factor-HP-Elite-Slice.jpg


acerblocks.jpg


christine-100224071-orig.png
 
Last edited:
I still believe Apple is priming the Apple crowd for a single GPU machine. It may be upgradeable, but I think it might be a single card machine only. A mini tower. The reason? The people invited to the Apple event keep talking about the "new reality" where pros all moved to 1 GPU all over twitter, podcasts, blogs, etc.

These people had very little to say about GPUs last month. But now they are all saying the same thing: most software doesn't use multiple GPUs and pros need one big one. This is beyond the "thermal envelope" problem. Gruber, et all, are all saying the same thing: one GPU is the reality.

The evangelists Apple invited didn't think to ask about nVidia, CUDA, or OpenCL. But they all of a sudden know what pros need? Nah. They are mouth pieces for that Apple event. They were invited to convince people that whatever Apple makes is all they need.

I also don't know why Apple needs to much time to "design" what should be a simple, cheap to produce solution: a tower case.

This.

Single GPU's are rarely the best solution. A lot of GPU apps scale linearly, so two cards will be almost twice as fast as one. Also, it is possible on some apps to isolate the rendering to the non-display card, allowing the desktop and app to run full speed while rendering.

If they want a "modular" system, all they have to do is slap a modern dual Xeon mobo back in the 5,1 chassis, change the color of the case, add in some usb 3.1 and it would sell like hotcakes. There really isn't a need to re-invent the wheel.

Apple wants to over design it which is why we are in the current "we're sorry about the 6,1" fiasco.

I didn't think it would be possible for them to get it wrong again. But if this is the direction they are taking, then it may be a cluster from the start.
 
"Modular"

Perhaps a technical definition of "modular" includes PCs where any part that can be swapped for another part makes that PC modular, but I don't think that's how the term is typically used in context. If so, almost every PC applies, in which case specifically calling out a PC as modular loses its meaning.

I know from past discussions that generally people think of "modular" as having separate, optional modules that can be easily attached and detached to a main central box, like the following images show. In fact, I got these images from searching Google for "modular computer". What doesn't come up in that search is any random typical Dell, HP, etc.

1472939894_979_IFA-2016-Modular-PC-small-form-factor-HP-Elite-Slice.jpg


acerblocks.jpg


christine-100224071-orig.png

You forgot apple's own "Jonathan" computer concept:


upload_2017-4-5_11-22-51.png
 
(jeff7117) I think you're trying to read the tea leaves too hard. You're interpreting what the editors left of what the journalists said about what Apple said, not what Apple said. Pessimism and FUD seems to be running rampant. Show me where Apple said that multiple GPU's were a priori a bad idea. It's only a bad idea when you do it the way they did it on the nMP, where only apple-specific cards would work and therefore they had to hope that the dual GPU's they happened to have available at the time would satisfy going forward.

Plus, while I realize there are a lot of GPU centric people here, that's not the whole Pro world. I couldn't care less what they put in there, you could ship me a single GTX750ti and I'd be OK with it. I run a cMP for other reasons, and I can't imagine that I'm the only one.

I suppose they could redo the cMP tower, but I suspect they want to be go more modular than that. After all, one of the big cMP limitations today is the backplane stuff, PCIe rev 2 and SATA II. If they can work out a way to plug and play more of the backplane bits at a reasonable price, they might have something.
 
(jeff7117) I think you're trying to read the tea leaves too hard. You're interpreting what the editors left of what the journalists said about what Apple said, not what Apple said. Pessimism and FUD seems to be running rampant. Show me where Apple said that multiple GPU's were a priori a bad idea. It's only a bad idea when you do it the way they did it on the nMP, where only apple-specific cards would work and therefore they had to hope that the dual GPU's they happened to have available at the time would satisfy going forward.

Plus, while I realize there are a lot of GPU centric people here, that's not the whole Pro world. I couldn't care less what they put in there, you could ship me a single GTX750ti and I'd be OK with it. I run a cMP for other reasons, and I can't imagine that I'm the only one.

I suppose they could redo the cMP tower, but I suspect they want to be go more modular than that. After all, one of the big cMP limitations today is the backplane stuff, PCIe rev 2 and SATA II. If they can work out a way to plug and play more of the backplane bits at a reasonable price, they might have something.

Pulled from https://techcrunch.com/2017/04/04/apple-pushes-the-reset-button-on-the-mac-pro/

"Federighi elaborates:


I think we designed ourselves into a bit of a thermal corner, if you will. We designed a system that we thought with the kind of GPUs that at the time we thought we needed, and that we thought we could well serve with a two GPU architecture… that that was the thermal limit we needed, or the thermal capacity we needed. But workloads didn’t materialize to fit that as broadly as we hoped.

Being able to put larger single GPUs required a different system architecture and more thermal capacity than that system was designed to accommodate. And so it became fairly difficult to adjust. At the same time, so many of our customers were moving to iMac that we saw a path to address many, many more of those that were finding themselves limited by Mac Pro through a next generation iMac… And really put a lot of our energy behind that. [But,] while that [upgraded iMac] system is going to be fantastic for a huge number of customers — we want to do more.
"

Also “There’s certain scientific loads that are very GPU intensive and they want to throw the largest GPU at it that they can,” says Federighi. “There are heavy 3D graphics [applications] or graphics and compute mixed loads. Those can be in VR, those can be in certain kinds of high-end cinema production tasks where most of the software out there that’s been written to target those doesn’t know how to balance itself well across multiple GPUs but can scale across a single large GPU.”

That sounds like a single GPU solution to me.

I'm all for a decent single GPU solution as long as there's room for expansion to multiple cards via internal or supported eGPU solutions using non Apple GPU's.
 
(jeff7117) I think you're trying to read the tea leaves too hard. You're interpreting what the editors left of what the journalists said about what Apple said, not what Apple said. Pessimism and FUD seems to be running rampant. Show me where Apple said that multiple GPU's were a priori a bad idea.

Apple Senior Vice President of Software Engineering Craig Federighi:

I think we designed ourselves into a bit of a thermal corner, if you will. We designed a system that we thought with the kind of GPUs that at the time we thought we needed, and that we thought we could well serve with a two GPU architecture… that that was the thermal limit we needed, or the thermal capacity we needed. But workloads didn’t materialize to fit that as broadly as we hoped.

Yes, it did. Multi GPU compute is real, popular and will continue to dominate. Production software is continuing to move to GPU or it will be left behind. Just ask Maxon, The Foundry, Next Limit, etc.

Apple keeps saying "it didn't materialize", but that is false. What happened was the GPU revolution moved quickly to power hungry towers running multiple big GPUs and the Mac Pro couldn't even handle one. Not only that, the CG/mograph/animation industry is overwhelmingly nVidia/CUDA based.

What Federighi said that is true is that Apple painted themselves into a thermal corner. They say over and over in the transcript that the the Mac Pro couldn't handle one big GPU on one side. But this is a bit of misdirection. It couldn't handle any number of big GPUs and had no nVidia option, and that's where non-Apple software went.

“There’s certain scientific loads that are very GPU intensive and they want to throw the largest GPU at it that they can,” says Federighi. “There are heavy 3D graphics [applications] or graphics and compute mixed loads. Those can be in VR, those can be in certain kinds of high-end cinema production tasks where most of the software out there that’s been written to target those doesn’t know how to balance itself well across multiple GPUs but can scale across a single large GPU.”

"3d graphics applications" are some of the ones most suited to multi GPU, so I don't know what he's talking about outside of VR.

Hell, even Nuke supports multi-GPU support on the nMP!

I'm sure someone could point out some high end cinema program that only uses one GPU for certain nodes, but I don't have any more time for that. 3d rendering and sims will eat multi GPUs setups all day. And the speed they bring has transformed the industry.
 
I'll reply to your post like a few others have already. What on earth exactly do you think 'modular' means?!?
PC manufacturers have been making the 'modular', 'upgradeable' boxes for all of existence, and so too did Apple, until the trash can.
Complicated?!!? It's the easiest design to pull off, since you're not trying to build a piece of art disguised as a computer.

And if Apple was a normal PC builder that is exactly what they should build. But Apple is all about doing things different. They hung a giant banner off the side of 1 Infinite Loop with the word "Different" in 10 million point font for months.

What you are describing is just the cMP redux. I happen to think that is what Apple should do, but Schiller said the MP team was tasked to do something "really great". That sounds more like they are trying to make something revolutionary, hence the year plus development time, instead of something evolutionary like a nice practical box. I hope its the latter, but I'm not sure Apple can help trying to WOW everyone with their design prowess.
 
  • Like
Reactions: robeddie
Pulled from https://techcrunch.com/2017/04/04/apple-pushes-the-reset-button-on-the-mac-pro/

"Federighi elaborates:

[snip]

That sounds like a single GPU solution to me.

I'm all for a decent single GPU solution as long as there's room for expansion to multiple cards via internal or supported eGPU solutions using non Apple GPU's.

Well, it doesn't sound at all like a single GPU solution to me, so we obviously read things differently. I definitely think there's too much "apple can't possibly get this right" bias here that is skewing how people interpret things.
 
Apple Senior Vice President of Software Engineering Craig Federighi:



Yes, it did. Multi GPU compute is real, popular and will continue to dominate. Production software is continuing to move to GPU or it will be left behind. Just ask Maxon, The Foundry, Next Limit, etc.

Apple keeps saying "it didn't materialize", but that is false. What happened was the GPU revolution moved quickly to power hungry towers running multiple big GPUs and the Mac Pro couldn't even handle one. Not only that, the CG/mograph/animation industry is overwhelmingly nVidia/CUDA based.

What Federighi said that is true is that Apple painted themselves into a thermal corner. They say over and over in the transcript that the the Mac Pro couldn't handle one big GPU on one side. But this is a bit of misdirection. It couldn't handle any number of big GPUs and had no nVidia option, and that's where non-Apple software went.



"3d graphics applications" are some of the ones most suited to multi GPU, so I don't know what he's talking about outside of VR.

Hell, even Nuke supports multi-GPU support on the nMP!

I'm sure someone could point out some high end cinema program that only uses one GPU for certain nodes, but I don't have any more time for that. 3d rendering and sims will eat multi GPUs setups all day. And the speed they bring has transformed the industry.

I agree. Either they really don't know anything about the CG/mograph/animation/Post Production industry or they don't care outside of FCPX, or both. It's almost like Apple is trying NOT to discuss or get involved with anything from Nvidia/CUDA.

Maybe their reference to things not materializing meant that Open CL didn't take off like they hoped.

Nvidia put in a LOT of work for years to get CUDA off the ground. It's pretty much the standard for GPU based processing right now.

If their hardware is going to continue to be developed for their software, then I would expect the push for AMD cards over Nvidia will continue from Apple.
 
  • Like
Reactions: ssgbryan
Well, it doesn't sound at all like a single GPU solution to me, so we obviously read things differently.

Well, we all hope that Apple has the option for multi-GPU. Having the Apple software guy talk about how high end cinema software doesn't scale across GPUs - when we know that is wrong - either hints that they are trying to pump up their single GPU solution or that they simply don't know what they are talking about.

Why would he say that? Final Cut Pro X is dual GPU aware. Did he think Apple is unique and found some secret sauce nobody else discovered?

It strikes me as good old fashioned reality distortion field.
 
Last edited:
Pulled from https://techcrunch.com/2017/04/04/apple-pushes-the-reset-button-on-the-mac-pro/

"Federighi elaborates:
.....

Also “There’s certain scientific loads that are very GPU intensive and they want to throw the largest GPU at it that they can,” says Federighi. “There are heavy 3D graphics [applications] or graphics and compute mixed loads. Those can be in VR, those can be in certain kinds of high-end cinema production tasks where most of the software out there that’s been written to target those doesn’t know how to balance itself well across multiple GPUs but can scale across a single large GPU.”

That sounds like a single GPU solution to me.

Cherry picking out of context isn't going to help reading comprehension. The paragraph before this

"...
And though Apple feels that the current Mac Pro does work well for a certain set of customers, for other applications it was essentially at the end of its ability to “get better.”

I ask who, exactly, the pro customers are that most needed the more powerful GPU in a Mac Pro.
.... "

The author explicitly elicits for some subset of case of example. Not that are solely single only domains in their absolute entirity, but some place where there are some single GPU use cases for illustrative examples.

The notion that Apple thinks dual aren't useful and effective is BS. For a certain set it works well and for another certain set single works better. Neither of those two subsets completely define pros. Neither. Probably try to keep with that have with the current Mac Pro and expand a bit with this other set that is being talked about that they don't particular do well with now.



I'm all for a decent single GPU solution as long as there's room for expansion to multiple cards via internal or supported eGPU solutions using non Apple GPU's.

Apple isn't trying to go to either absolute extreme. Dual has good coverage so have to go for quad GPUs any more than have to completely abandon Dual for mega Single.

If Apple gives themselves gives themselves a budget of for example 380W and give themselves up to two connections in that 380W thermal zone in the container then they could do
1. 300W card.
2 two 170W cards.
3. one 200W and one 150W card
etc.

No. In that example they wouldn't be able to do two 310 cards, but Apple is getting coverage with alot less that that now. Currently Apple has about a 150W budget for the two GPU cards. It is harder ( though technically not impossible ) to fold that into a 300W single solution. Nothing Apple said was a mismatch to them simply wanted a redesign system that allowed them to allocate a GPU thermal budget more flexibly. The flexibility will allow them to keep/grab market as another subset exit for MBP/iMac or elsewhere.
[doublepost=1491428091][/doublepost]
Apple Senior Vice President of Software Engineering Craig Federighi:

But workloads didn’t materialize to fit that as broadly as we hoped.
Yes, it did. Multi GPU compute is real, popular and will continue to dominate. Production software is continuing to move to GPU or it will be left behind. Just ask Maxon, The Foundry, Next Limit, etc.

The "broadly" he is speaking of there is across multiple areas of the collective, comprehensive pro market. Multiple GPUs being effective for programmer's compile problems, most legacy audio apps, doctors analyzing images, data analysis , etc. ?? That kind of breadth. Not 4 apps in the exact same product category. Broad isn't enumerating a list of apps that do approximately the same thing. It is a broad set of different algorithms in different application areas; not variations of the same theme.

Rendering image one , Rendering image two , ..... those can be highly decoupled. Where the working data set is incrementally interact with either other over iterations having a single larger cache pool pays benefits on some workloads.


Apple keeps saying "it didn't materialize", but that is false. What happened was the GPU revolution moved quickly to power hungry towers running multiple big GPUs and the Mac Pro couldn't even handle one. Not only that, the CG/mograph/animation industry is overwhelmingly nVidia/CUDA based.

computer graphics , animation .... broadness??

OpenCL kneedcapped on the mac and CUDA's growth in apps isn't decoupled. That is a contributing component in the "it didn't materialize" that is being swept under the rug here. The are multiple players in the "it didn't materialize" impediments including Apple and AMD, between the probable plan in the 2012 time frame and the actdual 2014-2016 execution on the Mac Pro.



"3d graphics applications" are some of the ones most suited to multi GPU, so I don't know what he's talking about outside of VR.
...
I'm sure someone could point out some high end cinema program that only uses one GPU for certain nodes,

again as posted above he was asked for illustrative areas where they are some use cases. Not a broad classification for the entire breath of uses cases for the entire area. Also have to remember context where there is no SLI/Crossfire like infrastructure on macOS. Just moving visual around that has lots of data to it ... that is one GPU.
 
Last edited:
  • Like
Reactions: kschendel
The "broadly" he is speaking of there is across multiple areas of the collective, comprehensive pro market. Multiple GPUs being effective for programmer's compile problems, most legacy audio apps, doctors analyzing images, data analysis , etc. ?? That kind of breadth. Not 4 apps in the exact same product category. Broad isn't enumerating a list of apps that do approximately the same thing. It is a broad set of different algorithms in different application areas; not variations of the same theme.

Rendering image one , Rendering image two , ..... those can be highly decoupled. Where the working data set is incrementally interact with either other over iterations having a single larger cache pool pays benefits on some workloads.


computer graphics , animation .... broadness?? .

So, if I'm reading this correctly, your problem is that my examples are all in one industry? Sorry, that's where I work, and that's where a lot of Mac users work (or used to work). I'm not cherry picking. Those are just the ones I know, and hence where I express my personal frustration and feel like the Apple guy had it wrong.

Multi-GPU acceleration is also used in simulation for science for everything from simulating weather to DNA sequencing, analyzing data for energy exploration, etc. It's everywhere. nVidia even has a section on their website dedicated to multi GPU support across multiple disciplines, if you want to know more.

I feel like Apple always puts people in this position: arguing semantics about the tiniest word choice.

Multi-GPU is out there across many disciplines. I think Federaghi was wrong. But I'm tired of debating word choices and semantics at this point. I don't think it's worth anymore effort to talk about vaporware.

Peace.
 
So, if I'm reading this correctly, your problem is that my examples are all in one industry? Sorry, that's where I work, and that's where a lot of Mac users work (or used to work). I'm not cherry picking. Those are just the ones I know, and hence where I express my personal frustration and feel like the Apple guy had it wrong.

Multi-GPU acceleration is also used in simulation for science for everything from simulating weather to DNA sequencing, analyzing data for energy exploration, etc. It's everywhere. nVidia even has a section on their website dedicated to multi GPU support across multiple disciplines, if you want to know more.

I feel like Apple always puts people in this position: arguing semantics about the tiniest word choice.

Multi-GPU is out there across many disciplines. I think Federaghi was wrong. But I'm tired of debating word choices and semantics at this point. I don't think it's worth anymore effort to talk about vaporware.

Peace.
+machine learning and artificial intelligence

That's why I have a number of systems with quad Titan X GPUs. (And there's little communication between GPUs, so SLI or NVlink have no value.)
 
  • Like
Reactions: Jack Burton
So, if I'm reading this correctly, your problem is that my examples are all in one industry? Sorry, that's where I work, and that's where a lot of Mac users work (or used to work). I'm not cherry picking. Those are just the ones I know, and hence where I express my personal frustration and feel like the Apple guy had it wrong.

You're not the only one. I read Federighi's statements the same way. It appears, from Apples own comments, that they don't have a grasp on how much GPU usage is in the pro market outside of their own apps. Nor does it appear that they understand how GPU usage can scale depending on the application.
 
  • Like
Reactions: Jack Burton
You're not the only one. I read Federighi's statements the same way. It appears, from Apples own comments, that they don't have a grasp on how much GPU usage is in the pro market outside of their own apps. Nor does it appear that they understand how GPU usage can scale depending on the application.
Maybe they don't understand GPUs because they don't look beyond OpenCL?

CUDA! CUDA! CUDA! CUDA! CUDA! CUDA!
 
  • Like
Reactions: ssgbryan
Good point - but the Teslas and Quadros are the same size and same power draw as the Titans and GTX cards.

You can get a GP102 Pascal chip in a $700 GTX 1080 Ti, or the same chip in a $5K to $7K Quadro/Tesla - if ECC on the VRAM is vital.

Note that Apple didn't use a FirePro with ECC in the MP6,1.
While you have a point, it's an oversimplification of the differences between the cards. From certified drivers to the software running off the chips themselves. You can stick a few Titans into a workstation, but you really won't get the same results as a similar Quadro card or duo setup.

Think of it this way. A Quadro card or Fire Pro is purpose made to do a single set of tasks and do them extremely well. A GeForce or regular AMD card is akin to Tim Allen's character, "Tim Taylor," from the show Home Improvement. It's a big barge that does a bit of everything, but falls flat on its face in particular tasks not made for its use.

Quadro cards come with their own unique pros that a regular GeForce card doesn't offer. If I were a pro working in video editing, photo editing or CAD, simulation, etc. A Quadro card would be a must-have, and not a "Oh, it's expensive. Let's get a GeForce instead."
 
While you have a point, it's an oversimplification of the differences between the cards. From certified drivers to the software running off the chips themselves. You can stick a few Titans into a workstation, but you really won't get the same results as a similar Quadro card or duo setup.

Think of it this way. A Quadro card or Fire Pro is purpose made to do a single set of tasks and do them extremely well. A GeForce or regular AMD card is akin to Tim Allen's character, "Tim Taylor," from the show Home Improvement. It's a big barge that does a bit of everything, but falls flat on its face in particular tasks not made for its use.

Quadro cards come with their own unique pros that a regular GeForce card doesn't offer. If I were a pro working in video editing, photo editing or CAD, simulation, etc. A Quadro card would be a must-have, and not a "Oh, it's expensive. Let's get a GeForce instead."
Do you have links that show that a GP102 in a GTX 1080 Ti gets different results from a GP102 in a Tesla P40?

Didn't think so....

If I were a pro working in video editing, photo editing or CAD, simulation, etc. A Quadro card would be a must-have, and not a "Oh, it's expensive. Let's get a GeForce instead."
But apparently you're not. What are your GPUs?

(For the record, the Quadro that came in the Dell Precision workstation that I bought for my home system is sitting in a drawer, and a GTX 960 is in the tower. The Quadro couldn't drive two 4K monitors, the 960 can.)
_____

And to be on-topic, the question is whether the future mMP can support more than one powerful GPU - not whether those GPUs are artificially overpriced "Quadro" or "Tesla" or "FireBird" GPUs vs. the same chips with a consumer brand.

Don't throw a dead cat onto the table.
 
Last edited:
The beauty of the cMP is that it could run from 0 to 4 video cards. Nobody had to argue about what the best number of cards was for all situations because you could configure different hardware for different needs and budgets.

Back then Apple recognized flexibility to some degree and sold the cMP in both 1- and 2-card configurations.

I hope the mMP will be similarly flexible.
 
The whole point of moving compute to GPU is to its massive parallelization capabilities to get fast results. In apps that are designed keeping GPU in mind, scalability is virtually linear.
I am not sure about the single massive GPU thing Apple mentions as an industry trend. What's more powerful than a single GPU? Two of them...
 
i really don't think that's the case.. the computer seems designed around heavy processing on two gpus and a cpu.. that's the function in which the form was for-- not the other way around..
back then, multi-gpu and GPGPU enhancements seemed right around the corner.. but the direction it appears to have gone is VerY powerful single gpus..

at least according to Federighi in your quotes, the new mac pro simply can't handle one side of the core getting very hot.. they were probably thinking more along the lines of 2 gpus and a cpu collectively generating the same heat/processing power but spreading out to all three sides.. instead, it's currently one gpu generating that much heat/processing and they can't stick one (much less two) of them in nMP..

idk, i think when they designed it, they made sound decisions based around functionality in which the form followed.. but there's risk involved when innovating since you're sort of predicting the future or attempting to shape the future..
in this case, apple took the risk and it hasn't panned out as some (the designers and at least a few customers (such as myself)) had imagined it would earlier in the decade.

SLI is still very capable in Windows for gaming. Mass Effect Andrameda on my gaming machine at 4K max settings needs SLI to get past the 60FPS mark. It isn't a 100% increase in performance but an additional 15 to 40 frames per second is nothing to sneeze at. :)

If Apple would support SLI or Crossfire it would enable better game performance.

Alternatively can a developer with proper coding in a tool like autocad or maya use additional cards to accelerate some of the viewports. For example if looking at the top, side and perspective view is it possible for the graphics cards to be assigned a group of voewports to render?
[doublepost=1491439465][/doublepost]
The simple question the bloggers/reporters could have asked (that apple probably wouldn't answer) is whether it is going to be a literal sits on desktop system ( like the MP 2013) or a desk-side/under-desk system like the (MP 2009-2012). If the primary objective is to push it "out of normal sight" then differences from past desk-side units probably wouldn't be all that high in general form.

If it is suppose to sit out on top of the desk and hit the same noise levels and similar desktop footprint constraints. That might lead to something with new differences.
That is a very good question.

I also wonder if they will give us SATA ports internally for drives. Will they do something like a X99 board but for Xeons (don't know off the top of my head what chipset that is) and include some m.2 slots or pcie slots for graphics cards, pcie SSD drives or what ever else we can come up with to use in it.

I sure hope so. :) if they can make it in between the size of the nMP and the classic but be as adaptable as the classic I would be enthralled.
 
The whole point of moving compute to GPU is to its massive parallelization capabilities to get fast results. In apps that are designed keeping GPU in mind, scalability is virtually linear.
I am not sure about the single massive GPU thing Apple mentions as an industry trend. What's more powerful than a single GPU? Two of them...
It is a trend in the higher consumer space (gamers, "power-user"), and as "gamer" hardware became mainstream then in the lower end of the workstation market also tends towards using the same builds due to the availability of components for it. A typical graphics / light video editing machine is easily served by one Pascal GPU, probably doesn't need ECC RAM and Xeon so you can grab off the shelf Skylake and RAM. This is the use-case that Apple wanted to push 5K iMac into, where the 6700k is decent enough and it has a max 64GB RAM, but the limitation of SATA bays, thermo headroom for a more powerful large GPU, still using DDR3 SO-DIMM etc makes it not a very great performer, and the lack of CUDA drivers for even an eGPU solution kills it.

If Apple intends to be as flexible as possible they almost have to go back to the cheese grater form factor where the number of configs are not limited by number of PCI slots (or the lack of it even). If they are seriously going for a design that can only use a single (internal) GPU then it is as limited as trash can MP, the only real difference is that it would fit in a modest demographic who wants something better than an iMac but not necessarily anything too much better. We will see.
 
I still think we're likely to see something like the Pascal Eggert concept design from last year. All I read into the interview is they want a way to ship a current GPU on a yearly basis.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.