Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
...PCI-E 3.0 reaches about 256 Gbps, while PCI-E 512 Gbps. Any future card you might want to add to an iMP would be more powerful than a Vega, which means very likely that it will have a high bandwidth workflow, exploiting fully PCI-E.....I seriously doubt that your money would be well spent with an ...1180 Nvidia or an AMD Vega 2 card in a TB3 enclosure. At least if you're looking to improve performance on an iMP.

There is already good data that increasing PCI-E bandwidth has less impact on real-world performance than you'd think.

Adding PCI-E lanes has almost no benefit: http://www.gamersnexus.net/guides/2488-pci-e-3-x8-vs-x16-performance-impact-on-gpus

PCI-E 3.0 has limited benefit over 2.0: https://www.pugetsystems.com/labs/a...E-Speed-on-Gaming-Performance-518/#Conclusion

PCIE- 3.0 has limited benefit over 2.0: http://www.hardwaresecrets.com/pci-express-3-0-vs-2-0-gaming-performance-gain/5/

Thunderbolt 3 has varying but often modest performance loss vs PCI-E. Note this is still much faster than the original GPU. The point is whether the improvement from baseline is beneficial for the task at hand, not whether it's exactly equal to PCI-E.: https://egpu.io/forums/mac-setup/pcie-slot-dgpu-vs-thunderbolt-3-egpu-internal-display-test/

At the WWDC discussion on eGPU support, Apple stated the performance situation is highly variable and there's no single accurate depiction. Some apps are designed to do lots of work on the GPU with fewer API calls, in which case there is minimal TB3 perf. penalty. Other apps do lots of little calls to the GPU which would have more overhead on TB3. Furthermore the efficiency and utility of eGPU support will be constantly changing in the future as developers use the new macOS features that support this.

Due to these varying parameters, it is unknown whether future eGPU cards added to an iMP would not be cost effective due to lack of TB3 bandwidth. It depends on the exact workflow and how that future app is written to use the GPU. Today nobody knows that with any confidence.
 
At the WWDC discussion on eGPU support, Apple stated the performance situation is highly variable and there's no single accurate depiction. Some apps are designed to do lots of work on the GPU with fewer API calls, in which case there is minimal TB3 perf. penalty. Other apps do lots of little calls to the GPU which would have more overhead on TB3. Furthermore the efficiency and utility of eGPU support will be constantly changing in the future as developers use the new macOS features that support this.

Sure, you are right. I never meant to say that it is a linear progression, but I am thinking that for a workflow of a Pro user (not a gamer), between simulations with many iterations, deep learning massive data training sessions and other GPGPU intensive tasks, an internal slot would do a better job than an external bus.

I am not against eGPUs. I am just saying they might not be the best deal for Pro users on a high-end machine.

Btw, since this chat is full of GPU experts (I am serious :D, it's not sarcasm!), can anybody explain to me the difference between:

* Vega Frontier Edition

* Vega Radeon Instinct

* Vega Radeon XP (or however the gaming card is called)

I am pretty sure to understand the difference between the gamer and the FE, but not sure about the difference between the FE and the Instinct.

Which flavour would we expect to see in the iMP and mMP (hypothetically)?
 
I am pretty sure to understand the difference between the gamer and the FE, but not sure about the difference between the FE and the Instinct.

Vega Instinct is designed to accelerate deep learning inference and training. Tom's Hardware has a review of the Instinct line at http://www.tomshardware.com/news/amd-radeon-instinct-miopen-deep-learning,33170.html

Which flavour would we expect to see in the iMP and mMP (hypothetically)?

The iMac Pro will come with Vega Frontier Edition GPUs as that is the workflow it is designed around. The new Mac Pro will also use Vega Frontier Edition, but I could see Vega Instinct Accelerators possibly being available, as well.
 
  • Like
Reactions: askunk
I don't think the FE is set to compete with the Quadro. They have been presented by AMD itself as a bridge "pro+gaming", more or less like a Titan Xp, with appropriate drivers that favour Pro apps but let games run pretty well.
 
Any chance that the 27" iMac Pro is the "RAM not upgradeable" model and an unannounced 32" iMac Pro will have the little RAM upgrade door?
 
Guys.. marketing machine - or not - this iMac Pro will be the the best christmas tree present, one could wish for.... This is how I see it.... the thought alone tastes sweet like honey.. an iMac Pro unboxing on christmas... hmmm this goes down well with a cinnamon "Glühwein"... prost.. till then, I guess my good old best computer I ever had MP5.1 has to push through.. cheers

-- and I didn't mention the year...

but one day..

maybe....

..
 
I don't see Apple offering a larger iMac Pro.

The rumored Apple-branded 32" 8K monitor will be for the new Mac Pro.
Originally - when there were only rumors of an iMP - I took it as a given that they'd make it 32" in order to accommodate more/bigger components and correspondingly offset the larger thermal demands. Then, when it was announced as 27", it occurred to me that the reason was probably one of synergization: being able to leverage the construction (especially of screens) and distribution (including boxing and storage) chain of the existing iMacs is a huge money-saver for them. Tim "Logistics Guy" Cook would surely have seen that solution a mile away.

But if they go ahead and release 32" screens then it kind of blows a hole in that theory. It's not like anyone in the market for an iMP is going to to turn down the extra size.

Plus, they've marketed the iMP heavily as being multi-display capable (see their website) - how poor would it look if their standalone monitors dwarfed the actual main screen? It'd be a clumsy thing to do.
 
Last edited:
how poor would it look if their standalone monitors dwarfed the actual main screen? It'd be a clumsy thing to do.

I don't see a problem. It happens daily with a MBP connected to screens. I think the 27" choice is driven mainly by operations. The iMP uses lots of parts in common with other iMacs. It would be anti-economic to build a niche product with a dedicated full assembly line.
 
I don't see a problem. It happens daily with a MBP connected to screens.

But that's because a laptop is, by design, small and portable. On top of which, when people use a monitor with their laptops it's usually as the main screen when they're at home, not as a dual-monitor set up. No one's going to dual-screen a 15" laptop and a 32" inch screen; you'd just swap. When you plug a monitor into an iMac on the other hand, you'd expect it to be a second screen, not a replacement of the iMac's internal one.

The iMP uses lots of parts in common with other iMacs. It would be anti-economic to build a niche product with a dedicated full assembly line.
That's what I was driving at before. If there was no 32" screen for the 2018 MP in development then I'd understand why they stuck to a 27", because you could roll it out on the same assembly line as every other desktop. But once you start creating 32" screens for the 2018 MP as is rumored, then it seems a bit niche to develop that screen purely for the fraction of Mac users who'll get an MP. You'd surely get a bit more bang for your buck sticking that screen (and perhaps that chassis, if it's similar in proportion) into the iMP and combining the entire Pro assembly line, rather than rolling out the iMP with the regular iMacs and depending upon the sales of the 2018 MP alone to justify those larger screens and any other bespoke components that end up in that machine.
 
Typically 32" screens are now "just" 4K, with a PPI much lower than the retina offerings from Apple. If Apple were to maintain retina pitch (~215PPI) across all their products, a 32" iMac / display would need to be roughly 6000x3400, which is an odd resolution that's neither DCI nor Rec. The next step up is 8K resolution that entails a ~43" panel, which treads into too-big territory for desktop usage.

If the iPad Pro Pro-motion feature is of any indication on Apple's direction in graphics, my bet is that we will see a 120Hz Apple Display at sub-27" size before them offering any bigger models.
 
When you plug a monitor into an iMac on the other hand, you'd expect it to be a second screen, not a replacement of the iMac's internal one.
Typically 32" screens are now "just" 4K, with a PPI much lower than the retina offerings from Apple.

So for a person working with 4K video, for example, you use the 32" 4K display for the video output and the iMac Pro's 5K display to hold your application and tools. So the 5K display is the "main" display and the larger display is the output.


If there was no 32" screen for the 2018 MP in development then I'd understand why they stuck to a 27", because you could roll it out on the same assembly line as every other desktop. But once you start creating 32" screens for the 2018 MP as is rumored, then it seems a bit niche to develop that screen purely for the fraction of Mac users who'll get an MP.
The next step up is 8K resolution that entails a ~43" panel, which treads into too-big territory for desktop usage.

The Dell UltraSharp UP3218K 8K display is 32" and offers a Retina-level 280dpi resolution at a native 7680 × 4320 resolution. It's also $5000.

So one could reasonably expect that a base 8K 32" iMac Pro with the same specs as the base 5K iMac Pro would be at lease $3000 more expensive. Folks are already foaming at the mouth on the $5000 price of the entry iMac Pro. An $8000 or even $9000 base price would likely send this forum into hysterics. :p
 
So for a person working with 4K video, for example, you use the 32" 4K display for the video output and the iMac Pro's 5K display to hold your application and tools. So the 5K display is the "main" display and the larger display is the output.

The Dell UltraSharp UP3218K 8K display is 32" and offers a Retina-level 280dpi resolution at a native 7680 × 4320 resolution. It's also $5000.

So one could reasonably expect that a base 8K 32" iMac Pro with the same specs as the base 5K iMac Pro would be at lease $3000 more expensive. Folks are already foaming at the mouth on the $5000 price of the entry iMac Pro. An $8000 or even $9000 base price would likely send this forum into hysterics. :p
Yes I do see the value of a proof monitor put on the side, I myself did it with some of my past setup especially when doing video editing. That secondary monitor can be of lower than Retina pitch since it naturally sits further away, practically it is not a desktop display in that sense. The question is if Apple would be interested to offer that, when not even the desktop display option is available for now.

As for the Dell, it seems to be more of a technical demo than anything practical. There is close to no realistic benefit to go higher than the Retina pitch, once you got the pixel dense enough then you will need to peep closer to see more details. On mobile devices, or even VR goggles or AR glasses we may have a need for that, but on a larger format display, the higher diagonal size requires the viewer sitting further away to fully fit his angle of view. Even with Apple's own lineup, the screens' PPI gradually scales down to compensate for that.

Anyhow, I just don't see Apple offering an 8K display in any meaningful manner, at least not as a foundation model. For similar reasons why we don't and probably won't ever see a 4K MBP 15".
 
As for the Dell, it seems to be more of a technical demo than anything practical...Anyhow, I just don't see Apple offering an 8K display in any meaningful manner, at least not as a foundation model.

I am assuming LG is the manufacturer of Dell's 8K display so I am also assuming Apple feels there could be some benefit to having it as an option - it would allow one to edit and view 4K video while still having plenty of area left over for all the tools, negating the need to have a separate 4K display for preview.

Honestly, if Apple does launch their own branded "8K Thunderbolt Display", I fully expect they will also offer their own branded "5K Thunderbolt Display" using the same LG panel as the UltraFine 5K.
 
Vega Frontier Edition is the equivalent to the Quadros (designed for workstations). Radeon RX Vega will be the designed for gaming cards.

I was expecting more from AMD with Vega.

Compared to Quadro cards its pretty good deal when comparing performance. However it lacks the certified drivers. But it gets even weirder, AMD states that smaller business forgo GPUs with certifications due to high cost. However if a business is going to forgo GPUs with certifications there are GPUs that are cheaper and better performing then the FE so where exactly does that leave it?
 
I am assuming LG is the manufacturer of Dell's 8K display so I am also assuming Apple feels there could be some benefit to having it as an option - it would allow one to edit and view 4K video while still having plenty of area left over for all the tools, negating the need to have a separate 4K display for preview.

Honestly, if Apple does launch their own branded "8K Thunderbolt Display", I fully expect they will also offer their own branded "5K Thunderbolt Display" using the same LG panel as the UltraFine 5K.
That's the gut feeling of the tech press also, the original 5K LG panel was almost exclusively only deployed by Apple and Dell. To be honest I would much rather have the exact panel used in LG UltraFine 5K, but with more than one input options so it is not stuck being just my iMac's secondary display. Thunderbolt3 / USB-C hub or not does not concern me, but of course if it is USB-C native then there is one less adaptor in the chain to worry about. I find it odd that LG's own USB-C offerings are only capped at sRGB 100% (the 32UD99 and 27UD88), if they had DCI-P3 or Adobe RGB I would have pulled the trigger already.

Btw, the TB3 / USB-C DisplayPort alt mode limits itself at ver 1.3, which is another future limiting factor for Macs to output to higher bandwidth displays, such as the Dell UP2718Q which requires DP1.4 for full 4K 60Hz HDR 10 bit. This brings to another question, if Apple release anything requiring more bandwidth than the currently 5K iMac display then it will probably need proprietary wiring, unless it is internally built-in with a GPU.
 
Last edited:
So for a person working with 4K video, for example, you use the 32" 4K display for the video output and the iMac Pro's 5K display to hold your application and tools. So the 5K display is the "main" display and the larger display is the output.

I was only referring to an Apple-built 32" screen, which would presumably be 5, 8 (or maybe even 10)K - not a generic 4K screen from a third party.

The point is, it'd be insane for Apple to develop a 32" screen and only use it in standalone monitors that are marketed toward the tiny percentage of Mac Pro users. Doubly so if it's higher than 5K, in which case they have to completely rebuild the OS to accommodate the higher-than-retina resolutions. From a development pipeline perspective, it'd be a missed synergy opportunity not to retrofit those screens, chassis, and distribution methods to higher end iMacs/iMac Pro too.

In short, if they're actually developing an 8K (or whatever it turns out to be) 32" screen, it's guaranteed to make its way back into the iMac range, at least in the Pros.
 
Btw, the TB3 / USB-C DisplayPort alt mode limits itself at ver 1.3, which is another future limiting factor for Macs to output to higher bandwidth displays, such as the Dell UP2718Q which requires DP1.4 for full 4K 60Hz HDR 10 bit. This brings to another question, if Apple release anything requiring more bandwidth than the currently 5K iMac display then it will probably need proprietary wiring, unless it is internally built-in with a GPU.

Yes, I expect the new Mac Pro will offer dedicated DisplayPort 1.4 ports and/or HDMI and an Apple 8K display will almost certainly use dual-DP1.4 like the Dell UltraSharp UP3218K to drive it.


The point is, it'd be insane for Apple to develop a 32" screen and only use it in standalone monitors that are marketed toward the tiny percentage of Mac Pro users.

It's been said that Apple was insane for not developing their own 5K display years ago, then doubly so for sub-contracting it out to LG (and then triply so after the wireless interference issues with the LG display came to light).

Apple is going to invest scores of millions on developing a new Mac Pro even though only 1-2% of their Mac market will ever buy one. That is arguably insanity from a purely fiscal perception. Compared to that effort, dropping another million or two engineering an 8K and 5K display looks like common sense. Especially since they probably already spent that supporting LG on the UltraFine display already. :)
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.