Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
it is 5K 27" eGPU TB3 based,

There is one thing sure, next TB3RDP its whatever size, whatever dpi, whatever interface will have no eGPU.

Reason: the eGPU matters only to macs w/o dGPU, for macs with dGPU an eGPU implies disabling internal (more powerful) dGPU, for the same architectural reasons that don't allow you to render with the iGPU and dGPU at the same time, the current Mac Architecture (inherited from PCs) don't allow mix renderer GPUs, only exception is when both are the same model and are linked thru SLI or similar, Thunderbolt 3 lacks SLI or similar, accordingly an eGPU matters only to macs w/o dGPU as the mini and the baseline MacBook Pros, so if such display arrives most likely will be an solution targeted ar minis and baseline iMac and Macbooks, neither the 5K iMac should benefit from an eGPU by disabling the internal (with 16x buses vs 4x buses on the external).

PD, same issue happens to Windows based pc and eGPU solutions.

So I consider, some sources speculating on eGPU display will fail ( or at least wont be a product targeted at iMac 5K or Mac pro even 16'rMBP w iGPU on board).
 
There is one thing sure, next TB3RDP its whatever size, whatever dpi, whatever interface will have no eGPU.

Reason: the eGPU matters only to macs w/o dGPU, for macs with dGPU an eGPU implies disabling internal (more powerful) dGPU, for the same architectural reasons that don't allow you to render with the iGPU and dGPU at the same time, the current Mac Architecture (inherited from PCs) don't allow mix renderer GPUs, only exception is when both are the same model and are linked thru SLI or similar, Thunderbolt 3 lacks SLI or similar, accordingly an eGPU matters only to macs w/o dGPU as the mini and the baseline MacBook Pros, so if such display arrives most likely will be an solution targeted ar minis and baseline iMac and Macbooks, neither the 5K iMac should benefit from an eGPU by disabling the internal (with 16x buses vs 4x buses on the external).

PD, same issue happens to Windows based pc and eGPU solutions.

So I consider, some sources speculating on eGPU display will fail ( or at least wont be a product targeted at iMac 5K or Mac pro even 16'rMBP w iGPU on board).


The eGPU could be something you could enable or disable on the monitor's settings. I don't know though why you would want to pay extra for something you won't use.
 
The eGPU could be something you could enable or disable on the monitor's settings. I don't know though why you would want to pay extra for something you won't use.
Some Apple fans love to pay extra for no added benefit. ;)

The 27" Apple monitor (recently dropped) was a prime example. Five year old screen, low resolution, outrageous price.
 
There is one thing sure, next TB3RDP its whatever size, whatever dpi, whatever interface will have no eGPU.

Your argument presupposes the dGPU in the laptop would be the one to switch off when connected to the eGPU display. IMHO, that will never happen.

The specific description, is that the monitor senses what machine config it's plugged into, & the eGPU switches off its functions when the GPU in the machine is more powerful than that which is in the monitor.

*speculation from here*

The eGPU will be less powerful than any dGPU in a mac which is compatible with the display (TB3 Macs only) - the eGPU is a weak GPU, for the purpose of allowing iGPU macs to elegantly drive 5k, not for improving the graphics performance of a dGPU mac. I wouldn't be surprised if the benchmark for performance was for it to be no better than the iGPU on a laptop's internal screen, the difference being just the panel size.

So, for a dGPU mac, all the eGPU will do, in effect, is provide the thunderbolt chain attachment, and the connection to the 2-part display panel, same as the current 5k iMac.

Yes, that will seem wasteful to people who care, and there will be wailing and gnashing of teeth over being forced to pay for a GPU you won't be using, and as usual, Apple won't give a toss about the people making those complaints.

Apple's scale from the 27" R5K iMac means they can bring this panel to market *far* cheaper than you would expect if adding the cost of a Dell 5k screen plus GPU, for example, and effectively eat/hide the cost of the eGPU within a price that looks "Apple normal". I'm expecting it'll cost the same as the previous TB display, and the presence of the eGPU will be downplayed as a technological feature - the perspective being that a monitor which is capable of driving itself, regardless of how weak the computer is, is the new normal and shouldn't be in any way remarkable.
 
Reason: the eGPU matters only to macs w/o dGPU, for macs with dGPU an eGPU implies disabling internal (more powerful) dGPU, for the same architectural reasons that don't allow you to render with the iGPU and dGPU at the same time, the current Mac Architecture (inherited from PCs) don't allow mix renderer GPUs, only exception is when both are the same model and are linked thru SLI or similar, Thunderbolt 3 lacks SLI or similar, accordingly an eGPU matters only to macs w/o dGPU as the mini and the baseline MacBook Pros, so if such display arrives most likely will be an solution targeted ar minis and baseline iMac and Macbooks, neither the 5K iMac should benefit from an eGPU by disabling the internal (with 16x buses vs 4x buses on the external).

PD, same issue happens to Windows based pc and eGPU solutions.

Uhhhh not true at all. Apple has done cross vendor multi GPU demos before at WWDC, and DirectX 12 has multi GPU multi vendor support built in, which can be used to render with an internal and external GPU at the same time, even if the GPU types or vendors don't match. DirectX 12 and OS X will both do multi GPU render without an SLI or Crossfire bridge as well, if that wasn't already implied by cross vendor multi GPU.

Because bridges are no longer required DirectX 12 would even have no problem rendering using two external GPUS at once. Thunderbolt doesn't need to know anything about SLI or Crossfire. It's all software now.
 
DirectX 12 and OS X will both do multi GPU render without an SLI or Crossfire bridge as well
What? I have 2 D700 and only 1 renders output.
I don't know about windows on DX12 (gaming rendering) but system GUI can only render on a single GPU at time, as yet.
[doublepost=1472391610][/doublepost]
So, for a dGPU mac, all the eGPU will do, in effect, is provide the thunderbolt chain attachment, and the connection to the 2-part display panel, same as the current 5k iMac.
the problem with this theoretical 5K display is that it is useful/appealing only for low-end macs not the Mac Pro neither the 5K iMac, precisely are its main market. You dont need to be a Marketing expert to be aware wat to do...

The info I have on the 4K/21" TB3RD (which i confess I hat to triple down before giving any credit, i was one of the supporter of the MST TB3 5KRDP solution theory), this is targeted at both low-end macs with intel Iris Pro and as second aux display for 5K iMacs/ Mac Pros, much wider market, Apple will sell an Asus 5K DP1.3 display only for Mac Pro (either driven thru native HDMI or a usb-v->DP1.3 adapter both at 60hz).

The 4K21" TB3RDP will allow daysichain displays/tb3 peripherals, no 5K solution (either with or w/o eGPU) allow enough bandwidth to daysichain another display or Tb3 peripheral (maybe a single TB1 on a MST/TB3 5K arrangement).
 
Ah yes. That will work well for realtime music composition for example. Thanks for pointing out the obvious to me.

uh, a personal computer will work fine by itself for realtime music composition.. that's not too intensive of a task.

i'm talking about processes that could make use of a cluster / render farm.. for that type of work, you'll get much higher performance at a much cheaper price point via cloud than trying to keep all the processing on local hardware.
 
uh, a personal computer will work fine by itself for realtime music composition.. that's not too intensive of a task.

i'm talking about processes that could make use of a cluster / render farm.. for that type of work, you'll get much higher performance at a much cheaper price point via cloud than trying to keep all the processing on local hardware.

No - the fact is that a 6-core Xeon is not enough for what I do. Thus I upgraded to a 12-core (dual CPU) cMP. And Apple removed the 2nd CPU option on the nMP which means their new machines are actually less powerful than the old ones. I am aware that there is a 12-core single CPU BTO available, but that is actually slower (CPU wise at least) than the cMP and way too expensive for what it is, so not many people in their right mind will shell out the cash for that.

So that is an example of work that will benefit from multiple CPU's and not work with a cloud service. I am sure there are many more usages that various professionals here on the forum could come up with that would fit this scenario as well.
 
No - the fact is that a 6-core Xeon is not enough for what I do. Thus I upgraded to a 12-core cMP. And Apple removed the 2nd CPU option on the nMP which means their new machines are actually less powerful than the old ones.

So that is an example of work that will benefit from multiple CPU's and not work with a cloud service. I am sure there are many more usages that various professionals here on the forum could come up with that would fit this scenario as well.
your argument seems to be within the realm of personal computers.. 4 core, 6 core, 8, 12.. (and within the realm of computers available by apple alone.)..

6 core vs 12 core is minuscule.. 12core nmp vs 12core cmp is even more miniscule.
i'm talking 6 core vs 60,000 core.. and that i (and plenty of others) have access to 60000 core computers for (a lot) cheaper than me personally buying 12cores over 6 cores.
 
the problem with this theoretical 5K display is that it is useful/appealing only for low-end macs not the Mac Pro neither the 5K iMac, precisely are its main market. You dont need to be a Marketing expert to be aware wat to do...

Low end Macs are the majority of the machines Apple sells, that's the primary market for the displays, it was the primary market for the previous TB display as well - which wasn't by any means a "pro" display.

For a "Pro" mac, it'll be a 27" 5K wide-gamut display with webcam, speakers, ambient light sensor, that'll cost less than a Dell 5k display. Apple supporting DP1.3 at all in the near future / next generation of hardware is not a given.
 
Low end Macs are the majority of the machines Apple sells

how many 800$ mac mini or 1200$ macbook are plugged to a 1200$ Thunderbolt display?

A Thunderbolt display its a high end market device (despite its color gamut or other "pro" features lack off), a 5k TB3RDP would cost not less than 1800$ w/o eGPU, while a 4K21"TB3RDP would cost about 600$
 
how many 800$ mac mini or 1200$ macbook are plugged to a 1200$ Thunderbolt display?

More, I would hazard a guess, than are plugged into Mac Pros. It's a premium docking station for a laptop, which happens to contain a display, not a monitor for budget-conscious consumers, or high-spec "Pros". A TB display, plus a Macbook Air cost less than a dGPU-based Macbook Pro on its own.

That's the hardware ecosystem the 5K eGPU display is updating - that's the product - a compact power-efficient iGPU laptop, which leaves the bulk, and power draw, of the hardware to drive a big display, in the big display. Which, just so happens, can also be used with the eGPU switched off for other machines.

Macbook Pros didn't radically rise in price, or become smaller when they went retina, neither did iPads or iPhones, and neither will the successor to the Thunderbolt display. There's no way in the wide world Apple is going to make a 4k display, something comparable to the myriad of cheaper 4K display options, and they're certainly not going to make a display that's smaller than the one they discontinued. I'm betting the new TB display's marketing tagline is simply that it's the only single cable 5k display for Mac, and DP1.3 displays will be left unsupported until the next Mac Pro ~mid 2017.

Anyway, we'll see what happens.
 
That's the hardware ecosystem the 5K eGPU display is updating - that's the product - a compact power-efficient iGPU laptop, which leaves the bulk, and power draw, of the hardware to drive a big display, in the big display. Which, just so happens, can also be used with the eGPU switched off for other machines.
You think as a passionate user, not as Market analyst, a 5K eGPU display its something only interest to few Macbook users, Apple want to sell you an 5K iMac not a 5K auxiliary display to convert your underpowered macbook on a iMac 5K wannabe desktop, it wont happen (and if this happens I'm convinced wont be the best experience, amid of system stabilty etc, it wont hapen, I know Apple owns a patent about, it doesnt means they wanna build the actual device, keep dreaming).
You may think a 4K, 21" display, not an option but this is actually the ideal Dock for Macbooks and Mac Minis, apple dont wanna allow you to grew beyond 21" retina display with an cheap system, a 21"tb3 retina display is good enough and technically safe. the Few Pros or Mac Users with deep pockets to afford 5K will be able to use a DP1.3 5K display.
I'm betting the new TB display's marketing tagline is simply that it's the only single cable 5k display for Mac, and DP1.3 displays will be left unsupported until the next Mac Pro ~mid 2017.
FYI DP1.3 comes to new Macbook Pros as to new Mac Pro/iMac, even maybe to the new Mini, in either two ways: the non-tb3 USB-C ports (only 2 USB-C ports will be Thunderbolt 3 ports on the new Macbooks, and upto 4 on the MacPro);
And even its possible Intel provides a minor revision on its Alpine Ridge TB3 controller to allow DP1.3 in USB-C DP Alt Mode (not the same as TB3 DP mode which reaches only DP1.2 and requires a Thunderbolt interface at the peripheral end while allow 2 DP1.2 channels for MST upto 5K)
 
Last edited:
Mago has a point with the 21".
But I sure hope they'll also release a 27", which to me seems obvious they should, the successor to the TBD. And something below the 5K iMac seems dull.
Still, the comment referring to 3rd parties when announcing the TBD discontinuation makes me feel it's over.
I'm also not buying the eGPU rumor but at the moment there's no clean solution really.
Would it be a low end Polaris?
And now with Vega only coming (again) in 2H17 seems that nMP will be early or mid 2017. AMD again shot itself in the foot, going around with the October launch - I believe those were really their own claims.
 
I think the ONLY display that Apple will show 100% guaranteed at some point in upcoming 12-18 months will be 27'' 5K display. Every other form factor was/is quite simply our speculation.

Vega for Enthusiast market(Titan X) will come in 1H 2017. There is second Vega for high-end market. GTX 1070 is Midrange GPU with high-end price, GTX 1080 is high-end GPU with Enthusiast market price. Titan X is Enthusiast level from the ground up. Everything beneath GTX 1070 is mainstream market.

I am adamant that we will see one HBM2 chip this year from AMD.
 
What? I have 2 D700 and only 1 renders output.

On Mac multi GPU requires more work on the part of the developer. It's not more automatic like it is in DX12. Most pro apps do better with OpenCL on one card and OpenGL on the other. Games are what would really want to do multi GPU rendering, and there aren't enough Macs, and definetly not enough Mac Pros playing games for anyone to bother doing multi GPU games for Mac.

In the future, it would be nice if Metal got DX12's multi GPU features that made the process more automatic, but no sign yet. If external GPUs or multiple GPUs become more common they might add that.

I don't know about windows on DX12 (gaming rendering) but system GUI can only render on a single GPU at time, as yet.

http://www.techspot.com/article/1137-directx-12-multi-gpu-geforce-radeon/

(Not in the above there are no bridges, and it's not using Crossfire or SLI.)

For system GUI it doesn't usually make sense to go multi GPU. But both Mac OS X and Windows support mixed multiple GPU. DX12 just makes it a lot more automatic. If Apple wanted to make it more automatic, there is no reason they couldn't build on top of their more manual technology as well.

You can do the multi tile render split across GPUs, just like DX12 does, on the Mac. But you're stuck doing it by hand instead of having the API do it for you, and no one wants to bother right now. But on OS X you could take a single frame and have different GPUs render different portions of it. GPUs on OS X can render to a display connected to a different GPU. It's called an offline render mode.
 
But AMD's roadmap now claims Vega in 1H17 only, so what do you expect this year still?
Maybe it's 490 but it's still Vega. Unless they're referring to big Vega next year and there will be a smaller Vega now and that doesn't count :)
 
But AMD's roadmap now claims Vega in 1H17 only, so what do you expect this year still?
Maybe it's 490 but it's still Vega. Unless they're referring to big Vega next year and there will be a smaller Vega now and that doesn't count :)
I already have stated this. RX 490 - Q4 2016. RX Fury Vega based GPU - Q1 2017.
 
It's there, but the cost per CPU minute can be shocking - even if you don't get slammed by the cost of data transfers.

Exactly. The renders costs add up very quickly and if you are transferring gigabytes of data per day and terabytes per week it gets out of hand really fast. That's aside from potential confidentiality issues.

Rendering in the cloud is not a cure all. It cuts the up front cost of acquiring a farm and maintaining it (AC, IT etc), but it's not a silver bullet. And it still not address the issue of needing a very powerful workstation at your desk to set up the scene in Maya etc.
[doublepost=1472429357][/doublepost]
your argument seems to be within the realm of personal computers.. 4 core, 6 core, 8, 12.. (and within the realm of computers available by apple alone.)..

6 core vs 12 core is minuscule.. 12core nmp vs 12core cmp is even more miniscule.
i'm talking 6 core vs 60,000 core.. and that i (and plenty of others) have access to 60000 core computers for (a lot) cheaper than me personally buying 12cores over 6 cores.


Those 60,000 cores are useless to you when you are trying to actually setup the scene in Maya or put together a composite in Nuke or Flame that EVENTUALLY will be rendered on a farm aka the cloud. You still need a powerful desk side workstation at your desk to actually set up the work that the farm will eventually process.
 
  • Like
Reactions: tuxon86 and filmak
It's there, but the cost per CPU minute can be shocking - even if you don't get slammed by the cost of data transfers.

Which is the same thing as saying it isn't there yet. You didn't even account for the biggest problem, which is TIME. I have to turn around a lot of last minute, huge batch video encoding twice a year, and then I have to be able to make same-day edits, re-encodes, and re-uploads.

When you're dealing with hundreds of 20-30 minute master video files, the upload times are a deal breaker. It doesn't matter if the videos take 10 seconds to encode, once in the cloud. It is still much faster to encode the drastically smaller files locally and upload them than it is to upload the masters.
 
  • Like
Reactions: ssgbryan and filmak
Mago has a point with the 21".
But I sure hope they'll also release a 27", which to me seems obvious they should, the successor to the TBD. And something below the 5K iMac seems dull.
Still, the comment referring to 3rd parties when announcing the TBD discontinuation makes me feel it's over.
I'm also not buying the eGPU rumor but at the moment there's no clean solution really.

Apple has a big problem with 5K TB3RD, in case this display receives a Go, these are the options and drawbacks:

A) DP1.3/USB-C 5K Display:

Good: the Technology is ready and in production at least by ASUS, no big deal to adopt, webcam/speakers thru USB possible, universally compatible.

Bad: No Thunderbolt Functionality, limited USB-3 bandwidth, 3rd party competition, no daisychain displays or peripherals.​

B) Thunderbolt 3/MST (dual DP1.2):

Good: Single Thunderbolt Cable, FULL WEBCAM/AUDIO/LAN/USB even some Bandwidth for TB1 class Peripheral Daysichain, compatible with every TB3 enabled mac, even those with only iGPU DP1.2 only capables
.
Bad: MST dual DP1.2 is not the most efficient and less troublesome way to drive video, very limited Bandwidth to daysichain devices, no chance to Daysichain another 5K Display, more expensive than DP1.3 solution.​

C)Thunderbolt 3/ eGPU:

Good: Single Cable, Host GPU Off Loading, Integrated Display Peripherals (Webcam etc) somewhat feasible.​

Bad: Renderer Restrictions (not possible to co-render with iGPU or dGPU help), More Expensive than TB3/MST solution, not viable on Mac Pros/iMac/rMBP16 since its dGPU outclass the eGPU mostly due PCIe 8x/16/32x interface, Periperals has to share bandwith with Gpu Data (choppy webcam/audio) , no spare bandwidth to daysichan peripherals.​

Note: a 4K TB3RD, has no such restrictions as to Daisychain devices, even a 2nd Display w/o bus performance degradation, render in SST mode, no gpu perfomance/compatibility issues, 4K/21" panels wide cheaper han a 5K, wider market (macbooks with or w/o dGPU, minis, MBP(multiple display setups), iMac (as auxiliar display).

If apple decides to build a 5K TB3RD, most likely will follow the MST solution, and assume no daisychain possibilities.

AND YES the TBD it is discontinued for ever, the TB3RD (actually an all new product) will replace it.
 
Apple has a big problem with 5K TB3RD, in case this display receives a Go, these are the options and drawbacks:

According to someone who claims to have used / worked on the eGPU display (the same source I suspect for the MR report on it, since all the details match up), it is a finished product, and was "released to manufacturing" months ago when the Skylake Macbook Pros were originally due to ship, until component delays held them up (and iirc caused them to elect to skip to a whole new dGPU generation, which added to the delay) - right about the time they canned the Thunderbolt display.

Speculating about what different combos of connectivity could do is lovely, but the source of this specifically stated there was no 4k display coming from Apple, it's the 5k eGPU display, and only the 5k eGPU display.
 
Exactly. The renders costs add up very quickly and if you are transferring gigabytes of data per day and terabytes per week it gets out of hand really fast. That's aside from potential confidentiality issues.

Rendering in the cloud is not a cure all. It cuts the up front cost of acquiring a farm and maintaining it (AC, IT etc), but it's not a silver bullet. And it still not address the issue of needing a very powerful workstation at your desk to set up the scene in Maya etc.

sounds like you have lots of negative things to say about cloud rendering yet you've never actually used it.. is that right?

Those 60,000 cores are useless to you when you are trying to actually setup the scene in Maya or put together a composite in Nuke or Flame that EVENTUALLY will be rendered on a farm aka the cloud. You still need a powerful desk side workstation at your desk to actually set up the work that the farm will eventually process.
well, the 60,000 cores are mostly useless to me.. none of the processes i use can scale to 120,000 threads.. i'm using somewhere between 64 and 128 cores of those supercomputers for a minute or two at a time... not all of it.. the cloud computers are built with super high core counts to allow for a lot of users at any given time.

----
re: local workstation.. you need a fast single core for virtually all processes leading up to a rendering.
if you're using a 12 core computer in this type of workflow, it goes like so:
do most of the work (well, ALL of the human work) on a single core at a slower clock speed than would be available on a quad.. render on the 12cores at a much slower rate than available via render farm..

single core clock speed is basically the #1 spec determining how well a 3D modeling application will run.. definitely not amount of cores.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.