A 6-7k display at 120ghz would need over 60-70Gbps video signal bandwidth, maybe even more for wide gamut stuff. That's out of reach for Thunderbolt 4.
Does that number account for DSC?
A 6-7k display at 120ghz would need over 60-70Gbps video signal bandwidth, maybe even more for wide gamut stuff. That's out of reach for Thunderbolt 4.
Does that number account for DSC?
No idea, I just took the typical quoted numbers I could find for various resolutions. DSC has quoted compression rates of 3.75:1, is that enough to transport 7k via a TB4 channel?
What would prevent Apple from increasing the compression, though? DSC can go higher. If they used a value of ~2.8:1, that should be sufficient for 10 bit 7k@120Hz. Is the issue that it becomes visually lossy?Apple runs DSC at 2.5x (12bits/px compressed using a 10bits/channel source framebuffer). So DSC wouldn’t quite be enough to get 6K @ 120Hz into a single DP 1.4 stream, Thunderbolt 3/4 does have the bandwidth to carry two tiled streams, which should fit in just under 32Gbps, leaving a respectable amount of bandwidth free for USB data.
When the XDR was released, it had to support machines that only supported HBR2/DP1.2 speeds, and used tiling to achieve it. Apple moved to Titan Ridge in the 2018 MacBook Pros, which enabled DP 1.4 support, but only the 15” MBP was able to output at HBR3 speeds. For some reason (maybe the iGPU?) the 13” MBP wasn’t. So a monitor released in 2019 that would only work with some machines released in the last year? Wasn’t gonna happen.
You can see how DSC alters things with this calculator. It was the only one I could find that seems to handle the blanking periods properly.
When I said M2 in that context I thought it would be understood that I was referring to the M2 generation generally, but I can see that didn't come through. I'll edit my post to make that clearer.An M2-Pro Mini does not necessarily mean that the regular M2 will support more displays or RAM than the current M1. It just means that the M2-Pro will.
I see what you meant.When I said M2 in that context I thought it would be understood that I was referring to the M2 generation generally, but I can see that didn't come through. I'll edit my post to make that clearer.
As you saw from nothingtoseehere's post, in many cases it's not an individual decision, it's a decision by the business owner. And that person could be reluctant to pay 50% more to supply laptops to their business just to be able to drive two external monitors.Apple are presumably confident that enough of those people will go for a 14" MBP instead.
That's not the best way to assess whether the M1 could support two 4k externals. It requires you determine directly what the M1's potential capabilities are, which requires a level of technical knowledge of the machine that none of us have....two external 4ks, plus the internal retina display, potentially in fractional scaling mode in a fanless machine with potentially only 8GB of unified RAM from which to allocate video memory? I suspect that your mileage may vary.
...two external 4ks, plus the internal retina display, potentially in fractional scaling mode in a fanless machine with potentially only 8GB of unified RAM from which to allocate video memory? I suspect that your mileage may vary.
Sure, but that's just the main 5k framebuffer. Other things need VRAM - otherwise we'd all be happy with 256MB. If you increase the display resolution - whether it's FHD to 4k or 4k to 5k for scaling - VRAM usage will increase across the board. Then there's bandwidth issues (M1 Pro/Max also increase the RAM bandwidth). Then the GPU has to do the downsampling (which may or may not involve VRAM - I have no idea - but is definitely extra work for the GPU) There's a reason Apple display a performance warning when you choose a fractional scaled mode.A 5K framebuffer with wide gamut is still only 60MB. RAM is the last problem for multiple high-res displays.
...and unfortunately you can have a bunch more active Safari tabs if you have 3 displays...You can fit a bunch of 8K frames into the space taken by a single Safari tab
And that person could be reluctant to pay 50% more to supply laptops to their business just to be able to drive two external monitors.
That's a distinction without a difference - I don't doubt Apple could have incorporated a third DisplayPort output on the M1, but I equally don't doubt that would have added complexity, cost and power consumption - along with possibly lackluster performance on multiple 4ks. The point is that the base Mx chip's most significant market is the MacBook Air, and supporting a third display simply wasn't on the "must have" list.I expect Apple understands that, and their decision to limit the M1 to 2 displays was a technological one rather than a business one (perhaps and I/O limitation related to the origins of the chip).
...and the simplest form of that question is "is there any evidence that not supporting 3 displays has harmed the sales of Macs and Minis" - and it looks like Apple have been selling them as fast as they make them.Instead, we should ask ourselves: Can we simplify/reframe this technically complicated question in a way that enables us to get a clear answer? And the answer is yes!
In scaled mode, without lag, on a machine with no fan? A quick google of "macbook pro 4k ui lag" suggests that is not undisputed. It was certainly a regularly cited issue with Intel Mac Minis: https://forums.macrumors.com/thread...n-the-mac-mini-2018-4k-monitor-dream.2155175/For instance, we know that a 2016 13" MBP with a Retina display, 8 GB RAM, and Intel integrated graphics only, could support 2 x 4k external displays, by Apple's own specifications (see https://support.apple.com/kb/SP747?locale=en_US).
OK, I didn't know that dual 4k might be borderline on the older Intel integrated graphics, even though it's included as supported in Apple's specs (which you say come from Intel).In scaled mode, without lag, on a machine with no fan? A quick google of "macbook pro 4k ui lag" suggests that is not undisputed. It was certainly a regularly cited issue with Intel Mac Minis: https://forums.macrumors.com/thread...n-the-mac-mini-2018-4k-monitor-dream.2155175/
Thing is, it was Intel who determined that their iGPUs could physically support 3-4 displays - and they were probably looking at Windows which doesn't have the fractional scaling requirements of MacOS. Apple could have decided to restrict that in MacOS (but imagine the wailing and gnashing of teeth here... in reality they didn't even restrict the machines that could run three external displays, they just only advertised 2)
In the case of Apple Silicon, Apple gets to decide whether it is cost-effective to include the extra complexity needed for a third display on the base Mx chip - even if performance is less than lacklustre.
Also, time has moved on and now we'd be talking about at least one of those displays being 6k, and office users needing to run more demanding things like video conferencing (heck, even WP and spreadsheet software now includes annoying animations...)
David Eden-Sangwell (has the "iCaveDave" youtube channel) had a clever idea for this: Apple could have given the user the option to disable the internal display in order to allow two large externals to be connected. This might have worked for your office, unless your typical setup was to use all three displays simultaenously (the laptop and the two externals).It is pure guesswork but I suspect that Apple's marketing decision is that the base "M" models only support 2 displays. So if you want to have a Macbook and two external displays, you must buy the MBP. Alas, I think that will stay but I were happy to be wrong.
In my office, the idea to switch to Apple was dead with this decision. The M1 MBA would have been perfect with the crucial exception that is does not support two external displays that we all use. For comparable Windows machines, two external monitors are supported.
"Pluginability" not the strong side of the Mac...
David Eden-Sangwell (has the "iCaveDave" youtube channel) had a clever idea for this: Apple could have given the user the option to disable the internal display in order to allow two large externals to be connected. This might have worked for your office, unless your typical setup was to use all three displays simultaenously (the laptop and the two externals).
I agree. I just thought it was a clever idea.I doubt that Apple would offer a solution of that kind, it would be very unlike them.
That would have been funDavid Eden-Sangwell (has the "iCaveDave" youtube channel) had a clever idea for this: Apple could have given the user the option to disable the internal display in order to allow two large externals to be connected. This might have worked for your office, unless your typical setup was to use all three displays simultaenously (the laptop and the two externals).
Yeah . So what laptop(s) did your office end up going with? Or are they still deciding? If the latter, maybe the M2 Air will be released at WWDC and support 3 displays...That would have been fun
As my office isn't exactly full of nerds: This nice, albeit theoretical, idea should include that you don't have to issue a terminal command or something similar before unplugging the MBA from the monitors and take it to the conference room with the intention to use the internal display there And vice versa, of course, it should be plug&play😃
What would prevent Apple from increasing the compression, though? DSC can go higher. If they used a value of ~2.8:1, that should be sufficient for 10 bit 7k@120Hz. Is the issue that it becomes visually lossy?
We ended up with 14" Lenovo ThinkBooks, AMD Ryzen 7 4700U, 16 GB RAM, 512 TB SSD. Ordered a while ago, delivering took some time, IT department still preparing them. Therefore, no first-hand experience yet But Lenovo devices have proved to be reliable for us, so I think they will be fine.Yeah . So what laptop(s) did your office end up going with? Or are they still deciding? If the latter, maybe the M2 Air will be released at WWDC and support 3 displays...
Agreed.Display I/O seems like it will be a big focus for M2, simply because it was relatively undercooked in the base M1 model so ripe for some love from the engineers for v2. I would be pretty disappointed if we have anything less than an HDMI 2.1 port on the Pros and support for at least two external high refresh 4K displays on the base M2 (although I would like triple external display support there's not much point going for more if the Air is still limited to two USB-C ports and one display per port).
At least in theory, there's no need to go to a proprietary standard, since TB4 with DisplayPort Alt Mode2.0 will do it:On the ultra high-end of monitors I wonder if Apple will go for something semi-proprietary to get over the limitations of TB4. If they want to push the specs of the XDR then this is going to be a problem for a number of years given how far away TB5 / USB5 are. Could be a good selling point for an AS Pro or Studio 2 - the new XDR works with the old models at 60hz, but if new models support 120hz it could drive sales for those that care about such things.
Doesn't the M1's TB4 use DP 2.0 rather than 1.4?Nothing really prevents it. I’m more pointing out that the XDR, with the 12bpp data rate they use at 6K with DSC, will fit in Thunderbolt @ 120Hz, but not DP1.4.
I’ll just add that the math you provided seems a bit off… Apple is using 10bpc, or 30bpp uncompressed. So when DSC is brought in, they compress it down to 12bpp (2.5x). The next rung down is 10bpp (3x), but at 10bpp, 6K @ 120Hz is still *just* outside the data rates available to DP1.4, due to the blanking timings. 26.2Gbps required at 10bpp vs 25.92Gbps available, assuming CVT-R2 timings which have the least overhead.
To fit into DP1.4 fully without tiling, they have to drop to 8bpp compressed. While the claims of DSC is that it is “visually lossless”, it is still a lossy algorithm, so the question is more just how far can you push it before artifacting becomes noticeable. And that’s not something there seems to be much data on, in part because DSC isn’t implemented in many displays, so there’s not a ton out “outside the lab” data for the consumer to dig through. But I’d imagine that as Apple is trying to push these as premium displays, they are likely to try to keep compression to a minimum.
Doesn't the M1's TB4 use DP 2.0 rather than 1.4?
For the math, I was just using this, which I cited in my OP. But it's not as detailed as your calculation:
https://gist.github.com/siracusa/bb006d14e9906ac215fdee7685dc4b4c
And yeah, while what constitutes digitally lossless is clearcut, "visually lossless" is fuzzy, since it depends on what differences people can and can't see, which varies based on the person, the viewing conditions, and the content.
So the minimum bandwidth when DSC is enabled for an RGB stream with no chroma subsampling is simply [pixels per second] * 8 = 2,620,304,640 * 8 = 20.96 Gbit/s. That would work out to a compression ratio of 3.75:1, which is on the high side for being "visually lossless", but is still within spec.
Ah, then DP Alt Mode 2.0, while it's supported under USB4, is not yet available anywhere, since it obviously requires DP 2.0.Nobody to date has shipped a DP2.0 device. They are very much on the near horizon though. The AMD 6000-series GPUs are supposed to be DP2.0 capable, and I would be surprised if Nvidia doesn't support it in the 4000-series GPUs. Intel's Alchemist GPUs look like they should as well. No monitors yet, and there likely won't be until there's enough GPUs in the wild for it to make sense.
In the case of Thunderbolt, there's nothing really coupling a TB revision to a DP revision, and TB4 adds nothing new to the TB3 spec. It is a "minimum required features" spec bump.
Yeah, Siracusa's math is on the mark, there's just a key term doing a lot of heavy lifting (emphasis mine), as they are calculating using an 8bpp rate. That is as compressed as you can go without chroma subsampling, as Siracusa also points out:
Then is DP Alt Mode 2.0, which is supported under USB4, currently available even though DP2.0 is not?
Also, I'm afraid I didn't follow what you meant previously when you wrote: "I’m more pointing out that the XDR, with the 12bpp data rate they use at 6K with DSC, will fit in Thunderbolt @ 120Hz, but not DP1.4." [emphasis mine] How is it relevant that DP1.4's bandwidth (32.4 GB/s) is less than TB4's (33.88 Gb/s), given the Macs can do TB4?