Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

leman

macrumors Core
Oct 14, 2008
19,521
19,677
Does that number account for DSC?

No idea, I just took the typical quoted numbers I could find for various resolutions. DSC has quoted compression rates of 3.75:1, is that enough to transport 7k via a TB4 channel?
 

Krevnik

macrumors 601
Sep 8, 2003
4,101
1,312
No idea, I just took the typical quoted numbers I could find for various resolutions. DSC has quoted compression rates of 3.75:1, is that enough to transport 7k via a TB4 channel?

Apple runs DSC at 2.5x (12bits/px compressed using a 10bits/channel source framebuffer). So DSC wouldn’t quite be enough to get 6K @ 120Hz into a single DP 1.4 stream, Thunderbolt 3/4 does have the bandwidth to carry two tiled streams, which should fit in just under 32Gbps, leaving a respectable amount of bandwidth free for USB data.

When the XDR was released, it had to support machines that only supported HBR2/DP1.2 speeds, and used tiling to achieve it. Apple moved to Titan Ridge in the 2018 MacBook Pros, which enabled DP 1.4 support, but only the 15” MBP was able to output at HBR3 speeds. For some reason (maybe the iGPU?) the 13” MBP wasn’t. So a monitor released in 2019 that would only work with some machines released in the last year? Wasn’t gonna happen.

You can see how DSC alters things with this calculator. It was the only one I could find that seems to handle the blanking periods properly.

 

theorist9

macrumors 68040
Original poster
May 28, 2015
3,880
3,060
Apple runs DSC at 2.5x (12bits/px compressed using a 10bits/channel source framebuffer). So DSC wouldn’t quite be enough to get 6K @ 120Hz into a single DP 1.4 stream, Thunderbolt 3/4 does have the bandwidth to carry two tiled streams, which should fit in just under 32Gbps, leaving a respectable amount of bandwidth free for USB data.

When the XDR was released, it had to support machines that only supported HBR2/DP1.2 speeds, and used tiling to achieve it. Apple moved to Titan Ridge in the 2018 MacBook Pros, which enabled DP 1.4 support, but only the 15” MBP was able to output at HBR3 speeds. For some reason (maybe the iGPU?) the 13” MBP wasn’t. So a monitor released in 2019 that would only work with some machines released in the last year? Wasn’t gonna happen.

You can see how DSC alters things with this calculator. It was the only one I could find that seems to handle the blanking periods properly.

What would prevent Apple from increasing the compression, though? DSC can go higher. If they used a value of ~2.8:1, that should be sufficient for 10 bit 7k@120Hz. Is the issue that it becomes visually lossy?
 
Last edited:

theorist9

macrumors 68040
Original poster
May 28, 2015
3,880
3,060
An M2-Pro Mini does not necessarily mean that the regular M2 will support more displays or RAM than the current M1. It just means that the M2-Pro will.
When I said M2 in that context I thought it would be understood that I was referring to the M2 generation generally, but I can see that didn't come through. I'll edit my post to make that clearer.
 
Last edited:

theorist9

macrumors 68040
Original poster
May 28, 2015
3,880
3,060
Apple are presumably confident that enough of those people will go for a 14" MBP instead.
As you saw from nothingtoseehere's post, in many cases it's not an individual decision, it's a decision by the business owner. And that person could be reluctant to pay 50% more to supply laptops to their business just to be able to drive two external monitors.

I expect Apple understands that, and their decision to limit the M1 to 2 displays was a technological one rather than a business one (perhaps and I/O limitation related to the origins of the chip). I.e., I think if the M1 could have easily been designed to support 3 displays, they would have relased it in that form.

...two external 4ks, plus the internal retina display, potentially in fractional scaling mode in a fanless machine with potentially only 8GB of unified RAM from which to allocate video memory? I suspect that your mileage may vary.
That's not the best way to assess whether the M1 could support two 4k externals. It requires you determine directly what the M1's potential capabilities are, which requires a level of technical knowledge of the machine that none of us have.

Instead, we should ask ourselves: Can we simplify/reframe this technically complicated question in a way that enables us to get a clear answer? And the answer is yes!

How do we do this? Well, we look at what Apple machines have been able to do in the past, and ask if we can reasonably compare their capabilities to the M1's. For instance, we know that a 2016 13" MBP with a Retina display, 8 GB RAM, and Intel integrated graphics only, could support 2 x 4k external displays, by Apple's own specifications (see https://support.apple.com/kb/SP747?locale=en_US).

Given this, in order to conclude that 2 x 4k can be supported by the M1, we merely have to assume that its GPU is at least as powerful as the integrated graphics in the 2016 13" MBP. And that's a very easy assumption to make! The only thing we've not accounted for here is whether baseline RAM requrements were higher in 2020 (when the M1 was introduced) than they were in 2016, but that's a 2nd-order effect.

You can see this approach is much better, because it avoids the pitfalls that would accompany trying to get into the weeds of the specific technical capabiltites of the M1.
 
Last edited:

leman

macrumors Core
Oct 14, 2008
19,521
19,677
...two external 4ks, plus the internal retina display, potentially in fractional scaling mode in a fanless machine with potentially only 8GB of unified RAM from which to allocate video memory? I suspect that your mileage may vary.

A 5K framebuffer with wide gamut is still only 60MB. RAM is the last problem for multiple high-res displays. You can fit a bunch of 8K frames into the space taken by a single Safari tab :)
 
  • Like
Reactions: Tagbert

theluggage

macrumors G3
Jul 29, 2011
8,013
8,445
A 5K framebuffer with wide gamut is still only 60MB. RAM is the last problem for multiple high-res displays.
Sure, but that's just the main 5k framebuffer. Other things need VRAM - otherwise we'd all be happy with 256MB. If you increase the display resolution - whether it's FHD to 4k or 4k to 5k for scaling - VRAM usage will increase across the board. Then there's bandwidth issues (M1 Pro/Max also increase the RAM bandwidth). Then the GPU has to do the downsampling (which may or may not involve VRAM - I have no idea - but is definitely extra work for the GPU) There's a reason Apple display a performance warning when you choose a fractional scaled mode.

You can fit a bunch of 8K frames into the space taken by a single Safari tab
...and unfortunately you can have a bunch more active Safari tabs if you have 3 displays...

And that person could be reluctant to pay 50% more to supply laptops to their business just to be able to drive two external monitors.

...but for every customer that does, Apple makes 50% more money, so they can afford to use a few. And you're still relying on a couple of anecdotes to establish that this is a significant need. Most people I know run between zero and one external displays off their MacBook Air - probably because AFAIK it was only the 2018-2019 MBA that supported more than one external display anyway.

I expect Apple understands that, and their decision to limit the M1 to 2 displays was a technological one rather than a business one (perhaps and I/O limitation related to the origins of the chip).
That's a distinction without a difference - I don't doubt Apple could have incorporated a third DisplayPort output on the M1, but I equally don't doubt that would have added complexity, cost and power consumption - along with possibly lackluster performance on multiple 4ks. The point is that the base Mx chip's most significant market is the MacBook Air, and supporting a third display simply wasn't on the "must have" list.

Instead, we should ask ourselves: Can we simplify/reframe this technically complicated question in a way that enables us to get a clear answer? And the answer is yes!
...and the simplest form of that question is "is there any evidence that not supporting 3 displays has harmed the sales of Macs and Minis" - and it looks like Apple have been selling them as fast as they make them.

For instance, we know that a 2016 13" MBP with a Retina display, 8 GB RAM, and Intel integrated graphics only, could support 2 x 4k external displays, by Apple's own specifications (see https://support.apple.com/kb/SP747?locale=en_US).
In scaled mode, without lag, on a machine with no fan? A quick google of "macbook pro 4k ui lag" suggests that is not undisputed. It was certainly a regularly cited issue with Intel Mac Minis: https://forums.macrumors.com/thread...n-the-mac-mini-2018-4k-monitor-dream.2155175/

Thing is, it was Intel who determined that their iGPUs could physically support 3-4 displays - and they were probably looking at Windows which doesn't have the fractional scaling requirements of MacOS. Apple could have decided to restrict that in MacOS (but imagine the wailing and gnashing of teeth here... in reality they didn't even restrict the machines that could run three external displays, they just only advertised 2)

In the case of Apple Silicon, Apple gets to decide whether it is cost-effective to include the extra complexity needed for a third display on the base Mx chip - even if performance is less than lacklustre.

Also, time has moved on and now we'd be talking about at least one of those displays being 6k, and office users needing to run more demanding things like video conferencing (heck, even WP and spreadsheet software now includes annoying animations...)
 

MrGunny94

macrumors 65816
Dec 3, 2016
1,148
675
Malaga, Spain
If the M2 Air does support 16GB (non-custom model order) and 2 external displays I would definitely buy one to carry with me on the go and leave the M1 Pro hooked up to my external displays.
 

theorist9

macrumors 68040
Original poster
May 28, 2015
3,880
3,060
In scaled mode, without lag, on a machine with no fan? A quick google of "macbook pro 4k ui lag" suggests that is not undisputed. It was certainly a regularly cited issue with Intel Mac Minis: https://forums.macrumors.com/thread...n-the-mac-mini-2018-4k-monitor-dream.2155175/

Thing is, it was Intel who determined that their iGPUs could physically support 3-4 displays - and they were probably looking at Windows which doesn't have the fractional scaling requirements of MacOS. Apple could have decided to restrict that in MacOS (but imagine the wailing and gnashing of teeth here... in reality they didn't even restrict the machines that could run three external displays, they just only advertised 2)

In the case of Apple Silicon, Apple gets to decide whether it is cost-effective to include the extra complexity needed for a third display on the base Mx chip - even if performance is less than lacklustre.
OK, I didn't know that dual 4k might be borderline on the older Intel integrated graphics, even though it's included as supported in Apple's specs (which you say come from Intel).

But think it through: Let's stipulate to the idea that 2 x 4k is borderline on the 2013-2018 Intel integrated graphics. Reasonably, if it's close enough to be borderline (and to be included as supported in Intel's specs), surely the 2020 M1's GPU, which is significantly more powerful, will be more than sufficient to drive them. I mean, come on. It sounds like you're more interested in coming up with whatever arguments you can against that rather than acceding to what's most reasonable and likely.

Finally, if you really want to look at what Apple itself has determined the M1 can support, look at the specs of the M1 Mini. Surely these *do* account for non-integer scaling: Apple says the Mini can drive a 4k and a 6k. That's a total of 3840*2160 + 6016*3384=28,652,544 pixels.

By contrast, if you have an M1 Air or 13" MBP, driving its internal display plus 2 x 4k would require 2560*1600 + 2*(3840*2160) = 20,684,800 pixels.

So Apple itself is saying the M1 (including the 8GB model) can drive nearly 40% more pixels than would be required for the Air or 13" MBP to support 2 x 4k externally.

Thus we can conclude the M1's limitation to a total of two displays wasn't because of the RAM size or GPU capability.

No idea what the actual reason is, but the idea that it's because they based it on a chip that only had I/O for two displays seems plausible.

Also, time has moved on and now we'd be talking about at least one of those displays being 6k, and office users needing to run more demanding things like video conferencing (heck, even WP and spreadsheet software now includes annoying animations...)

6k is so uncommon at this point, especially for someone buying an M1 Air or 13" MBP, that it's almost academic. Dual 4k would be far more common than dual monitors that include one 6K. The argument that 'It's pointless to talk about dual 4k support because it would actually need to support 6k + one other' doesn't make sense for products at this level. Consider the number of people who wanted to buy an Air or 13" MBP in 2020-2021, but would have liked dual monitor support. What fraction of those do you think were planning to run a 6k as one of those two monitors?
 
Last edited:
  • Like
Reactions: Tagbert

theorist9

macrumors 68040
Original poster
May 28, 2015
3,880
3,060
It is pure guesswork but I suspect that Apple's marketing decision is that the base "M" models only support 2 displays. So if you want to have a Macbook and two external displays, you must buy the MBP. Alas, I think that will stay but I were happy to be wrong.
In my office, the idea to switch to Apple was dead with this decision. The M1 MBA would have been perfect with the crucial exception that is does not support two external displays that we all use. For comparable Windows machines, two external monitors are supported.
"Pluginability" not the strong side of the Mac...
David Eden-Sangwell (has the "iCaveDave" youtube channel) had a clever idea for this: Apple could have given the user the option to disable the internal display in order to allow two large externals to be connected. This might have worked for your office, unless your typical setup was to use all three displays simultaenously (the laptop and the two externals).
 
  • Like
Reactions: nothingtoseehere

leman

macrumors Core
Oct 14, 2008
19,521
19,677
David Eden-Sangwell (has the "iCaveDave" youtube channel) had a clever idea for this: Apple could have given the user the option to disable the internal display in order to allow two large externals to be connected. This might have worked for your office, unless your typical setup was to use all three displays simultaenously (the laptop and the two externals).

I doubt that Apple would offer a solution of that kind, it would be very unlike them.
 

nothingtoseehere

macrumors 6502
Jun 3, 2020
455
522
David Eden-Sangwell (has the "iCaveDave" youtube channel) had a clever idea for this: Apple could have given the user the option to disable the internal display in order to allow two large externals to be connected. This might have worked for your office, unless your typical setup was to use all three displays simultaenously (the laptop and the two externals).
That would have been fun :cool:

As my office isn't exactly full of nerds: This nice, albeit theoretical, idea should include that you don't have to issue a terminal command or something similar before unplugging the MBA from the monitors and take it to the conference room with the intention to use the internal display there ;) And vice versa, of course, it should be plug&play😃
 

theorist9

macrumors 68040
Original poster
May 28, 2015
3,880
3,060
That would have been fun :cool:

As my office isn't exactly full of nerds: This nice, albeit theoretical, idea should include that you don't have to issue a terminal command or something similar before unplugging the MBA from the monitors and take it to the conference room with the intention to use the internal display there ;) And vice versa, of course, it should be plug&play😃
Yeah :). So what laptop(s) did your office end up going with? Or are they still deciding? If the latter, maybe the M2 Air will be released at WWDC and support 3 displays...
 

Krevnik

macrumors 601
Sep 8, 2003
4,101
1,312
What would prevent Apple from increasing the compression, though? DSC can go higher. If they used a value of ~2.8:1, that should be sufficient for 10 bit 7k@120Hz. Is the issue that it becomes visually lossy?

Nothing really prevents it. I’m more pointing out that the XDR, with the 12bpp data rate they use at 6K with DSC, will fit in Thunderbolt @ 120Hz, but not DP1.4.

I’ll just add that the math you provided seems a bit off… Apple is using 10bpc, or 30bpp uncompressed. So when DSC is brought in, they compress it down to 12bpp (2.5x). The next rung down is 10bpp (3x), but at 10bpp, 6K @ 120Hz is still *just* outside the data rates available to DP1.4, due to the blanking timings. 26.2Gbps required at 10bpp vs 25.92Gbps available, assuming CVT-R2 timings which have the least overhead.

To fit into DP1.4 fully without tiling, they have to drop to 8bpp compressed. While the claims of DSC is that it is “visually lossless”, it is still a lossy algorithm, so the question is more just how far can you push it before artifacting becomes noticeable. And that’s not something there seems to be much data on, in part because DSC isn’t implemented in many displays, so there’s not a ton out “outside the lab” data for the consumer to dig through. But I’d imagine that as Apple is trying to push these as premium displays, they are likely to try to keep compression to a minimum.
 

playtech1

macrumors 6502a
Oct 10, 2014
695
889
Display I/O seems like it will be a big focus for M2, simply because it was relatively undercooked in the base M1 model so ripe for some love from the engineers for v2. I would be pretty disappointed if we have anything less than an HDMI 2.1 port on the Pros and support for at least two external high refresh 4K displays on the base M2 (although I would like triple external display support there's not much point going for more if the Air is still limited to two USB-C ports and one display per port).

On the ultra high-end of monitors I wonder if Apple will go for something semi-proprietary to get over the limitations of TB4. If they want to push the specs of the XDR then this is going to be a problem for a number of years given how far away TB5 / USB5 are. Could be a good selling point for an AS Pro or Studio 2 - the new XDR works with the old models at 60hz, but if new models support 120hz it could drive sales for those that care about such things.
 

nothingtoseehere

macrumors 6502
Jun 3, 2020
455
522
Yeah :). So what laptop(s) did your office end up going with? Or are they still deciding? If the latter, maybe the M2 Air will be released at WWDC and support 3 displays...
We ended up with 14" Lenovo ThinkBooks, AMD Ryzen 7 4700U, 16 GB RAM, 512 TB SSD. Ordered a while ago, delivering took some time, IT department still preparing them. Therefore, no first-hand experience yet :) But Lenovo devices have proved to be reliable for us, so I think they will be fine.
Let's see in a couple of years what :apple: will be offering then ;-)
 
  • Like
Reactions: theorist9

theorist9

macrumors 68040
Original poster
May 28, 2015
3,880
3,060
Display I/O seems like it will be a big focus for M2, simply because it was relatively undercooked in the base M1 model so ripe for some love from the engineers for v2. I would be pretty disappointed if we have anything less than an HDMI 2.1 port on the Pros and support for at least two external high refresh 4K displays on the base M2 (although I would like triple external display support there's not much point going for more if the Air is still limited to two USB-C ports and one display per port).
Agreed.
On the ultra high-end of monitors I wonder if Apple will go for something semi-proprietary to get over the limitations of TB4. If they want to push the specs of the XDR then this is going to be a problem for a number of years given how far away TB5 / USB5 are. Could be a good selling point for an AS Pro or Studio 2 - the new XDR works with the old models at 60hz, but if new models support 120hz it could drive sales for those that care about such things.
At least in theory, there's no need to go to a proprietary standard, since TB4 with DisplayPort Alt Mode2.0 will do it:

However in practice, as @Krevnik pointed out below, no devices have yet shipped with DP 2.0. So yes, I suppose they might need to do something on their own, such as implementing their own duplex->simplex conversion. Though I suppose it depends if and when they plan to release a 120 Hz 6k or 7k monitor; if it's late enough, DP 2.0 should be available, since the first source and sink devices were just certified.
 
Last edited:

theorist9

macrumors 68040
Original poster
May 28, 2015
3,880
3,060
Nothing really prevents it. I’m more pointing out that the XDR, with the 12bpp data rate they use at 6K with DSC, will fit in Thunderbolt @ 120Hz, but not DP1.4.

I’ll just add that the math you provided seems a bit off… Apple is using 10bpc, or 30bpp uncompressed. So when DSC is brought in, they compress it down to 12bpp (2.5x). The next rung down is 10bpp (3x), but at 10bpp, 6K @ 120Hz is still *just* outside the data rates available to DP1.4, due to the blanking timings. 26.2Gbps required at 10bpp vs 25.92Gbps available, assuming CVT-R2 timings which have the least overhead.

To fit into DP1.4 fully without tiling, they have to drop to 8bpp compressed. While the claims of DSC is that it is “visually lossless”, it is still a lossy algorithm, so the question is more just how far can you push it before artifacting becomes noticeable. And that’s not something there seems to be much data on, in part because DSC isn’t implemented in many displays, so there’s not a ton out “outside the lab” data for the consumer to dig through. But I’d imagine that as Apple is trying to push these as premium displays, they are likely to try to keep compression to a minimum.
Doesn't the M1's TB4 use DP 2.0 rather than 1.4?

For the math, I was just using this, which I cited in my OP. But it's not as detailed as your calculation:
https://gist.github.com/siracusa/bb006d14e9906ac215fdee7685dc4b4c

And yeah, while what constitutes digitally lossless is clearcut, "visually lossless" is fuzzy, since it depends on what differences people can and can't see, which varies based on the person, the viewing conditions, and the content.
 

Krevnik

macrumors 601
Sep 8, 2003
4,101
1,312
Doesn't the M1's TB4 use DP 2.0 rather than 1.4?

Nobody to date has shipped a DP2.0 device. They are very much on the near horizon though. The AMD 6000-series GPUs are supposed to be DP2.0 capable, and I would be surprised if Nvidia doesn't support it in the 4000-series GPUs. Intel's Alchemist GPUs look like they should as well. No monitors yet, and there likely won't be until there's enough GPUs in the wild for it to make sense.

In the case of Thunderbolt, there's nothing really coupling a TB revision to a DP revision, and TB4 adds nothing new to the TB3 spec. It is a "minimum required features" spec bump.

For the math, I was just using this, which I cited in my OP. But it's not as detailed as your calculation:
https://gist.github.com/siracusa/bb006d14e9906ac215fdee7685dc4b4c

And yeah, while what constitutes digitally lossless is clearcut, "visually lossless" is fuzzy, since it depends on what differences people can and can't see, which varies based on the person, the viewing conditions, and the content.

Yeah, Siracusa's math is on the mark, there's just a key term doing a lot of heavy lifting (emphasis mine), as they are calculating using an 8bpp rate. That is as compressed as you can go without chroma subsampling, as Siracusa also points out:

So the minimum bandwidth when DSC is enabled for an RGB stream with no chroma subsampling is simply [pixels per second] * 8 = 2,620,304,640 * 8 = 20.96 Gbit/s. That would work out to a compression ratio of 3.75:1, which is on the high side for being "visually lossless", but is still within spec.
 
  • Like
Reactions: Tagbert

theorist9

macrumors 68040
Original poster
May 28, 2015
3,880
3,060
Nobody to date has shipped a DP2.0 device. They are very much on the near horizon though. The AMD 6000-series GPUs are supposed to be DP2.0 capable, and I would be surprised if Nvidia doesn't support it in the 4000-series GPUs. Intel's Alchemist GPUs look like they should as well. No monitors yet, and there likely won't be until there's enough GPUs in the wild for it to make sense.

In the case of Thunderbolt, there's nothing really coupling a TB revision to a DP revision, and TB4 adds nothing new to the TB3 spec. It is a "minimum required features" spec bump.



Yeah, Siracusa's math is on the mark, there's just a key term doing a lot of heavy lifting (emphasis mine), as they are calculating using an 8bpp rate. That is as compressed as you can go without chroma subsampling, as Siracusa also points out:
Ah, then DP Alt Mode 2.0, while it's supported under USB4, is not yet available anywhere, since it obviously requires DP 2.0.

I subsequently found this May 9, 2022 press release at DisplayPort.org—the first DP 2.0 source and sink devices were just certified:

Given that the ability to turn duplex into simplex seems to offer such an obvious benefit, why couldn't TB implement this without having to wait for DP to do it?

Also, I'm afraid I didn't follow what you meant previously when you wrote: "I’m more pointing out that the XDR, with the 12bpp data rate they use at 6K with DSC, will fit in Thunderbolt @ 120Hz, but not DP1.4." [emphasis mine] How is it relevant that DP1.4's bandwidth (25.92 GB/s*) is less than TB4's (38.88 Gb/s*), given the Macs can do TB4? [*max payload bandwidth, which is less than max link bandwidth.]
 
Last edited:

Krevnik

macrumors 601
Sep 8, 2003
4,101
1,312
Then is DP Alt Mode 2.0, which is supported under USB4, currently available even though DP2.0 is not?

USB4 isn't a monolithic standard, and alt modes are optional. The difference is that an alt mode is a USB-C extension, not specifically part of the protocol. The other difference is that USB4 can tunnel DP data, using the same techniques that Thunderbolt can. However, much like Thunderbolt, there's no provisions for requiring a specific revision of DisplayPort for tunneling to work. And even if USB4 did require that USB controller chips supported DP2.0, you still need a source of data that operates at that speed for it to do any good. This is the problem. You need the whole chain to support DisplayPort 2.0, not just a single part of it. And if I was a standards body, I wouldn't want to wait for the whole chain to move over before "USB4" products existed.

All that said, because DP2.0 uses a similar PHY layer to USB4, I would expect it does make certain implementations easier.

Also, I'm afraid I didn't follow what you meant previously when you wrote: "I’m more pointing out that the XDR, with the 12bpp data rate they use at 6K with DSC, will fit in Thunderbolt @ 120Hz, but not DP1.4." [emphasis mine] How is it relevant that DP1.4's bandwidth (32.4 GB/s) is less than TB4's (33.88 Gb/s), given the Macs can do TB4?

Make sure you are comparing either bandwidth to bandwidth, or data rate to data rate. You seem to have mixed the two up here. DP1.4's data rate is just under 26Gbps. And Thunderbolt's bandwidth, including the dedicated display signal portion is indeed 40Gbps, while 8Gbps of that is dedicated to. Thunderbolt should get around 32Gbps data rate if all you are doing is carrying a display signal, so a little over 6Gbps more data rate or just shy of 8Gbps more bandwidth. Both DP1.4's PHY layer and Thunderbolt 3/4 both use 8b/10b coding AFAICT, so the bandwidth numbers are comparable in this case since you get the same data rates from the same bandwidth. That said, when talking about tunneled DP data, it's the data rates that are important. Since that sets the limit of the data carried per stream across a USB4 or TB cable.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.