Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Riven

macrumors newbie
Original poster
Sep 12, 2016
6
9
SF, CA
With Apple making external GPUs (eGPUs) a reality in macOS 10.13, I've decided to wait as long as I need to until Apple releases a computer with Thunderbolt 4, specifically on a MacBook Pro. (32 GB DDR4L RAM might come as well by then.) If you look at my short post history, you might doubt me because at first I thought I was going to buy a Razer Blade Pro after Apple's dismal showing of rMBPs (imo) and then just today I said I was going to wait for the Mac Pro. In my mind native support for external GPUs on macOS was an absolute pipe dream. But then I ran across the external GPU Development Kit, and I'm absolutely locked into Apple for my next computer because of this.

Currently, the Alpine Ridge TB3 controller has 40Gb/s throughput per port, and while there is enough bandwidth to run a GPU decently, think about the compute-heavy/high-end GPUs like the GTX 1080 or any workstation GPU – they will certainly run into the TB3 bottleneck. (I haven't done enough in-depth research to know how much a GPU's performance will be affected.)

Since bandwidth has doubled with every iteration of TB, it's sensible to think TB4 will have 80Gb/s throughput per port. With that, I'll finally be able to run a gaming or workstation-class GPU from a laptop and have the setup of my dreams - portability when I need it; and graphics, compute power, a docking station, and charging when I need it at home, all through one cable.

If you've been waiting for external GPUs, or something like this, wait for TB4 because high-end GPUs might still be significantly bottlenecked - I remember reading a long while ago that TB2 could run a GTX750 at only 85% of max performance. I'm willing to wait up to 3 years from date of posting. Perhaps it might come out earlier, who knows.

Thoughts? Comments? Estimations on % of GPU performance lost due to TB3/hypothetical TB4 bandwidth constraints?
 
Since bandwidth has doubled with every iteration of TB, it's sensible to think TB4 will have 80Gb/s throughput per port.

Not necessarily. TB1 to TB2 didn't double the bandwidth. TB2 simply aggregates the two channels of TB1, which is the same total bandwidth.

Also:
2011 - TB1
2013 - TB2
2016 - TB3

Which shows exponentially increased time between releases, so you might be waiting 4+ years for a consumer product with TB4.
 
Unless you are running two graphics cards in you eGPU in SLI then currently the bandwidth of TB3 is way more than enough for you. As it will take 4 years for AMD and NVIDIA to upgrade their architecture again then I doubt there will be anything that won't run on a TB3 eGPU before TB4 comes out anyway, you really are waiting for no good reason.
 
They know what we want and have the capacity to give it, but they will not because their only objective is drip feeding as much lower capacity nonsense for as long as possible. You see, their unique interest is profit, not benefitting others. They could not care less about benefitting others, that only comes as a subsidiary effect of their selling something for profit.
 
Honestly, if you are wanting better GPU performance and more RAM I'd just ditch the Mac. There are laptops out there with a 1080 already, and 32GB on a PC is not an issue, even thin ones like the XPS 15. Apple just decided battery life was more important than RAM so decided to stick with low-powered RAM. I've used several PC laptops with standard DDR4 RAM and battery life hasn't been an issue. Really how many people work all day, every day not plugged in at some point? Most people I know plug in most of the day in the office and occasionally unplug for a meeting so battery life isn't really that much of a concern not to the point were I'd be willing to sacrifice battery over RAM or GPU.

Unless you really, really need Mac OS, It would be easier to just switch. Yes Mac OS is lovely and yes having iCloud integration with other devices (phone, watch) is fantastic and nothing else on the market comes close, but if your primary computing platform is too slow it's time to look at priorities. Is the performance stopping you from earning money for example?
 
  • Like
Reactions: martyjmclean
Honestly, if you are wanting better GPU performance and more RAM I'd just ditch the Mac. There are laptops out there with a 1080 already, and 32GB on a PC is not an issue, even thin ones like the XPS 15. Apple just decided battery life was more important than RAM so decided to stick with low-powered RAM. I've used several PC laptops with standard DDR4 RAM and battery life hasn't been an issue. Really how many people work all day, every day not plugged in at some point? Most people I know plug in most of the day in the office and occasionally unplug for a meeting so battery life isn't really that much of a concern not to the point were I'd be willing to sacrifice battery over RAM or GPU.

Unless you really, really need Mac OS, It would be easier to just switch. Yes Mac OS is lovely and yes having iCloud integration with other devices (phone, watch) is fantastic and nothing else on the market comes close, but if your primary computing platform is too slow it's time to look at priorities. Is the performance stopping you from earning money for example?

One could argue that if you want GPU performance and more *and you're a professional* you should ditch laptops.
 
One could argue that if you want GPU performance and more *and you're a professional* you should ditch laptops.
These days, especially with Thunderbolt 3, the desktop form factor is likely to see even less usage, even by pros. Relying on a laptop + Thunderbolt 3 dock (which essentially turns your laptop into a desktop), is an option for many professionals who don't need a powerful desktop CPU for their job. Although, there are some laptops (if you can call them that) which do come with desktop-class CPUs, so even that scenario is covered.
 
These days, especially with Thunderbolt 3, the desktop form factor is likely to see even less usage, even by pros. Relying on a laptop + Thunderbolt 3 dock (which essentially turns your laptop into a desktop), is an option for many professionals who don't need a powerful desktop CPU for their job. Although, there are some laptops (if you can call them that) which do come with desktop-class CPUs, so even that scenario is covered.

Sure, it's an option, but a very bad one for anybody working in the medium to high-end media industry. Also, laptops with "desktop-class" CPUs are a sad joke, sorry. If you mean the MacBook with the i9 - well, we know how bad things are.

If you mean those 4-5kg PCs which some call "laptops", I am not sure what to say.. :)
 
True. Or have a laptop and a desktop? I only run on a laptop when I have to, in my home office I'm always on my desktop.

it is precisely what I am getting inclined to do. Used to be on a hackintosh desktop, got myself an MBP 15 in 2011 for that time I was looking to go out and make presentations, and then the 2016 as I was travelling ever other day, and now since a year I am at my home office, and I am inclined towards a desktop once again.
 
No official news, though it doesn't mean there are not developing a future controller. After all, USB is catching up with the standard.
 
  • Like
Reactions: MrUNIMOG
For computation - TB3 isn't really an issue - since most of the time you're dumping stuff over to the GPU's memory to process. In a similar vein to how most of those ridiculous mining motherboards with 18 or so PCI-e slots were all 1x slots.

Personally, I'd rather see either:

- Thunderbolt and USB-C being merged into one standard; or
- The damn connectors for Thunderbolt 4 and USB-whatever's next being different.

TB3/USB-C port sharing was a good idea in theory. In practice it's made buying cables 10 times more difficult than it needs to be.
 
  • Like
Reactions: ajfahey
True, but it doesn't happen for all the GPU processes. That's why ALL the test carried out so far on eGPUs have shown lower performances compared to the same cards on PCI-e slots.
[doublepost=1558021550][/doublepost]I wouldn't waste my money on an eGPU unless my computer needed an additional discrete GPU other than the internal one and had no additional slots. Using an eGPU as the main GPU, in my opinion, is wasted money (case + loss of performance).
 
For computation - TB3 isn't really an issue - since most of the time you're dumping stuff over to the GPU's memory to process. In a similar vein to how most of those ridiculous mining motherboards with 18 or so PCI-e slots were all 1x slots.

Personally, I'd rather see either:

- Thunderbolt and USB-C being merged into one standard; or
- The damn connectors for Thunderbolt 4 and USB-whatever's next being different.

TB3/USB-C port sharing was a good idea in theory. In practice it's made buying cables 10 times more difficult than it needs to be.

USB4 and TB3 is going to be the same thing and use USB-C. Nothing has been announced about TB4 or even if that will be a thing.
 
True. Or have a laptop and a desktop? I only run on a laptop when I have to, in my home office I'm always on my desktop.
There aren’t a lot of widely used use cases that compel computer performance better than what is available on a high end laptop. I’m still using a 16 GB 2016 MB Pro that I connect to a 27” 4K monitor for desktop use. What I would look for first is a 34-38” with 4-5K resolution and a faster SSD. Processing power just isn’t much of a priority for my office type work flow which includes the portability afforded by a laptop.

The latest 2018 MB Pro processors are barely 50% faster than my 2015 and it’s hard to imagine current iMac desktop performance significantly changing anything. This may be why Apple has decided to move into services. Computer upgrades of the past were performance driven. Computer upgrades now are based on wear-out and breakage. We’re now on 7-8 year replacement periods, not the old 2 year period.
 
There aren’t a lot of widely used use cases that compel computer performance better than what is available on a high end laptop. I’m still using a 16 GB 2016 MB Pro that I connect to a 27” 4K monitor for desktop use. What I would look for first is a 34-38” with 4-5K resolution and a faster SSD. Processing power just isn’t much of a priority for my office type work flow which includes the portability afforded by a laptop.

The latest 2018 MB Pro processors are barely 50% faster than my 2015 and it’s hard to imagine current iMac desktop performance significantly changing anything. This may be why Apple has decided to move into services. Computer upgrades of the past were performance driven. Computer upgrades now are based on wear-out and breakage. We’re now on 7-8 year replacement periods, not the old 2 year period.

Nonsense. Utter nonsense. Professional photo editing, video editing, VM lab, development, etc, etc. All of these quite common and absolutely choke a laptop, but run well on a decent desktop - there are enough youtubers saying they cannot edit 4k on a laptop, and have moved to PC's or an iMac Pro. There is no way I could run my full VM lab on my laptop (P51 maxed out) as it isn't fast enough and even 64GB RAM is a struggle for some of the simulations I need to run and no I can't run this workload in the cloud as it would be too expensive. Nor would I use it for professional photo editing (42megapixel files on my camera), or at least not for long periods again it's not fast enough. Building 1:1 image previews in LR is noticeably quicker on my desktop (core i7-6850K, 6-core, 128GB RAM, 980Ti), and there are times when I've considered upgrading this to something newer with more cores to simply save 30+ mins on each import. This is dead time as the computer is maxed out on CPU when I build previews.

Apple is creating products for the content consumers (yourself), but not for the content creators. I agree any computer from the past 5 or more years is good enough for general 'stuff', but their strategy is in tatters for the professional market. Just speaking to their own engineers and power users would tell them what is required and it's not some thin and light crap they are producing now.
 
  • Like
Reactions: askunk and ssmed
There aren’t a lot of widely used use cases that compel computer performance better than what is available on a high end laptop.
Compiling software, photo/video editing, running VMs or encoding can be done on a laptop. But on a fairly recent laptop like my MacBook Pro from last year, it is not a great experience.

To me, the main issue is not performance, but rather thermal capacity. The laptop heats up and makes an unpleasant amount of noise.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.