Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Bring back eGPUs or wait for Apple silicon?


  • Total voters
    25
Oh? How is that?

Sure looks like they are making W7900 Radeon Pro 48GB cards.

And normal 7900XTX? Both are high end in their respective categories.

On the topic of drivers for newer AMD cards, they put the responsibility at Apple. I can dig out the email from AMD.
RDNA 4 reportedly doesn’t have a high end variant to go against the 5080/5090.
 
Oh? How is that?

Sure looks like they are making W7900 Radeon Pro 48GB cards.

And normal 7900XTX? Both are high end in their respective categories.

On the topic of drivers for newer AMD cards, they put the responsibility at Apple. I can dig out the email from AMD.

The W series are workstation-class cards rather than consumer/gaming focused GPUs, and AMDs strategy there is unrelated to the consumer side. On the heels of deciding not to release a 4090 competitor last year, AMD has also gone on record that they will not be going after Nvidia's flagship 5000 series parts either.
 
The W series are workstation-class cards rather than consumer/gaming focused GPUs, and AMDs strategy there is unrelated to the consumer side. On the heels of deciding not to release a 4090 competitor last year, AMD has also gone on record that they will not be going after Nvidia's flagship 5000 series parts either.

Apologies, it wasn't first explained that "workstation-class" cards were not "high end". They certainly are the highest end as far as price goes and in terms of specs.

The 5090 appears to be high-end on power consumption if the rumours are correct: 600w!
 
Last edited:
Apologies, it wasn't first explained that "workstation-class" cards were not "high end". They certainly are the highest end as far as price goes and in terms of specs.

The 5090 appears to be high-end on power consumption if the rumours are correct: 600w!
The chip it is based on matters. Aside from RAM there is no difference between an AMD workstation card and their consumer equivalent. The data center cards use a different chip (CDNA vs RDNA).

The 5090 isn’t using more power than the 4090. Power usage rumors for the cards appear to be the same, again.
 
I know everyone is focused on gaming, but what about 3D rendering and AI? Both fields are becoming very GPU-dependent and having an extra GPU around can help a lot. Adding eGPU support would really help the MacStudios and Pros make sense for 3d artists. Currently, people are using them but then offloading their rendering to second computers(PC or Linux boxes, which adds complexity) or render farms hosted on the cloud (which costs money).
 
Apple should license this product and sell it as the Big Mac (miniPC) and Apple Pie (eGPU) combo.

That’s actually quite amazing. I always dreamed of something like a mini-computer that is portable but you can dock and change its functionality. In particular if we could use phones as the mobile PC, that would be ultimate theoretical setup for computing.

Also I love the Apple Pie and Big Mac names!
 
Last edited:
The problem is the cost will outweigh the return, Apple had the technology back in 2018 but abandoned it for the silicon series and basically turned their back on the gaming community who use egpu options. So now that market has either continue to run what they have like me or switched over to PC's and gaming consoles at a reduced cost.
 
Oversimplified

- NVIDIAs faulty GPUs burned Apple pretty bad = They Switched to AMD
- AMD also had faulty GPUs and couldn't truly compete against NVIDIA = In-house alternative (M-Series)
- Intels faulty CPUs burned Apple pretty bad (lol) = Possible Switch to AMD Rumors, but AMD couldn't compete at the time = In-House alternative (M-Series)

While they must've been working on the M-Series for a long time. Each time they were burned it would just justify the need for control of the hardware stack. The M-Series put a nail on eGPU coffin and lunched it into the sun. Why support a standard that would let the competition out perform you?

Because the eGPU market had always been a small niche community the continued support wasn't really worth the resources. Especially when the M-Series was ready to lunch when the last set of AMD GPUs that did have macOS support were released. Add the limited bandwidth of Thunderbolt4/USB4 any high-end card would always be handicapped.

Oculink and TB5 never had a chance to bring better eGPU support to a Mac. An additional port while Apple was trying to cut down, an in-house alternative...

Should they? As an eGPU user myself. I had an Intel-Mini and a MBP that I used a 6900XT on. It made playing games so much better, but I also wasn't a heavy enough user to justify a dedicated gaming PC. The games I did play all had official macOS versions which was less of a reason to spend on that. I came across a good deal on an eGPU enclosure and the GPU. So to me the price was well worth it.

Now, I have a gaming PC because I started gaming more and my buddies would favor games that did not have macOS versions. I still held onto my eGPU and did my own casual Mac gaming from time to time. I've moved onto a newer MacBook with an M series so I can't use the eGPU anymore. I would still want eGPU support to come back. Only because it would mean gaming meant more to Apple than it currently does. It might also lead to Apple finally getting with Valve on adding official Proton/GPTK support.

But, I'm old and my enthusiasm for new video games isn't what it used to. Especially since they just don't make them like they used to.
 
  • Like
Reactions: groove-agent
AMD puts the driver issue into Apple. I had that comment from them last week.

I can use the W7900 Radeon Pro in Windows with my 2019 Mac Pro, but not MacOS.

It would be nice to be able to use cards like 4090RTX which are really, really powerful for many tasks outside of gaming. For the money that 4090 is hard to beat by anything.

I don’t need eGPU enclosure but do have one. I can just use the cards directly in my machine without bother.
Yeah I was thinking of buying a preowned Intel Mac Pro for that reason. I miss dual booting and running Windows in MacOS via Parallels.
 
  • Like
Reactions: venom600
Since the start of this thread, I still think Apple should make an Apple Silicon eGPU / dock. It seems like the Apple silicon performance gains are starting to plateau. It would be the only way to play AAA titles on Mac.

If I already bought a Mac, I’d rather take the money I was going to spend on a gaming PC and invest in an eGPU to simplify my desk setup.
 
Yeah I was thinking of buying a preowned Intel Mac Pro for that reason. I miss dual booting and running Windows in MacOS via Parallels.
Well you can get a 7,1 Mac Pro, and cheaper now depending on the spec and the condition.

There are still some new in box ones hidden around eBay but they are very high spec (1.5TB ram) and expensive. Anything with desirable GPUs like W6800/W6900 also hard to find.

I would suggest don’t buy one. Instead get the cheapest Mac that will do the job and build a proper powerful PC with a 5090RTX.

I have two 7,1 machines and what happens if they go wrong I don’t know. Probably Apple tries to give me M2 Mac Pro replacements. Both are still on AppleCare. I still keep the 7,1s because they are very fast for what I use them for.
 
Yeah I was thinking of buying a preowned Intel Mac Pro for that reason. I miss dual booting and running Windows in MacOS via Parallels.

I seriously considered that but what turned me off is that it is a dead end. It can't officially run Windows 11 because it doesn't have a TPM chip, and while you can get around that, who knows for have long. Windows 10 is not going to be supported much longer either. So you have a beautiful computer with a state of the art graphics card in a PCI Express 3 slot and a 6 year old processor that can't keep up. Doesn't make sense.
 
there are a lot of uses for eGPU beyond gaming. In video production we still need more power even with Apple's latest gear. Especially when it comes to stuff like After Effects, DaVinci and Premiere. Also folks using C4D and other 3D design programs can never have enough speed. When I worked in broadcast we had 6 maxed out Mac Pros on their own network with a proprietary fast storage solution connected by fiber to each workstation. Then we had 6 more render servers in the machine room. So when we had a big 3D package to finish it could be rendered by both the server farm and the 6 workstations working overnight. And that was back when you could throw the biggest video card available in your Mac. Though the video card really only worked on previews when crafting the graphics. Rendering was mostly CPU bound. Heck the render farm had only chipset video (or maybe some Maxxon chip?)
None of the arguments I've heard about why no eGPU hold water.
"they can't because the GPU on AS is on die"
This is essentially the case with computers that have the GPU on the northbridge. It wasn't an impediment, as there are PCI lanes and code for that.
"there aren't enough PCIe lanes"
So you are telling me Thunderbolt is a lie?
"Apple hasn't written drivers"
This is actually the only reason.
The amount of people who use eGPU and need them to output video is relatively small compared to all Mac users. We simply aren't a huge profit center. Data scientists can move their workloads to the cloud. But likely don't have to since they are only using the brute power of eGPU, not it's video output.
Video guys, well we should just buy the next biggest Mac.
Apple could go ahead and enable GPU/eGPU support but it would have to contend with not just the cost of engineering the code and updating firmware, but also the added complexity this adds to supporting Macs.
I would love to see it become a thing, but with Mac computers being a very small part of the revenue pie, and people who want GPU a smaller slice of that, I don't expect it soon.
 
Every recommendation I read online comes to the conclusion that eGPUs cannot be recommended due to bandwidth issues on TB4, causing your expensive GPU card to underperform. I don't know how things will look with TB5.

I used a Sonnet eGPU case w/ NVIDIA 1080 on an Intel 8th generation Thinkpad back in the day, and I thought it was marvelous.

Every compatible game (native and via Crossover) that I've tried on my new M4 Max runs well enough to keep me happy. I haven't thought of an eGPU in a while.
 
Then we had 6 more render servers in the machine room. So when we had a big 3D package to finish it could be rendered by both the server farm and the 6 workstations working overnight.
I remember rendering farms back in the day but never had 3D work heavy enough to need it. I was on PC back in those times doing mostly art work for games which was all polygon restricted otherwise you’d kill the frame rates in game.

The GPUs back then (2001/2002) were quite feeble compared to what we take for granted now.
 
I would love to see it become a thing, but with Mac computers being a very small part of the revenue pie, and people who want GPU a smaller slice of that, I don't expect it soon.

They are the richest company on earth with unlimited resources. They can do anything they want to. They just don't want to do it because for 99% of people what they have is enough and they send the rest to buy gaming PCs or workstations (since the Mac Pro is a veritable joke now).

I think it's important to remember that even among gamers, PC users are a tiny fraction, and among those, the ones that buy expensive GPUs are even more rare. You're talking about a fraction of a fraction of people.

And yet you can get Linux drivers for these cards.
 
The future of Apple Silicon will be discrete GPU, but not what your expecting. Apple will never partner with Nvidia or AMD. Never again.

The discrete GPU itself will come from Apple's dedicated GPUs. For Ex. [A customer buys a base M-series Mac with the M-Max or Ultra GPU itself] [M4 w/M4 Ultra GPU]

We've seen how the original M1 Max had connectors that allowed the M1 Ultra to exist. We know Apple creates binned M-Series with turned off CPU/GPU cores. We know the engineering exists.

Apple can and will eventually give buyers the option to pair their Mac purchase with a dedicated Max or ultra GPU as the discrete GPU.

Not a matter of IF, but WHEN
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.