Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
This thread is full of nonsense

The idea that Apple is behind on graphics is completely detached from reality. Metal and Apple Silicon are very well supported for 3D and AI work. The state of graphics on Mac is the best it has ever been.

M4 Max is as fast as a desktop 4070 in Blender and literally faster than all of AMD and Intel’s GPUs. M3 Ultra is 4080 SUPER level. A theoretical M4 Ultra would be 4090 level. No “professional” is crying about this.

There is also a lot of excitement for Apple hardware in the AI community. Metal is very well supported by software like llama.cpp. Apple has popular frameworks like MLX to make inference faster and more efficient on Apple Silicon (by taking advantage of unified memory). MLX also allows for distributed clusters, e.g. a set of Mac Studio’s communicating over a Thunderbolt ring network. Performance is great.

Oh it does. They ditched Nvidia GPU and therefore, they cant even use their GPU for their own server and super computer which heavily affected their AI development.
What is your point exactly? Apple doesn’t need to run everything in its own data centres, they can rent NVIDIA, AMD, Google Amazon etc. hardware for jobs like training. Just as they rely on Google, AWS etc. for cloud services. Apple doesn’t need to solve every problem in house.
 
And what is your definition of a professional user? It sounds as though you wouldn’t consider me a professional user despite the fact I make my living using my MacBook Pro, because 2D AutoCAD drawings don’t need an RTX5090.

I guess I am not either even though I also make a living with my MacBook Pro doing consulting work, analysis and programming. I also am apparently hallucinating since I run an LLM on my MBP.

Mac Pro isn’t a priority for Apple. Staying on Intel was never in the cards. The switch to Apple Silicon started in 2008 under Steve Jobs when Apple acquired PA Semi. 5 years later they introduced the iPhone 5S with a “desktop class” 64-bit processor and the writing was on the wall that eventually the Mac would switch over.

The die was cast when the cofounded ARM as well. The Newton was in many was a test case for Apple going to their own processors.

From what I can see, Apple is making their own servers for their cloud based LLM, based on arrays of a modified iPad design, interestingly, to run a secure Private Cloud Compute. Apple is doing some interesting work, and making some cool products (that we don’t always get to see), but all anyone can do is moan about AI.

It'll be interesting to see the results, given Apple can tailor the OS and chip to meet their needs, potentially resulting in a powerful yet efficient design. It also will let them keep everything in house, and use it as a service revenue stream by charging developers to incorporate it into their apps; giving developers access to AI without having to use a third party's service.

What is your point exactly? Apple doesn’t need to run everything in its own data centres, they can rent NVIDIA, AMD, Google Amazon etc. hardware for jobs like training. Just as they rely on Google, AWS etc. for cloud services. Apple doesn’t need to solve every problem in house.

It also makes more sense since you can scale up or down as needed and offload the actual physical plant and its operations to someone who can achieve better economies of scale. Reinventing the wheel, as you point out, is not necessary.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.