Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

jayryco

macrumors member
Original poster
Oct 5, 2022
86
281
I find myself intensely curious and speculative with news that Nvidia has just surpassed Alphabet (Google) and Amazon in terms of overall value. This being because of the huge AI push in recent years and Nvidia being the only one to provide competent enough hardware to train these LLM's, so they have all the biggest companies in the world purchasing the H100 and H200 platforms, their growth has been massive and incredibly quick - and potentially threatening the top spot if the trend continues in this direction.

With that being said, I wonder if Apple will be developing their own hardware supercomputer chips since they are essentially already in the chip business at this point and it's proved relatively successful for them. Do they not already have all the fundametals in place for them to make this push? I mean if Apple just remain a customer to Nvidia then surely Nvidia will become completely unbeatable in the space as AMD are slow to rival Nvidia.

I don't know enough but it would seem this could be a logical move since the R&D for both hardware and software is moving in that direction? We know that Apple's integrated GPUs aren't that powerful vs a dedicated chip but is this kind of speculation crazy? Apple surely should decide to build their own datacenter with propriety hardware?

I dunno. Just seems the right move for a company that prizes themselves on their own hardware capabilities - but like I said, I really don't know enough about their chip strategy which is why I'm here speculating.

Any thoughts? :) Thanks!
 
  • Haha
Reactions: cateye

MRMSFC

macrumors 6502
Jul 6, 2023
368
379
With the neural engine built into Apple Silicon, I think they’re already a player in the industry, kinda.
1708004549076.jpeg



That said, Apple has always been a “consumer first” company. I doubt they’re looking to directly compete with Nvidia on providing big business with hardware for AI training.

Apple is probably looking in the direction of leveraging their on-chip AI network to deliver a personalized experience without the need to datamine the **** out of users.

Which is to say, making siri less dumb.
 

leman

macrumors Core
Oct 14, 2008
19,494
19,631
Apple so far hasn't shown any interest in that part of the market. I also very much doubt that Apple would be able to compete on prices. Also, while they do have some hardware components with competitive performance (AMX units), they lack high-end interconnect technology which makes Hopper possible.

No, I think Apple will continue focusing on low-power on-device inference, while building up the local ML training capabilities of their GPUs. Before long, I wouldn't be surprised if a Mac Pro would be competitive agains a similarly priced Nvidia workstation in this area. But I don't expect to see any kind of server push from them any time soon.
 

Kierkegaarden

Cancelled
Dec 13, 2018
2,424
4,137
Apple so far hasn't shown any interest in that part of the market. I also very much doubt that Apple would be able to compete on prices. Also, while they do have some hardware components with competitive performance (AMX units), they lack high-end interconnect technology which makes Hopper possible.

No, I think Apple will continue focusing on low-power on-device inference, while building up the local ML training capabilities of their GPUs. Before long, I wouldn't be surprised if a Mac Pro would be competitive agains a similarly priced Nvidia workstation in this area. But I don't expect to see any kind of server push from them any time soon.
They haven’t, that we know of — but nobody can ignore a company with a $2T market cap with high margin products. So anything is possible, and they have quite a war chest to do some interesting things.
 

leman

macrumors Core
Oct 14, 2008
19,494
19,631
They haven’t, that we know of — but nobody can ignore a company with a $2T market cap with high margin products. So anything is possible, and they have quite a war chest to do some interesting things.

That’s the thing though - Apples tech is just too expensive to be sold as bare bones parts. They can do the things they do because they develop the tech for themselves. It doesn’t have to be prided competitively.
 
  • Like
Reactions: Chuckeee and kitKAC

jayryco

macrumors member
Original poster
Oct 5, 2022
86
281
Would R&D be too high to develop the server tech exclusively for Apple and its user base without licensing it to other companies? As opposed to using that same money purchasing the tech from Nvidia or another supplier.

If they are already in the business of developing CPUs and GPUs, why not extend it to larger scale processing like more powerful GPUs and bespoke server designs?
 
  • Like
Reactions: Kierkegaarden

dmccloud

macrumors 68040
Sep 7, 2009
3,122
1,883
Anchorage, AK
I find myself intensely curious and speculative with news that Nvidia has just surpassed Alphabet (Google) and Amazon in terms of overall value. This being because of the huge AI push in recent years and Nvidia being the only one to provide competent enough hardware to train these LLM's, so they have all the biggest companies in the world purchasing the H100 and H200 platforms, their growth has been massive and incredibly quick - and potentially threatening the top spot if the trend continues in this direction.

With that being said, I wonder if Apple will be developing their own hardware supercomputer chips since they are essentially already in the chip business at this point and it's proved relatively successful for them. Do they not already have all the fundametals in place for them to make this push? I mean if Apple just remain a customer to Nvidia then surely Nvidia will become completely unbeatable in the space as AMD are slow to rival Nvidia.

I don't know enough but it would seem this could be a logical move since the R&D for both hardware and software is moving in that direction? We know that Apple's integrated GPUs aren't that powerful vs a dedicated chip but is this kind of speculation crazy? Apple surely should decide to build their own datacenter with propriety hardware?

I dunno. Just seems the right move for a company that prizes themselves on their own hardware capabilities - but like I said, I really don't know enough about their chip strategy which is why I'm here speculating.

Any thoughts? :) Thanks!

When the AI bubble bursts, companies like nVidia who went heavily in that direction will see their market valuations plummet. The only question is when that will happen.
 

leman

macrumors Core
Oct 14, 2008
19,494
19,631
Would R&D be too high to develop the server tech exclusively for Apple and its user base without licensing it to other companies? As opposed to using that same money purchasing the tech from Nvidia or another supplier.

If they are already in the business of developing CPUs and GPUs, why not extend it to larger scale processing like more powerful GPUs and bespoke server designs?

What would be the use case for this? Machine learning? I doubt it’s cheaper for Apple to develop their own tech rather than buying/renting Nvidia hardware. There is also the opportunity cost to consider.

For Apple ecosystem specific services they do offer (like Xcode cloud), something like a Mac Mini is more then sufficient. They probably have some sort of custom server farm with basic M1 chips or similar. But that doesn’t need high–end solutions or GPUs.
 

Kierkegaarden

Cancelled
Dec 13, 2018
2,424
4,137
Would R&D be too high to develop the server tech exclusively for Apple and its user base without licensing it to other companies? As opposed to using that same money purchasing the tech from Nvidia or another supplier.

If they are already in the business of developing CPUs and GPUs, why not extend it to larger scale processing like more powerful GPUs and bespoke server designs?
I think it depends on what their plans are for AI. Could they sell an LLM product? Would this be functionality in Xcode or developer tools? Not sure where the opportunities will be on the service side — right now the opportunity seems to be just in the infrastructure.
 

Kierkegaarden

Cancelled
Dec 13, 2018
2,424
4,137
When the AI bubble bursts, companies like nVidia who went heavily in that direction will see their market valuations plummet. The only question is when that will happen.
Yeah, that’s the big question. Nvidia did $60B in revenue last year with income about half that, and they have a $2T market cap. They are clearly profitable and growing, but the valuation is still insane. The industry is cyclical, and competitors will be targeting them. Could be a race to the bottom, eventually. The profit margin they have today will not continue.
 
  • Like
Reactions: gusmula and AlexESP

dumastudetto

macrumors 603
Aug 28, 2013
5,529
8,310
Los Angeles, USA
Nvidia is no threat to Apple. Nvidia services the chips that deliver AI experiences from the cloud endpoint. Apple builds the clients and chips that provide end-users with powerful AI experiences across all their devices, including cloud-based solutions powered by Nvidia technologies.

Apple leads the world in pretty much every category where they choose to compete.
 

Oculus Mentis

macrumors regular
Sep 26, 2018
144
163
UK
Apple is in desperate need of an AI strategy but people in Cupertino are in complete denial of this cyclone of innovation currently happening All around them. VR goggles are so 2016…

Here is what I’d like to see from Apple:
- Move away from a fashion & status mindset to one that caters for innovators and developers.
- Establish a relationship with professionals in all fields, not just media.
- Develop an innovative, result oriented, Llm OS and UI for data creation, editing, analysis, storage, and sharing.
- Innovate on the hashing and tagging of data sources in model creation to mitigate copyright disputes.
- Finally ditching their chokehold on Ram. Local inferencing will need lots of ram.
 

NT1440

macrumors Pentium
May 18, 2008
15,088
22,154
Apple is in desperate need of an AI strategy but people in Cupertino are in complete denial of this cyclone of innovation currently happening All around them. VR goggles are so 2016…

Here is what I’d like to see from Apple:
- Move away from a fashion & status mindset to one that caters for innovators and developers.
- Establish a relationship with professionals in all fields, not just media.
- Develop an innovative, result oriented, Llm OS and UI for data creation, editing, analysis, storage, and sharing.
- Innovate on the hashing and tagging of data sources in model creation to mitigate copyright disputes.
- Finally ditching their chokehold on Ram. Local inferencing will need lots of ram.
Are you willing to revisit this post after WWDC?
 

Oculus Mentis

macrumors regular
Sep 26, 2018
144
163
UK
Are you willing to revisit this post after WWDC?
Of course I will, but WWDC is in June and let’s not forget that last year Apple completely snubbed the word AI at WWDC as if it is only some sort of thingy no name they use hush hush under the bonnet to just do a bit better the same old things they’ve done so far.

They badly missed the boat, I hope they can catch up.
In the meantime Nvidia GTC is in 10 days or so and they’re not going to wait for Timmy to turn ai into thinly sliced, colorful, blingified apples that only grow into AAPL walled garden.

Let’s hope for the best. I hope they haven’t lost too much ground and time on their luxury goggles.
 

Avatar74

macrumors 68000
Feb 5, 2007
1,611
404
Nvidia being the only one to provide competent enough hardware to train these LLM's

You know that AMD Athena is replacing Nvidia in the new OpenAI/GPT servers, right?

wonder if Apple will be developing their own hardware supercomputer chips since they are essentially already in the chip business at this point and it's proved relatively successful for them.

No. Apple doesn't have any interest to be in this space not just because GPUs/APUs aren't their wheelhouse, but because they aren't partnering with server manufacturers. Both Nvidia and AMD partner with companies like HPE... they don't build the entire supercomputer architecture. Apple isn't trying to sell AS architecture to third parties.

Nvidia will become completely unbeatable in the space as AMD are slow to rival Nvidia.

Frontier, currently the fastest supercomputer and world's first Exascale supercomputer, is AMD EPYC (CPU) and AMD Instinct MI250X (APU). El Capitan is going to be faster than Frontier, and also based on Zen architecture. The third fastest supercomputer, Aurora, is Intel-powered.

See TOP500.
 

JPack

macrumors G5
Mar 27, 2017
13,469
26,071

Apple moved out of the HPC space when they decided to focus on consumer electronics. Apple's specialty is selling proprietary hardware and software, bundled as one custom product. That's not what data severs or HPC is about.
 
  • Like
Reactions: Chuckeee

krspkbl

macrumors 68020
Jul 20, 2012
2,440
5,855
It's insane to see how big Nvidia has got. I knew they were big but I only really saw them as a GPU maker that did some AI stuff on their cards (DLSS for example) and competed against AMD and more recently Intel for gaming GPUs and highend GPUs for 3d/movie stuff. Now they are bigger than Google and look like they could pass Apple soon. Also, recently Apple has been below Microsoft...who are also going in with AI.

Co-pilot on Windows is the only "chatbot" I use and I have an Nvidia GPU for other AI stuff. Apple need to get on the AI train soon but I really don't they'll be able to match Microsoft or Nvidia.

I'd be curious to see if Apple can make more powerful "Apple Silicon" processors that are made purely for AI. I know the processors we have now have "Neural engines" for AI which will be good enough for consumers but could they make a more powerful processor for servers?

Nvidia has been playing a long game with CUDA and with that being important for AI i can't see anyone taking their place anytime soon.
 

cocoua

macrumors 65816
May 19, 2014
1,008
623
madrid, spain
Apple is in desperate need of an AI strategy but people in Cupertino are in complete denial of this cyclone of innovation currently happening All around them. VR goggles are so 2016…

Here is what I’d like to see from Apple:
- Move away from a fashion & status mindset to one that caters for innovators and developers.
- Establish a relationship with professionals in all fields, not just media.
- Develop an innovative, result oriented, Llm OS and UI for data creation, editing, analysis, storage, and sharing.
- Innovate on the hashing and tagging of data sources in model creation to mitigate copyright disputes.
- Finally ditching their chokehold on Ram. Local inferencing will need lots of ram.
good points.
The RAM thing; Apple has always sold devices short in RAM, disk, GPU or whatever, as they plan very well when your computer would need more of one or another.
When the retina MBP came out, apps went from 300MB to 1GB, so the rMBP 256GB was just a joke for pros.
This example goes with RAM, and GPU and even ports. And is even more palpable ini iOS when each yea the new big software feature is only available to the latest model. So its a deep core strategy. Software is their only way to control sales, and that's the main reason they wont license OSX to another platform ever.
 

PeLaNo

macrumors regular
Jun 6, 2017
225
116
Would R&D be too high to develop the server tech exclusively for Apple and its user base without licensing it to other companies? As opposed to using that same money purchasing the tech from Nvidia or another supplier.

If they are already in the business of developing CPUs and GPUs, why not extend it to larger scale processing like more powerful GPUs and bespoke server designs?
You were onto something, lol.

I mean, it made sense for Apple to design the chip specifically to run Apple intelligence, considering how many billions of requests they're going to handle every single day. Instead of relying on Nvidia's grip, why not build one yourself?

This also has trickle-down effects when you can use the know-how to increase the power and efficiency of the "edge device" like iPhones, iPads, and Macs so Apple silicon devices can be the ones which can be fastest and most efficient to run LLMs and other AI algorithms.
 
  • Haha
Reactions: jayryco

PeLaNo

macrumors regular
Jun 6, 2017
225
116
They can always go back to cryptocurrency mining 😋
Not really, haha. Only Bitcoin is the major asset with proof of work, also known as mining.
And NVIDIA GPUs can no longer compete with ASICs.
People used to buy NVIDIA cards to mine ETH, but now that they’ve shifted to proof of stake, the demand for powerful hardware has decreased.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.