Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Weisswurstsepp

macrumors member
Jul 25, 2020
55
63
They just released new GPUs for the Mac Pro. Not sure how that is letting it become an old joke. The trashcan was the exception, not the rule.

No, that was equally true for the Cheesegrater MP (5,1) with Westmere XEONs which Apple still sold in 2012 and early 2013 and at a time when PC workstations were already sold with Sandy Bridge E.

The only difference is that the Trashcan languished for even longer.
 

Jorbanead

macrumors 65816
Aug 31, 2018
1,209
1,438
No, that was equally true for the Cheesegrater MP (5,1) with Westmere XEONs which Apple still sold in 2012 and early 2013 and at a time when PC workstations were already sold with Sandy Bridge E.

The only difference is that the Trashcan languished for even longer.
Sandy Bridge E xeons only came with 8-cores max correct? It seems like an Intel issue just as much as Apple. Now that Apple doesn’t need to rely on Intel or AMD I doubt it matters anyways.
 

Joe The Dragon

macrumors 65816
Jul 26, 2006
1,031
524
Mark Gurman is saying that Apple is working on a 40-core SoC for the Mac Pro for 2022.

You're Tim Cook, sitting in his nice office, looking at how much money you just spent to make this giant SoC for a relatively small market. In fact, you have to do this every year or every two years to keep the Mac Pro relevant. How do you recuperate some of this money spent?

You create "Apple Cloud". No, not iCloud. Apple Cloud. Like AWS. Where anyone can come and rent a 40-core M3 SoC running on macCloudOS. You get into the cloud hosting business. You file this under the "Services" strategy that you keep pushing to make Wall Street happy.

Soon, you'll be releasing 64-core SoCs with 128-core GPUs, then 128-core SoCs with 256-core GPUs, and so on. Somehow, you're actually beating anything AWS, Azure, Google Cloud can offer... without really trying.

Apple Silicon Cloud.

It wouldn't surprise me if Apple is already testing their own SoCs to power their iCloud service, which currently depend on AWS. Apple was reportedly spending $30m/month on AWS in 2019. It might be $100m+ per month by now given how fast services have grown.
For DC / Cloud you need storage that is in RAID / NOT LOCKED TO THE CPU BORD and good networking. AND IPMI
 
  • Like
Reactions: Weisswurstsepp

BootLoxes

macrumors 6502a
Apr 15, 2019
749
897
They only depreciated opengl, which still is functional btw.
So many 3d programs rely on CUDA which is nvidia.

my marmoset toolbag, CUDA
my substance painter, CUDA
I dropped marvelous designer after they became sub only but I think the newer versions use CUDA as well.

Granted these programs still run on apple and amd gpu machines but they pretty much run in cpu only mode which is slower in most situations.
 
  • Like
Reactions: IceStormNG

senttoschool

macrumors 68030
Original poster
Nov 2, 2017
2,626
5,482
Their chips are very fast, but that is not the main concern in cloud computing. People usually care much more about cost per request rather than performance per dollar.
That's precisely why they'll probably test switching their AWS loads to their own chips first.

The Nuvia team left Apple because Apple didn't want to make a server chip back then. I think Apple might be more open to it now given how close a Mac Pro chip is to a server chip.
 

UBS28

macrumors 68030
Oct 2, 2012
2,893
2,340
Why would you need “Apple cloud” when you have the M1, M2X and later the M2X Pro chips? It makes no sense.

These machines should provide enough computing power without resorting to the cloud for most Apple customers.

If you are going to use the Cloud, than these Mac’s are useless as you can do the same on a $200 computer.
 

senttoschool

macrumors 68030
Original poster
Nov 2, 2017
2,626
5,482
Why would you need “Apple cloud” when you have the M1, M2X and later the M2X Pro chips? It makes no sense.

These machines should provide enough computing power without resorting to the cloud for most Apple customers.

If you are going to use the Cloud, than these Mac’s are useless as you can do the same on a $200 computer.
 

JMacHack

Suspended
Mar 16, 2017
1,965
2,424
So many 3d programs rely on CUDA which is nvidia.

my marmoset toolbag, CUDA
my substance painter, CUDA
I dropped marvelous designer after they became sub only but I think the newer versions use CUDA as well.

Granted these programs still run on apple and amd gpu machines but they pretty much run in cpu only mode which is slower in most situations.
So you’re complaining about Apple for this and not the software vendor for only implementing CUDA?
 
  • Like
Reactions: robco74

Juraj22

macrumors regular
Jun 29, 2020
179
208
Macs aren’t that bad at gaming, it’s just the developer that won’t develop for macs.
This is about to change. When Macs have arm, iPhone have arm, market is big, game developers will no longer ignore mac because of small market size.
 

t0mat0

macrumors 603
Aug 29, 2006
5,473
284
Home
This is what Nuvia in some way was trying to do because Apple wasn’t at the time wanting to go the Apple Silicon push for server Silicon route. Some details in the ensuing lawsuit Touched on this through conversations brought up as material in the court case.
they make some money up by being able to have a design that scales. That scale means reduced overheads of making multiple designs, fragmented silicon design etc. Means a better rate when you ask TSMC to make x million of them. And if you’re making so many, you get to bin and use that across your products. Plus Apple gets to have much better profit margins now they dont have to pay Intel

 

BootLoxes

macrumors 6502a
Apr 15, 2019
749
897
So you’re complaining about Apple for this and not the software vendor for only implementing CUDA?
Absolutely. Its the industry standard and apple thinks it can just step in and change that and ended up ruining 3d dev on macs. This was just an example of one of the tools.

As mentioned earlier opengl was depreciated.

Then nvidia got axed

Then 32 bit legacy programs that people used

and now an entire architecture switch.

If i was a game dev I would not target macs because they keep taking away tools needed.
 
  • Like
Reactions: bobcomer

JMacHack

Suspended
Mar 16, 2017
1,965
2,424
Absolutely. Its the industry standard and apple thinks it can just step in and change that and ended up ruining 3d dev on macs. This was just an example of one of the tools.

As mentioned earlier opengl was depreciated.

Then nvidia got axed

Then 32 bit legacy programs that people used

and now an entire architecture switch.

If i was a game dev I would not target macs because they keep taking away tools needed.
I think it’s absurd to blame Apple for the laziness of devs. Depreciating outdated libraries nearly a decade after they introduced Metal, and they gave plenty of notice about depreciating 32-bit libraries and the switch to Apple Silicon. Likewise they did their part in providing the materials needed to change programs over.

The only thing Apple hasn’t done is go in and program these apps themselves.

In my view the blame lies solely on devs themselves. If they cannot put forth the effort to use modern standards that’s on them.
 
  • Like
Reactions: Detnator

robco74

macrumors 6502a
Nov 22, 2020
509
944
CUDA is not an industry standard, it is a proprietary standard owned exclusively by Nvidia and runs only on their hardware. As for OpenGL, that's being phased out in favor of Vulkan. While Apple doesn't directly support Vulkan, there is MoltenVK and several games are using it. Several game engines, including Unity and UE support Metal.

Apple wants its frameworks to run across a variety of hardware. I imagine they also wanted to avoid being locked to a single GPU vendor. Now that they have their own GPU designs, they can do what they want and not be subject to the whims of a third-party who has other customers to consider. Apple can design exactly what they want for their intended use case.
 

BootLoxes

macrumors 6502a
Apr 15, 2019
749
897
CUDA is not an industry standard,

Most renderers to my knowledge either use cuda or cpu which is much slower than cuda.
EDIT: Dang turns out it really is the standard. Didnt realize how many programs relied on cuda and now RTX until i searched.

I think it’s absurd to blame Apple for the laziness of devs
I think its absurd to expect devs to develop for such a small base when you keep restricting their tools. There comes a point where it isnt financially worth it and when you make that many hoops to jump through just to get it to work on macs then many devs will call it quits.

But dont take my word for it. Its one of the biggest reasons why my fav games arent on mac


"We have no plans of giving this game on the Mac. There are several technology decisions that Apple has made that has made it a little difficult for us to release Overwatch in the way we want it to be consumed, and that is why we haven’t pursued it," said Tim Ford, Lead Engineer on Overwatch at Blizzard to Gadgets 360 on the sidelines of BlizzCon 2017."


"When macOS released the Catalina update on October 7, 2019, we discovered that their new OS is no longer capable of supporting Paladins due to their removal of all 32-bit code from the latest update.

Unfortunately, these changes are forcing us to remove Mac support for Paladins following our latest update, A Tigron’s Tale."
 
Last edited:

theorist9

macrumors 68040
May 28, 2015
3,880
3,059
There's two broad use cases for an "Apple Cloud".

1) Those that specifically want/need to program in MacOS on Mac hardware, and need more hardware than what they own themselves. They're already served by companies like macstadium (https://www.macstadium.com/usecases), though with the limitation that they each have to rent out the entire piece, or pieces, of equipment.

2) The general server market, which mostly wants to program in Linux, and only cares about cost/computation. This is the far bigger market, and thus I assume the one the OP is talking about Apple entering. Here the idea would be that energy consumption is a big part of the cost, and if Apple's architecture could deliver a server chip that's much more efficient than what's currently available, then Apple could offer server services more cheaply than the other big players.

For the latter, there's at least two problems:

1) Server chips are different from desktop chips, and AWS has put a lot of research into obtaining higher efficiencies, with their ARM-based Graviton chips. So Apple would need to redesign its desktop architecture for server use, and see whether it would be more efficient than the best AWS has come up with. That's a major research effort with no guarantee of success.

2) Producing chips to run Linux rather than MacOS goes against Apple's corporate philosophy.
 

zakarhino

Contributor
Sep 13, 2014
2,611
6,963
You make a relatively convincing argument and I believe it even more after remembering an article a while back about how Apple were looking into selling their chips to other companies. Maybe there were conversations about other applications for their chips outside of consumer computers and one of them was moving their iCloud hardware stack in house. Could make for a very interesting proposition for devs: one codebase, one programming language, and one set of APIs to create a single app that can be run on any Apple device + parts of the app can be run on the cloud and streamed down to devices that lack the local capability to run that portion.
 

Weisswurstsepp

macrumors member
Jul 25, 2020
55
63
Sandy Bridge E xeons only came with 8-cores max correct? It seems like an Intel issue just as much as Apple. Now that Apple doesn’t need to rely on Intel or AMD I doubt it matters anyways.

Yes, Sandy Bridge E was available up to 8 cores. But what relevance has the core count to the fact that Apple has let the Mac Pro line languishing?
 

Andropov

macrumors 6502a
May 3, 2012
746
990
Spain
Absolutely. Its the industry standard and apple thinks it can just step in and change that and ended up ruining 3d dev on macs. This was just an example of one of the tools.

As mentioned earlier opengl was depreciated.

Then nvidia got axed

Then 32 bit legacy programs that people used

and now an entire architecture switch.

If i was a game dev I would not target macs because they keep taking away tools needed.

It's a propietary standard that only works on GPUs from a specific vendor. Making it the de-facto standard in the industry is a path never ends well for final users.

And the deprecation of 32-bit programs is an absurd complaint. The first 64-bit Mac was launched in 2003 (the PowerMac G5). Mac OS X 10.2 Jaguar already had (limited) support for 64-bit programs. The last 32-bit Mac was launched in 2006. Mac OS X 10.7 Lion (2011) was the last OS to support the 32-bit kernel. Then macOS 10.14 Mojave (2018) announced it would be the last OS to support 32-bit apps. Then, macOS 10.15 Catalina (2019) dropped support for all 32-bit apps, 16 years after the first 64-bit Mac and 13 years after the last 32-bit Mac. That's a decade and a half to make the transition, seems like more than enough time.
 

dgdosen

macrumors 68030
Dec 13, 2003
2,817
1,463
Seattle
Trying to make macOS into an OS suitable for cloud services is a far bigger challenge than making macOS gaming-friendly :) And being able to run a database software and running it in production are two entirely different things. We live in an age of managed services, serverless architecture, on-demand scalability... I just don't see a way how Apple can disrupt that market. Their hardware is great, but they are utterly lacking in software, and they have been dismantling their server efforts up to the point that they are basically non-existent.
All valid points, but I feel Apple would be crazy not to dip their toes in this market. They could start by offering services to support iOS app builders. Using your own argument, they don't need to offer IaaS/PaaS, they could just offer first class support for cloud functions (ala lambda/GCF) written in Swift. And if their data centers gently sip power with Apple Silicon, that just makes the bar to profitability lower.

Also, banned?
 

Jorbanead

macrumors 65816
Aug 31, 2018
1,209
1,438
Yes, Sandy Bridge E was available up to 8 cores. But what relevance has the core count to the fact that Apple has let the Mac Pro line languishing?
Because the Mac Pro came with 12-cores so they wouldn’t lower the core count and Max performance just to be on Sandy Bridge - when they knew less than a year later they were going to announce the 2013 Mac Pro using Ivy Bridge.

Apple wouldn’t go “hey we know we had a 12-core powerhouse but this year we’re going to only do an 8-core because Intel didn’t give us any other option so sorry for those of you that wanted a bit more power. At least we’re on a new generation right?”
 

sunny5

macrumors 68000
Jun 11, 2021
1,835
1,706
CUDA is not an industry standard.
It's a propietary standard that only works on GPUs from a specific vendor.
For 3D, Nvidia GPU is the only option. Most of software are not supporting Mac due to a lack of Nvidia GPU. Beside, Nvidia is dominating the external GPU market by 80% while AMD is still struggling to compete against Nvidia. This is why AI, machine learning, 3D, game, and other markets are still dominated by Nvidia GPU. Without it, you can't even develop properly. Too bad if you wanna argue about this cause that's how it worked for a while.

This is why many software, especially 3D and CUDA based software are NOT supporting Mac.

I think it’s absurd to blame Apple for the laziness of devs.
It's the opposite. Apple is the one who cut the support for Nvidia GPU several years ago. Do you really think Metal API is the best in the market? Btw, macOS's market share is extremely low compared to Windows. At this point, it's Apple to be blamed for being laziness and both Nvidia and AMD are not even supported for eGPU.

This is about to change. When Macs have arm, iPhone have arm, market is big, game developers will no longer ignore mac because of small market size.
How come mobile developers are still not supporting macOS then? iOS is a big market but not macOS. Since developers need to optimized their mobile apps for macOS, nobody will going to do that. It's been almost a year since Apple Silicon Mac released and yet, are there any proper mobile games on Mac App Store?

Admit it, macOS's market size is still small that even mobile app developers including iOS developers aren't willing to support their apps on macOS.

So you’re complaining about Apple for this and not the software vendor for only implementing CUDA?
CUDA is the standard for 3D software. Without it, it won't work or you have to rely on CPU. It's been a standard for a long time especially since Nvidia is dominating the market by 80% while AMD is only less than 20%. Also, Apple already dropped the support for Nvidia itself while Nvidia was willing to support on Mac. Who's gonna blame first? Also, Apple is well known for having poor GPU performance for several years just because of using AMD GPU.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.