Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

TechnoMonk

macrumors 68030
Oct 15, 2022
2,603
4,110
I say this as a CUDA guy: CUDA isn't the moat it used to be. And Apple only just added ray tracing (with no Ultra option) and has no tensor cores. But yes they need to continue to improve their software. No question. MLX is new so I would expect further improvements and for it to be plugged in to more backends.
Apple really simplified MLX implementation, as some one who uses both CuDA and MLX. CUDA dependencies aren’t the easiest to deal with on Linux. MLX is just one package that makes it very simple. It’s the beginning, I see Apple silicon very viable for on device inference and training. It won’t catch up with H100s or H200s but pretty good for workstations and consumer devices.
 

TechnoMonk

macrumors 68030
Oct 15, 2022
2,603
4,110
Right now it's the cheap consumer GPUs like the 4090 that are eating their sales. Nvidia would be making more money by focusing on the data center market. It's a crazy market out there, and the demand for GPUs far exceeds the supply. But because it's uncertain if the AI boom is sustainable, Nvidia is hedging their bets by maintaining presence in other markets. Gamers need powerful GPUs, but they can't afford paying anything resembling market prices. Which means that consumer GPUs must be intentionally crippled to avoid cannibalizing the real market.

I'm not sure if anyone really cares about workstation GPUs anymore.
Nvidia is going to prioritize manufacturing of GPUs that cost few tens of thousands of dollars. They will sell millions of those to data centers. Nvidia CEO has repeatedly said their focus is on Datacenters in near future. It opens up for Apple and AMD.
 

Homy

macrumors 68030
Jan 14, 2006
2,502
2,451
Sweden
That website does not show Ultra so we don’t know how it compares to the i3 or i5 that alone i9.

Yes but is it lot professionals that use video editing and music production value time over cost so would not https://technical.city/en/cpu/Core-i9-14900KS or https://technical.city/en/cpu/Ryzen-Threadripper-PRO-7995WX running MacOS not value there time. I mean these people spend 5,000 to 10,000 professionally on high end systems.

Not to say lot people working in Hollywood.

There are many strange comparisons in your post. You build your entire case upon a website with just a few old benchmarks and make bold claims by comparing the weakest Apple Silicon CPU, 8-core M1 from 2020 with the latest top-of-the-line 24-core i9-14900KS from 2024 and 96-core Threadripper PRO 7995WX from 2023. Not surprising though since it’s not the first time such posts show up in this forum.

So just because that website doesn’t include results for M2 Ultra we’re supposed to ignore that such results exist on other websites?

Passmark CPU score for M1 is 8,207 but for M2 Ultra 24 core it’s 49,768. It’s not like 63,617 for 14900KS but much closer. Your site lacks Cinebench scores for 14900KS. Cinebench 15 used for M1 is from 2013. Cinebench 10 used for 7995WX is from 2007. The latest Cinebench is version 2024 which shows that M2 Ultra is pretty close to i9-14900.

Skärmavbild 2024-06-10 kl. 02.54.41.png
Skärmavbild 2024-06-10 kl. 03.05.30.png
Skärmavbild 2024-06-10 kl. 03.09.34.png
Skärmavbild 2024-06-10 kl. 03.09.51.png



Also interesting that you mention video production because if you want the fastest desktop computer for Premier Pro which is ranked as the best overall professional video editing software and exists for both Mac and PC you should get a M2 Ultra. Mac Studio M2 Ultra 24 core is faster than even a system with Threadripper PRO 7995WX 96 core and RTX 4090.

Source
Skärmavbild 2024-06-10 kl. 03.33.52.png

Source
Skärmavbild 2024-06-10 kl. 03.34.09.png

Source
Skärmavbild 2024-06-10 kl. 03.34.23.png


So yes, Apple already has desktop CPUs faster than the fastest workstation CPUs like 7995WX but as usual it all depends on your needs and use case.
 
Last edited:

unchecked

macrumors 6502
Sep 5, 2008
450
555
I believe it’s purely a marketing decision. The desktop market is small and is getting smaller. Apple has made the decision to emphasize the mobile market (iPhone, iPad, MacBook, watch, AVP) for the chip development. For Apple, the desktop market (iMac, mini, studio, pro, AppleTV) is primarily repackaging of mobile devices. Even the ultra chip is just a combination of multiple (2) laptop max chips.

Apple is basically decided the future market and business case is in the mobile market. That is where their development is concentrated. Desktops just get the hand me down from the mobile market and that’s “good enough” for the vast majority of the remaining shrinking market of desktop users.

Counterpoint. There are a lot more gamers and streamers today and that also meant there has been a boom in PC building. The Mac is indeed small compared to the MacBooks because if I'm looking for a mobile solution, I'm looking at a MacBook all day over any Windows laptop. But if people need desktops for whatever that they're doing, they're not looking at Macs. They're going to get a Windows PC and not a Mac, therefore the low Mac sales.
 

JouniS

macrumors 6502a
Nov 22, 2020
638
399
Nvidia is going to prioritize manufacturing of GPUs that cost few tens of thousands of dollars. They will sell millions of those to data centers. Nvidia CEO has repeatedly said their focus is on Datacenters in near future. It opens up for Apple and AMD.
I meant that nobody has a real interest in selling workstation GPUs. The workstation market is a small fraction of the gaming market, which itself is a small fraction of the data center market. Workstation users don't want crippled consumer GPUs, but they also can't afford paying market prices.
 
  • Like
Reactions: TechnoMonk

jujoje

macrumors regular
May 17, 2009
247
288
Forget Apple, if Nvidia doesn’t increase RAM in their 5090 GPU, Apple will eat its sales. I am a heavy 4090 user, and the 24 GB RAM runs out of memory. I am forced to use a slower M1 Max MBP for anything that needs more than 24 GB GPU memory, its not much for todays workloads. I haven’t bought an Apple workstation in past 12 years, but will move to Studio if Nvidia doesn’t get its stuff together with low memory options. I will take a 256 GB unified memory which can run heavy GPU workflows.

Pretty much quote for truth there; Nvidia cards with decent memory (48GB) are stupidly expensive. Rambled about this before, but even if Apple's GPUs aren't nearly as fast, the amount of memory opens up workflows that simply aren't possible on other hardware.

Random anecdotal example; was playing around with grain simulations in Houdini as one does. A reasonable resolution (30m grains iirc) running on GPU openCL caused the Nvidia card to run out of memory (24GB). Could send it to the farm to run on the CPU OpenCL but that takes 5 hours as OpenCL on CPU is slow. Running it on the M2Ultra? 45min. Oddly enough, from what I recall, even with smaller simulations the Nvidia card was much slower than you'd expect because their OpenCL driver is pretty poor.

Anyways, for a workstation the M2 Ultra is pretty great, as far as I'm concerned. The CPU cores perform well, and while the GPU is slower than I'd like, in a lot of use cases the amount of memory makes up for it (what point is a fast GPU if it can't finish the task?). Also, as the Mac sits on my desk, it was pretty much silent while simulating, which is nice.

If they add raytracing and manage to scale the M3/4 GPU performance up for the new Ultra it'll be in a pretty reasonable spot.

Still feel that a Extreme / Mac Pro variant would be some great technology looking for a market, at least as far as vfx goes; the Ultra makes a good personal / freelance / small boutique studio workstation, but the Extreme would be aimed at film studios, who I can't see switching away from cheap Linux boxes and CPU farms.
 
Last edited:

seggy

macrumors 6502
Feb 13, 2016
465
311
We were … which is also why bringing up the 3995WX, a 64-core, 128 thread workstation chip not a desktop chip, is equally a non sequitur. If the argument is to make an “Extreme” chip to compete against Threadripper pros and large core Xeons that’s different from saying Apple doesn’t compete against desktop chips which it very much does. Had the M3 ultra existed it would’ve competed against threadrippers like 60s maybe 70s depending on application though not the 80s.
I thought I heard the rumble of goalposts bring shifted.
 

crazy dave

macrumors 65816
Sep 9, 2010
1,450
1,220
I thought I heard the rumble of goalposts bring shifted.
From the OP and the post you initially replied to which was also the second post:

Screenshot 2024-06-09 at 11.35.34 PM.png


Core i3-12100F and Core i9-14900KS. That's what the poster you replied to was comparing AS to in the context of a desktop, core i-X desktop processors. Those are desktop class processors and despite the OP's statements, stinksaroundhere is quite correct that Apple's current crop of processors compare very favorably to them (Ryzen as well), both in single threaded and multi-threaded desktop workloads. His "best GPU per watt" remark however is most definitely a shift in goal posts from performance to performance per watt as the AS GPU does not compare favorably with similar desktop GPUs in terms of raw performance like the CPU does. Hence why I responded that your inclusion of a 3995WX "is equally a non sequitur". As an aside, his remark is also not strictly true as Nvidia Max Q GPUs have in fact a similar performance per watt as Apple GPUs (simply measured as TFLOPs, obviously actual performance varies hugely from that). They share remarkably similar design philosophies, but that's another tangent.

Later we began to discuss actual workstation chips and if you want to talk about workstation chips in a more productive manner, then sure the Ultra CPU can be compared in design philosophy to a HEDT/light workstation chip and it also true that Apple does not make anything close to a high end workstation Threadripper 7980X equivalent that could go into the Mac Pro but not the Studio. Since we're going to have to wait for the M4 to get an Ultra, to estimate where an M4 Ultra might lie, we have to do some very rough calculations: assuming the same scaling from M2 Max to M2 Ultra, based on an M3 Max a hypothetical M3 Ultra with 24 P-cores and 8 E-cores would've scored about 3100 pts in Cinebench R24, about the same as a lower end modern 7000 Threadripper and about the same as some of the higher end older models (score is roughly equivalent to a 5975WX). A hypothetical M4 Ultra is unfortunately more difficult to estimate since we don't have the Max configuration or even know if said Ultra will be 2x Maxes. And, more crucially, I don't believe Maxxon actually makes Cinebench for the iPad so we don't have even a score yet for the base M4 either. But assuming the M4 Ultra will be some factor faster than the 3000 above seems reasonable, a 24 P-core, 8-E-core M4 Ultra might be say 15-20% faster than the above M3 Ultra putting it a close to a modern 7970X at least in that workload. Obviously hypotheticals upon hypotheticals here but that's just to give a very rough idea of where an M4 Ultra might lie in CB R24. I'm sure there are other, far better workstation benchmarks out there, but CB R24 is common, decently stresses the cores better than R23 did, and at least scales fairly linearly with core/thread count.

So no, Apple makes excellent desktop CPUs and, an upcoming M4 Ultra promises to be a good light workstation CPU depending on its configuration. Whether Apple chooses to ever make an Extreme remains to be seen.
 
Last edited:
  • Like
Reactions: MacPowerLvr

krspkbl

macrumors 68020
Jul 20, 2012
2,449
5,882
They don't need to bring out a "full fat" desktop CPU.

If they did then the only product it'd make any difference in would be the tower Mac Pro and that's the most expensive model that "nobody" is buying. Every other Mac has limited airflow/cooling potential so it makes sense to stick with powerful low powered processors. Also, for absolute top performance they would need to ditch ARM and go with x86. That'd ruin all the progress they've made since leaving Intel.

Apple Silicon is great but there is no need for such powerful processors on the Mac line up. ARM is great at low power and works well in phones, tablets, laptops and slim/small form factor cases/enclosures. If you scale up far enough (a seriously powerful Mac Pro tower) then it would start to hold it back. It's not as simple as just increasing how much power you pump through it. Maybe one day we'll get a higher tier of performance SoC but if you want a screaming fast desktop CPU then go with Intel/AMD.
 
  • Like
Reactions: Chuckeee

jdb8167

macrumors 601
Nov 17, 2008
4,859
4,599
If they are not intended for scientific use, how do you explain that, when Jobs introduced the PPC processor, he did so by showing its performance in Mathematica?
Not that it matters to the overall debate but you've used this example a couple of times and it is simply wrong. Steve Jobs left Apple in 1985 and the PowerPC was introduced in 1991. I'm not sure what you are thinking of but this never happened. Maybe it was the G5 introduction?
 

Tyler O'Bannon

macrumors 6502a
Nov 23, 2019
886
1,497
Quick list of reasons:

Apple silicon is still young

Apple sells far more laptops than desktops

Apple silicon has all come from iPhone which has all been mobile

Apple silicon is very powerful even for mobile

Desktops have cooling systems that do allow for more sustained performance. Perhaps we will at least see higher clocking in the future. Again, they may stick to very strict development and focus on rapid release cycles.

Keeping development all on one platform helps them develop and update faster

So far, Apple values their power per watt more than anything else. That could change if they do make desktop chips.

The Ultra is not really a mobile chip, I mean it is, but it’s still too hot for it so yeah I get that it is but it isnt

With Mac Pro and Mac Studio, Apple may push into desktop chips however they may not. They may just fuse mobile chips together and keep development on one platform and move faster. M3 came out faster than M2, and M4 came out faster than M3. So we may see quick release cycles that keep pushing forward, all still based on mobile.
 

TechnoMonk

macrumors 68030
Oct 15, 2022
2,603
4,110
I love synthetic benchmarks

Real world benchmark here for one of our projects... C++...

- llvm compile time on a stock base M2 8Gb 256Gb mac mini - 1:51
- llvm compile time on a custom i9-14900k 64Gb with 1TB Samsung 990 Pro running Linux - 1:42
Lot more than a synthetic benchmark. For some of my workflows, GPU time accounts for about 40%. And the rest is I/o moving data from Memory to GPU, batches and workers on 4090. On my M1 Max with 64 GB, I have more memory to work with, I avoid most of I/O and with better batch sizes and more workers, 4090 and M1 Max is a wash. If you are using tiny datasets for benchmarks it looks great.
 
  • Like
Reactions: Chuckeee and cjsuk

HowardEv

macrumors 6502
Jun 1, 2018
470
326
Medford ma
Why does Apple not bring out desktop CPU?
Because M series doesn’t need high power to be high speed.

The question then becomes, why did they keep the old Intel Mac desktop design instead of making them small and portable with a battery. Do people still associate the power cord with high speed, like they have a power cord fetish? It’s embarrassing.
 

HowardEv

macrumors 6502
Jun 1, 2018
470
326
Medford ma
Plenty of professional musicians record, edit, and mix on laptops
Laptops have bad cooling design and throttle more than M in a Mac Mini. Also they cost more and are quite delicate, and only support one large display instead of two like Mini. And once they are plugged into displays and audio interfaces and instruments they are useless as laptops.

Ideally we want a new smaller Mini that, has all day battery, all the ports chip can handle, is a passive heat sink with no thermal throttling, and is inexpensive.
 
  • Haha
Reactions: wyrdness

HowardEv

macrumors 6502
Jun 1, 2018
470
326
Medford ma
Not sure what you mean by archaic benchmarks?
Why can’t there be real world tests, like how many audio tracks and virtual instruments they can handle before Logic hiccups? I still don’t know if I really need 16GB or if 8 is enough. All the articles cite conventional wisdom that you need 16 for DAWs but no one says why.
 

dmccloud

macrumors 68040
Sep 7, 2009
3,138
1,899
Anchorage, AK
Why does Apple not bring out desktop CPU?

You can tell my looking at this https://technical.city/en/cpu/Apple-M1

Some thing needs to change at Apple. Like how can Apple compete in the desktop class when they bringing out mobile CPU as you can tell looking at the benchmark that desktop Core i3-12100F is way faster and the Core i9-14900KS is light years ahead of the Core i3-12100F

Apple really needs to bring out desktop CPU.

1. Your "source" is about as reliable as Drunk Bobby standing on the corner claiming the end of the world is coming tomorrow.

2. You're trying to compare the M1 (released in 2020) to Intel processors released after that, and ignoring subsequent generations of Apple SOCs.

3. Apple Silicon is neither a "desktop" nor a "mobile" SOC - they are simply SOCs that can be used in either a desktop or notebook machine. You also ignore the M2 Ultra, which meets the very definition of "desktop" SOC you refer to.

4. M4 is coming to the Mac at some point, and we won't know what the specs will be until those devices are formally announced. This means any speculation on your part is not only out of date, but based on several misconceptions, inaccuracies, and misleading comparisons.
 
Last edited:

boss.king

macrumors 603
Apr 8, 2009
6,394
7,647
Laptops have bad cooling design and throttle more than M in a Mac Mini. Also they cost more and are quite delicate, and only support one large display instead of two like Mini. And once they are plugged into displays and audio interfaces and instruments they are useless as laptops.
If it were 2006 and we didn’t have phenomenal performance in laptops, I’d agree, but that’s not really the case anymore. Also, musicians tend to travel and so laptops are incredibly useful there.
 
  • Like
Reactions: Timpetus

ChrisA

macrumors G5
Jan 5, 2006
12,917
2,169
Redondo Beach, California
OP, what exactly are you complaining about past the specs? Is the M series chip not able to perform your workflow as it should?
Yes. For casual users who are using the computer for viewing YouTube, shopping on Amazon, or writing a paper for school, even the low-end Macs are overkill. A $300 Chromebook would do all of those things. Then if you do informal photography and shoot videos to be posted online. You want a computer that can support a larger screen and the M2 base or M2 with 16GB is something you might want.

The Studio is for someone who shoots video with multiple high resolution cameras and has a few audio tracks to mix. So if you are cutting 8K multicamera sots you are looking at a lot or ram, the 32GB Mac Studio would be the starting point for that. Same for orchestral composition with Logic Pro. You need to hold all the audio samples in RAM

And then come AI. I am working on a robotics application and recently moved that effort away from Linux and Nvidia GPS to MacOS. I am a long-time user of both Linux and Mac, going back to the 1980's (back them it was BSD Unix and SunOS, not Linux and the Mac was System 9.). My goal is a voice interface and a parameterized gait generator for a walking robot that uses machine learning to select the parameters. The parts for this seem to be working on a M2-Pro Mac Mini with 16GB RAM. (The Llma3 LLM with 7B parameters runs well on the M2-pro. But I know I will run into a RAM limit. I say this to show that even current tech AI and robots work does not requird a very high-end Mac. If you are an end-used if this taechnology the Chromebook or even an older iPhone SE is good enough.

The more expensive Macs are nice to have and if you have disposable budget, go for it. But for most of us, budget is fixed so spending on a computer means not spending on something else, like a new Bamboo 3D printer.
 

Superhai

macrumors 6502a
Apr 21, 2010
735
580
I believe that Apple have desktop class CPUs for sure, but every one is designed around the "mobile" entry level in mind. The reason must be that they want to scale and bin, without too much loss. If they designed a chip solely for desktop from the ground up, they would have to have a complete separate production line and without usability in other products. If the desktop workstation Mac’s gets very popular, which will not happen anytime soon, then we may see Apple taking that risk.
 

theorist9

macrumors 68040
May 28, 2015
3,880
3,059
Not that it matters to the overall debate but you've used this example a couple of times and it is simply wrong. Steve Jobs left Apple in 1985 and the PowerPC was introduced in 1991. I'm not sure what you are thinking of but this never happened. Maybe it was the G5 introduction?
Yes, I said "PPC" when I should have said "G5 PPC". In a previous thread were I referenced this (from 2021: https://forums.macrumors.com/thread...c.2298203/page-13?post=30153770#post-30153770 ), I did include the G5, but I accidentally dropped that here, so thanks for catching that.

But my broader point remains: Apple has consistently recognized scientists as among the Mac user base.

Note: Wolfram Research = Mathematica
1718051740518.png



 
Last edited:

cjsuk

macrumors 6502a
Apr 30, 2024
616
2,262
But my broader point remains: Apple has consistently recognized scientists as among the Mac user base.

Note: Wolfram Research = Mathematica

Also plenty of us unrecognised ones too :)

Actually my org is mostly windows but everyone technical has a mac at home or for academic work.
 
  • Like
Reactions: theorist9

izzy0242mr

macrumors 6502a
Jul 24, 2009
691
491
Apple does not make desktop CPU”s they are made for laptops with battery in mind. Those CPUs are made for desktop with heat sink, number of fans with being plug in all the time with battery life not factor in those case.

The Intel and AMD CPUs for laptop are light years slower than those CPUs.
The M# Ultra chips do not exist in any laptop. They only exist in desktops. That's the unequivocal desktop chip.
 

HowardEv

macrumors 6502
Jun 1, 2018
470
326
Medford ma
The M# Ultra chips do not exist in any laptop. They only exist in desktops. That's the unequivocal desktop chip.
But that’s overkill. I want the same M3 cpu and battery they use in the fastest MacBook, but in a new mini Mini, not jammed into a thin laptop that throttles and is delicate and awkward and expensive.

And new cheaper studiodisplays, and a new MacBook that’s light and cheap and fanless that can be the wireless display and keyboard for the headless Mac that is connected to the displays and audio interfaces while the laptop can be free to be used as a laptop.
 
  • Like
Reactions: streetfunk
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.