Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

goMac

macrumors 604
Apr 15, 2004
7,663
1,694
However it's known that the M can play some games very well. I've read the thread on the failure for Apple to have a games notebook with the M machines. Having read the whole thread, it appears that the economic case for software vendors with major PC based games to effectively port to the M processor is highly marginal. Basically Apple can't handle AAA gaming. the thread stopped in early November ...

So games are a real test for the M IMO. And if one wants to test games performance, then use one of the few that can run on the M platform. And Apple makes heaps of money from the simple games sold for iPhones and iPads. But the big games that people buy costly hardware for, very few will run on an M processor, and that situation is unlikely to change.

I think it's reasonable to say that the new MacBook Pros are not great gaming machines. They're fine, but they're not great. If you're coming from an older MacBook Pro without an eGPU, it's a great upgrade. If you're coming from a higher end Mac or PC desktop, a Mac with a good eGPU, or a PC gaming laptop, you're going to be disappointed. And there is nothing wrong with that story. M1 Max is great at other things, like video editing. It's a CPU with special silicon for certain tasks.

Where it feels like Apple has gotten themselves in trouble is the 3080 comparisons. And to be fair, for some things, they have that performance. But it does not seem like generally they are a point for point match to the 3080. And so now everyone is making comparisons to the 3080, but when they don't hold up, it causes a fuss.

I do think the sorts of issues seen with games might not just be contained to games. The Redshift benchmarks that Linus Tech Tips did hint more at something going on. But everything together is telling a story, both good and bad, about how M1 Max performs. And is teaching us more about the chip.

Feral's games are generally well optimized. These games aren't Apple Silicon Native. They might get a boost if they're ever ported to native versions. But I wouldn't be so quick to blame software, or blame poor ports. Nothing that was tested here was some built-on-WINE abomination. The titles tested were ones that likely spend a lot of time being optimized for Metal, and at least Intel Macs.
 

Melbourne Park

macrumors 65816
I think it's reasonable to say that the new MacBook Pros are not great gaming machines. They're fine, but they're not great. ...
No question about that, although I suspect its more software related than the hardware.

But getting back to the review of a MacBook Pro Max (M processor) compared to the previous MacBook top end notebook, and a mac notebook with an eGPU attached containing a 6900 XT 16GB GPU, we can learn a couple of things when looking forward to a Mac Pro with M architecture.

Looking at that revue's results with gaming, it showed the following speed differences in the Warhammer game which tested performance on frame rates @ 3840 x 2160 Resolution:
Highest End MacBook Pro Intel: 19
New MacBook Pro M Max: 45
Macbook Pro Intel with EGPU 9600 XT: 73

Firstly, the improvement of the MacBook Pro Max being 2.4 times faster compared to the previous top MacBook Pro Intel, is typical of various game tests comparisons that I have viewed on Youtube.

Secondly the benefit of the eGPU 9600 XT is 1.6 times faster than the MacBook Pro M Max. Or the MacBook Pro M Max has 60% of the performance of the 9600 XT (rounded down from 61.5 %).

So, if there was a higher clock speed M Mac Pro, it would have had to run 67% faster in order to match the 9600 XT. I don't know if more cooling and power will provide a gain of 67% in performance.

Let's say though, that 40% was possible with the current chips. That would mean the performance would be less than the 9600 XT performance by some margin: 15% slower in fact (using the actual numbers 85% from a 40% clock speed increase).

Now, that means a 2 CPU machine running 30% faster, would run 50% faster than the 9600 XT. If the chip was 40% faster and the duel setup had no losses in performance, then we'd get a 70% faster machine that the Intel with a 9600 XT. And with 4 processor, the performance would be over three times faster than the 9600 performance.

With Gen 2 M desktop processors, I guess they'd run 25% better again. That would result in a desktop being 4 times faster than the Intel with one 9600 XT 16GB.

The potential is certainly there. And for applications designed to run on the M processor, the performance gains would be much greater.

By the way, the EGPUs are IMO bulky, and perhaps a new more compact Mac Pro "M" generation might be smaller than an eGPU?

Image 16-12-21 at 9.33 am.jpeg
 
Last edited:

goMac

macrumors 604
Apr 15, 2004
7,663
1,694
No question about that, although I suspect its more software related than the hardware.

There's an increasing set of benchmarks, outside of games, of well optimized Metal apps that are showing that in most cases, the GPU is not in the same league as the 3080. It's clear that in some tasks that really push multiple aspects of the processor you can get ahead. (Stuff like Pixelmator pushes a wider range of accelerators on the chipset.) But the performance story is being repeated beyond games, and it's being repeated even with well optimized ARM native Metal software.

There's another review here that seems to repeat the same theme outside of games. (I think this was posted in a different thread here, I don't remember which one.)

He's comparing against desktop GPUs, but it still maps into the same performance generally seen elsewhere.

So, if there was a higher clock speed M Mac Pro, it would have had to run 67% faster in order to match the 9600 XT. I don't know if more cooling and power will provide a gain of 67% in performance.

I think it's possible that Apple could catch a 3080 with a larger device. Like a Mini Mac Pro.

But like I said earlier in the thread, both the 3080 and 6900 are old GPUs at this point. They're about to be replaced. If they're just catching up with the 3080 or the 6900 next year, it's way too late.
 
Last edited:

Melbourne Park

macrumors 65816
...

But like I said earlier in the thread, both the 3080 and 6900 are old GPUs at this point. They're about to be replaced. If they're just catching up with the 3080 or the 6900 next year, it's way too late.

Maybe you did not understand the figures: a 2 processor Desktop Mac Pro will run 50% faster than a 9600 equipped PC. Maybe 70% faster. And that is not a second generation "M" processor either - just ones clocked up with desktop cooling. And such a computer would likely not have the high cost of hardware required for GPU equipped computer form factors. They would not require large Power supplies, multiple fans and the cooling issues and large space volume of a computer requiring two 9600 style GPUs. Their performance would still be getting close to two 9600s. A 4 processor "M" based Mac Pro will blow two 9600s away. And also, likely blow two new gen GPU processors away. And the gap would widen with M2 new generation M processors.

And what makes people think prices of the GPU oligopoly will drop? GPU makers have learn't that the market will pay a heap for GPUs. Invidia predicted a major fall in their profits when the shorten of components for GPUs was revealed. And Invidia GPU production has been much less. But they have made record profits. So the previous low prices won't return to what they were two years ago. Unless Intel does an Apple equivalence or someone else makes a fast GPU and sells it cheap and breaks the GPU oligopoly;. Or demand for GPUs drop.

Another way to make GPU prices drop, would be for Microsoft to put Windows onto the M processor. That would threaten high prices in GPUs, and pull their prices down. It would strengthen Microsoft too. And weaken AMD and Invidia. But it would also threaten Apple's competitive advantage - hence it's a reason why Apple don't seem to be wanting Windows to run on their "M" architecture. Apple like high GPU prices, especially if their new desktops will not be able to use them.

I have wondered too whether Apple might have some slots for their own GPUs in the new gen "M" Mac Pro (but I doubt it). And hey - Apple may cause a small drop in GPU demand themselves if they don't offer an eGPU slot in the coming Mac Pros.
 
Last edited:

mode11

macrumors 65816
Jul 14, 2015
1,452
1,172
London
Firstly, the improvement of the MacBook Pro Max being 2.4 times faster compared to the previous top MacBook Pro Intel, is typical of various game tests comparisons that I have viewed on Youtube.
Yes, but this is a combination of both CPU and GPU.

AS MBP - M1 Max with 10-core CPU / 32-core GPU

Intel MBP - 2.4GHz 8-Core i9 CPU / Radeon 5600M GPU

Secondly the benefit of the eGPU 9600 XT is 1.6 times faster than the MacBook Pro M Max. Or the MacBook Pro M Max has 60% of the performance of the 9600 XT (rounded down from 61.5 %).

This is comparing the i9 + 6900XT to the M1 Max. We know from various benchmarks that the M1's CPU is a lot stronger than the i9, so is it a fair reflection on the 6900XT?

Warhammer is a complex strategy game, so it uses the CPU heavily. On the same Barefeats page, Tomb Raider (a more typical action game) shows a different story, with the i9 MBP on 19 fps, M1 Max MBP on 33, and i9 + 6900XT on 68.

In games where a lot of the M1 Max's advantage is due to its CPU, you can't expect performance to scale so much when using multiple M1 Max's - games can only use so many threads.
 
  • Like
Reactions: ZombiePhysicist

mode11

macrumors 65816
Jul 14, 2015
1,452
1,172
London
And what makes people think prices of the GPU oligopoly will drop? GPU makers have learn't that the market will pay a heap for GPUs. Invidia predicted a major fall in their profits when the shorten of components for GPUs was revealed. And Invidia GPU production has been much less. But they have made record profits. So the previous low prices won't return to what they were two years ago. Unless Intel does an Apple equivalence or someone else makes a fast GPU and sells it cheap and breaks the GPU oligopoly;. Or demand for GPUs drop.

The semiconductor supply chain issues are well known, and are affecting areas like car production as well. A large part of the high prices are due to third parties selling for a huge mark up on MSRP when they get hold of stock - that's not money that goes to Nvidia or AMD. As market leader, Nvidia does charge a premium for their cards, but their profits are also being boosted by their expansion from gaming cards to data centre AI and Compute. When capacity returns to manufacturing, competition will bring prices down. If Nvidia's products have an excessive profit margin, it will be easy for AMD to undercut them and gain crucial market share.

Another way to make GPU prices drop, would be for Microsoft to put Windows onto the M processor. That would threaten high prices in GPUs, and pull their prices down. It would strengthen Microsoft too. And weaken AMD and Invidia. But it would also threaten Apple's competitive advantage - hence it's a reason why Apple don't seem to be wanting Windows to run on their "M" architecture. Apple like high GPU prices, especially if their new desktops will not be able to use them.

Firstly, there's some work to do in "putting Windows" on the M processor. Microsoft do have an ARM version of Windows, but it would need development to take full advantage of the M processor's abilities. Microsoft wouldn't want to tailor ARM Windows to Apple's specific implementation though; no-one else has a license to make M architecture hardware, so they would then be dependant on a single supplier. If 'Windows on M-ARM' proved popular, it would send a huge number of customers to a major competitor for hardware, in perpetuity.

I seriously doubt Apple would be upset about their machines being bought in huge numbers to run Windows. It's all money, and if anything, people are more likely to try macOS if they already own a Mac. It's true Apple has dropped Bootcamp, but that's because it's now non-trivial for them to support it. Before, all they had to do was provide a package of Windows drivers for the x86 hardware (which were written by Intel, Realtek, AMD etc. anyway).

They could support Windows on ARM, but would that have much appeal? ARM Windows doesn't have much software support; it's not like it people could use it to play Windows games. At this point, it would be more of a boost to ARM Windows than Mac sales.
 

Melbourne Park

macrumors 65816
... When capacity returns to manufacturing, competition will bring prices down. If Nvidia's products have an excessive profit margin, it will be easy for AMD to undercut them and gain crucial market share.

Apple makes money from its hardware sales, sure ... but its cross selling is a key to sustained profitability (I'd like to have said cross selling is core to Apple).. Hence their other Apple products, and the "Apple Store". Windows by-passes the Apple Store, and would cost Apple significant on going profits, and not just through software sales. It's the whole brand thing which is at stake. Apple doesn't want native Windows on the M from reports I've read. I can quote several opinion references for that too.

As for GPU prices - my point was that both GPU oligopolists have realised they can price their products much higher, and by doing so, they can make a lot more money. Nvidia has announced prices won't go down in 2022. I don't have much faith either in market forces, when there are only two companies competing. You trust oligopolists much more than I do.

But if prices are going to come back to normal - why would anyone buy a GPU in 2022? Nvidia's new generation is coming out in next year - so if your right, and prices drop a lot in 2023, those people who buy costly GPUs before the new generation lands, will loose all their money. I doubt the new generation will be much cheaper than today's generation can be bought for now by placing orders with legitimate retailers and waiting on availability.

But its way way off topic ... you know, IBM wanted Motorola's 68000 CPU for its PC, but it wasn't quite ready, so IBM went with an Intel available now one. A costly miss for Motorola. And this year, we've seen Intel loose their premier position as the most valuable CPU maker. Thankfully it doesn't look like Nvidia will be able to buy Apple's old business, ARM.
 
Last edited:

Melbourne Park

macrumors 65816
In games where a lot of the M1 Max's advantage is due to its CPU, you can't expect performance to scale so much when using multiple M1 Max's - games can only use so many threads.
I actually picked a game because it represented other results I have seen for games which run OK on the Mac. However some of the scores in that review show much better results for the M Max than the one I chose, being a game that is sold many time mores on PCs than the Mac platform.

I also said that version 17 for DaVinci was not native on the M processor. And the 6900 achieved a score of 48 compared to the Max's 10. More telling though with their DeVinci test, as that the previous Intel Macbook without the 6900 eGPU achieved 9 - in other words, the M Pro Max was only 10% faster than the previous Macbook Pro. So it seemed to me that the review was loaded against the M processor because DaVinci has faster M native versions out, (from version 17.1 onwards). And those versions run much much quicker on the M Pro processors.

For instance https://forum.blackmagicdesign.com/viewtopic.php?f=21&t=149829 showed that a desktop PC with a Radeon 5950 and an RTX 3090 achieved 1.34 (94 seconds) compared to a 13" Macbook at 195 seconds; while a 14" M Pro Max (hence not going at the speed capability of a 16" let alone a desktop in the future) took just 57 seconds. I doubt the 3090 is one eighth as fast as an AMD 6900 - or if the 16" flank speed is 10% great, one ninth as fast as a 6900?

Meanwhile the tests that review - https://barefeats.com/m1-max-16-vs-intel-16.html - gave showed much inferior performance. I haven't read all the tests, and some tests would favour Apple's architecture - using ProRes for instance. But I felt the test from barefeats was loaded against the Mac - why not use version 17.1 or version 17.4 which are native? Any test should have posted precisely which version they were using.

One thing is for certain though - the desktop Mac Pro will be faster than the notebooks.
 
Last edited:

mode11

macrumors 65816
Jul 14, 2015
1,452
1,172
London
Windows by-passes the Apple Store, and would cost Apple significant on going profits, and not just through software sales.
But these sales would be in addition to sales to Mac users. These people would otherwise buy a Dell or Lenovo, where Apple also gets no cut of software sales. The only difference is that some of those people may wind up giving macOS a try on their brand new Apple laptop.

Nvidia has announced prices won't go down in 2022.
Nvidia only control MSRP. If people could buy GPUs at retail price, they would be happy. Sure, an RTX3090 is very expensive, but most sales would be for 3060's / 3070's.

why would anyone buy a GPU in 2022? Nvidia's new generation is coming out in next year - so if your right, and prices drop a lot in 2023, those people who buy costly GPUs before the new generation lands, will loose all their money.
Because they've been waiting ages to buy a new GPU, and would finally have the opportunity? There's always something better around the corner. A GPU isn't an investment; they always depreciate heavily long term. Also, I didn't say GPU prices would suddenly drop in 2023 - just that as supply returns to normal, so would prices (as scalpers can't rinse you on eBay when you can just buy from Amazon).

When iPhones went over £1000, people worried that this would become the new standard, but that wasn't the case. Sure, Apple created a new tier at that level, as Nvidia have established a GPU tier a similar price point, but these aren't the volume sellers. Most people simply won't / can't spend that much on a phone or graphics card.
 
Last edited:

mode11

macrumors 65816
Jul 14, 2015
1,452
1,172
London
However some of the scores in that review show much better results for the M Max than the one I chose, being a game that is sold many time mores on PCs than the Mac platform.
Which review are you referring to? I assumed you were referencing the Barefeats figures, but I don't think you have actually said.

All games sell many more times on PCs than the Mac.
 

Melbourne Park

macrumors 65816
But these sales would be in addition to sales to Mac users. These people would otherwise buy a Dell or Lenovo, where Apple also gets no cut of software sales. The only difference is that some of those people may wind up giving macOS a try on their brand new Apple laptop.


..

Windows mostly sells via OEM versions, and in the major brands, the OEM software is installed by the manufacturer. I've tried to buy OEM versions of Win for my Macs but not been able to. In theory its possible. boot camp has become difficult with later versions of OS X with Classic Mac Pros too.

Apples has never shipped a Win computer. Apple ships Macs with OS X installed. For me, I've only been able to buy a retail version of Win for installation into a Mac. So I've paid a lot more for Windows. Plus you pay a premium on the hardware with a Mac, for its quality and also a portion of that premium price is for OS X. That's always been the case. Apple doesn't list OS X's cost. If you really think Apple would OEM Windows and not install OS X, you've lost me. Very few believe Win will arrive on the M processor. You can read the internet for why Apple is not assisting that process happening, the issue has been commented on.

I've put in the links of the reviews I've referred to, but not in every post. Post 483 has links.

And then, there are the reasons why Apple wants to keep native Windows off their hardware. the reasons make up a long list. That subject is not for this thread.
 
Last edited:

mattspace

macrumors 68040
Jun 5, 2013
3,344
2,975
Australia
Apples has never shipped a Win computer.

They did ship Macs with dos cards, that had a complete pc on a nubus/pci card, and windows pre-installed. The models had their own dedicated product sku, model name and everything.

Very few believe Win will arrive on the M processor.

And then, there are the reasons why Apple wants to keep native Windows off their hardware.

I think it’s more likely that Microsoft will keep Windows off AS machines. They’re happy to sell Mac users their own apps, but for the Apple customer who NEEDS Windows, their preferred solution will probably be that you rent time on a cloud instance, or buy Windows hardware, and hey look Surface is a great premium Windows product...

Without hardware support, Microsoft can just sit back, and watch the PowerPC days repeat.
 

mode11

macrumors 65816
Jul 14, 2015
1,452
1,172
London
I've tried to buy OEM versions of Win for my Macs but not been able to.
Check out Amazon, you can buy Win 10 Pro license codes for less than £10. No need to visit the dark web.

If you really think Apple would OEM Windows and not install OS X, you've lost me.
I never said or implied that, but nice straw man.

Very few believe Win will arrive on the M processor.
Yes, in my previous comment I explained why Microsoft won't optimise ARM Windows for Apple hardware. Not that anyone cares about that version of Windows (yet), as it can't run the software you'd want Windows for in the first place - may as well run macOS. Most people would be better off using Parallels 17 to run x86 Windows on their AS Mac.

And then, there are the reasons why Apple wants to keep native Windows off their hardware. the reasons make up a long list. That subject is not for this thread.
Well OK, but if it's such a long list, it wouldn't be hard to give one or two examples.
 

goMac

macrumors 604
Apr 15, 2004
7,663
1,694
Maybe you did not understand the figures: a 2 processor Desktop Mac Pro will run 50% faster than a 9600 equipped PC. Maybe 70% faster. And that is not a second generation "M" processor either - just ones clocked up with desktop cooling. And such a computer would likely not have the high cost of hardware required for GPU equipped computer form factors. They would not require large Power supplies, multiple fans and the cooling issues and large space volume of a computer requiring two 9600 style GPUs. Their performance would still be getting close to two 9600s. A 4 processor "M" based Mac Pro will blow two 9600s away. And also, likely blow two new gen GPU processors away. And the gap would widen with M2 new generation M processors.

I know I'm late replying. I'm going to assume you mean 6900s and not 9600s.

I think your numbers are a bit high. But at the least I think you're right, a 4 die version of a Mac Pro could be competitive with a 6900.

But again, Apple's lateness in shipping aside... the 6900 is, in tech terms, an ancient GPU. So is the 3080. They're both due to be replaced in the first half of next year with much faster GPUs.

Apple beating the 6900 would be meaningless. If anything, the benchmarks how far behind they already are. They'd be beating an old, slow GPU with their highest end GPU tech. And you'd have to ignore the 7900/3080/3090 for it to look good.

I also said that version 17 for DaVinci was not native on the M processor. And the 6900 achieved a score of 48 compared to the Max's 10. More telling though with their DeVinci test, as that the previous Intel Macbook without the 6900 eGPU achieved 9 - in other words, the M Pro Max was only 10% faster than the previous Macbook Pro. So it seemed to me that the review was loaded against the M processor because DaVinci has faster M native versions out, (from version 17.1 onwards). And those versions run much much quicker on the M Pro processors.

I'll repeat again - You're making an inference here that they tested Resolve 17.0 specifically, even though that was an old version. I don't know if that's been proven.

I think you might be ignoring a lot of how M1 works and how the performance works. At half precision/FP16, the 5600m on the previous Macbook Pro _is actually faster on paper than M1 Max._ At FP32, it's around 6 TFLOPS. That's worse than M1 Max, but not by multiple factors or anything.

A 5600m getting within 10% of an M1 Max on a benchmark is a completely realistic outcome just because of the technical specifications.

There doesn't have to be a conspiracy. You're just extremely overestimating M1 Max's capabilities. Again, at FP16, the old Macbook Pro is actually faster.

And for FP32, the M1 Max doing a decent margin over the outgoing Macbook Pro is also realistic.

But because the M1 Max is better at some things and worse at others, it really all comes down to what you're measuring.

The same is true of desktop benchmarks BTW. At FP16, Apple will have a much harder time catching the 3080. And at FP32 a hypothetical Mac Pro would have an easier time. It all really depends on the package and the workflow. Different workflows are going to behave different. The M1 Max is not an all-around-great-performer-at-everything. But then again neither is the 3080.
 
Last edited:

mode11

macrumors 65816
Jul 14, 2015
1,452
1,172
London
Old, slow GPU? The RX6900XT is 1 year old, and among the fastest desktop GPUs out there. Whilst not as mature as Nvidia's equivalent tech, it does support hardware ray tracing and advanced upscaling techniques. It's also a triple slot GPU with a massive heatsink. If Apple can match the performance of two of them with a 4-die SoC, it would be impressive. Having said that, if the resulting computer costs £10K, it would be less so.

And as you point out, faster cards are on the horizon. For businesses that can justify the purchase of the fastest available GPUs (and perhaps four of them in one machine), being restricted to Apple's current fastest SoC, with no chance of upgrading, will be unacceptable. Especially if Apple updates said SoC on a biannual basis, to help recoup the costs from creating a large, expensive chip for a niche machine.
 

goMac

macrumors 604
Apr 15, 2004
7,663
1,694
Old, slow GPU? The RX6900XT is 1 year old, and among the fastest desktop GPUs out there.

Yes. The 6900XT is old. It's built on TSMC's 7nm architecture (with the 3080 built on Samsung's 8nm process!) That's 2018 era manufacturing. RDNA2 is also older than it looks. The PC launch was late because AMD had to reserve RDNA2 capacity for the XBox and PS5. You're basing the 6900XT's age based on it's PC launch. But AMD had to finish design and ramp production well before that. It seems like that also a good explanation why we never saw a 5800 or 5900.

Both AMD and Nvidia look poised to jump to a 5nm process early next year. It won't be quite as dramatic as Apple jumping to a 5nm process while Intel is setting on 11/whatever they are calling it now. But it's still going to be a big jump in performance. And unless Apple is planning on going to 3nm with the Mac Pro (which is not expected) they're not going to be able to outmaneuver AMD and Nvidia just based on manufacturing supremacy. AMD and Nvidia will be able to ship GPUs that are nearly or as efficient, except without being constrained by the thermal issues that Apple will have by trying to shove everything on one chip.

Basically, if Apple is going to go the SoC route on desktops, they need to make sure they are outpacing Nvidia and AMD in efficiency so they can make up for the thermal constraints they're putting on themselves. Because no one is going to care in a Mac Pro if Apple's GPU is 200 watts if it's clearly slower than a 300 watt AMD or Nvidia GPU.

So yeah. AMD launched the 6900XT last year. But it's pretty clearly the end of a series of GPUs. Not the beginning of a series.
 

Melbourne Park

macrumors 65816
...And unless Apple is planning on going to 3nm with the Mac Pro (which is not expected) they're not going to be able to outmaneuver AMD and Nvidia just based on manufacturing supremacy. AMD and Nvidia will be able to ship GPUs that are nearly or as efficient, except without being constrained by the thermal issues that Apple will have by trying to shove everything on one chip.

Basically, if Apple is going to go the SoC route on desktops, they need to make sure they are outpacing Nvidia and AMD in efficiency so they can make up for the thermal constraints they're putting on themselves. Because no one is going to care in a Mac Pro if Apple's GPU is 200 watts if it's clearly slower than a 300 watt AMD or Nvidia GPU.
Apple are way in front of the thermal performance of AMD and NVidia. But you bring up a good point. Why would Apple incorporate PCIe slots for hot power hungry GPUs that would significantly increase the cost of Apple's new architecture Mac Pros? Why would Apple incorporate bulky and expensive extra power and cooling technologies into an otherwise efficient Power Mac. for circumspect productivity gains based on hugely expensive add in GPUs that get annually superseded?
 
Last edited:
  • Like
Reactions: richinaus

Melbourne Park

macrumors 65816
...

Well OK, but if it's such a long list, it wouldn't be hard to give one or two examples.

It deserves another thread. Ego driven companies like Apple and the players for Windows is like trying to make a logical explanation about a divorce IMO. But there are lots of issues, including exclusivity rights, and companies which seem to me to be at war with each other.
 

Boil

macrumors 68040
Oct 23, 2018
3,478
3,173
Stargate Command
Future Mac "Pro" desktop lineup...?

Mac mini Pro
  • M1 Pro
  • M1 Max
  • M1 Max Duo

27" iMac Pro
  • M1 Pro
  • M1 Max
  • M1 Max Duo

32" iMac Pro
  • M2 Max Duo
  • M2 Max Quadra

Mac Pro Cube
  • M2 Max Duo
  • M2 Max Quadra

Final Intel Mac Pro tower...

Mac Pro (Intel)
  • New main logic board
  • Ice Lake Xeon CPUs
  • ECC DDR4 3200 RAM
  • PCIe Gen4 expansion slots
  • W7000-series MPX GPUs
 

richinaus

macrumors 68020
Oct 26, 2014
2,432
2,186
Future Mac "Pro" desktop lineup...?

Mac mini Pro
  • M1 Pro
  • M1 Max
  • M1 Max Duo

27" iMac Pro
  • M1 Pro
  • M1 Max
  • M1 Max Duo

32" iMac Pro
  • M2 Max Duo
  • M2 Max Quadra

Mac Pro Cube
  • M2 Max Duo
  • M2 Max Quadra

Final Intel Mac Pro tower...

Mac Pro (Intel)
  • New main logic board
  • Ice Lake Xeon CPUs
  • ECC DDR4 3200 RAM
  • PCIe Gen4 expansion slots
  • W7000-series MPX GPUs
The only one I am not sure of is the 32"imac pro. They wont do this and a 27" imac pro.

Personally all I want is a Mac Pro Cube M2 Max Quadro, 64gb Ram and a 27 / 32" 5k mini LED monitor to match. Heaven.
 
  • Like
Reactions: Boil

Boil

macrumors 68040
Oct 23, 2018
3,478
3,173
Stargate Command
The only one I am not sure of is the 32"imac pro. They wont do this and a 27" imac pro.

Personally all I want is a Mac Pro Cube M2 Max Quadro, 64gb Ram and a 27 / 32" 5k mini LED monitor to match. Heaven.

I am hoping Apple will really simplify the Mac(intosh) naming schemes...

Mn-series models
  • MacBook
  • Mac mini
  • 24" iMac

Mn Pro / Max (Single / Duo / Quadra) models
  • MacBook Pro
  • Mac mini Pro
  • 27" iMac Pro
  • 32" iMac Pro
  • Mac Pro Cube

Apple could differentiate the M2 Max Duo / Quadra models by using higher density LPDDR5X; so we could see up to:
  • M2 Max Duo - 512GB RAM (1TB/s UMA)
  • M2 Max Quadra - 1TB RAM (2TB/s UMA)
I might be mistaken about the densities of the LPDDR5X chips, maybe half the capacities listed above, but still the higher UMA bandwidth...?

So a 27" iMac Pro with a M1 Max Duo would be the "low-end" model of iMac Pros, whereas the mid-to-high-end 32" iMac Pro would have the M2 Max Duo / Quadra SoCs in them, with much more RAM & higher UMA bandwidth...
 

mode11

macrumors 65816
Jul 14, 2015
1,452
1,172
London
AMD launched the 6900XT last year. But it's pretty clearly the end of a series of GPUs. Not the beginning of a series.
Point taken. I guess it seems weird to be talking about a generation being 'over' and 'obsolete', when seemingly few of these GPUs made it into people's hands.

Apple are way in front of the thermal performance of AMD and NVidia.
Yes, but as goMac pointed out, this is in large part due to the advantage of being on 5nm rather than 7 or 8. Also, the power consumption of silicon operates on a curve. Pushing for maximum performance uses disproportionately more power. But then no one needs a slow workstation that sips power - they might as well use a laptop. A Prius uses less petrol than a Chiron at 70mph, but that's not relevant if you want to be able to go 250.

Why would Apple incorporate PCIe slots for hot power hungry GPUs that would significantly increase the cost of Apple's new architecture Mac Pros?
  • PCIe slots don't add much manufacturing cost - see the PC market.
  • Apple will happily pass on that cost to the customer anyway, with a large profit margin.
  • Power comes from the wall; it won't be running on batteries.
  • Large fans get rid of heat quietly.
  • If you want a compact machine, optimised for power efficiency, you can buy an iMac, Mac mini, or laptop.
Why would Apple incorporate bulky and expensive extra power and cooling technologies into an otherwise efficient Power Mac. for circumspect productivity gains based on hugely expensive add in GPUs that get annually superseded?
The fact that GPUs in particular get rapidly superseded is the entire point. This will also apply to the GPU portion of Apple's SoCs. Only unlike other workstations, you won't be able to upgrade this key part of the computer after a couple of years.

Personally all I want is a Mac Pro Cube M2 Max Quadro, 64gb Ram and a 27 / 32" 5k mini LED monitor to match. Heaven.
Sounds awesome, but how much would you be willing to pay? £8K?

Personally, I just want a £2K expandable Mac tower. As this will seemingly never happen again, the next best thing is a second-hand Mac Pro workstation. If Apple had continued to iterate on the cheese grater, I'd probably be using a ~2017 model right now. Unfortunately, they replaced the 5,1 with something barely faster than its (upgraded) predecessor, sat on it for 6 years, then replaced that with an incredibly expensive dead end.

AS Macs are good value, considering their performance and other qualities. An M1 Pro + a powerful (Nvidia) GPU would be ideal. But it's looking like the only option for decent GPU power will be some uber-expensive quad-M1 option with a massive overkill of CPU.
 
Last edited:
  • Like
Reactions: jinnyman

goMac

macrumors 604
Apr 15, 2004
7,663
1,694
Apple are way in front of the thermal performance of AMD and NVidia.

I think they're ahead on thermal performance but not ahead on performance performance. Like I said above, they didn't actually completely beat the previous gen MacBook Pro. Previous gen is ahead on FP16. Which is not what every workflow uses, but could be contributing to some of the variability in benchmarks.

On the Nvidia side, they claimed they had the performance of a 3070 or 3080, in a 3060-sized laptop package. It's seeming more like they have 3060 performance in a 3060 sized laptop.

Do they still have better thermals and power consumption than a 3060? Yeah, but not by that much. It's actually a bit worrisome that at 5 nm they aren't more ahead of Nvidia.

For CPU there is a more clear advantage. I also worry that's because Intel is so behind on their fabs and not because of their design or engineering talent. But Intel looks like they have a harder recovery ahead of them, so that's going to be less of a concern for Apple short term.
 

Joe The Dragon

macrumors 65816
Jul 26, 2006
1,031
524
Apple are way in front of the thermal performance of AMD and NVidia. But you bring up a good point. Why would Apple incorporate PCIe slots for hot power hungry GPUs that would significantly increase the cost of Apple's new architecture Mac Pros? Why would Apple incorporate bulky and expensive extra power and cooling technologies into an otherwise efficient Power Mac. for circumspect productivity gains based on hugely expensive add in GPUs that get annually superseded?
users also want non apple storage inside. Apple can fix that some what with real raid choices and an way to restore / mange it without needed an 2 system linked to the usb-c port.
 

mikas

macrumors 6502a
Sep 14, 2017
898
648
Finland
As we all pobably know, the storage today is called M.2 NVMe.

Have we seen any standard M.2 NVMe as a standard of the shelf solution with Apple devices of any sort. And as a standard pluggable device in an Apple Mac Pro as of today? Nope, so I predict that's not gonna happen this time either. I would wish that, though. But not gonna happen.

I believe they are gonna bet it in on every front, and about everything there is to it. It's gonna be only Apple, or nothing. They are gonna rule the world. They are gonna make the profit off all of it too.

They are doing it on every other fronts in there, like games, entertainment like tv-shows, maybe cars someday and what not.

To me, this is annoying. I learnt them as a computer company when I was young. Now they are a Ted Lasso company. Or a Music company. Maybe some time they Are A Car Manufacturer too, who knows.

I'm sad to say that their real computer business seems to be on the back yard of their developings though. It seems like a second thought to be a computer, it comes just a lot behind of these continuous business models, subscriptions stuff and all.

Apples computers have become as a mear means to grow their subscriptions models and earnings to their ultimate limits. They want to tie us to subsrcibtions and yearly payments. That's much more ludicruous of an achievement than selling you just a computer you can use any way you want, and with any partners you want.

That's what all the other big players try to achieve too. All of them.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.