Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Unregistered 4U

macrumors G4
Jul 22, 2002
10,612
8,636
and will we see a bunch of fancy dedicated bits for faster video editing / rendering & digital audio DSPs and such?
We could. And, it’d be implemented at hardware in a way that makes sense for Apple and their vision for the future of the Mac, and not by whatever Intel’s able to squeak out incrementally.
The E's are still fairly fast. Ars dug into that somewhere, I'm still looking for the article, but it's like 1.7 ghz vs 2.4 ghz. E cores aren't slow, just clocked lower.
I remember reading that, too. The E’s now are as performant as the old P’s used to be.
 

leman

macrumors Core
Oct 14, 2008
19,530
19,708
The E's are still fairly fast. Ars dug into that somewhere, I'm still looking for the article, but it's like 1.7 ghz vs 2.4 ghz. E cores aren't slow, just clocked lower. They operate just like normal cores and will pick up foreground work no issue.

E cores are not just clocked lower, they also have less compute units.
 

teagls

macrumors regular
May 16, 2013
202
101
This isn’t surprising, though. Just like to get performance using NVIDIA solutions, you use proprietary NVIDIA API’s, to get performance using Apple solutions, you use proprietary Metal.

Thats not entirely what I mean't. Nvidia isn't required to get feature parity with more advanced ML. You can still run more advance neural activations, convolutional layers, 3D convolutions, etc that are optimized for x86 cpu without having to manually implement them. Performance will vary and usually isn't too bad for inference.

But none of this is possible on iOS. You literally have to implement it all from scratch. None of it exists and who the heck wants to do that. Nobody wants to reinvent the wheel. And if you do implement it just on the CPU, your end users are gonna be pissed. CoreML has the capability of a 1st grader. Pytorch & Tensorflow is like somebody in Grad school.
 

Voyageur

macrumors 6502
Mar 22, 2019
262
243
Moscow, Russia
Thats not entirely what I mean't. Nvidia isn't required to get feature parity with more advanced ML. You can still run more advance neural activations, convolutional layers, 3D convolutions, etc that are optimized for x86 cpu without having to manually implement them. Performance will vary and usually isn't too bad for inference.

But none of this is possible on iOS. You literally have to implement it all from scratch. None of it exists and who the heck wants to do that. Nobody wants to reinvent the wheel. And if you do implement it just on the CPU, your end users are gonna be pissed. CoreML has the capability of a 1st grader. Pytorch & Tensorflow is like somebody in Grad school.
Perhaps you are right, and perhaps you are not. All this is just a distant assumption, the truth we do not know except that Apple has always positioned its Macs for entertainment, too. In particular - for games. This is evidenced by the fact that almost always on the advertising pages of their new iMacs and MacBookPro, they demonstrate the gameplay and benchmarks with games.
 

Unregistered 4U

macrumors G4
Jul 22, 2002
10,612
8,636
But none of this is possible on iOS. You literally have to implement it all from scratch. None of it exists and who the heck wants to do that. Nobody wants to reinvent the wheel.
Even with Intel or NVIDIA, there was a time when it DIDN’T exist, though. There were folks that had things done on Intel that “didn’t want to reinvent the wheel” with NVIDIA, but some enterprising folks did. And, fortunately, some made their work public for others to iterate on over the years.

I guess I’m saying that given a certain technology, add people with an interest to promote and improve the technology and multiply that by time and you’ll reach a level of maturity. Isn’t the only “magic” with Intel or NVIDIA just that they’ve just been around longer? And, as with everything in tech, there will come a future when the low end least capable version of any technology will will be more capable than today’s best.
 

teagls

macrumors regular
May 16, 2013
202
101
Perhaps you are right, and perhaps you are not. All this is just a distant assumption, the truth we do not know except that Apple has always positioned its Macs for entertainment, too. In particular - for games. This is evidenced by the fact that almost always on the advertising pages of their new iMacs and MacBookPro, they demonstrate the gameplay and benchmarks with games.

None of that really makes sense.. How does benchmarking games relate to a lack of advanced ML capabilities. If anything look at how when Overwatch came out Blizzard said they were unable to make a Mac version due to limitations in graphics technology.

Even with Intel or NVIDIA, there was a time when it DIDN’T exist, though. There were folks that had things done on Intel that “didn’t want to reinvent the wheel” with NVIDIA, but some enterprising folks did. And, fortunately, some made their work public for others to iterate on over the years.

I guess I’m saying that given a certain technology, add people with an interest to promote and improve the technology and multiply that by time and you’ll reach a level of maturity. Isn’t the only “magic” with Intel or NVIDIA just that they’ve just been around longer? And, as with everything in tech, there will come a future when the low end least capable version of any technology will will be more capable than today’s best.

Yes fortunately they did make their work public, but Apple is a walled garden. With absolute control and little to no transparency. When the iPhone first came out Steve Jobs was totally against the App Store and letting developers build native apps.

When the LiDAR iPad first came out you couldn't even use the LiDAR camera directly and sample depth data. Probably after enough developers complained they finally made it available in newest iOS.

Did you know, you can't directly program for the neural engine on the iPhone or iPad. In fact there is no real direct way to know if your CoreML model will run on it other than trial and error by profiling and digging through instruments.

How can you improve and advance that technology when it's deeply guarded and inaccessible by developers. Apple has constantly fought against that. They have a strong history of it. Does Intel or Nvidia do anything like that?! God no...
 

Voyageur

macrumors 6502
Mar 22, 2019
262
243
Moscow, Russia
If anything look at how when Overwatch came out Blizzard said they were unable to make a Mac version due to limitations in graphics technology.
Considering that Overwatch works well through bootcamp, there is only one conclusion: Blizzard didn’t want to spend time and resources on developing the mac version.
 
  • Like
Reactions: Unregistered 4U

pasamio

macrumors 6502
Jan 22, 2020
356
297
None of that really makes sense.. How does benchmarking games relate to a lack of advanced ML capabilities. If anything look at how when Overwatch came out Blizzard said they were unable to make a Mac version due to limitations in graphics technology.



Yes fortunately they did make their work public, but Apple is a walled garden. With absolute control and little to no transparency. When the iPhone first came out Steve Jobs was totally against the App Store and letting developers build native apps.

When the LiDAR iPad first came out you couldn't even use the LiDAR camera directly and sample depth data. Probably after enough developers complained they finally made it available in newest iOS.

Did you know, you can't directly program for the neural engine on the iPhone or iPad. In fact there is no real direct way to know if your CoreML model will run on it other than trial and error by profiling and digging through instruments.

How can you improve and advance that technology when it's deeply guarded and inaccessible by developers. Apple has constantly fought against that. They have a strong history of it. Does Intel or Nvidia do anything like that?! God no...


When the iPhone first came out it didn't have copy and paste, it didn't have MMS and it didn't have a lot of other features that we use every day. The first iPhone had a whole heap of limitations on it that were slowly paired away over time.

If I recall one of the objections was that they didn't want to figure out policing third party app developers, a problem that the company is still struggling with today.

I personally view Apple's approach at times as shipping a product and then adding features around it, the LiDAR is an example of that. There is always a tension between getting the software out and getting the hardware out whilst still keeping everything in sync and working. They did an early cut of the LiDAR functionality and then added more later. I don't see that as malicious but as prioritisation.

Intel and NVIDIA's positioning in the market are as suppliers of components that are integrated into a broader computer offering. Apple's market is in shipping that final computing device. Intel and NVIDIA will always have a different incentive to openness because they can't survive without it. Apple as a fully integrated vendor doesn't have as much of that priority either.
 

leman

macrumors Core
Oct 14, 2008
19,530
19,708
Considering that Overwatch works well through bootcamp, there is only one conclusion: Blizzard didn’t want to spend time and resources on developing the mac version.

Quality of Mac graphics drivers and the terrible OpenGL developer experience would make it very difficult to deliver smooth gameplay one needs for a competitive online shooter. Blizzards decision is understandable.

The situation will improve dramatically with ARM Macs and Metal. They offer almost console-level control over the hardware, a common (this predictable) GPU platform and very good software tooling. Only catch: porting will require significant effort since you have to reimplement your engine for a completely different GPU architecture to get best results. Then again, developing a game from scratch using modern Metal is often arguably simpler and quicker than using Vulkan or SX12. If these new Macs sell well and the game developers bite, after a couple of years we might very well get a solid gaming platform.
 

ssgbryan

macrumors 65816
Jul 18, 2002
1,488
1,420
Do you think entire line of Apple Silicon Apple Macs will be stuck with 2 or 4 big cores like iPhones or iPads?

No one is claiming that A12z itself beats 8-core latest Intel processors, let alone 16-core 3950x. The fact that Apple already has big core that is already good enough to hang with the very best Intel or AMD has to offer shows that Apple Silicon with 8, 12, 16 Cores will have very good potential to offer great performance while consuming less power than best of x86 processors of same number of cores.

Of course, Apple can fail, just like any other company. But it is not like Apple A series is something of a completely unknown quantity, as we already seen its potential. I am honestly excited to see how it will perform when it is fed sufficient amount of power.

There are a NUMBER of people in this very thread that are claiming that the A12z beats "Intel's best Silicon". And they still haven't shown a big core that is "good enough".

The reality is that the A12z (8 cores) will beat a 8th generation (2 generations back) 4 core/4 thread i3. Not an i5, not an i7, certainly not an i9. And then there is AMD, who are curb-stomping Intel right now with a 12 - 20% uplift every 15 months. Zen 3 launches this fall - Zen 4 (design already completed) will launch in 2022. 5nm, and 4 way SMT. Intel is going with Big/Little with Golden Cove,

We have ONE Geekbench 5 score - and if one actually takes the time to look at all the sub scores, the 1st thing you notice is that the A12z doesn't have a lot of the ones that the x86 chips have.

Why do you think that is? Could it be that those sub-tests aren't being run because the ability isn't there?

This isn't about performance - it is about control.

As one blogger put it.....

This is about Apple putting a roof on the walled garden.

You aren't going to see cheaper computer prices. What you will see is most of your software transitioning to software as a service for Adobe, Microsoft, etc. Most, if not all, it will be sold through the Apple store - Timmy will want his cut. And that of course, assumes that x86 OSX software is ported to ARM.

You will have no ability to upgrade your machine.

The only place your machine will be serviced at is an Apple shop. Timmy wants all of that money.

But don't worry.....

You will have Candy Crush for your desktop.

And the potential for an RDNA2 GPU (Linux drivers are showing a Navi 32 with Apple OS).
 

Kpjoslee

macrumors 6502
Sep 11, 2007
417
269
The reality is that the A12z (8 cores) will beat a 8th generation (2 generations back) 4 core/4 thread i3. Not an i5, not an i7, certainly not an i9. And then there is AMD, who are curb-stomping Intel right now with a 12 - 20% uplift every 15 months. Zen 3 launches this fall - Zen 4 (design already completed) will launch in 2022. 5nm, and 4 way SMT. Intel is going with Big/Little with Golden Cove,

We have ONE Geekbench 5 score - and if one actually takes the time to look at all the sub scores, the 1st thing you notice is that the A12z doesn't have a lot of the ones that the x86 chips have.

A12z is not going to be used on actual Apple silicon Macs, just like how Pentium 4 3.6 Ghz DTK wasn't representative of how Intel Macs performed when they switched to Intel in 2006. Stop using it as a reference point of how Apple Macs will perform in the future. A12z is obviously missing some features due to being iPad only Soc, but it will definitely change with Apple Silicon on Macs, especially if they adapt SVE2 which would mean it will have feature parity on SIMD with latest x86 processors.

Apple will progress just like how AMD and Intel will progress with future architecture. So hold onto your judgement until Apple unveils full Apple Silicon lineup for future Macs.
 

ssgbryan

macrumors 65816
Jul 18, 2002
1,488
1,420
A12z is not going to be used on actual Apple silicon Macs, just like how Pentium 4 3.6 Ghz DTK wasn't representative of how Intel Macs performed when they switched to Intel in 2006. Stop using it as a reference point of how Apple Macs will perform in the future. A12z is obviously missing some features due to being iPad only Soc, but it will definitely change with Apple Silicon on Macs, especially if they adapt SVE2 which would mean it will have feature parity on SIMD with latest x86 processors.

Apple will progress just like how AMD and Intel will progress with future architecture. So hold onto your judgement until Apple unveils full Apple Silicon lineup for future Macs.

What data point should I use? Notice - data - not some rambling from the technically illiterate true believers.

I understand that the believers gotta believe, but I work off of data - and there is no "there" there.
 

Kpjoslee

macrumors 6502
Sep 11, 2007
417
269
What data point should I use?

Is Apple A12z going to be used on Apple Silicon Mac? No.
Do we have any data or benchmark on Apple Silicon that will be used on future Mac? No.

So, the answer is, there is no data you can use to prove your point, period.
Let me repeat, hold onto your judgement until Apple unveils full Apple Silicon lineup for future Macs.
 

jeanlain

macrumors 68020
Mar 14, 2009
2,464
958
Quality of Mac graphics drivers and the terrible OpenGL developer experience would make it very difficult to deliver smooth gameplay one needs for a competitive online shooter. Blizzards decision is understandable.

The situation will improve dramatically with ARM Macs and Metal.
The situation has already improved. I suppose intel Macs and Xcode are now adequate for this game. IMO, the absence of overwatch on Mac is just a question of market share.
 
  • Like
Reactions: Voyageur

leman

macrumors Core
Oct 14, 2008
19,530
19,708
What data point should I use? Notice - data - not some rambling from the technically illiterate true believers.

We don't have any hard data, because the hardware in question does not publicly exist yet. But there is enough technical analysis and data out there on existing Apple CPUs. You are technically literate — extrapolate from that.

Here some links to help you out:

- Excellent tech articles from Anandtech, showing that an iPhone 11 core running at 2.66ghz (5W power draw) reaches 90% (or better) SPEC scores of a desktop 9900K core (4.8Ghz effective max turbo boost).

- Tests on SIMD-accelerated json parsing done by Damiel Lemire, where an A12 running at 2.5Ghz is within 10% performance delta to an Intel Skylake core running at 3.7Gz under same conditions (128-bit SIMD)

- This redditor running Rust on an iPhone 11. A sample program was built from source in 1 minute 40 seconds. My i9-9980HK limited to 4 jobs (to make the comparison more fair), can do the same in 1 minute 10 seconds. Considering that my laptop has much faster SSDs, fluctuations in other factors such as download speeds, and the fact that the iPhone is running at 5 watts while my laptop pulled 60 watts on average for that job, I'd say it not too bad

There is more info out there, just look around.

The reality is that the A12z (8 cores) will beat a 8th generation (2 generations back) 4 core/4 thread i3. Not an i5, not an i7, certainly not an i9. And then there is AMD, who are curb-stomping Intel right now with a 12 - 20% uplift every 15 months. Zen 3 launches this fall - Zen 4 (design already completed) will launch in 2022. 5nm, and 4 way SMT. Intel is going with Big/Little with Golden Cove,

The reality is that a mobile phone core running at 2.6Ghz and consuming 5Watts trades blows with desktop CPUs running much higher clocks and consuming 5x (conservative estimate) the power.

That an iPad CPU with 4 performance cores cannot match the performance of a 8/16 HT i9 does not mater. They are not operating under equal conditions. We don't know how well Apple CPUs scale. We have no reason to believe that they don't scale at all. With a mere 25% increase in clock and 8 P-cores, Appel Silicon will be at least as fast as anything Intel or AMD are currently shipping in the customer segment while consuming less power than either of them. This is not a religious matter. This is engineering.

I understand that the believers gotta believe, but I work off of data - and there is no "there" there.

Doesn't seem so. You are very eager to poke fun on others with your "believer" labels, but you are jumping to conclusions just as eagerly anyone else in this thread. You are not a sceptic, your opinion is already made. So at least stop this "true believer" smack talk, it is condescending, hypocritical, and frankly, unbecoming an intelligent individual such as yourself.
 

ssls6

macrumors 6502a
Feb 7, 2013
593
185
Once upon a time unix ran on RISC processors and windows ran on intel. In those dark ages, unix on RISC was a much faster way to go for computational heavy loads. Pesky intel however could innovate and move faster than the RISC Cpu manufacturers and RISC was left behind....unix workstations on custom RISC cpu's died off when linux on intel hit the scene.

I have no doubt that A-series Macs will evolve into computational powerhouses and the harmonization will allow for some really unique and interesting products in the future.
 

Voyageur

macrumors 6502
Mar 22, 2019
262
243
Moscow, Russia
Has anyone already said here that such a transition means for the Mac Pro that in the future there will be fewer options for customization? You can’t change the processor with your own hands. These will not be on sale. The GPU also does not change, because it is a SoC.

So far, we have a return to a policy of closedness and total control with Apple's inherent unpredictability. so far it’s hard to call a stable solution.
 

Boil

macrumors 68040
Oct 23, 2018
3,479
3,174
Stargate Command
Has anyone already said here that such a transition means for the Mac Pro that in the future there will be fewer options for customization? You can’t change the processor with your own hands. These will not be on sale. The GPU also does not change, because it is a SoC.

So far, we have a return to a policy of closedness and total control with Apple's inherent unpredictability. so far it’s hard to call a stable solution.

Apple has yet to say if dedicated GPUs will still be a thing for Apple Silicon Mac Pros. I do not see why it would not.
 

ssgbryan

macrumors 65816
Jul 18, 2002
1,488
1,420
Apple has yet to say if dedicated GPUs will still be a thing for Apple Silicon Mac Pros. I do not see why it would not.

Navi 32 for Apple OS.
[automerge]1593702225[/automerge]
Has anyone already said here that such a transition means for the Mac Pro that in the future there will be fewer options for customization? You can’t change the processor with your own hands. These will not be on sale. The GPU also does not change, because it is a SoC.

So far, we have a return to a policy of closedness and total control with Apple's inherent unpredictability. so far it’s hard to call a stable solution.

I did.

It has always been about control, not performance.

No longer will Timmy have to hear about how how obsolete an Apple computer is compared to an AMD or Intel computer.

Mac Mini has an 8th gen Intel processor - didn't get a 9th gen, won't get a 10th gen.
 

G4DPII

macrumors 6502
Jun 8, 2015
401
544
Apple has yet to say if dedicated GPUs will still be a thing for Apple Silicon Mac Pros. I do not see why it would not.

They said during the Keynote that it would be Apple GPU's alone with CPU's. Certainly for the lower end machines. Why give someone else profit when they can grab it all themselves.

If people are still expecting to be able to use AMD GPU's in the long run forget about it. Thy may compromise for the first gen ARM Mac Pro, but after that in 6 or 7 years time, everything will be solely from Apple.
 

Unregistered 4U

macrumors G4
Jul 22, 2002
10,612
8,636
Yes fortunately they did make their work public, but Apple is a walled garden. With absolute control and little to no transparency. When the iPhone first came out Steve Jobs was totally against the App Store and letting developers build native apps.

Did you know, you can't directly program for the neural engine on the iPhone or iPad. In fact there is no real direct way to know if your CoreML model will run on it other than trial and error by profiling and digging through instruments.
No, I didn’t. But, thanks to someone who is, fortunately, providing a handy how-to on how to most efficiently perform that trial and error, I wouldn’t have to figure it out on my own.

This was actually a little test for me. I wondered “Could I quickly find out the details to your question and find someone that has posted a solution for free?” And, I did. And there’s likely lots more going on in Apple’s Developer forums because developers are working together, asking questions of Apple, to find solutions to enable on-device ML. As time goes on, more and more folks will just add to the knowledge. Even someone starting with CoreML today is going to have a better time than two years ago. That’s bound to improve over time.

You will have no ability to upgrade your machine.

The only place your machine will be serviced at is an Apple shop. Timmy wants all of that money.
This is no different from today for the majority of purchasers. :)
Apple will progress just like how AMD and Intel will progress with future architecture. So hold onto your judgement until Apple unveils full Apple Silicon lineup for future Macs.
I like how the thought goes “Apple can’t be Intel... see? AMD ARE BEATING INTEL!” Apparently only AMD and Intel can improve their designs year over year and Apple, that improves their designs year over year, CAN’T improve their designs year over year. :)
 

deconstruct60

macrumors G5
Mar 10, 2009
12,493
4,053
...
If people are still expecting to be able to use AMD GPU's in the long run forget about it. Thy may compromise for the first gen ARM Mac Pro, but after that in 6 or 7 years time, everything will be solely from Apple.

Unless Apple drops out of the Mac Pro - iMac Pro space completely that isn't very likely at all. With SoC and cellular modems Apple has bit off alot to "chew" here. Nvidia isn't going anywhere, AMD isn't . Intel isn't. As much as Apple squeezes those three out of the Mac embedded/integrated GPU business that will just deep those three's focus on the discrete GPU market. Apple pushing their competitors into a smaller space then there is a pretty good chance they'll get better at covering that smaller space.

Couple that with Apple just doesn't have the volume to support both a higher than average CPU die/package and a higher than average GPU die/package with an even smaller set of Macs that have discrte GPUs. So Apple can win at pushing more iGPUs in but that is actually less traction over time for any possible discrete GPU move. ( selling GPUs only to yourself has problems if that vertical market is only a very small faction of the overall market. )

As long as AMD ( or Intel) is willing to do and execute reasonable well contract semi-custom discrete GPUs to Apple specs they'll probably stay on that track. ( Any more than Sony and Microsoft are going to dump AMD for their own GPUs for their consoles. )

Apple is still going to have to deal with external GPUs via Thunderbolt.


Will Apple push the discrete GPU out of the "entry" Mac Pro (with some mid-range iGPU)? That seems likely in 6-7 years. The relatively weak display stream count W5500X they just did is indicative that not everyone needs some heavyweight 3D , barn burner graphics. ( similar to the 128GB SSD). Nominal rack system , audio workstation, iOS/iPad/Mac test/integration node all would/could work with just a reasonable sized iGPU ( when get down to the 3nm range or lower . ).
 

mode11

macrumors 65816
Jul 14, 2015
1,452
1,172
London
So if the last Intel Macs are still being sold in 2022, and they support for say 4 years after that, you would still be on a current OS in 2026

Sure, that's probably a conservative estimate. Though there may be sexy new features in macOS by then that depend on AS, which could tempt an earlier upgrade.

Single threaded performance matters enough. But what matters even more is efficiency.

In a laptop, sure, but it's not top priority for a huge box that's plugged into the mains... like a Mac Pro.

I stand here holding the bag.

I understand that you bought your Mac Pro as a depreciating business asset that you'd probably replace anyway in 5 years, but I think I'd be peeved too. For one thing, the resale value will have plummeted by then. Also, there was an implicit suggestion when it was launched, that this massively expandable and expensive machine would be a platform to be built on for years. To abandon its core architecture just six months later is pretty cheeky - given that they would have been planning the transition for at least a couple of years, and probably since the A4.
 

deconstruct60

macrumors G5
Mar 10, 2009
12,493
4,053
Has anyone already said here that such a transition means for the Mac Pro that in the future there will be fewer options for customization? You can’t change the processor with your own hands. These will not be on sale. The GPU also does not change, because it is a SoC.

There is nothing substantive to indicate that the PCI-e slots are going away in a Mac Pro. You may simply not be able to get rid of all of Apple's GPUs ( just like can't ejective them all from the laptops , Mini , or iMac ) , but that won't mean can't add an additional one. You can add an additional one to the laptops , Mini, and iMac now. That will probably continue. The Mac Pro will also in the future and be a bit more easier because probably will still have internal PCI-e slots and at least one MPX bay.

Apple didn't create the MPX bay design just to run away from it now. It may be modified a bit going forward, but some future variation will still be there.

Similarly folks digital audio and video capture/generation cards aren't going to disappear. Their 10-40GbE cards aren't to disappear. Their 4 M.2 SSD cards aren't going to disappear. etc. etc.


The "control" to kick everything that Apple put in there when I bought it. That will change. Probably won't be able to completely 'nuke' Apple's design choice. But that doesn't mean they'll block additional supplemental GPUs.
That nominal iGPU may just end up feeding video to the Thunderbolt ports. Which if folks just simply ignore those as video out ports really doesn't impede much of anything. ( And Apple was always going to bundle a GPU with the Mac Pro so not like were going to lower the Mac Pro prices of not having one there at all. Apple was going to make you buy at least one of theirs. )


So far, we have a return to a policy of closedness and total control with Apple's inherent unpredictability. so far it’s hard to call a stable solution.

If Apple rolls out a iMac Pro solution after 3 years ( 2017-2020) , Then a Mac Pro solutions after 3 years ( 2019-2022). Then another something in that range with another 3 year cycle on it. Then not all that unpredicatble.

They won't be matching Lenevo/HP/Dell/Boxx blow for blow but they would be on a predictable path at a slower pace.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.