Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Joelist

macrumors 6502
Jan 28, 2014
463
373
Illinois
Stop conflating ARM with Apple Silicon - all they have in common is the instruction set. Apple Silicon is far more performant than other ARM SOCs and has its own design which is 100% Apple - it is NOT Cortex.
 

leman

macrumors Core
Oct 14, 2008
19,520
19,670
I get that Apple is having problems with Intel getting new processors out in a timely fashion, and that x86 chips are one big reason that battery life is stuck at around 6-10 hours for anyone’s brand of medium or better performance laptop. But I haven’t heard WHY or WHAT has changed in ARM processors that makes them attractive now to (computational) power users.

To add to what others have already said, I would like to point out a common confusion around energy efficiency. For example, it seems that you associate energy efficiency with longer battery life. But this doesn’t have to be the case. Energy efficiency = performance potential. Think about it this way: the latest and greatest Intel CPU core needs to consume 50 watts of power to barely outperform an iPhone 11. That’s ten times more power - for an almost identical end result. I don’t think it takes a big leap of faith to imagine how Apple CPUs could perform if they were not restricted to a phone enclosure. It’s not wishful thinking, it’s basic extrapolation from existing data.

Apple CPUs are energy efficient because they can do much more work than Intel CPUs. In fact, they are probably the most advanced customer chip on the market. An Apple mobile chip has more transistors than a high-end Intel chip. They have more execution units, more cache and they are better at utilizing the hardware efficiently. This means they can be run at much lower speeds to achieve same result, which in turn means lower voltages and lower power consumption. Stack more of these cores together and put them in a less thermally restricted environment, and you will easily get a CPU able to compete with Intel desktop offering while consuming less power than Intel mobile chips.
 

Yebubbleman

macrumors 603
May 20, 2010
6,024
2,616
Los Angeles, CA
Again, you say Apple chips will be 'faster', based on bench and Geekbench. You also add that the performance gain will be 'INSANE". Based on what metric do you get INSANE?

Based on the fact that Intel is still milking the Skylake cow from 2016, while the iPad Pro from 2020 is leaps and bounds faster than the iPad Pros available in 2016. That shows you the rate of performance increase happening in Apple Silicon. You are NOT getting anywhere near that kind of growth out of Intel's processors (which is why AMD is currently wiping the floor with them). Do some research.

So I take issue with Geekbench and xBench but you reference them again, and what 'INSANE' leap are you looking at? From what I can tell, some Apple chips may be a tad faster in single core, and then somewhat slower in multi core (using those benchmarks I take issue with anyways).

An iPad Pro chip with passive cooling (if any) is outperforming (not just outbenching) everything that isn't an 8-core 15" or 16" MacBook Pro, an 8-core or 10-core 27" iMac, an iMac Pro, or a Mac Pro and you don't think that's insane? Furthermore, with the aforementioned rate of growth within their SoC platforms, you don't see that they won't (likely within the two year period) outclass the machines said iPad Pro currently cannot? Hell, Apple demoed the 2020 iPad Pro CPU in a developer kit editing three simultaneous 4K streams effortlessly in Final Cut Pro X. Even if you're not an FCPX fan, the capability to do that IS THERE. And has been demoed outside of the benchmarks you take issue with so much.

What else do you need to convince you that this will result in better performing systems?

The move to 3nm process will help apple chips a little, but it won't be 'INSANE'. And then I ask you, how long can TMSC keep shrinking something that's already down to 3nm? I guess we'll find out.


Considering Intel won't make it to 7nm anytime soon, I'd say that this is still a vast improvement for however long it lasts.

And no, Turbo Boost does not sustain when all cores are being pushed for more than a few minutes, and it certainly won't sustain in a thin apple product that's already challenged to dissipate heat. At least, that's my opinion.

You just stated a fact and then claimed it was opinion. The whole point behind Turbo-Boost is to be able to sustain. It's not Turbo-Burst. It's Turbo-Boost. If you have an app only designed to use 2 cores and you're rocking an 8-core processor, it shuts off the other cores and scales up to the thermal limits of the CPU. That's its entire point of existing. If it can only do so for short periods of time, what's the point?

I agree that Apple's thermal envelops are needlessly thin. But, at least as far as the CURRENT high-end Macs, this isn't the kind of problem that it was with every touchbar 15" MacBook Pro.

In any case, again, why are you worried about Turbo-Boost when most video apps (certainly all of the big name ones) are multi-threaded and multi-core aware anyway? Furthermore, as long as developers optimize around Apple's technologies - Grand Central Dispatch especially - (which they are especially supposed to do ANYWAY), any concerns about the need for Turbo-Boost go out the window entirely.

Like I've said, I'll be real interested to see what performance ends up being when all the cores are pushed pretty hard ... given what I do, that's the only situation where all that extra supposed 'power' will matter. If it can't deliver that, then all your 2-minute benchmarking tools are completely useless to me.

Given what you do, and the tools you use to do them with, you're going to find that, so long as your developers update their apps, performance will be far better than what you can currently get from Intel. All signs are pointing to this. It's cool to be skeptical of them, but that doesn't change the fact that they're there.
 

dmccloud

macrumors 68040
Sep 7, 2009
3,138
1,899
Anchorage, AK
They’re not the same code base so your argument is silly. The only fact here is that we saw a pre-canned demo of these builds. Nothing more. That is a fact.

Once again, you reject facts in order to further your brand of ignorance and misinformation. From Microsoft's own website:

Our most-popular Office 365 apps—Excel, PowerPoint, and Word—are designed for the modern workplace, with cutting-edge features like real-time co-authoring, AutoSave, and more. With our newest version of Office for Mac, version 16.9.0, we've extended these capabilities to Apple users; in fact, this release marks the first time in 20 years that Office shares the same codebase across Windows, Mac, iOS, and Android for core functionalities.

Note that Microsoft also includes iOS and Android in that statement, so yes, there are commonalities between Office across all four platforms.

 

johngwheeler

macrumors 6502a
Dec 30, 2010
639
211
I come from a land down-under...
Oh,video rendering will probably be amazing, considering that Apple has specialized chips for video rendering even on their phones (why else do you think the iPhone can get that 4k, 60 FPS recording reliably)?

The problem is that video rendering is only a small fraction of the whole market. Unless performance with Apple Silicon is very extraordinary (and not "just" 20% to 30% faster), I don't see how the average user will switch from x86 to Apple Silicon. And I really don't think it is, or else Apple would be gloating over it from the start.

I don't think that Apple's strategy is necessary to make existing Intel Mac (or Windows) users to immediately switch to Apple Silicon with amazing performance boosts of 200-300%. They just want to attract people who were looking to upgrade their computer anyway within the next year. Considering Intel year-on-year improvements have been quite unimpressive for the most part (10-15%?), if Apple offered a 30% improvement with much better battery life, wrapped in a nice design, at a reasonable price, then it would attract a lot of people.

Personally I'm not expecting fireworks with Apple Silicon. At a guess, up to 30% faster than the equivalent Intel model Mac it replaces, with 40-50% better battery life. Some applications (e.g. video) may show more significant improvements due to the use of specialized Apple Silicon features, and these are the ones that Apple will highlight.
 
  • Like
Reactions: Roode

bill-p

macrumors 68030
Jul 23, 2011
2,929
1,589
...at a reasonable price...

Sorry for quoting out of context. But I'm not going to count on this. It's Apple. New Apple Silicon Macs will most likely keep the same price point, or raise it even higher.

The iPad Air just got a $100 price boost after all.
 
  • Like
Reactions: filu_

johngwheeler

macrumors 6502a
Dec 30, 2010
639
211
I come from a land down-under...
Sorry for quoting out of context. But I'm not going to count on this. It's Apple. New Apple Silicon Macs will most likely keep the same price point, or raise it even higher.

The iPad Air just got a $100 price boost after all.

Yes, I wouldn't expect a price reduction either. By "reasonable price", I meant in comparison to other Macs or Intel machines in the same performance ballpark.
 

Maximara

macrumors 68000
Jun 16, 2008
1,707
908
A vison of what the future 'should' be in some peoples eyes don't often really make sense in the real world.

I'm sure the people who watched Metropolis in the 1930's thought it absolutely made sense that eventually people would commute around town in propeller planes. Just like how you feel a future without ports makes perfect sense.

I'm not saying you're necessarily wrong, I'm just saying you need to question whether you're following what makes true sense in the real world versus how the current trajectory of things seems to be guiding your train of thought.

Make sense?
Not really.

Metropolis (1927) was actually harkening back to Steampunk concepts such as seen in Paris in the 20th Century (1863) which surprisingly accurate when compared to what actually appeared in the 1960s it predicted. Ironically the publisher Verne sent it to deemed it "too fantastic" and so it sat unpublished until 1994.

The 1930s were Dieselpunk and so Metropolis was more a nostalgia trip into a future that anyone with any actually knowledge knew was unfeasible given how chaotic driving a car could be. Compare that to the decidedly Dieselpunk Things to Come (1936) which predicted a World War where cities were effectively bombed back into the stone age asa power crazed leaders continued to fight with what they had left. The ironic thing is to some degree the film (and the 1933 novel it was based on) was overly pessimistic as it predicted a conventional WWII lasting until civilization was effectively destroyed (Well's Atomic bomb appeared in World Set Free...in 1914 and he predicted the discovery of atomic power in the 1930s)

As for a future without ports making perfect sense ...not really. External devices need power and so a cable connection providing that power (with a wall outlet supplement) makes far more sense.

You know what doesn't make sense that actually happens? Pre-ordering a computer game based on what was shown in the ads or E3. 'Can you say Fallout 76, neighbor?' :p :)

Sorry for quoting out of context. But I'm not going to count on this. It's Apple. New Apple Silicon Macs will most likely keep the same price point, or raise it even higher.

The iPad Air just got a $100 price boost after all.

I looked up the specs and this comparison is such a non sequitur that I can't understand why it is brought up:

"At $599, the iPad Air 4 is a notable $100 more than its predecessor, but Apple seems to have made it more than worth it. The much-faster A14 Bionic chip and super-bright Liquid Retina display should make for much improved performance in demanding apps and even better Netflix binge-watching.

Even better, though, is the iPad Air 4's support for the optional Apple Pencil 2 and Magic Keyboard are big boons for folks not looking to spend all the money it takes to get the iPad Pro. This mid-range iPad may be the Goldilocks' pick, for offering just enough for a great experience while not costing too much."

A newer CPU in the same line always cost more then the old one. The difference with Mac is Apple is not going to have to cough up a minimum of $400 a CPU to Intel for something that heat throttles. As I said before there are already rumors of the new ARM Mac being ~$400 cheaper then its Intel equivalent.
 
Last edited:

Joe Dohn

macrumors 6502a
Jul 6, 2020
840
748
I don't think that Apple's strategy is necessary to make existing Intel Mac (or Windows) users to immediately switch to Apple Silicon with amazing performance boosts of 200-300%. They just want to attract people who were looking to upgrade their computer anyway within the next year. Considering Intel year-on-year improvements have been quite unimpressive for the most part (10-15%?), if Apple offered a 30% improvement with much better battery life, wrapped in a nice design, at a reasonable price, then it would attract a lot of people.

Personally I'm not expecting fireworks with Apple Silicon. At a guess, up to 30% faster than the equivalent Intel model Mac it replaces, with 40-50% better battery life. Some applications (e.g. video) may show more significant improvements due to the use of specialized Apple Silicon features, and these are the ones that Apple will highlight.


If you're offering "just" 30% improvement in exchange for dropping the possibility to run x86 applications (including Windows / Bootcamp), this could even scare off new users. Sure enough, 30% right now seems a big deal. However, Intel and AMD refresh their lines at a much faster rate, so that "advantage" quickly vanishes.

Especially considering Apple is threatening to surpass them, I don't think they'll just stand still...
 

leman

macrumors Core
Oct 14, 2008
19,520
19,670
Sure enough, 30% right now seems a big deal. However, Intel and AMD refresh their lines at a much faster rate, so that "advantage" quickly vanishes.

Especially considering Apple is threatening to surpass them, I don't think they'll just stand still...

Apple has a new CPU/GPU every year. Intel just took almost six years to release a new CPU, and they are barely able to beat last year’s Apple architecture while consuming four time more power... do you really think they will be able to come out with a completely new, superior design overnight?
 

smoking monkey

macrumors 68020
Mar 5, 2008
2,363
1,508
I HUNGER
Personally I'm not expecting fireworks with Apple Silicon. At a guess, up to 30% faster than the equivalent Intel model Mac it replaces, with 40-50% better battery life.

I'd call that pretty significant! What's not to get excited about that? This is huge for laptops. Less power usage, cooler, longer battery life, quieter and more than likely faster than what we've got now for general use. Honestly, this is a game changer for laptops.
 
  • Like
Reactions: Roode

Joe Dohn

macrumors 6502a
Jul 6, 2020
840
748
Apple has a new CPU/GPU every year. Intel just took almost six years to release a new CPU, and they are barely able to beat last year’s Apple architecture while consuming four time more power... do you really think they will be able to come out with a completely new, superior design overnight?

Not really, but no one expects a good GPU from Intel anyway. You can just pair it up with Nvidia and/or AMD. As long as the processors are faster, it doesn't matter.
 

leman

macrumors Core
Oct 14, 2008
19,520
19,670
Not really, but no one expects a good GPU from Intel anyway. You can just pair it up with Nvidia and/or AMD. As long as the processors are faster, it doesn't matter.

I am talking about the processor (CPU). Check out the Anandtech review of Intel's newest Tiger Lake

111168.png


Here you can see a Tiger Lake in 28W configuration barely outperforming the A13 in last years iPhones. Anandtech's Andrei Frumusanu later confirmed that the Intel CPU was drawing around 20 watts to achieve this result. A13 needs 5 watts. Even more, if you restrict the Tiger Lake to the 15W profile (like most ultrabooks would), the performance is the same as A13. Again, we are comparing a laptop CPU to a mobile phone CPU here!

Intel's upcoming H-Series Tiger Lake might be able to push the peak performance by additional 10-15% (I find it unlikely that they will be able to clock it higher than 5.3 ghz). The CPU will end up drawing over 30 watts in that scenario. Apple's A14 is already supposed to be around 15-20% faster than the A13 — and that is still within limitations of a mobile phone. What happens if Apple decides to allow their CPU cores to go to 15 watts per core instead of 5 watts? They should be able to get at least 10-20% more peak performance out of it? I think you are being overly optimistic in assuming that Intel will be able to easily match that. They would need to start from scratch completely to match Apple performance per watt.

By the way, since you mention the GPU... this is the area where Intel has made very significant progress. The Tiger Lake GPU is actually faster than iPad Pro.
 
  • Like
Reactions: lysingur and throAU

Maximara

macrumors 68000
Jun 16, 2008
1,707
908
I am talking about the processor (CPU). Check out the Anandtech review of Intel's newest Tiger Lake

111168.png


Here you can see a Tiger Lake in 28W configuration barely outperforming the A13 in last years iPhones. Anandtech's Andrei Frumusanu later confirmed that the Intel CPU was drawing around 20 watts to achieve this result. A13 needs 5 watts. Even more, if you restrict the Tiger Lake to the 15W profile (like most ultrabooks would), the performance is the same as A13. Again, we are comparing a laptop CPU to a mobile phone CPU here!

Intel's upcoming H-Series Tiger Lake might be able to push the peak performance by additional 10-15% (I find it unlikely that they will be able to clock it higher than 5.3 ghz). The CPU will end up drawing over 30 watts in that scenario. Apple's A14 is already supposed to be around 15-20% faster than the A13 — and that is still within limitations of a mobile phone. What happens if Apple decides to allow their CPU cores to go to 15 watts per core instead of 5 watts? They should be able to get at least 10-20% more peak performance out of it? I think you are being overly optimistic in assuming that Intel will be able to easily match that. They would need to start from scratch completely to match Apple performance per watt.

By the way, since you mention the GPU... this is the area where Intel has made very significant progress. The Tiger Lake GPU is actually faster than iPad Pro.

We have to remember A13 has a built-in GPU chip:

So A13 have, for 5 watts:
  • CPU: 6 (ARM big.LITTLE: 2 "big" Lightning + 4 "little" Thunder)
  • GPU: Apple-designed 4 core
So that 5 watts is for the GPU as well. Sure the Tiger Lake GPU looks better...until you remember it is using 3 times the power minimum to get that performance.

More power = more heat + lesser battery life.
 

leman

macrumors Core
Oct 14, 2008
19,520
19,670
So that 5 watts is for the GPU as well. Sure the Tiger Lake GPU looks better...until you remember it is using 3 times the power minimum to get that performance.

It's a bit more complicated than that. First of all, we don't have exact performance figures yet since the reviewers only had the Intel-provided platform for a couple of days. I've seen some impressive 3dmark scores, but no analysis of the performance its. Second, the combined CPU+GPU power draw of the Tiger Lake is limited to 28 watts in sustained scenario. Third, I was actually talking about the A12Z (8 GPU Cores), not A13. Since Apple does not publish TDP figures for it's chips — in fact, it's not even clear if their chips have a TDP at all, it is a bit difficult to make comparisons. Anandtech review of the 2018 iPad Pro states that the entire tablet uses less than 10 watts while running high-demand graphical workloads, but that the performance is also slightly degraded after the initial strong peak.

All in all, we will have to wait until Apple Silicon Macs will come out, it will surely be exiting to take an in-depth look. So far, I have the following impression:

- I am not sure whether 8 core A14 GPU will manage to outperform the Tiger Lake GPU@28 watts
- I believe Apple GPUs have at least 2x performance-per-watt advantage compared to Intel at this point

If Apple ships a 16x A14 GPU part with it's new MBP, it is definitely going to leave Tiger Lake in the dust. It will actually probably get close to the 5500M.

More power = more heat + lesser battery life.

As a power user, I would prefer Apple Silicon Macs to use a bit less power but deliver substantially higher performance :)
 
  • Like
Reactions: richinaus

d7d19285f0bd48

macrumors member
Sep 13, 2020
36
37
If you're offering "just" 30% improvement in exchange for dropping the possibility to run x86 applications (including Windows / Bootcamp), this could even scare off new users. Sure enough, 30% right now seems a big deal. However, Intel and AMD refresh their lines at a much faster rate, so that "advantage" quickly vanishes.

Intel & AMD have been stagnant for a decade and are trying to bridge the gap by throwing more cores at the problem instead of working on increasing performance/efficiency.
 
  • Like
Reactions: throAU and Maximara

Nicole1980

Suspended
Mar 19, 2010
696
1,551
Intel & AMD have been stagnant for a decade and are trying to bridge the gap by throwing more cores at the problem instead of working on increasing performance/efficiency.

The guy right above you talks about how a 16 core apple silicon gnu would be right there with the amd 5500. That would be a lot of cores that apple is 'throwing' at the problem, right?
 

raknor

macrumors regular
Sep 11, 2020
136
150
The guy right above you talks about how a 16 core apple silicon gnu would be right there with the amd 5500. That would be a lot of cores that apple is 'throwing' at the problem, right?

You seem to be confusing CPUs and GPUS. Throwing cores at the problem seems to be how vendors make GPUS faster.. intel’s ice lake G7 has 64 EU, Tigerlake G7 has 96 EUs, before that the iris plus version topped off at 48EU till the gen 9.

Nvidias newest RTX30 series doubled the cores compared to RTX20 series.

The person you responded to and the thread was talking about CPUs especially since x86 is explicitly mentioned.
 

leman

macrumors Core
Oct 14, 2008
19,520
19,670
The guy right above you talks about how a 16 core apple silicon gnu would be right there with the amd 5500. That would be a lot of cores that apple is 'throwing' at the problem, right?

You don’t know much about how computers work, do you? Maybe it would be a good opportunity to learn some things instead of making jokes about topic you clearly don’t understand.

Radeon 5500M Pro is a 24 core design by the way.
 

Nicole1980

Suspended
Mar 19, 2010
696
1,551
You don’t know much about how computers work, do you? Maybe it would be a good opportunity to learn some things instead of making jokes about topic you clearly don’t understand.

Radeon 5500M Pro is a 24 core design by the way.

well the guy accused intel of 'throwing more cores at the problem' with their cpu's as if its a cop out by intel. So directly to my point, why isnt it considered a cop out when AMD uses 24 cores for a gpu?
 

Maximara

macrumors 68000
Jun 16, 2008
1,707
908
As a power user, I would prefer Apple Silicon Macs to use a bit less power but deliver substantially higher performance :)
With the iMacs and MacBooks this goes hand in glove. One of the problems right now is heat throttling as Apple designed for chips that Intel said would be ready...and then weren't. As a result Apple had to go with chips that had higher heat budgets then they designed for.
 

leman

macrumors Core
Oct 14, 2008
19,520
19,670
well the guy accused intel of 'throwing more cores at the problem' with their cpu's as if its a cop out by intel. So directly to my point, why isnt it considered a cop out when AMD uses 24 cores for a gpu?

Because GPUs are - by design - massively parallel processors. They are built to run a lot of (multiple thousands or more)smaller tasks than can be easily split across multiple cores. CPUs in contrast need to run complex tasks that are not necessarily parallel. In modern computing, where multi-core CPUs are the norm, software will take advantage of them, but a lot of things are still limited by the single-core performance. These considerations are the foundation of all the differences between CPUs and GPUs.
 

Falhófnir

macrumors 603
Aug 19, 2017
6,146
7,001
well the guy accused intel of 'throwing more cores at the problem' with their cpu's as if its a cop out by intel. So directly to my point, why isnt it considered a cop out when AMD uses 24 cores for a gpu?
Graphics computations can often be parallelised more successfully/ usefully than CPU tasks can, which makes multi-core performance more important on a GPU, while what the CPU is doing generally benefits more from each core being more powerful past a certain point. Previously this point was around quad core, more recently 6-8 core as software is evolved to take advantage of using more cores, though again it's diminishing returns as there's only so much parallelisation that can usually be written into software.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.