Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

johngwheeler

macrumors 6502a
Dec 30, 2010
639
211
I come from a land down-under...
Two quotes that show why I am nervous about Apple Silicone. Like the posters I quoted I don’t know either and Apple isn’t saying what a minimum expectation of performance should be. Better? Worse? The same? All of the above?

When they announced the new iMac Pro it was done at WWDC in June. They showed early versions and gave out performance figures, all 6 months before they were available.

Right now, no one outside of Apple can speak authoritatively on what Apple Silicone can and can’t do compared with its current lineup of Intel based computers, or what features or abilities will be lost because of the change. Silence isn’t golden.

I imagine that we will find out within the next 5-6 weeks.

BTW, it's "silicon" not "silicone". Silicone is a polymer used to make breast implants and sealants. Silicon is a semi-conducting element used in the manufacture of integrated circuits.
 

robvas

macrumors 68040
Mar 29, 2009
3,240
630
USA
You seem to think ARM is inherently inferior - it's really not, and there are plenty of CPU manufacturers churning out 64-128 core CPUs for servers that outperform Intel & AMD CPUs with less power and lower cost.
Apple doesn’t make servers. Massively parallel systems dont make much sense for 90% of users.
 

Yurk

macrumors member
Apr 30, 2019
75
90
The problem is not that ARM is inferior. It is power efficient, and that's great for laptops. The problem is that Apple's motivation is to turn the mac to a big iphone, and milk customers from the 30% fee for all app store applications. The could not care less about scientists or compatibility, so they blocked Nvidia, CUDA, Bootcamp, 32 bit applications, etc. It's very sad, because Steve Jobs wanted the Mac to be the best tool for scientists in academia, and was very successful, but Tim Cook is moving further and further away from that paradigm, converting the mac into Fischer Price laptop with an Apple app store. It's a sad direction Apple is going, and the Apple logo is no longer shining on the mac.
 
Last edited:

pianojoe

macrumors 6502
Jul 5, 2001
461
26
N 49.50121 E008.54558
Given the speed of current fanless iPad Pros, just imagine what the current line of Apple silicon could do with proper wattage and cooling. ? I’m confident we’ll see a tremendous performance gain.
 
  • Like
Reactions: ader42

throAU

macrumors G3
Feb 13, 2012
9,198
7,349
Perth, Western Australia
But iPhones, and non Pro iPads, don't run at sustained high speeds (I don't own or know anyone with an iPad Pro so I'm not just going to assume what it can do and what its weaknesses are) but judging by the software it doesn't work as well as the same software made for a MacBook Pro or an iMac. I am thinking specifically about Adobe LightRoom or Photoshop. It takes longer to do some edits and there are some features that don't work well. Not all of that is the CPU, some was the lack of a file system for iPads, which has been fixed. But even on things both my iMac version and my iPad version can do the iMac is a lot faster, although my iPad is a lot newer.

Like for like software, the iPad runs better than the MBA - I've run the three things below on both, back to back:

  • Stars Wars: Knights of the old Republic
  • Neverwinter nights
  • Swift playgrounds
all make the MBA sound like a freaking jet turbine whilst the 3 year old iPad Pro runs them all without breaking a sweat at same/faster frame rate. At same/higher resolution (in the case of the games, I've dropped the MBA down to 1280x800, the iPad display is 2224x1668).

Sure, that's not every application out there, but its also an embarrassment for intel that a 2020 i7 13" laptop CPU gets made look stupid by an iPad processor from 3 years ago in ANY software - when the iPad was less than half the price, runs in 12-15 watts (total system power) instead of 30 and has no cooling. Especially when you consider that two of the above things (the games) were originally written for the PC architecture :D
 
Last edited:
  • Like
Reactions: ader42

ght56

macrumors 6502a
Aug 31, 2020
839
815
I'm looking forward to them in a few years after they hammer out all the bugs and any hardware issues. Hopefully they will be better than a netbook...I mean, my iPhone is already like a gazillion times faster than my 15-inch work laptop, so I figure they should be pretty fast.
 
  • Like
Reactions: throAU

bousozoku

Moderator emeritus
Jun 25, 2002
16,120
2,397
Lard
High Sierra 10.13.4 blocked TB1 and TB2 support for eGPUs, forcing users to upgrade to macs with TB3.
Mojave blocked Nvidia eGPUs and CUDA.
Catalina blocked 32 bit apps.
Big Sur on ARM dropped the last remaining way of using a Nvidia eGPU and CUDA, i.e. Bootcamp.
The mac mini cannot be downgraded from Mojave and the 16" mac cannot be downgraded from Catalina (and has serious noise / thermal / power throttle defects). That's the final nail in the coffin.

I can no longer use my $3300 Nvidia Titan V eGPU. All serious machine learning or HPC developers need CUDA. MXNet needs CUDA. Wolfram Mathematica machine learning needs CUDA. The performance of matrix multiplication on ARM is laughable compared to Intel processors. Intel MKL libraries are heavily optimized, much more than ARM BLAS or AMD BLAS libraries. Intel is not 3rd, they are the 1st in HPC performance, because they back their hardware with proper software library support. Intel knows that, that is why they are still #1 in CPU market share. Nvidia gets that. AMD does not get that. Apple drops valuable functionality with every update, and calls it an 'upgrade'. Their new macbook will be an i-phone with a keyboard.

I abandoned my 5 macs and upgraded to a Fischer Price laptop. It is just as capable of not supporting 32-bit applications and Nvidia CUDA, it never overheats, and moreover it has a replaceable battery.

Maybe, you could have waited until you knew that your applications and hardware were supported by operating system upgrades?
 

filu_

macrumors regular
May 30, 2020
160
76
It had some shortcomings, and when the keyboard was designed it was probably not known how unreliable it would be in actual use. Other products have been pretty good IMO.

And bent iPads? Or those with white spots?
 

bill-p

macrumors 68030
Jul 23, 2011
2,929
1,589
Did you try running all that on a 15Watt Intel CPU? I mean, you are talking about a thin thermally constrained tablet here. Of course it gets warm under load... Just the CPU cores alone need over 20 watts to operate at max frequency, and there is also a rather beefy GPU in there. Macs will have more thermal headroom.

Actually, the MacBook Air 2020's CPU is rated for 9W. That means it can get to 1.1GHz around 9W.

Apple constrained the chassis of the 2020 MacBook Air to about 10W, so... of course it is going to suffer.

You're basically saying the iPad Pro at 2.4GHz is outperforming an Intel chip that's purposefully constrained to 1.1GHz. That's not a surprise to me at all.

And yes, I can agree Apple Silicon is more efficient, but your statement that they can rival Intel chips at 4-4.5GHz? I'm not sure I'd believe that until I see it with my own eyes.

Under sustained load (>30 minutes), my Core i9 MacBook can only sustain 3GHz clock speed because Apple is very stingy with power draw constraints. The 16" MacBook can only draw a maximum of around 90-95W in sustained load and the Core i9 needs something like 120W to reach 4GHz. Heck, the dGPU can draw 55W, which causes the CPU to be scaled down to 35W (compared to the rated 45W TDP), so... it drops even below the base clock speeds of 2.3GHz.

So if you're comparing A12Z against the performance of current power and thermal constrained Mac computers, it's a very skewed comparison.

In fact, if we are going by core count and speed, the more "apples to apples" comparison would be the Core i9 MacBook w/ Turbo Boost turned off against the A12Z. Both are 8 core, both are around 2.4Ghz. The iPad is far more efficient, sure, but it's far slower. That's what's preventing me from using my iPad Pro as my main computer. I don't think the situation will change until maybe 2 years later.

But Apple can feel free to prove me wrong. I'll nab that next MacBook that they come out with as long as it can drive my 5K monitor without breaking a sweat.
 

AxiomaticRubric

macrumors 6502a
Sep 24, 2010
945
1,154
On Mars, Praising the Omnissiah
I bought an Apple TV last December, which entitled me to one year of Arcade. If those are a representation of commercial games on an iPad then so what. I went ahead and unsubscribed after 3 months because there wasn’t anything that held my interest. I am not a heavy gamer, and I haven’t purchased a game from the App Store in years, and that was around 2012 and it was a simple tower game. I’ve seen people play iPad and iPhone games but I’ve seen nothing that makes me want to buy them.

Not to get off-topic, but gaming on iOS and iPadOS is centered on independent developers and smaller studios. They simply can't compete with the big development studios that create the top games for Playstation, Xbox, Nintendo, and Windows.

If you're not a serious gamer then iPadOS is fine for casual gaming.

Gaming on the Mac will always be an inferior experience compared to the established gaming platforms. ARM SoCs will not change this, and some studios will choose to not port their existing Intel macOS games to Apple Silicon.
 
  • Like
Reactions: cardfan

ctjack

macrumors 68000
Mar 8, 2020
1,553
1,569
the iPad Pro at 2.4GHz is outperforming an Intel chip that's purposefully constrained to 1.1GHz. That's not a surprise to me at all.
Air's i3 is not constrained to 1.1 GHz. It is a new Intel's naming trick to avoid any court filing. It is working at 3GHz on average, and only if overheats, it will deliver the "guaranteed" 1.1GHz. So the 1.1 GHz in its' name is just a mark of the lowest GHz possible.
 

bill-p

macrumors 68030
Jul 23, 2011
2,929
1,589
Air's i3 is not constrained to 1.1 GHz. It is a new Intel's naming trick to avoid any court filing. It is working at 3GHz on average, and only if overheats, it will deliver the "guaranteed" 1.1GHz. So the 1.1 GHz in its' name is just a mark of the lowest GHz possible.

Do you have a MacBook Air 2020?

I got one a short while ago. Under sustained (>30 minutes) load, it drops to 2.3GHz and then eventually to 1.1GHz.

The 3.2GHz is only seen very briefly for the first seconds of a load.

Needless to say, the abysmal performance caused me to return it.

In comparison, my 16" i9 goes to 4.2GHz for the first few seconds then settles around 3GHz indefinitely.

So for "my" workload at least, the average for the Air is base clock, while the 16" is above that.
 
  • Like
Reactions: throAU

throAU

macrumors G3
Feb 13, 2012
9,198
7,349
Perth, Western Australia
Gaming on the Mac will always be an inferior experience compared to the established gaming platforms. ARM SoCs will not change this, and some studios will choose to not port their existing Intel macOS games to Apple Silicon.

I wouldn't be so sure - at least not in the long term.

The Switch is ARM based and the AppleTV 4k is significantly more powerful than the switch. Same with any current iPad, all the new iPhones, etc..

Also, plenty of indie / young game developers are growing up writing their first games for... iOS. With the switch to apple silicon those games will pretty easily run on the Mac as well - without even the need for a recompile.

It will take time, but apple is definitely pushing for game development, and their hardware is already plenty capable of it. Give it a few years, and I give it 50/50 chance that Apple will be a significant gaming platform.
 
  • Like
Reactions: russell_314

johngwheeler

macrumors 6502a
Dec 30, 2010
639
211
I come from a land down-under...
Apple doesn’t make servers. Massively parallel systems dont make much sense for 90% of users.

I agree. That ship has sailed, and there would be no financial benefit to Apple going back to making servers. The point I was making was that Apple is set to fill the gap for ARM-on-desktop (which hasn't existed since the 1990s AFAIK) and that there is nothing than makes ARM inherently incapable of running in a powerful workstation / desktop computer like the Mac Pro. The fact that 128-core ARM CPUs exist (or will shortly) implies that it should be technically possible to build an ARM cpu that is equivalent or better to a Xeon W CPU.
 
Last edited:

leman

macrumors Core
Oct 14, 2008
19,521
19,673
You're basically saying the iPad Pro at 2.4GHz is outperforming an Intel chip that's purposefully constrained to 1.1GHz. That's not a surprise to me at all.

I was comparing it to the 28W TDP Ice Lake in higher tier 13” MBP. Not the MBA - the iPad Pro runs circles around the MBA.

And yes, I can agree Apple Silicon is more efficient, but your statement that they can rival Intel chips at 4-4.5GHz? I'm not sure I'd believe that until I see it with my own eyes.

This is not my statement. Refer to A12 and A13 deep dive reviews by Anandtech and others. Funny fact: a year ago I was the same opinion as you. ARM Macs? Give me a break, why does one want a dead slow toy? Then I actually looked at the data and I was shocked. Apple CPUs are an engineering marvel.

The 16" MacBook can only draw a maximum of around 90-95W in sustained load and the Core i9 needs something like 120W to reach 4GHz. Heck, the dGPU can draw 55W, which causes the CPU to be scaled down to 35W (compared to the rated 45W TDP), so... it drops even below the base clock speeds of 2.3GHz.

The 16” MBP debilere around 60Watt of sustained CPU performance, which allows the i9 to run above its spec. This allows the CPU to maintain clocks of around 3.2-3.4 on all cores if I remember correctly (been a while since I tested it). On single core max turbo boost it consumes around 25 watts. Yes, the GPU makes it more complicated, as the laptop in total offers around 80Watts of power delivery that needs to be shared between CPU and GPU. Note that all i9-equipped laptops suffer from this.

Which again illustrates the problems Intel is having. You need to run a 45W CPU at 60W to get a performance gain. These chips are clocked way above their comfort point so that Intel can claim performance advances. To get most out of those chips you need to run them on a desktop.



In fact, if we are going by core count and speed, the more "apples to apples" comparison would be the Core i9 MacBook w/ Turbo Boost turned off against the A12Z. Both are 8 core, both are around 2.4Ghz. The iPad is far more efficient, sure, but it's far slower. That's what's preventing me from using my iPad Pro as my main computer. I don't think the situation will change until maybe 2 years later.

Why would it be a fair comparison? The i9 has 8 cores (the A12Z has 4, the low performance cores barely count), and the iPad can do substantially more work per clock - it has 50% more execution units per core than Intel CPU.
 
  • Like
Reactions: russell_314

leman

macrumors Core
Oct 14, 2008
19,521
19,673
In fact, if we are going by core count and speed, the more "apples to apples" comparison would be the Core i9 MacBook w/ Turbo Boost turned off against the A12Z. Both are 8 core, both are around 2.4Ghz. The iPad is far more efficient, sure, but it's far slower. That's what's preventing me from using my iPad Pro as my main computer. I don't think the situation will change until maybe 2 years later.

Since you were curious, I did some quick tests while sipping my morning tea. The CPU in question is i9-9980HK in a 16" MBP. I only used Geekbench, because despite it's many flaws it's readily available and I only hat 10 minutes. Power usage and clocks were monitored with Intel Power Gadget.

  1. Disabling turbo boost (limits the clocks to 2.4Ghz). Single-core power usage is around 5 wats, multi-core usage is about 30 watts. Geekbench score: https://browser.geekbench.com/v5/cpu/3695566
  2. Disabling turbo boost, hyper-threading and limiting the core count to 4 cores in order to better match the iPad Pro with 4 high-perf cores. Multi-core power usage drops down to 15 watts. Geekbench score: https://browser.geekbench.com/v5/cpu/3695627
Summary: running the 8-core Skylake++ CPU at 2.4 ghz with hyper threading will get you a Geekbench score that's about 10% faster than a 4-core A12Z at 2.5ghz (Intel needs twice as much power to get this result). Matching the core counts (4 vs. 4) makes the A12Z about twice as fast. Regarding single-core performance , the 2.5Ghz A12Z is around 100% faster than the 2.4ghz Skylake at comparable power consumption.

P.S. Disabling the turbo boost made my MBP laggy as hell.

Edit: an A12 core actually consumes only 4 watts at 2.5ghz (https://www.anandtech.com/show/14892/the-apple-iphone-11-pro-and-max-review/4) — 20% less than optimized Skylake at 2.4ghz.
 
Last edited:

robvas

macrumors 68040
Mar 29, 2009
3,240
630
USA
If Apple were to come out with an ARM chip that beats the fastest Intel chips, in real benchmarks, not just Geekbench, as well as had noticeable power savings, then Amazon, Google, Facebook, and Microsoft would all be beating down their door trying to buy the chips for their servers.
 

leman

macrumors Core
Oct 14, 2008
19,521
19,673
If Apple were to come out with an ARM chip that beats the fastest Intel chips, in real benchmarks, not just Geekbench, as well as had noticeable power savings, then Amazon, Google, Facebook, and Microsoft would all be beating down their door trying to buy the chips for their servers.

ARM server chips that compete with Xeons are already on the market. You also might want to check out a startup called NUVIA. Talking about "real" benchmarks: https://www.anandtech.com/show/14892/the-apple-iphone-11-pro-and-max-review/4

Anyway, you comment doesn't make much sense since Apple does not make server-grade chips.
 

EdT

macrumors 68020
Mar 11, 2007
2,429
1,980
Omaha, NE
Not to get off-topic, but gaming on iOS and iPadOS is centered on independent developers and smaller studios. They simply can't compete with the big development studios that create the top games for Playstation, Xbox, Nintendo, and Windows.

If you're not a serious gamer then iPadOS is fine for casual gaming.

Gaming on the Mac will always be an inferior experience compared to the established gaming platforms. ARM SoCs will not change this, and some studios will choose to not port their existing Intel macOS games to Apple Silicon.

I’ve played WoW for the first time in 15 years, mostly because friends and family were doing it and it was a way to stay connected. To me it isn’t that the games themselves are important, it’s the ability of a computer to smoothly run a complex program that uses input from a user, whether keyboard or mouse or game controller along with sound and video and do it smoothly. If a computer can handle that it will handle anything else I am likely to ever do with it.

And there are games like Kerbal and Minecraft that I keep saying I am going to try. But I’ve been saying that for a long while now.
 

MisterMe

macrumors G4
Jul 17, 2002
10,709
69
USA
Not in single-threaded tasks.

They'd take the opportunity to make more money though!
Apple did not become a $2 trillion company by selling merchant processors. It is difficult to see the company believes that its path to $3 trillion goes through that market.
 

JMacHack

Suspended
Mar 16, 2017
1,965
2,424
So how badly does Apple freak out when NVIDIA buys out ARM ? Heck Apple should buyout ARM. And screw Qualcomm which all they do is License ARM architecture.

The only reason companies go with ARM is mobility and power otherwise
  1. ARM is a Reduced Instruction Set Computing (RISC) (RISC) architecture, which means that the processor itself handles far fewer instructions in hardware than, say, x86. This means that there is less actual material used in each chip, so they're dirt cheap to produce.
How is Apple going to sell Desktops with ARM CPUs is beyond me. If you don’t need Low Power or Fanless Then a Ryzen 7 Or an Intel i7 will be So Much FAster its not even funny.
You're gonna have to come up with new material soon.
Running lots of VMs is a reasonable use case, but I'd have to question whether the most cost-effective way to do it is to have a super-expensive workstation computer. You may find that cloud platforms such as AWS, GCP or Azure are better options - both in terms of cost and the management effort required to run and configure them on your own hardware. It depends on your use case. Are you spinning VMs and containers up and down for development, or running them 24x7?

In my industry (enterprise cloud computing), I have seen a rapid decline in the number of clients using VMs on their own hardware, and most of those are ESXi servers in their data center, not workstations. I used to use a lot of VMWare or VirtualBox VMs for test environments, but now it's easier, less time-consuming, and often cheaper to just spin them up in the cloud.

Video editing / rendering has diminishing returns over about 24 cores (Premiere Pro CPU performance: AMD Threadripper 3990X 64 Core) - with the 64 core Threadripper running slower than the 32-core version.

Photoshop only shows minimal improvements with >8 cores (https://www.pugetsystems.com/recomm...-Adobe-Photoshop-139/Hardware-Recommendations)

You could run thousands of Docker containers on a 64 core machine with sufficient RAM, so this is overkill to run 50-100 containers (which is a relatively complex micro-service architecture).

You might argue that need to do all of the above *at the same time*....but this is definitely an edge use case.

By all means, spend your money on £4000 CPUs....you probably aren't getting your money's worth though :)
Which can be covered by IPC increases. Something that Apple's likely focused on with their architecture.
The entire Macbook line was a complete disaster for the past 4 years due to the keyboard problem. How is that good?
FUrthermore, it took them FOUR YEARS to own and finally fix the problem. Pathetic to say the list for a company like Apple.
Disaster yes, complete... maybe not. Pathetic for Apple? I agree.
I'm guessing they had a robot press the switch mechanism a million times in a nice clean dust and grime free space and said it was good. Then reality struck.

The problem is that Apple's motivation is to turn the mac to a big iphone, and milk customers from the 30% fee for all app store applications.
This conspiracy theory still doesn't hold up to any basic scrutiny.
 

filu_

macrumors regular
May 30, 2020
160
76
This conspiracy theory still doesn't hold up to any basic scrutiny.

If these apps work like an iPad, such as Files or Numbers - that would be a grim joke. I just found out that my spreadsheets are not available.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.