Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Peperino

macrumors 65816
Nov 2, 2016
1,000
1,684
It will not be an overpriced netbook. That would be a disaster for Apple, a company that tries to produce products that are good at doing things. Expect Apple Silicon to outperform Intel at the same power consumption.

Produce products that are good?

The entire Macbook line was a complete disaster for the past 4 years due to the keyboard problem. How is that good?
FUrthermore, it took them FOUR YEARS to own and finally fix the problem. Pathetic to say the list for a company like Apple.
 
  • Like
Reactions: filu_

Tekguy0

macrumors 6502
Jan 19, 2020
306
361
Produce products that are good?

The entire Macbook line was a complete disaster for the past 4 years due to the keyboard problem. How is that good?
FUrthermore, it took them FOUR YEARS to own and finally fix the problem. Pathetic to say the list for a company like Apple.
It had some shortcomings, and when the keyboard was designed it was probably not known how unreliable it would be in actual use. Other products have been pretty good IMO.
 

EdT

macrumors 68020
Mar 11, 2007
2,429
1,980
Omaha, NE
We don’t know. There will certainly be incentives to buy the new hardware, and battery life on a portable will likely have higher priority than performance, but we shall see.

If you don’t like the Macs on offer, don’t buy them and get a PC instead.
We don’t know at all but estimates based on previous Apple performance gains put a potential A14X (the iPad processor, not a Mac) as outperforming all of the Apple laptops available right now.


Two quotes that show why I am nervous about Apple Silicone. Like the posters I quoted I don’t know either and Apple isn’t saying what a minimum expectation of performance should be. Better? Worse? The same? All of the above?

When they announced the new iMac Pro it was done at WWDC in June. They showed early versions and gave out performance figures, all 6 months before they were available.

Right now, no one outside of Apple can speak authoritatively on what Apple Silicone can and can’t do compared with its current lineup of Intel based computers, or what features or abilities will be lost because of the change. Silence isn’t golden.
 

AxiomaticRubric

macrumors 6502a
Sep 24, 2010
945
1,154
On Mars, Praising the Omnissiah
The entire Macbook line was a complete disaster for the past 4 years due to the keyboard problem. How is that good?

Complete disaster? That is hyperbole.

I have a 2016 MacBook Pro and there have been zero problems with the keyboard. (I fully acknowledge that mileage varies on this but I haven't experienced any of the reported issues.)

A complete disaster would be lithium ion batteries that spontaneously combust for the majority of users, or motherboards that consistently melt due to insufficient cooling.
 
  • Like
Reactions: russell_314

bill-p

macrumors 68030
Jul 23, 2011
2,929
1,589
As to the rest, you might be overestimating the stress caused by a multi-process environment. Its not like all this software has to literally run at the same time - most of them are just idling in the background, waking up very infrequently for some occasional state update here and there. An Apple CPU is almost guaranteed to do better under prolonged stress than an Intel one simply because Apple CPUs need less power.

In a my environment, it's pretty standard for Youtube (4K) and many other website tabs running in the background all at once, while compiling code, Git pushing/pulling, other services running, etc...

And occasionally, there'd be Lightroom, Capture One, Final Cut Pro processing things in the background on top of those as well. Very often, these processes take up to 1 hour to complete.

So yes, everything literally has to run at the same time. It's a completely different use case than iOS. If I can't even do the above, then there's no point to having a Mac at all.

Going back, the iPad Pro at full load is not a pretty sign. It's very obvious Apple has to allow the chip to draw much more power, and be able to dissipate much more heat. So the question is: under sustained heavy load, does the "need less power" statement still apply? And to what degree?
 

EdT

macrumors 68020
Mar 11, 2007
2,429
1,980
Omaha, NE
Windows 10 BootCamp has been stated to be not supported. There may be some x86 emulation capability in the future.

Gaming will be supported either via (1) Rosetta 2 emulation - don't expect great results (2) iOS applications (3) native ARM games - we'll see how many current MacOS games can be easily recompiled for ARM

eGPUs should be supported if the ASi Macs support PCIe over thunderbolt, which I expect they will.

I think that Apple is prepared to take the hit of the appox. 2% of Mac users who regularly used Windows via Bootcamp, in the expectation that being able to run iOS apps will compensate for this lost user-segment.

I can see that Bootcamp is important for gamers, but maybe the better option is just to buy a separate Windows PC or console for gaming. It's really not the Mac's forte.

But I won’t buy 2 expensive computers. If the end result is that I have to have a Intel or AMD computer then when I buy another computer that is what I will buy, and I will sell or give away my iMac-or trash it if the value drops to zero. I’m not a Apple fanatic. I purchased my first iMac rather than buy a Windows 8 computer. I will switch back if Apple decides they don’t care about what I want. I have no illusions that this will shake Apple to its core but I don’t owe them loyalty. If what they make is something I don’t like then I have options.
 
  • Like
Reactions: Brazzan

EdT

macrumors 68020
Mar 11, 2007
2,429
1,980
Omaha, NE
Well to be fair the Apple Silicon Macs will be able to natively run iPadOS games, for what it's worth.

I bought an Apple TV last December, which entitled me to one year of Arcade. If those are a representation of commercial games on an iPad then so what. I went ahead and unsubscribed after 3 months because there wasn’t anything that held my interest. I am not a heavy gamer, and I haven’t purchased a game from the App Store in years, and that was around 2012 and it was a simple tower game. I’ve seen people play iPad and iPhone games but I’ve seen nothing that makes me want to buy them.
 
  • Like
Reactions: whfsdude

IvanKaramazov

macrumors member
Jul 23, 2020
32
49
In a my environment, it's pretty standard for Youtube (4K) and many other website tabs running in the background all at once, while compiling code, Git pushing/pulling, other services running, etc...

And occasionally, there'd be Lightroom, Capture One, Final Cut Pro processing things in the background on top of those as well. Very often, these processes take up to 1 hour to complete.

So yes, everything literally has to run at the same time. It's a completely different use case than iOS. If I can't even do the above, then there's no point to having a Mac at all.

Going back, the iPad Pro at full load is not a pretty sign. It's very obvious Apple has to allow the chip to draw much more power, and be able to dissipate much more heat. So the question is: under sustained heavy load, does the "need less power" statement still apply? And to what degree?

It makes perfect sense to expect some Macs to be able to stream 4K youtube with dozens of tabs while also compiling code and running media software in the background, but it's worth noting that the Macbook Air, present generation and all past, would be running hot, loud, and frankly throttled trying to do all that. Not all Macs and certainly not all Intel chips can gracefully handle the kinds of workloads you're describing, which is why someone with your needs obviously buys a Macbook Pro if not a desktop Mac of some sort. The chip in the iPad is in no way comparable, power-wise, with the kinds of Intel chips that can achieve what you're talking about, but it is more directly comparable to the Macbook Air.

The chip in the 13" Pro is rated 28w TDP, the 16" 45w, and the desktops significantly higher from there, and all these chips run at significantly higher wattage under load than their TDP. Anandtech's testing of the A12X found that it averaged around 8w actual draw during a sustained load test, which gives some hint of the thermal room of the device. The "not pretty" full load you're describing is probably right around that 8w.

LTT found that the current Macbook Air, which has a terrible cooling system, could probably maintain 10w power draw with no throttling or significant heat. That's not enough for the supposedly "10w" Intel chip in the device, so it throttles pretty quickly, heats up, and spins up fans loudly. But that 10w thermal headroom would, in theory, be enough to cool the A12X under load more or less indefinitely. And while Geekbench is not a great benchmark for most things, it's a pretty good indicator of comparative full load, non-throttled speeds, and the 2-year old iPad Pro is faster than Intel's best in the Macbook Air before they both start throttling. It seems reasonable to assume that an A12X shoved in the current Macbook Air chassis would run faster than the current Intel Air without any of its heat and throttling issues. And that's neglecting two years of architectural improvements and a die shrink for a theoretical A14X. The A-X chips may run hot and throttle in the iPad Pro chassis, but even a thin and light laptop chassis should give them breathing room.

Beyond the limitations of a thin and light laptop, we don't know what Apple can achieve, that's true. But if we can reasonably conclude that an A14X in a Macbook Air would absolutely smoke Intel's best chip for the same TDP, and I think we can, then there's at least reason to be optimistic that with 28w or 45w of headroom, Apple could build a larger, higher core-count chip that should at the minimum be competitive with the respective Intel offerings, if not objectively better.

As regards concerns about running iOS v MacOS, if you hunt you can find general impressions on Twitter and Reddit and such from developers with A12Z DTKs, and they seem to be unanimous in being pleasantly surprised with the fluidity and performance of MacOS on that environment.
 

Yurk

macrumors member
Apr 30, 2019
75
90
VCU 1525
really!? i’ll try tomorrow linking R to the latest version on an intel dual 18-core cpu server.
are the optimizations only available to some cpu generations or across the range going back a couple of years?
I am not sure. I have only tried it in my 28 core Xeon 8173 and my 8-core i9 macbook. I think any Intel CPUs with avx 512 support would benefit from the latest libraries, but that is an assumption.
 

richinaus

macrumors 68020
Oct 26, 2014
2,429
2,186
But I won’t buy 2 expensive computers. If the end result is that I have to have a Intel or AMD computer then when I buy another computer that is what I will buy, and I will sell or give away my iMac-or trash it if the value drops to zero. I’m not a Apple fanatic. I purchased my first iMac rather than buy a Windows 8 computer. I will switch back if Apple decides they don’t care about what I want. I have no illusions that this will shake Apple to its core but I don’t owe them loyalty. If what they make is something I don’t like then I have options.

Of course you do have choice. And I totally see your point of it getting expensive and we all should buy the tools we need to do certain tasks rather than following blindly. It is an absolute fact that PC’s are better than Macs for 3D work for example. But if you use FCP, Logic, developing iOS apps or some other Apple app then of course the mac is better. Macs are far more enjoyable to use for media in my opinion, and a much better experience overall. But really windows 10 is pretty good these days.

For me it is all about the apps I use and if the developers are fully going to support AS. It will take time I think.

I have zero doubts in my mind however that the new AS hardware will be great and the chips will support accross laptop and desktop, without issue.
 
  • Like
Reactions: russell_314

Yurk

macrumors member
Apr 30, 2019
75
90
Newsflash: cuda on Mac has been walking dead since about 2012. if you invested 3k in a titan v to run with a Mac that's on you...

All serious ML people are running something other than macOS. Likely Linux.
CUDA on Mac has been supported up to Volta and up to High Sierra. That was 2018, not 2012. Titan V was released in late 2017 and purchased then. Go troll somewhere else.

I can run my Titan V everywhere, including my High Sierra 2012 macbook, 2014 mac mini, 2017 macbook, but not on my 2018 mojave mac mini or my 2019 catalina 16" macbook pro. But I can still use it on Bootcamp, or on my Xeon workstation. If I want a MacOS desktop, my only choice is to login to my Xeon workstation remotely and run CUDA that way. Meanwhile, any Dell or Lenovo or MS Surface laptop with a Nvidia card allows users to code CUDA on their couch.
 
Last edited:

Michael J

macrumors newbie
Sep 19, 2006
25
14
Seattle
Let's hope it is better. I just got a MBP13 2020 to replace my 2017 and the battery life still maxes out at 4 hours and the fans still scream whenever I use Adobe Lightroom. I'd blame that on Adobe except it runs smooth and fast, and for 10 hours, on my iPad Pro 11. Oh wait, that's ARM.
 
  • Like
Reactions: lysingur

involuntarheely

macrumors regular
Jul 28, 2019
126
140
VCU 1525

I am not sure. I have only tried it in my 28 core Xeon 8173 and my 8-core i9 macbook. I think any Intel CPUs with avx 512 support would benefit from the latest libraries, but that is an assumption.
I just checked, the server I've been using had R linked to MKL 2020 update 1. I've seen theres update 2 and a bugfix update 3 but I'm not sure how much I'd gain by that.
I'm considering getting a 3990X threadripper workstation to link to BLAS. do you have any link to benchmarks pitting last generation intel with MKL against last generation AMD on OpenBLAS?
 

leman

macrumors Core
Oct 14, 2008
19,520
19,670
Going back, the iPad Pro at full load is not a pretty sign. It's very obvious Apple has to allow the chip to draw much more power, and be able to dissipate much more heat. So the question is: under sustained heavy load, does the "need less power" statement still apply? And to what degree?

I don’t see why it wouldn’t. Synthetic benchmarks suggest that an A13 at 2.5 GHz is more or less equivalent to a 4.3-4.5 GHz Intel core. At this frequency the A13 will consume less than 5 watts of power. Which means - assuming adequate power delivery and dissipation - that 4 such cores at 2.5ghz would use around 20 watts. Here is the fun part - this is still less then an Intel Coffee Lake running at 4.5 GHz. Intel CPUs, depending on turbo boost and scenario will consume somewhere from 1 to 50 watts per core. Apple CPUs will always consume under 5 watts per core.

We don’t know what the peak performance of final ARM Macs is going to be, but their sustained performance is definitely going to be excellent. Already A12Z is confidently outperforming the fastest Ice Lake CPUs (28 watt TDP) in sustained benchmarks - and thats a two year old design...

In a my environment, it's pretty standard for Youtube (4K) and many other website tabs running in the background all at once, while compiling code, Git pushing/pulling, other services running, etc...

And occasionally, there'd be Lightroom, Capture One, Final Cut Pro processing things in the background on top of those as well. Very often, these processes take up to 1 hour to complete.

So yes, everything literally has to run at the same time. It's a completely different use case than iOS. If I can't even do the above, then there's no point to having a Mac at all.

Did you try running all that on a 15Watt Intel CPU? I mean, you are talking about a thin thermally constrained tablet here. Of course it gets warm under load... Just the CPU cores alone need over 20 watts to operate at max frequency, and there is also a rather beefy GPU in there. Macs will have more thermal headroom.
 

throAU

macrumors G3
Feb 13, 2012
9,198
7,348
Perth, Western Australia
I'm going to say yes. I can't imagine that Apple would release them if they didn't. There's just no way they're going to be slower given the insanely high performance per watt of Apple chips. They could probably outperform Intel using less than 40% of the power.

As I already posted above, in terms of processing power per watt and in terms of processing power vs. heat output the A10X in my 2-3 year old iPad Pro destroys the i7-1060NG7 in my MacBook Air 2020 - especially given it is running in half the watts with no basically zero cooling (certainly no fan).

Vs. the like for like machines in the 13" space, performance wise Apple's processors are going to destroy what intel currently has available. Remains to be seen how they'll go in the 15" form factor, but I wouldn't put it past them to come out very strong.

Even if intel had been working hard for the past 10 years to refine their architecture (they haven't really), Apple will be on TSMC 5nm process and intel will be on intel broken 10nm process for at least the next 12-18 months, likely longer.

The playing field is very much NOT level and intel has a lot of work to do to catch up. Their product lineup is very weak right now all the way from mobile to server.

This is why Apple has switched - intel have been stuck trying to make their new 10nm process work, haven't succeeded despite 5 years of saying "everything is fine" and "volume shipping by end of the year" (for 4+ years) - and intel won't be digging themselves out of the hole for at least 1-2 years assuming everybody else is standing still with manufacturing tech.

Either apple play the same game as everyone else in the market and wait for intel in the laptop/desktop - or they take this opportunity to move forward in a BIG way. Given intel's current stumbles and the fact that switching to AMD just leaves them open to the same thing happening when AMD stumble in future, the time has never been better to do it.
 
Last edited:
  • Like
Reactions: collin_

throAU

macrumors G3
Feb 13, 2012
9,198
7,348
Perth, Western Australia
CUDA on Mac has been supported up to Volta and up to High Sierra. That was 2018, not 2012. Titan V was released in late 2017 and purchased then. Go troll somewhere else.

Apple have been discouraging CUDA use since they ditched Nvidia and started pushing OpenCL and Metal. The writing has been on the wall, if you've been paying attention, since then. "Walking dead" doesn't mean killed off, but deprecated/discouraged by the vendor as they are pushing something else.

You've had about 8 years to come up with and then execute a migration plan, so cry me a river. If you've purchased new expensive gear in that time-frame to use with MacOS despite Apple's strong suggestions that CUDA is not the way forward for their platform since around 2012, I maintain: that's on you.

I'm not saying that Apple refusing to support Cuda is a good idea. I'm saying that it is what it is, the signs have been abundantly clear, and ignoring them is on you.
 

Joelist

macrumors 6502
Jan 28, 2014
463
373
Illinois
Part of Intel's problem is Apple poached a lot of their best designers - they specifically went after the team that created the Banias, Dothan and Conroe microarchitectures.
 

throAU

macrumors G3
Feb 13, 2012
9,198
7,348
Perth, Western Australia
Part of Intel's problem is Apple poached a lot of their best designers - they specifically went after the team that created the Banias, Dothan and Conroe microarchitectures.

Another part of the problem is their management and company culture.



The loss of focus is especially clear. When you have a CPU manufacturer spending billions of dollars purchasing garbage like McAfee, failing hard at making modems and purchasing then selling their ARM subsidiary (essentially right about the time ARM based cores started to take over the world) shows just how dumb their management team have been.

Never mind the artificial platform segregation games they've been playing with things like Optane to essentially kill their own only new tech that is actually good before it's even had a chance to be born.
 
  • Like
Reactions: BigSplash

EdT

macrumors 68020
Mar 11, 2007
2,429
1,980
Omaha, NE
As I already posted above, in terms of processing power per watt and in terms of processing power vs. heat output the A10X in my 2-3 year old iPad Pro destroys the i7-1060NG7 in my MacBook Air 2020 - especially given it is running in half the watts with no basically zero cooling (certainly no fan).

Vs. the like for like machines in the 13" space, performance wise Apple's processors are going to destroy what intel currently has available. Remains to be seen how they'll go in the 15" form factor, but I wouldn't put it past them to come out very strong.

Even if intel had been working hard for the past 10 years to refine their architecture (they haven't really), Apple will be on TSMC 5nm process and intel will be on intel broken 10nm process for at least the next 12-18 months, likely longer.

The playing field is very much NOT level and intel has a lot of work to do to catch up. Their product lineup is very weak right now all the way from mobile to server.

This is why Apple has switched - intel have been stuck trying to make their new 10nm process work, haven't succeeded despite 5 years of saying "everything is fine" and "volume shipping by end of the year" (for 4+ years) - and intel won't be digging themselves out of the hole for at least 1-2 years assuming everybody else is standing still with manufacturing tech.

Either apple play the same game as everyone else in the market and wait for intel in the laptop/desktop - or they take this opportunity to move forward in a BIG way. Given intel's current stumbles and the fact that switching to AMD just leaves them open to the same thing happening when AMD stumble in future, the time has never been better to do it.

I understand WHY Apple is leaving Intel, and all of the points you make are definitely valid. And I also understand how impressive Apple's development has been in ARM chips. Apple doesn't have the best phone screens or the most memory or the longest lasting battery, but for most of the 21st century and definitely in the last 10 years Apple has the most powerful processors running those phones, especially since Apple moved into designing the chips themselves.

But iPhones, and non Pro iPads, don't run at sustained high speeds (I don't own or know anyone with an iPad Pro so I'm not just going to assume what it can do and what its weaknesses are) but judging by the software it doesn't work as well as the same software made for a MacBook Pro or an iMac. I am thinking specifically about Adobe LightRoom or Photoshop. It takes longer to do some edits and there are some features that don't work well. Not all of that is the CPU, some was the lack of a file system for iPads, which has been fixed. But even on things both my iMac version and my iPad version can do the iMac is a lot faster, although my iPad is a lot newer.

So what I am afraid of is what Apple isn't telling me. If they are going to be selling an ARM computer this year they have a good idea what that computer is going to end up being able to do. And what it isn't. I don't need to know every small detail but if something is definitely going away say so now. And if something probably ISN'T going away say that as well. My experience with iPads is that they are good for very light computing, simple editing of photos (after you find a way to get the pictures into them if its not a wireless camera or an iPhone) and viewing movies and pictures. So right now my image of an ARM based computer, including the Windows ARM based ones, is something that is light in every meaning of the word. It's light weight, it's light cost, and it's on the light side of sustained processing power. Good as a Mac Air or an iPad Pro Plus, but not as a Macbook and a long ways away from a MacBook Pro or an iMac.
 
  • Like
Reactions: cardfan and robvas

johngwheeler

macrumors 6502a
Dec 30, 2010
639
211
I come from a land down-under...
Running lots of VMs is a reasonable use case, but I'd have to question whether the most cost-effective way to do it is to have a super-expensive workstation computer. You may find that cloud platforms such as AWS, GCP or Azure are better options - both in terms of cost and the management effort required to run and configure them on your own hardware. It depends on your use case. Are you spinning VMs and containers up and down for development, or running them 24x7?

In my industry (enterprise cloud computing), I have seen a rapid decline in the number of clients using VMs on their own hardware, and most of those are ESXi servers in their data center, not workstations. I used to use a lot of VMWare or VirtualBox VMs for test environments, but now it's easier, less time-consuming, and often cheaper to just spin them up in the cloud.

Video editing / rendering has diminishing returns over about 24 cores (Premiere Pro CPU performance: AMD Threadripper 3990X 64 Core) - with the 64 core Threadripper running slower than the 32-core version.

Photoshop only shows minimal improvements with >8 cores (https://www.pugetsystems.com/recomm...-Adobe-Photoshop-139/Hardware-Recommendations)

You could run thousands of Docker containers on a 64 core machine with sufficient RAM, so this is overkill to run 50-100 containers (which is a relatively complex micro-service architecture).

You might argue that need to do all of the above *at the same time*....but this is definitely an edge use case.

By all means, spend your money on £4000 CPUs....you probably aren't getting your money's worth though :)
 

johngwheeler

macrumors 6502a
Dec 30, 2010
639
211
I come from a land down-under...
Apple is NOT a cpu manufacturer neither cpu architecture designer.
Intel on the other hand is the LARGEST and highest valued semiconductor chip manufacturer.

Apple goes to "theirs" chips (in fact it is arms chips but awhole other discusion) to CONTROL the whole hardware in their products. Not because it is the best or the fastest cpus. Boy, they even will put their own modems in the new iphones and ipad (from 2021), not because theirs (apple acquired intels modem division) are better than qualcoms but to CONTROL the whole proccess. So, stop the debate of which is fastest, ofcorce intels chips are fastest.

My i9 10850k,a 430 euros cpu, f0ck!ng blows away my ipad pro 12.9 (6GB edition) and the fact is that apple have no competitor for the intels highest tier cpus. The A14X will be in par of some mid-range i5 cpus. Does anybody imagine a mac pro (desctop) with arm cpu?! NO. And no, because there isnt any arm cpus for this tier.

For my business/work (at the office) im using an ipad pro, iphone and mac mini but for other scenarios my DYI intel pc is much more suitable

From 2017 im using oculus rift for the past 4 months is my primary display in my home. I use it to mimic three 32inch displays, so inside my oclusus i load an virtual environment with 3, 4k 32 displays. Macs will never have a cpu/gpu THAT powerfull to adress a scenario like this. Also try to play Elite dangerous in vr in any updomming macs. Even if they had support for the oculus or/and for any vr platform, witch i dont think this will be the case, the upcomming macs are like comparing a Pentium III 800MHz with TNT2 gpu, with AMD athlon64 4000+ with Nvidia 6800 ultra.

No ARM cpus for Mac Pro tier? Well, maybe none specifically designed for desktop workstations, but plenty of 64-80 core ARM CPUs with 128-core around the corner:




Apple will produce their own SoC & maybe dGPUs with hardware optimizations for Mac machines, but ARM is already available for powerful computers.
 

johngwheeler

macrumors 6502a
Dec 30, 2010
639
211
I come from a land down-under...
ARM Macs have been in development for probably years now. This isn't something that Tim Cook just decided to do then sent to production. Big guns might be a new Mac Pro desktop. No way they're going to shove an ARM chip into the 13" MBP. That would be a terrible move. The latest rumors that it's going to be a 14" MBP makes the most sense IMO. This is going to be the longest five day wait LOL

Yes, I agree that putting a brand-new CPU into the exsting MBP13 design would be "meh", and lack the impact of a new design. It will either by a new 12" MacBook, which would be a relatively low-ball entrance, or the rumored 14" MBP (or maybe a new 14" MacBook?). I would prefer it to be a new 14", which I think is a more useful size for most people.

I don't think this will be announced on September 15th, and would expect an October announcement for a late-November launch.
 
  • Like
Reactions: russell_314

johngwheeler

macrumors 6502a
Dec 30, 2010
639
211
I come from a land down-under...
How is Apple going to sell Desktops with ARM CPUs is beyond me. If you don’t need Low Power or Fanless Then a Ryzen 7 Or an Intel i7 will be So Much FAster its not even funny.

Let's see what Apple produces. I would be very surprised if overall performance was less than current Intel models, and would expect a significant (20-50%) increase in CPU & GPU performance.

You seem to think ARM is inherently inferior - it's really not, and there are plenty of CPU manufacturers churning out 64-128 core CPUs for servers that outperform Intel & AMD CPUs with less power and lower cost.

ARM is currently very much a mobile and server technology. Apple is aiming to fill the consumer laptop & desktop gap, and I think they will be successful.
 

johngwheeler

macrumors 6502a
Dec 30, 2010
639
211
I come from a land down-under...
You are talking about something you have zero understanding about. Everything in your post is blatantly wrong. Both on conceptual level and on factual level. For your own and other users sake, I strongly suggest that you go out and educate yourself on the basics of how modern CPUs work and specifically about what’s special about the architecture of Intel, AMD, Apple and Cortex CPUs.

I suspect that @TheRealAlex is going to have to adjust his world-view in a few weeks time....
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.