Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Correct if wrong, but my understanding is: only 32-bit Windows x86 apps are capable of running in emulation mode on a "Windows 10 for ARM" machine. No 64-bit Windows x86 apps can work.
Even though: the latest ARM CPU may itself be listed as 64-bit.
Device drivers may also be required to be 32-bit only, although also not certain about that.

Correct. Right now it will only recompile 32 bit x86 apps for ARM automatically. Dunno what the reasoning is behind that. I'm not sure if they'll upgrade that for 64 bit or just keep working at getting everyone to add ARM variants of their apps. I'm always confused where they see the future of Win32 going. The Windows 10 Universal platform seems like it's their big push into ARM, so maybe they think things will just solve themselves as devs move over. But they've also made multiple runs at getting people off Win32 before.

Some workloads are better in conjunction with a capable GPU and processor, and not just GPU.

I don't totally disagree. But this again goes back to that if Intel ships a CPU that's 30% slower this doesn't work in their favor, even if they have some fancy optimization that regains 5%. They're still not digging themselves out of that hole.

Intel also doesn't have a unified address space between the CPU and their integrated GPU which is really a very significant problem. If you are dealing with a unified GPU/CPU workflow, and are working off of integrated graphics, Apple will absolutely slaughter Intel on performance. Really not a good look for Intel on something like the 13" MacBook Pro. Where you have Macs that don't have a discrete GPU, Intel could be in a lot of trouble.

(I do have benchmarks, none that I can post, but really, it's a problem and the results aren't good for Intel even on a full size notebook versus an iPhone.)

Are you referring to Sunny Cove? Unfortunately, I haven't read into this at all apart from headlines and some gossip on tech blogs and forums. Though anything we see in late '19 or mid '20 may just be low wattage stuff.

It looks like Intel might be thinking about it in Sunny Cove.

As far as moving on, it's too early to tell as you pointed out. I'm on an older Intel HEDT for a home workstation and I've been wanting to upgrade for two years now. The truth is each time I read rumors which end up being about 80% true and I keep putting it off. With DDR4 prices set to drop this year and the arrival of DDR5 in 2021-2022, we're going to see some great stuff from both Intel and AMD.

The problem is Intel is only talking about catching up to Apple right now, and not lapping them.

If Intel launched 10 nm on time, we wouldn't be here. As it is now, they're talking about launching their 2016 CPUs in 2019 and everyone is supposed to be impressed by that. If they launched their CPUs originally scheduled for 2019 in 2019 that might actually put them back on the right course with Apple. But that's not what's happening. And now that Apple is in the CPU business, they're going "we can launch all our CPUs on time, why can't Intel?"

Meantime Apple is looking at the MacBook and MacBook Air and thinking that they already make a CPU way better than the warmed over 5 year old CPU that Intel is selling them. Why wouldn't they start moving to ARM?

If Intel keeps having issues, it’s only a matter of time before something happens like an ARM Mac Mini being faster than the entry level Mac Pros. And if that happens there are going to be a bunch of questions about Intel on the high end that Apple can’t ignore, even with R&D costs. Apple’s ARM Macs could end up eating their x86 ones if Intel keeps stalling.

This again ignores AMD which seems like a decent alternative, and seems to be staying current with Apple. They also have Infinity Fabric which seems like a nice related feature to Apple’s unified GPU/CPU address space. So both the companies are in a similar place with CPU design right now.
 
Last edited:
Yeah, I'm not disagreeing with you there. I just can't imagine Apple spending that kind of money even though they can buy a conglomerate like Disney a few times over and still have money to spare. It's a huge feat that may turn out to be fruitless in a few years. I forget the specifics, but it takes several years for a processor design to go from board to pre-production units.

The low end for now isn't great with Ryzen or Intel offerings. Ryzen suffers from some issues at the low wattage end, and Intel's core architecture is just too old. It's likely why they hired Jim Keller. Now the thing with AMD is that Keller helped them out and set them on a path. The CES event for AMD left many speculating AMD has a lot of tricks up their sleeves to screw with Intel for the next 2-4 years.

But if you're as old as I am, then you know to never get too excited over these types of events and rumors. Let the younger kids get bummed out.
[doublepost=1548746498][/doublepost]
Yeah, AMD needs to show Apple they'll have consistent developments YoY for a predetermined time frame, say 7 years, before they begin switching or possibly offering two CPU vendors when people pick out a processor type and speed on the order page. This isn't the mid 2000s where Intel can bribe OEMs on the hush-hush and expect it not to get out. Verified leaks pay big money to those who speak to the media.

I can see Threadripper and Epyc being options once you get past the really mid-range but pricey Xeons in the future. The reality is while Intel's 28c Xeon is impressive, I'm sure AMD can come up with something better. I'd pay close attention to Intel's actions in Q2-Q4 as AMD begins to release new product lines. Intel's actions can be very telling of how afraid they are of AMD gaining marketshare.
[doublepost=1548746576][/doublepost]

Well, Apple could always allow iOS to run natively on a regular Intel Mac. Apple has their own "skunkworks" division inside HQ. Who knows what kind of Frankenstein experiments their top engineers come up with. Let's not forget one of those engineers was playing around with Apple OS on Intel based systems way back in the very early 2000s.

I suspect Apple likely brought up macOS to run on the A series as far back as 2010 with the A4. Keep in mind, this was likely buggy and probably not the user interface initially. But I suspect by the time they introduced the A7, which was their first 64 bit SoC, which Phil Schiller described as desktop class performance; that was likely the first hint to Intel, we have macOS running on this, be warned.

Internally, I think Apple was thinking of two strategies: release macOS on A Series with the expectation that traditional desktop app developers would need to recompile for it (Microsoft, Adobe etc). Or wait until iPad apps become mature and sufficient enough to meet most users needs. Hence, WWDC 2018 UIKit announcement, Photoshop coming to the iPad. Office for iOS meets most users needs, sure, if you need advanced features like Styles, Mail Merge, Tables, this is probably not gonna be the device for you. So, I think Apples first candidate for macOS on A series will be the MacBook.

With apps like Luma Fusion demonstrating that you can do pro video work on an iPad Pro to edit video; Apple’s idea for macOS on A series is, they don’t want any recompile’s or ports of traditional desktop apps. They actually want iOS apps, but they will give you the traditional macOS experience.

Going back to the MacBook. How is Apple gonna make it enticing for you to pick up one?

- Traditional, familiar MacBook clamshell design.
- It will be the first Mac to have Face ID.
- It will hit the $999 sweet spot.
- 8 Core A series SoC
- 16 GBs of LPDDR4 RAM
- It will have a touch screen, meaning, users asking a for a touch screen Mac, will have it for the first time with an optimized library of over a million apps.
 
I hope they'll allow Nvidia Quadro/GeForce (both single and SLi), dual Xeon CPUs, RAID 10 support, and higher RAM configurations…like up to 2TBs, more inputs (including legacy USB Type A and Thunderbolt 2…but I won't hold my breath), and better ventilation/cooling.

…oh could they please add a backlit keyboard? The laptops have them, why not add it?
 

Attachments

  • Skjermbilde 2019-01-29 kl. 12.27.12.png
    Skjermbilde 2019-01-29 kl. 12.27.12.png
    1.9 MB · Views: 140
The Mac Pro 6,1 was screwed (pun intended) from the beginning https://www.nytimes.com/2019/01/28/technology/iphones-apple-china-made.html

It was an interesting read, if a bit screwy (hah) in some of its assertions.

Mostly interesting in that the issue with the 6,1 ship times early on wasn't really the design but the local assembly. Wonder when we would have gotten the product if it was a bog-standard Chinese job.

That said, Apple like everyone else is looking at non-Chinese markets. Low-volume products like the Mac Pro and Macs in general would seem like the obvious choices to produce in another country.
 
With apps like Luma Fusion demonstrating that you can do pro video work on an iPad Pro to edit video; Apple’s idea for macOS on A series is, they don’t want any recompile’s or ports of traditional desktop apps. They actually want iOS apps, but they will give you the traditional macOS experience.

Going back to the MacBook. How is Apple gonna make it enticing for you to pick up one?

- Traditional, familiar MacBook clamshell design.
- It will be the first Mac to have Face ID.
- It will hit the $999 sweet spot.
- 8 Core A series SoC
- 16 GBs of LPDDR4 RAM
- It will have a touch screen, meaning, users asking a for a touch screen Mac, will have it for the first time with an optimized library of over a million apps.

You lost me at i .

As for a million iApps - does OSX still come with widgets ? God I hated those ...
 
It doesn't work quite like how you think it does. Windows 10 ARM automatically ports/recompiles x86 applications to ARM for you on the fly. They become ARM applications when you run them.
No, they become on the fly recompiled binary ARM instructions of x64 applications in the same manner that Java is JIT compiled into native object code.

So in the end you get full native performance.
After the code has been JIT compiled. The initial execution of the code will be slow as the JIT compiler converts it into ARM native object code.

That's quite different than the Surface/Surface 2 (which didn't run x86 apps at all), and preserves performance. Qualcomm has a pretty good demo:

Apple could do something similar, but I wouldn't want to speculate too much. But in theory if Apple did the same thing they wouldn't take an emulation hit from running x86 packages.

I didn't bother to watch this video as I don't want to spend 17 minutes trying to determine what it is you're attempting to show with it. If there's a specific part of the video which you feel is important to watch let me know what the time code is and what I should be looking for.

So let's be clear here. x86 is not a problem. Intel is a problem. There's nothing wrong with x86. There is a whole lot wrong with Intel.

Right now you are suggesting that even though Apple has a $50 ARM CPU that absolutely destroys Intel's $50 CPU in performance, they should ignore that and continue using Intel.
You're extrapolating a low power, lost cost version on one processor type to another low power, low cost version of another processor type. These processors are used in a situation where battery length, and not performance, is the key design goal. I do not see this translating into higher performing components. History has demonstrated it's difficult to out perform x64. A number of companies stopped manufacturing their own processors due to the extremely high cost of developing high performance processors. It made no business sense to do so. I do not see why that would change with ARM.

Intel could ship a x86 CPU that's competitive. They don't. That's on them. It's not on Apple to keep buying CPUs that are legitimately bad products.

Xeon CPUs are not bad products right now. But that's probably because Intel has a monopoly. AMD is starting to catch up. If Intel did keep having issues, and Apple had a workstation level CPU up their sleeves, I'd think the above logic would apply.
I don't and history isn't on your side.

Let me flip my above logic for software.

If Apple sold a Mac Pro that was twice as fast as Intel workstations, would software vendors ignore that? No, they'd be stupid to. If Apple ever produced the goods I don't see why vendors wouldn't port to it. We'd be stupid not to buy that workstation. In the end, if Intel keeps screwing up, the momentum behind an ARM transition would power it through. You're saying that if Apple and/or Windows vendors both start shipping faster ARM hardware everyone would ignore that, and I really don't believe that would be true.

Also ARM is fairly close to x86. I think you're overestimating the amount of porting work.

And again, Windows is making the same transition. Intel is the drunk uncle that set the house on fire, everyone is running for the exits, and Apple doesn't want to be the last one out the door. Even the Windows only devs are going to have to port. Maybe Intel can put the fires out, maybe they can't. But they've already lost the low end. Windows is already ported and shipping on ARM, macOS is moving.

If you really want to get a feeling which way the wind is blowing, Google has already started porting Windows Chrome to ARM. It's not a pro app, but it's usually a signifier of where vendors are going because it's such a common app.
This is a hypothetical and therefore isn't worth considering. We could play the what-if situation all day long. It's my opinion this hypothetical will never come to be therefore I am not going to discuss it.

This whole "Apple is going to go with ARM" argument just screams of anti-x64 bias. There are people who feel the x64 is obsolete and newer, better designs are the way of the future. History has shown x64 to be very capable and extremely difficult to beat. Perhaps one day it'll happen. It's my opinion the case for ARM isn't there (unless you're looking at the low end, portable devices).
 
  • Like
Reactions: 0388631
There are two different cases of MacOS on ARM...

One is the MacBook/MacBook Air, possibly along with models of the Mini and/or 21" iMac. 8 Vortex cores from the iPhone (or their successors) would be enough to compete with the low-power 2-4 core Intel CPUs. These machines rarely if ever run Boot Camp or Parallels. They also don't often run huge, messy software. This is easy - Apple could drop this machine pretty much any time.

The second case is the iMac Pro and upcoming Mac Pro, reaching out to include the 15" MasBook Pro and high-end models of the 27" iMac. These machines would need 32, 64 or even 128 or more (in the case of a theoretical high-end Mac Pro configuration) Vortex cores to reach the speeds they need. Nobody wants to program for a 128-core machine - many tasks just don't work that way! They all run huge, messy software, and a not insignificant number have Parallels or Boot Camp installed. Of course, Apple could develop a new, higher performance (and higher power) ARM core, but that's a lot of work, and it still doesn't solve the Parallels/Boot Camp problem.

If I were Apple, and I wanted to pursue the ARM idea, here's what I'd do... It would take clear differentiation between the two lines, and I don't know how to do that from a marketing standpoint. At the very least, there should be enough redesigns that there aren't lookalike Macs that are sometimes Intel and sometimes ARM.

Release a really thin-and-light ARM laptop for a start. It can essentially be a repurposed iPad Pro with a built-in keyboard. Market it as having access to both the Mac and iOS app stores (it's easy to get iOS apps running - they may even run without a recompile). Make it clear that it runs Mac apps only from the Mac app store (where getting the correct binary is easy). If it's a success (and it should be - it'll be thin, light, have great battery life, and run what many people need), release some more Macs like it, making sure they're visually distinctive. What you don't want is lookalike Minis, some of which run iOS apps as well as Mac apps, while others are compatible with Parallels.

Eventually, there will be a range of ARM Macs that includes small laptops, Minis and 21" iMacs. There may be a big-screen model named something like Mac TV, and intended for living rooms. They range up to essentially a "double iPad Pro" - twice as many cores as you can get in an iPad - but they all use the "big" core from the iPhone and iPad of the same year.

Meanwhile, continue the Intel lineup, all with the word Pro in their name. Anything with the MacBook Pro brand is Intel (yes, the 13" would be a candidate for ARM, but eliminate confusion by not going there). All 27" iMacs are rebranded iMac Pro (and the lowest-end models are eliminated), and they're all Intel. The Mac Pro is of course Intel, and there may be a Mac Pro Mini.

It's relatively easy to figure out - anything that says Pro is Intel, runs Mac apps from anywhere (including Parallels and Boot Camp), and doesn't run iOS apps. Anything that doesn't say Pro is ARM, uses the Mac App Store, and runs iOS apps too. Of course, Apple would have to call the iPad Pro something else!
 
You lost me at i .

As for a million iApps - does OSX still come with widgets ? God I hated those ...
How many apps are on the App Store? My point is, Apple is using the A series not only to wean itself off off Intel, but the old app development model. Only Swift and possibly Xcode are the two that will likely be around on macOS on Aseries after a few years.
 
Hi. I've been reading the thread and understand some of the technical stuff but most goes over my head. I have a 12 core 5.1 mac pro. I am wondering if somebody could be kind enough to explain in simple terms how a 12 core chip today is faster than what I already have? I'm looking to upgrade at some point. Also do you think there will be a class of mac pro. What I mean is if it is module could there be one range with bluray drive and video caption cards for those that work with video, while another class with specific stuff for audio, and yet a third type for a basic mac pro that can do a little of both?
 
No, they become on the fly recompiled binary ARM instructions of x64 applications in the same manner that Java is JIT compiled into native object code.

...

After the code has been JIT compiled. The initial execution of the code will be slow as the JIT compiler converts it into ARM native object code.

But it's caching the code between runs, which means the first execution and only the first execution. That's a one time hit to have native performance, which isn't bad. Once Windows on ARM renders

Additionally it's caching the JIT translations between modules. So once a library is translated for one app, it's automatically translated for all other apps on the system.

You're extrapolating a low power, lost cost version on one processor type to another low power, low cost version of another processor type. These processors are used in a situation where battery length, and not performance, is the key design goal. I do not see this translating into higher performing components.

I don't disagree, but I also don't think this is helping Intel's case. Apple built a CPU targeted towards battery consumption, and it's still faster than what Intel ships in the same class. How does that make Intel look better?

History has demonstrated it's difficult to out perform x64.

Again, Apple is currently outperforming Intel with CPUs in the same weight class. This is not in theory. Happening right now.

That's doesn't directly translate to higher end processors, but in the 13" space, Apple is currently the winner.

And Apple's integrated graphics architecture is clearly hands down superior to whatever Intel is doing at any CPU size.

This whole "Apple is going to go with ARM" argument just screams of anti-x64 bias. There are people who feel the x64 is obsolete and newer, better designs are the way of the future. History has shown x64 to be very capable and extremely difficult to beat. Perhaps one day it'll happen. It's my opinion the case for ARM isn't there (unless you're looking at the low end, portable devices).

I'm not saying x86 is obsolete. What I am saying is that Apple ships a faster CPU in the low end notebook class of CPUs than Intel does. You're trying to argue in circles around that about history and whatever else. I'm talking about where we are factually today.

We didn't have to be here. There is nothing in x86 that makes it unable to compete. But we are where we are because Intel dropped the ball repeatedly. I'm not making an instruction language argument or preferring one instruction set over another. What I'm saying is where Apple competes with Intel, they are currently outperforming Intel.

Long term does it give Apple an opening into the higher end? I think Apple could run a split platform indefinitely, especially if they teamed up with AMD. But Intel's performance on higher end chips has stalled too. They just don't really have strong competition there.
[doublepost=1548788577][/doublepost]
How many apps are on the App Store? My point is, Apple is using the A series not only to wean itself off off Intel, but the old app development model. Only Swift and possibly Xcode are the two that will likely be around on macOS on Aseries after a few years.

There have already been pretty significant changes to Mac app development in the last few years that seem like a clearing of the decks ahead of ARM. And Marzipan will give them a source of existing ARM apps. But I'd still be surprised if Apple goes App Store only. Apple has done some work on outside-of-store distribution in the past few releases that make it seem like it will stick around, just get a little more stringent.
 
Last edited:
But it's caching the code between runs, which means the first execution and only the first execution. That's a one time hit to have native performance, which isn't bad. Once Windows on ARM renders

Additionally it's caching the JIT translations between modules. So once a library is translated for one app, it's automatically translated for all other apps on the system.
Isn't this what I said?

I don't disagree, but I also don't think this is helping Intel's case. Apple built a CPU targeted towards battery consumption, and it's still faster than what Intel ships in the same class. How does that make Intel look better?
Apple built a CPU for portable devices where battery life is priority (i.e. iPhone) and then started to incorporate it in other portable devices where battery life is priority (iPads). The fact that this CPU may be used in other portable devices where battery life is priority wouldn't surprise me.

Having said that the Macintosh line doesn't need that extreme battery life if it comes at the cost of performance. While a MacBook may benefit creating such a system now means two different codes bases: One for x64 and one for ARM. While recompiling may be simple supporting two different code bases requires more than recompiling. One needs to test and support each code base. As one who does this porting I would hope you'd know this.

Again, Apple is currently outperforming Intel with CPUs in the same weight class. This is not in theory. Happening right now.
Apple is doing so on their iPhone and iPad products. I am not aware of any Macintosh product using ARM. At least not one outside of Apple.

That's doesn't directly translate to higher end processors, but in the 13" space, Apple is currently the winner.

And Apple's integrated graphics architecture is clearly hands down superior to whatever Intel is doing at any CPU size.
Is it? Which 13" Macintosh is winning?


I'm not saying x86 is obsolete. What I am saying is that Apple ships a faster CPU in the low end notebook class of CPUs than Intel does. You're trying to argue in circles around that about history and whatever else. I'm talking about where we are factually today.
You certainly come across that way.

We didn't have to be here. There is nothing in x86 that makes it unable to compete. But we are where we are because Intel dropped the ball repeatedly. I'm not making an instruction language argument or preferring one instruction set over another. What I'm saying is where Apple competes with Intel, they are currently outperforming Intel.
Aside from iDevices where is Apple competing with Intel?
 
  • Like
Reactions: 0388631
Having said that the Macintosh line doesn't need that extreme battery life if it comes at the cost of performance. While a MacBook may benefit creating such a system now means two different codes bases: One for x64 and one for ARM. While recompiling may be simple supporting two different code bases requires more than recompiling.

I'm again confused where two different code bases is coming from. ARM and Intel builds are generally derived from a single code base. PowerPC and Intel builds were derived from the same code base back in the day, even at the big boys like Adobe. Xcode doesn't even really support split code bases for different CPUs. So I'm unclear why everyone thinks this is practice.

One needs to test and support each code base. As one who does this porting I would hope you'd know this.

Testing on ARM is more of a thing. But I don't think that's avoidable. Apple is mostly going to force Xcode projects to compile on ARM. And any emulation would force ARM testing anyway. You're acting like software vendors can avoid something that I don't think is avoidable.

Also, if ARM provides better performance vs Intel at the same size class, I don't know why vendors would avoid ARM. "We're not going to support faster CPUs" is not really an argument customers like hearing from vendors.

The other thing to watch for is Apple is already moving to cut things on Intel so that people can't even stay on Intel cleanly without changes. Stuff like OpenGL is going away. Apple's going to force everyone over the bridge by setting the Intel side on fire.

There is no future where developers can keep shipping the same app they've always shipped and not have to change. Intel is not a safe harbor on the Mac platform.

Is it? Which 13" Macintosh is winning?

The iPad Pro is outperforming Intel CPUs in the 13" class. The iPad Pro is faster than both the MacBook and the MacBook Air. And yes, the iPad Pro has TDP issues, but if you stuck a fan on it like the Intel CPU in the MacBook Air has, it would do fine. The TDP issues on the iPad really have to do with Apple's love of no cooling on tablets. The overall TDP of the A series is lower than Intel's TDP.

The OS it runs is secondary, you know that. You're setting up a straw man here. You can't say "Oh that's an iPad benchmarks against Intel don't count because iPad." It would be absurd.

For all the talk about instruction set bias it seems like you have a pretty strong bias towards x86 here.
 
We'll have to disagree on the part which I've removed.
The iPad Pro is outperforming Intel CPUs in the 13" class. The iPad Pro is faster than both the MacBook and the MacBook Air. And yes, the iPad Pro has TDP issues, but if you stuck a fan on it like the Intel CPU in the MacBook Air has, it would do fine. The TDP issues on the iPad really have to do with Apple's love of no cooling on tablets. The overall TDP of the A series is lower than Intel's TDP.
Outperforming it doing what? Running a optimized, cut down version of OS X?
 
Again I ask you: Outperforming it doing what?

You're really trying to argue your way out this one by going "What are industry benchmarks? What do they mean? Can we trust them?" Throwing all benchmarking under the bus is a bit desperate.

Look, if you have some algorithm or capability you think Intel is better at, then just come out with it. Stop stalling without saying anything. I know you know that there are tons of benchmarks out there already comparing Intel and the A series and you're wasting my time by pretending you don't know what I'm talking about.
 
You're really trying to argue your way out this one by going "What are industry benchmarks? What do they mean? Can we trust them?" Throwing all benchmarking under the bus is a bit desperate.
I didn't say anything about benchmarks. You made the statement:

The iPad Pro is outperforming Intel CPUs in the 13" class.​

To which I asked:

Outperforming it doing what?​

No where in that question did I say anything about benchmarks. You've spent more time dodging the question than answering it.
 
I didn't say anything about benchmarks. You made the statement:

The iPad Pro is outperforming Intel CPUs in the 13" class.​

To which I asked:

Outperforming it doing what?​

No where in that question did I say anything about benchmarks. You've spent more time dodging the question than answering it.

Let's fast forward this a bit. Here are some A12X benchmarks, so we don't have to do the whole thing about where could I possibly find benchmarks I don't know how to search for things how does one even compare CPUs.

https://venturebeat.com/2018/11/01/...s-rival-macbook-pros-with-intel-core-i7-cpus/

We can skip the parts where it's not faster than the 15" MacBook Pro's i7 in multicore (we're comparing against 13" MacBook Air and MacBook), the parts where the MacBook Pro is poorly cooled (the iPad is cooled much worse) and skip straight to you actually saying what you think is special about an Intel CPU that should discount these benchmarks.

I'm not even getting to the Pandora's box of "Intel's graphics suck" because that does touch the driver stack and oh Lordy I haven't had enough coffee to try to pull that stalling apart.
 
Is this the only data you have? That ARM is faster than x64 when it comes to running Geekbench?

Again, I am asking you if you have any data that disagrees. You keep saying "well not that I don't like that" and I'm asking "then what?"

I agree Geekbench can be iffy, but regardless of what you feel about Geekbench, the 2018 13" Air getting so very pummeled by the A12X is really hard to explain away with "well Geekbench." Especially when the iPad has no fan and the MacBook Air does.

(Here's the 2018 Air Score for comparison https://browser.geekbench.com/macs/437 and a direct link to the iPad Pro: https://browser.geekbench.com/v4/cpu/10703607 . That's a heck of a gap for "well Geekbench.")

(I'm also not suggesting Apple has headroom to add more speed if they added a fan to the iPad. Rather I think it would get A12X running at load continuously, and given the benchmarks, they don't need the extra speed.)
 
Again, I am asking you if you have any data that disagrees. You keep saying "well not that I don't like that" and I'm asking "then what?"
The burden of proof is not on my shoulders. So again I ask: Is this the only data you have? That ARM is faster than x64 when it comes to running Geekbench?

I agree Geekbench can be iffy, but regardless of what you feel about Geekbench, the 2018 13" Air getting so very pummeled by the A12X is really hard to explain away with "well Geekbench." Especially when the iPad has no fan and the MacBook Air does
If you feel Geekbench is iffy then why are you holding it up as evidence to support your position?
 
  • Like
Reactions: 0388631
The burden of proof is not on my shoulders. So again I ask: Is this the only data you have? That ARM is faster than x64 when it comes to running Geekbench?

I will gladly look at your alternative evidence that comparable x64 CPUs are faster than Apple CPUs.

If you don't want to do that, you're welcome to stop replying to me.

Let me put it this way: What benchmark do you use to compare Intel CPUs against each other? What benchmark do you think is fair?

If you feel Geekbench is iffy then why are you holding it up as evidence to support your position?

Why are you stalling?
 
I will gladly look at your alternative evidence that comparable x64 CPUs are faster than Apple CPUs.

If you don't want to do that, you're welcome to stop replying to me.

Let me put it this way: What benchmark do you use to compare Intel CPUs against each other? What benchmark do you think is fair?
Again, the burden of proof is on your shoulders, not mine.

Why are you stalling?
Uh, that would be you. I have repeatedly asked if you have any data, other than the data you yourself admit is questionable, to support your assertion that:

The iPad Pro is outperforming Intel CPUs in the 13" class.​

Given the effort you've put into dodging this question and until / unless you provide more, I'll have to assume no, you do not.
 
  • Like
Reactions: 0388631
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.