Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I recently switched from a 10-core 64GB iMac Pro to a 16GB Mac Mini M1. For my mostly-Java software development workflow, the Mini M1 has been significantly faster and more responsive experience across the board. Builds of our large Java8-based platform are 30% faster on average. And, the user experience is much snappier. I sold the iMac Pro a week or so ago after running both side-by-side for a couple weeks.

Amazing...
Yeah I have been thinking of getting rid of my 16" once the new 14" models are available. Way too many problems with Intel and AMD chips inside this one. Love the screen and battery life tho
 
$200 off on Intel Macs at Amazon.

I wonder if this means that M1X is imminent.

 
As an Amazon Associate, MacRumors earns a commission from qualifying purchases made through links in this post.
  • Like
Reactions: thekev
Everything works with Intel, which is the most important thing. And it runs Windows too which has much more applications available than OS X.

The M2X 16" MBP that I am considering, will be a secondary computer most likely.
 
Yeah I have been thinking of getting rid of my 16" once the new 14" models are available. Way too many problems with Intel and AMD chips inside this one. Love the screen and battery life tho
My guess is the screen on the 14" will be better than that of the 16". And it is very likely the battery life will be much better. The M1 Airs and MBPs run double the time of the Intel models they replace.
 
My CPU normally runs at 2-20% (20% during the day for my trading programs) and it's a 65 Watt CPU. It's never seen 100% CPU though it can get up to 70% starting up a program but that's just for a few seconds. So I got a low-power desktop chip that isn't used heavily so typical power consumption is low.

Unfortunately, it's worth taking Intel's TDP numbers with a huge grain of salt. AMD isn't much better, but at least they aren't doing what Intel has done with these 10th gen chips.

Under load, the 10700 and 10700K are very similar, thanks to Turbo Boost. One's rated at 65W TDP, and the other 125W TDP. Yet they both wound up drawing the same power according to Anandtech, with peaks over 200W on both. So unless your machine is configured to disable Turbo Boost or apply TDP limits (neither is the default), it's drawing a lot more when it is loaded down than Intel claims in it's ratings. To actually see what's going on, you pretty much need something like Intel's Power Gadget tool to show you what the real power draw is.


But at the end of the day, I was more trying to point out that heat as we sense it is more about energy per volume. Noise from fans is about RPM required to provide the needed CFM to dissipate that energy. I was more pointing out the 5600X, despite being a noticeably less power hungry chip than something like the 10700 when looking at real measurements, is still harder to cool when you shove it in a smaller box with bad airflow.

I'd love to have a 5900x and I am shopping for one but they are essentially unicorns right now.

Thankfully the 5900X is overkill for what I do. The 5600X was a lot cheaper, and easier to grab one when I did. What stuff I do that does need cores I do on macOS.
 
Under load, the 10700 and 10700K are very similar,
Fascinating, never heard about that. I wonder about the 10900 and the 10900k's and if they're similar the same way too. I always thought Intel had way to many CPU part numbers to be logical. (Just got my first PC with a 10900, it's a nice machine...)
 
  • Like
Reactions: pshufd
Unfortunately, it's worth taking Intel's TDP numbers with a huge grain of salt. AMD isn't much better, but at least they aren't doing what Intel has done with these 10th gen chips.

Under load, the 10700 and 10700K are very similar, thanks to Turbo Boost. One's rated at 65W TDP, and the other 125W TDP. Yet they both wound up drawing the same power according to Anandtech, with peaks over 200W on both. So unless your machine is configured to disable Turbo Boost or apply TDP limits (neither is the default), it's drawing a lot more when it is loaded down than Intel claims in it's ratings. To actually see what's going on, you pretty much need something like Intel's Power Gadget tool to show you what the real power draw is.


But at the end of the day, I was more trying to point out that heat as we sense it is more about energy per volume. Noise from fans is about RPM required to provide the needed CFM to dissipate that energy. I was more pointing out the 5600X, despite being a noticeably less power hungry chip than something like the 10700 when looking at real measurements, is still harder to cool when you shove it in a smaller box with bad airflow.

Thankfully the 5900X is overkill for what I do. The 5600X was a lot cheaper, and easier to grab one when I did. What stuff I do that does need cores I do on macOS.

I have a tool that measures power draw for the CPU, GPU, SSDs, fan speeds, load percentages. Power draw on the CPU is much less than 65 watts. Part of the reason for getting a powerful CPU with lots of cores so that you can run those cores at low frequencies with overall low power draw. One reason for something like a 5900X where you can take a task and split it out to many cores and threads at low frequency.
 
We will see how M1 fares long term. I enjoy mine but there are reports of terabytes being swapped on the SSD, which may lower its lifespan.
 
Fascinating, never heard about that. I wonder about the 10900 and the 10900k's and if they're similar the same way too. I always thought Intel had way to many CPU part numbers to be logical. (Just got my first PC with a 10900, it's a nice machine...)

The 10600 is much better about staying closer to TDP under load than the 10700 apparently. I haven’t seen direct comparisons for the 10900 chips, but if anyone knows, AnandTech probably has it somewhere in their reviews, since they do some of the more comprehensive analysis of power consumption available publically.

I have a tool that measures power draw for the CPU, GPU, SSDs, fan speeds, load percentages. Power draw on the CPU is much less than 65 watts.

Power draw scales with load, so yes, a lighter load will stay under 65W.

Part of the reason for getting a powerful CPU with lots of cores so that you can run those cores at low frequencies with overall low power draw. One reason for something like a 5900X where you can take a task and split it out to many cores and threads at low frequency.

I’m not really sure what scenario you are thinking of where this is true. It also runs counter to the sort of measurements AnandTech produce, and how CPUs have operated in the last 5-ish years.

Under very light loads, where all cores are active, more cores need more power to keep the silicon going. Under full load, the same is true, only there’s more work happening, drawing even more power, hence why you see peak wattage measurements of 200W for the 10700K vs 140W for the 10600K. At best, you can get similar idle power usage when most of the cores are sleeping and only a couple are needed to keep the lights on, but generally more cores means more a little more idle consumption as well.

And at least right now, base clocks are not materially different enough to truly let something like the 10700 claw back the gap compared to a 10600. Turbo boost also throws a rather ugly wrench into the works here as well. The higher core count CPUs are about as efficient per core as the lower core count CPUs these days, so your consumption scales up with the number of cores under load. You’ll consume roughly the same total power to complete a specific task with either one.

And I’m not sure why you keep bringing up the 5900X, but under full load, it uses about double the power of the 5600X, with double the cores. So it’s not any more efficient when given longer, heavily multi-threaded tasks to complete. But it will get them done faster if they are multi-threaded enough to use all the cores.
 
Under load, the 10700 and 10700K are very similar, thanks to Turbo Boost. One's rated at 65W TDP, and the other 125W TDP. Yet they both wound up drawing the same power according to Anandtech, with peaks over 200W on both. So unless your machine is configured to disable Turbo Boost or apply TDP limits (neither is the default), it's drawing a lot more when it is loaded down than Intel claims in it's ratings. To actually see what's going on, you pretty much need something like Intel's Power Gadget tool to show you what the real power draw is.

I fired up my workload and let it run for a while. Total CPU power ranged from 15-18 watts with occasional spikes to 25 watts. CPU temperatures are normally around 29-31 degrees under my daily workload and 23-27 degrees when I'm just reading or browsing the web. I'm sure that the massive cooler is part of that but this 10700 really does run cool and doesn't draw a lot of power. I've never seen it. Toss in a YouTube video and it uses a whopping 21 watts.
 
Well eventually they're going to have to as Apple will drop support for Intel Macs at some point and perhaps earlier than expected as they will want more adoption of AS Macs.

Most? Most who? A few people on a forum or YouTube reviewers? Please step into reality that the majority of Apple's real customers are not forum or tech social media junkies. Some or even at a stretch many, not most. You'd need to show proof of that statement.

The thread title is about aspects of Intel Macs outperforming M1 Macs.

I've heard this from time to time here but you have to understand that MacOS is not the same slimmed down and dumbed down system as iOS. It takes a lot more to run MacOS.

I hope you're not referring to forum or social media junkies because they're certainly nowhere near Apple's volume market. Sadly this forum would love to believe they make up the majority of Apple's customers and their needs and wants should cover everyone else's.
When Apple does drop support for Intel Macs, there will be much more powerful SoCs than the M1 available. They aren't going to force everyone onto low cost machines.

Perhaps you could cite some reviews recommending the MBP over the MBA. I haven't seen any.

FYI, iOS and MacOS are very similar, they are both based on the Darwin core OS. A major difference is memory management. iOS does not swap dirty memory pages out to storage, it just terminates apps when it runs low on RAM. It does discard clean memory pages though. Macs do have great I/O requirements though and the M1 has been optimized to run translated X86 binaries. That said, I expect the A14X in the next iPad Pro to be very similar in power.

You don't have to follow forums to understand that the MBA is Apples high volume product. It's their cheapest laptop so of course it's their highest volume product. You can't buy an Intel MBA anymore so Apple is going to sell a lot of ARM Macs this year. Apple helpfully threw in the M1 Mini to keep developers happy. MacOS developers are going to need an Intel and an ARM Mac for now.
 
When Apple does drop support for Intel Macs, there will be much more powerful SoCs than the M1 available. They aren't going to force everyone onto low cost machines.

Perhaps you could cite some reviews recommending the MBP over the MBA. I haven't seen any.
I never stated anything of the sort in my post you quoted. I said they will eventually have to drop support for Intel as they will want more adoption of APPLE SILICON Macs. That would mean all Macs including iMacs and the Mac Pro. I wasn't referring only to the Air. I honestly thought you understood my post.
FYI, iOS and MacOS are very similar, they are both based on the Darwin core OS.
Regardless of the similarities you can't run MacOS Final Cut Pro X on iOS.
You don't have to follow forums to understand that the MBA is Apples high volume product. It's their cheapest laptop so of course it's their highest volume product.
I never said it wasn't.
 
I recently switched from a 10-core 64GB iMac Pro to a 16GB Mac Mini M1. For my mostly-Java software development workflow, the Mini M1 has been significantly faster and more responsive experience across the board. Builds of our large Java8-based platform are 30% faster on average. And, the user experience is much snappier. I sold the iMac Pro a week or so ago after running both side-by-side for a couple weeks.

Amazing...

I had heard that there were OpenJDK builds for the M1. .NET Core is not there yet, they will support ARM with the .NET 6 release. Docker is not production ready either and last I checked, Android Studio and its Android emulators are not native yet. I am sure by the end of this year, all these issues will be resolved.
 
I had heard that there were OpenJDK builds for the M1. .NET Core is not there yet, they will support ARM with the .NET 6 release. Docker is not production ready either and last I checked, Android Studio and its Android emulators are not native yet. I am sure by the end of this year, all these issues will be resolved.

I wish I could say the same about Fidelity Active Trader Pro. My understanding is that it was written in the 1990s using some archaic tool.
 
I wish I could say the same about Fidelity Active Trader Pro. My understanding is that it was written in the 1990s using some archaic tool.
Its a .NET Windows application that requires the latest Windows specific version of .NET. BTW .NET wasn't around in the 90s but it has been a quite few years since writing .NET desktop apps was fashionable.
 
I had heard that there were OpenJDK builds for the M1. .NET Core is not there yet, they will support ARM with the .NET 6 release. Docker is not production ready either and last I checked, Android Studio and its Android emulators are not native yet. I am sure by the end of this year, all these issues will be resolved.
You can get OpenJDK 11 & 15 from Homebrew. You can also download Azul JDK 8, 16 & 17. https://www.azul.com/downloads/zulu-community/
 
Well it's not directly relate to Intel vs M1, but we are yet to see that many games running native on m1 yet. I know there are numbers of iOS based games, but only several games transcoded for Mac OS that are built for Mac OS/Windows/Multi platform.

Looks like "just re-compile will do" idea is not really working for gaming.
 
I fired up my workload and let it run for a while. Total CPU power ranged from 15-18 watts with occasional spikes to 25 watts. CPU temperatures are normally around 29-31 degrees under my daily workload and 23-27 degrees when I'm just reading or browsing the web. I'm sure that the massive cooler is part of that but this 10700 really does run cool and doesn't draw a lot of power. I've never seen it. Toss in a YouTube video and it uses a whopping 21 watts.
Try something intensive like Cinebench or Prime 95 and you will see the TDP increase WAY past the 65Watts rated by Intel.

3D rendering or video editing or gaming will see a increase in Watts way past the rated TDP.
 
  • Like
Reactions: Krevnik
Try something intensive like Cinebench or Prime 95 and you will see the TDP increase WAY past the 65Watts rated by Intel.

3D rendering or video editing or gaming will see a increase in Watts way past the rated TDP.

Those things are irrelevant for my workload.

I built this system to run specific workloads and two programs in particular, Fidelity Active Trader Pro and Ameritrade Think or Swim. And it runs both quite well and uses little power in the process. I said my system runs cool and quiet. And it does. As I wrote, the idea was to run a lot of cores at low frequencies to keep power consumption low and to run cool and quiet.
 
Its a .NET Windows application that requires the latest Windows specific version of .NET. BTW .NET wasn't around in the 90s but it has been a quite few years since writing .NET desktop apps was fashionable.

That doesn't mean that ATP wasn't written in the 1990s though.
 
My guess is the screen on the 14" will be better than that of the 16". And it is very likely the battery life will be much better. The M1 Airs and MBPs run double the time of the Intel models they replace.
Yeah plus all the apps I'm running are either full Apple Silicon already (Cisco, Homebrew and so on) or work well with Rosetta (HANA Studio, SAP GUI)
 
  • Like
Reactions: jerryk
That's never going to be better on an M1. Best you can hope for is "adequate." It just can't happen without an x86_64 processor. (barring some kind of peripheral that contains an x86 processor like the old Amiga bridgeboard...)

It's not in the adequate category yet, but I have hope...
A fellow Amiga man.

Was you into the demo scene?
 
Looks like "just re-compile will do" idea is not really working for gaming.
It was never going to work for sizeable programs of any kind. There’s likely a lot of old 32-bit and obscure libraries and code in large programs that have never been changed over the years. Most likely due to an arcane wizard of programming leaving the company, or management pushing new features over basic functionality.

The gaming industry in particular has serious issues with crunch culture and pressing for yearly releases. Supporting a new arch, while promising, that only represents a minority of the Mac lineup right now, is probably not a priority compared to juggernauts like console and PC gaming.

The M1 has us all excited, but let’s bring ourselves back down to Earth and realize that only the low-end Macs have the new architecture. And that it will take awhile before we get native programs of all kinds.
 
  • Like
Reactions: bobcomer
It was never going to work for sizeable programs of any kind. There’s likely a lot of old 32-bit and obscure libraries and code in large programs that have never been changed over the years. Most likely due to an arcane wizard of programming leaving the company, or management pushing new features over basic functionality.

The gaming industry in particular has serious issues with crunch culture and pressing for yearly releases. Supporting a new arch, while promising, that only represents a minority of the Mac lineup right now, is probably not a priority compared to juggernauts like console and PC gaming.

The M1 has us all excited, but let’s bring ourselves back down to Earth and realize that only the low-end Macs have the new architecture. And that it will take awhile before we get native programs of all kinds.

What will change my mind on gaming (I'm not a gamer):

- GPU shortages persist until 2023
- Apple is able to build compelling GPUs, either integrated or discrete
- Apple sells systems at a very good price/performance point
- Performance/watt remains excellent
 
  • Like
Reactions: thekev
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.