Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

leman

macrumors Core
Original poster
Oct 14, 2008
19,521
19,678
The shift in here to "omg power savings" is hilarious. Last week it was about being the king of the hill, but now we're back to stroking our "power savings." ?

It has always been about realistic performance in a reasonable package. In this regard Apple is definitely the king of the hill and they will stay this way for the foreseeable future. It's really laughable that mobile Apple CPU with under 40W basically has the same performance as latest x86 desktop behemoth in workloads such as scientific compute and software development.

But hey, Intel is faster in Cinebench! That's gonna count for something, right?
 

thenewperson

macrumors 6502a
Mar 27, 2011
992
912
It has always been about realistic performance in a reasonable package. In this regard Apple is definitely the king of the hill and they will stay this way for the foreseeable future. It's really laughable that mobile Apple CPU with under 40W basically has the same performance as latest x86 desktop behemoth in workloads such as scientific compute and software development.

But hey, Intel is faster in Cinebench! That's gonna count for something, right?
The extent to which (mostly) PC users lack perspective on this is saddening. Also the myopia around desktops vs laptops, it's pathetic.
 

GuruZac

macrumors 68040
Sep 9, 2015
3,748
11,733
⛰️🏕️🏔️
It has always been about realistic performance in a reasonable package. In this regard Apple is definitely the king of the hill and they will stay this way for the foreseeable future. It's really laughable that mobile Apple CPU with under 40W basically has the same performance as latest x86 desktop behemoth in workloads such as scientific compute and software development.

But hey, Intel is faster in Cinebench! That's gonna count for something, right?
This ? and let’s be honest, the smaller the laptop, the worse it gets for Intel. They have nothing that can compete with the M1 in my MacBook Air.
 

Miha_v

macrumors regular
May 18, 2018
193
385
It has always been about realistic performance in a reasonable package. In this regard Apple is definitely the king of the hill and they will stay this way for the foreseeable future. It's really laughable that mobile Apple CPU with under 40W basically has the same performance as latest x86 desktop behemoth in workloads such as scientific compute and software development.

But hey, Intel is faster in Cinebench! That's gonna count for something, right?
Well, if you're a 3D artist (Cinebench is very representative here), this can def be a deal-breaker. I know a couple of motion graphic artist, that 10 years ago all used to work on Mac pro machines but all switched to PCs, because top performance (and more importantly: price/performance) on a custom built machine is just way better. Not to even mention the inclusion of powerful Geforce cards (option unavailable on Macs), something crucial for fast workflow with todays GPU based renderers.

It all depends on your needs. For most, Mac is absolutely the best all-around package (myself included).
Apple silicone with latest updates might start luring raw-power hungry professionals back in...
 

Shirasaki

macrumors P6
May 16, 2015
16,263
11,764
Architecture difference and we run comparison between those two huh... sounds legit. (Yeah I know Geekbench exist but ain't no normal person just run benchmark all day unend):confused:

Yes, hardware is there now and I have no doubt its performance will explode a couple of times before hitting the plateau again. The issue now lies on the software side. Everyone knows x86 has millions if not billions of software to choose from thanks to decades of development etc. It's been Apple silicon anniversary, how many Mac software is being converted to support Apple silicon at this point, let alone properly taking advantage of M1's performance.

On the other hand, Apple is designing their new MacBook Pro with one goal in mind: creativity industry. As of now MacBook Pro can still be used for general purposes. But after another year or so, I am not so sure. Apple may end up building their Mac so specialised everyone else might just get a MacBook Air and call it a day or keep using their PC, and all of those hardware power goes largely unused thanks to its niche nature. Remember, M1 Pro/Max MacBook could sell very well because there are LOTS of people waiting desperately for an update that is worthwhile.

Thanks to Intel being lead by an engineer again, Alder Lake might just be the start, similar to M1. I am not saying Intel will once again catch up on Apple and lead performance crown in a year or two, but this leadership change gives me hope that Intel might once again prove themselves worthy of a competitor to challenge Apple silicon in terms of performance while also optimising for high efficiency.
 

Serban55

Suspended
Oct 18, 2020
2,153
4,344
Well, if you're a 3D artist (Cinebench is very representative here), this can def be a deal-breaker. I know a couple of motion graphic artist, that 10 years ago all used to work on Mac pro machines but all switched to PCs, because top performance (and more importantly: price/performance) on a custom built machine is just way better. Not to even mention the inclusion of powerful Geforce cards (option unavailable on Macs), something crucial for fast workflow with todays GPU based renderers.

It all depends on your needs. For most, Mac is absolutely the best all-around package (myself included).
Apple silicone with latest updates might start luring raw-power hungry professionals back in...
its not what he was saying...cinebench is a short benchmark
For me, as a Maya user...i dont give a cr for a seconds-minutes benchmark times....Intel will throttle like crazy in laptops...i mean 13" or 14" are out of the question...even 15" and 17" if they are not Big laptops with proper cooling..again out of the question to have power for all of your work sustained
think about it...5% more in cpu is nothing after 10 seconds when the Intel cpu will heat up inside
In desktops , intel still can have a proper workload ...but from my perspective..Intel laptops are , for now, gone for professionals...a lot of users will prefer macs or amd
Thanks god for intel that windows OEM still use them in a lot of devices...
 

Serban55

Suspended
Oct 18, 2020
2,153
4,344
Thanks to Intel being lead by an engineer again, Alder Lake might just be the start, similar to M1
Yes, different leader can be a good thing now...but again, that leader is not Johny Srouji...when he was at Intel, that company was on top...now since he is at Apple for a decade. and for the last 2 decades shows, that he knows/understand silicon, he is the brain
 

GuruZac

macrumors 68040
Sep 9, 2015
3,748
11,733
⛰️🏕️🏔️
Architecture difference and we run comparison between those two huh... sounds legit. (Yeah I know Geekbench exist but ain't no normal person just run benchmark all day unend):confused:

Yes, hardware is there now and I have no doubt its performance will explode a couple of times before hitting the plateau again. The issue now lies on the software side. Everyone knows x86 has millions if not billions of software to choose from thanks to decades of development etc. It's been Apple silicon anniversary, how many Mac software is being converted to support Apple silicon at this point, let alone properly taking advantage of M1's performance.

On the other hand, Apple is designing their new MacBook Pro with one goal in mind: creativity industry. As of now MacBook Pro can still be used for general purposes. But after another year or so, I am not so sure. Apple may end up building their Mac so specialised everyone else might just get a MacBook Air and call it a day or keep using their PC, and all of those hardware power goes largely unused thanks to its niche nature. Remember, M1 Pro/Max MacBook could sell very well because there are LOTS of people waiting desperately for an update that is worthwhile.

Thanks to Intel being lead by an engineer again, Alder Lake might just be the start, similar to M1. I am not saying Intel will once again catch up on Apple and lead performance crown in a year or two, but this leadership change gives me hope that Intel might once again prove themselves worthy of a competitor to challenge Apple silicon in terms of performance while also optimising for high efficiency.
I think that’s fair. And I too am glad to see leadership changes at Intel. Will that be enough? Not sure, but at least they are doing something again. The more competition and more pressure on all the chip makers, the better for all of us.
 
  • Like
Reactions: Shirasaki

Spindel

macrumors 6502a
Oct 5, 2020
521
655
Yes, hardware is there now and I have no doubt its performance will explode a couple of times before hitting the plateau again. The issue now lies on the software side. Everyone knows x86 has millions if not billions of software to choose from thanks to decades of development etc.
While not to be dismissed, the BC software trope with x86 is a fallacy in itself.

Firstly if you still use software that is from let's say 2000 or earlier (I'm being generous here) you might as well emulate x86 and it will run better on modern non compatible hardware.

Secondly a lot of people already do this with stuff in DOSBox so technically if you have archaic software people already need to emulate it even on a x86 platform so switching underlying arch wouldn't matter much anyway.

Thirdly if you have, lets say, a production facility with an old CN mill running off a 386 you can't just put a new modern computer to it (even if it has the right ports) and just run the control program. If you switch computer you will emulate the old system via i e DOSBox and we are back to point 1. Old systems using software relying on x86 arch might as well be emulated on whatever architecture because they will never "feel" the performance penalty for emulation.


The big problem is existing software still being developed over decades that would do good with a rewrite to drop legacy stuff in the code that in this day and age can be done better (I'm looking at you AutoCAD). But for the software developers its easier/cheaper to just pile on crap on old crap than to fix old crap.
 
  • Like
Reactions: Argoduck

leman

macrumors Core
Original poster
Oct 14, 2008
19,521
19,678
Architecture difference and we run comparison between those two huh... sounds legit. (Yeah I know Geekbench exist but ain't no normal person just run benchmark all day unend):confused:

Are you suggesting that comparing performance between different architectures is not possible? That's a very radical viewpoint.


On the other hand, Apple is designing their new MacBook Pro with one goal in mind: creativity industry. As of now MacBook Pro can still be used for general purposes. But after another year or so, I am not so sure.

What is the basis for your statement? Sure, the new Macs excel at creative workflows. They also excel at software development, data manipulation, research, office, everyday computing and even *gasp* gaming.

Thanks to Intel being lead by an engineer again, Alder Lake might just be the start, similar to M1.

Ugh, since Intel has been led by an engineer again we got a number of smear campaigns, aggressive benchmark manipulation, a new generation of cores that runs hotter than the old one for a minor bump in performance and a "next-gen" efficiency cores that is comparable in performance to modern vanilla ARM cores — at a much higher power consumption. I don't see how this is a good start. Not to mention that it introduces considerable complexity for software design and OS scheduling.

Maybe I am overacting and maybe I am blind-sighted, but from my perspective as a software engineer, the x86 situation is a dead end. Throwing more and more cores at basic computational problems is not an answer. Not to mention that now you have to optimize your software for multiple cores with different microarchitecture...
 

Zdigital2015

macrumors 601
Jul 14, 2015
4,143
5,622
East Coast, United States
What do you mean the 12th gen i5 is like 200 dollar cpu and the m1 chip does not pull 20 watts. The i5 pulls 125 with its gpu also running. Look it up it breaks 1800 single core in geek bench 5.
1. 12 Gen H-Series laptop CPUs have not been released yet.
2. The 11th Gen H-Series Core i5 11500H is $250, i7 and i9 CPUs will cost a lot more and are more suitable to challenging the M1 Pro and M1 Max.
3. The Core i5-11500H is a 45w TDP CPU.
4. The iGPUs in the Core Series are still not going to be desirable by anyone who wants a gaming rig.
5. The M1 draws max 20w-30w and the M1 Max/Pro are in the 65w-80w range, 12th Gen H-Series are likely to be 65w TDP and ramp up to 112w on their own, not 45w TDP as all Intel has left is to keep pushing wattage and frequency into their ancient x86 ISA and change up the microarchitecture every so often to try and eek out some performance gains.
6. SC performance keeps being used as some sort of holy grail since M1 crushed everyone else. My personal 13” MBP M1 get 1721 in SC and 7541 in MC while giving me 12-14 hours a charge, the 12th Gen Core i5 don’t impress me at all.

Try harder next time.
 
  • Like
Reactions: Argoduck

EntropyQ3

macrumors 6502a
Mar 20, 2009
718
824
Ugh, since Intel has been led by an engineer again we got a number of smear campaigns, aggressive benchmark manipulation, a new generation of cores that runs hotter than the old one for a minor bump in performance and a "next-gen" efficiency cores that is comparable in performance to modern vanilla ARM cores — at a much higher power consumption. I don't see how this is a good start. Not to mention that it introduces considerable complexity for software design and OS scheduling.
Pat Gelsinger is strongly engaged in technology propaganda. He is trying to create a picture of Intel as an entity that is in the technology forefront and tries to leverage this into subsidies that range into the tens of billions from both the US and EU. (Why this would be required given the incredible profitability of Intel this century is a bit opaque, to say the least.) Creating government vested interest in Intel manufacturing infrastructure.
It’s not a stupid plan. Whether it deserves success is another question.
 

grilledcheesesandwich

macrumors member
Jun 10, 2021
64
251
You seem to be putting a lot of blind faith in Intel regarding their ability to execute well in mobile. What I see here does not give me any reason to be hopeful. But sure, Alder Lake will be competitive with AMD.
Not really blind faith, considering Tiger Lake remains much more respectable than Rocket Lake, a dismal showing by every performance metric. The existential problems for Intel lie in the desktop & server markets, not mobile computing. Intel will need to successfully iterate with Raptor Lake & Meteor Lake over the next couple years to regain market share from AMD, I imagine.
 

Spindel

macrumors 6502a
Oct 5, 2020
521
655
1. 12 Gen H-Series laptop CPUs have not been released yet.
2. The 11th Gen H-Series Core i5 11500H is $250, i7 and i9 CPUs will cost a lot more and are more suitable to challenging the M1 Pro and M1 Max.
3. The Core i5-11500H is a 45w TDP CPU.
4. The iGPUs in the Core Series are still not going to be desirable by anyone who wants a gaming rig.
5. The M1 draws max 20w-30w and the M1 Max/Pro are in the 65w-80w range, 12th Gen H-Series are likely to be 65w TDP and ramp up to 112w on their own, not 45w TDP as all Intel has left is to keep pushing wattage and frequency into their ancient x86 ISA and change up the microarchitecture every so often to try and eek out some performance gains.
6. SC performance keeps being used as some sort of holy grail since M1 crushed everyone else. My personal 13” MBP M1 get 1721 in SC and 7541 in MC while giving me 12-14 hours a charge, the 12th Gen Core i5 don’t impress me at all.

Try harder next time.
For clarification:

The OG M1 Mini draws around 30-40 W from the wall when both CPU and GPU run at full simultaneously. Can't even be compared to just a CPU running at 65 W.
 

Shirasaki

macrumors P6
May 16, 2015
16,263
11,764
Are you suggesting that comparing performance between different architectures is not possible? That's a very radical viewpoint.




What is the basis for your statement? Sure, the new Macs excel at creative workflows. They also excel at software development, data manipulation, research, office, everyday computing and even *gasp* gaming.



Ugh, since Intel has been led by an engineer again we got a number of smear campaigns, aggressive benchmark manipulation, a new generation of cores that runs hotter than the old one for a minor bump in performance and a "next-gen" efficiency cores that is comparable in performance to modern vanilla ARM cores — at a much higher power consumption. I don't see how this is a good start. Not to mention that it introduces considerable complexity for software design and OS scheduling.

Maybe I am overacting and maybe I am blind-sighted, but from my perspective as a software engineer, the x86 situation is a dead end. Throwing more and more cores at basic computational problems is not an answer. Not to mention that now you have to optimize your software for multiple cores with different microarchitecture...
I am not entirely sure if you just hate Intel or you are serious about how computer architecture should go. Again, no matter how good the performance is, without appropriate software support, the said hardware platform could never truly take off. Granted we have long past 1984 but Mac software transition to Apple silicon isnt as smooth as some might choose to believe.

X86 might hit its design and performance plateau soon, but those huge software libraries means X86 will not go anywhere for a very very long time, much like Commodore ecosystem era back in 1974-1992. I doubt such huge software library could be substituted with newer stuff that runs on Apple silicon, or even better, RISC-V. Looking forward doesn’t mean what we have done in the past is being automatically invalidated and thus carries zero value and turns into total waste. Apple will still need to convince more professional users and software developers to develop new stuff on Apple silicon, and high benchmark score alone will NEVER be enough.
 

Shirasaki

macrumors P6
May 16, 2015
16,263
11,764
The big problem is existing software still being developed over decades that would do good with a rewrite to drop legacy stuff in the code that in this day and age can be done better (I'm looking at you AutoCAD). But for the software developers its easier/cheaper to just pile on crap on old crap than to fix old crap.
This is why big Geekbench score number alone is not enough to entice devs to develop for Mac all of a sudden, and iterating on existing software package is definitely cheaper and easier than doing a complete rewrite. I am looking forward to a future where ARM or RISC-V is so good, X86 can just be emulated, even Core i9-12900K. Most can’t spend time and energy bother future architecture and bleeding edge until it is absolutely necessary and no other option remains.
 
  • Like
Reactions: bobcomer

leman

macrumors Core
Original poster
Oct 14, 2008
19,521
19,678
I am not entirely sure if you just hate Intel or you are serious about how computer architecture should go.

Why would I hate Intel? I don’t have a personal relation with a company. I just find their latest work seriously underwhelming and I disagree with many of the software and hardware choices they made in the last years.

Although, I do have a very strong distaste towards their manipulative marketing tactics.

Again, no matter how good the performance is, without appropriate software support, the said hardware platform could never truly take off. Granted we have long past 1984 but Mac software transition to Apple silicon isnt as smooth as some might choose to believe.

Can you elaborate this last statement? I think the software transition has been a tremendous success. Majority of libraries, developer toolchains and popular commercial apps were fully native by spring 2021. There is still some lack of support for Apple proprietary technology but these things obviously take longer.

X86 might hit its design and performance plateau soon, but those huge software libraries means X86 will not go anywhere for a very very long time, much like Commodore ecosystem era back in 1974-1992.

That without a doubt. The big selling point of x86 is backwards compatibility. Then again, virtually all open source libraries on homebrew are arm-native (and have been so for months) and several agencies that do not rely on legacy software have been moving to ARM-based supercomputers.

And anyway, what libraries are you talking about? Even Intel own high-performance numeric libraries such as embree are natively supported on M1 hardware.
 

MauiPa

macrumors 68040
Apr 18, 2018
3,438
5,084
an alternate title. "Intel Ushers in the Era of water-cooled Laptops". I'd love to see that. It is so funny listening to the Intel folks parlaying a desktop chip with more cores and sucking huge power loads and giving off scary thermals that outperforms a laptop. Yah that's right. In what Universe (the meta verse?) does this even come close to being meaningful.

It is very funny though, keep it up
 

leman

macrumors Core
Original poster
Oct 14, 2008
19,521
19,678
This is why big Geekbench score number alone is not enough to entice devs to develop for Mac all of a sudden, and iterating on existing software package is definitely cheaper and easier than doing a complete rewrite.

Why would you need to do a complete rewrite? As a dev, I find what you say extremely puzzling.
 

Spindel

macrumors 6502a
Oct 5, 2020
521
655
This is why big Geekbench score number alone is not enough to entice devs to develop for Mac all of a sudden, and iterating on existing software package is definitely cheaper and easier than doing a complete rewrite. I am looking forward to a future where ARM or RISC-V is so good, X86 can just be emulated, even Core i9-12900K. Most can’t spend time and energy bother future architecture and bleeding edge until it is absolutely necessary and no other option remains.
The thing is that for a lot of software Rosetta2 emulation/translation is allready good enough. As long as you don’t do stuff that requires long cpu times (like rendering) you can not tell if the software runs rosetta or is native.
 

Tenkaykev

macrumors 6502
Jun 29, 2020
385
427
an alternate title. "Intel Ushers in the Era of water-cooled Laptops". I'd love to see that. It is so funny listening to the Intel folks parlaying a desktop chip with more cores and sucking huge power loads and giving off scary thermals that outperforms a laptop. Yah that's right. In what Universe (the meta verse?) does this even come close to being meaningful.

It is very funny though, keep it up
That could be a selling point and an opportunity to sell a few more accessories. Rock up to work, Plug laptop water cooling circuit into an Intel brand coffee making attachment, don your Intel brand noise cancelling headphones and your all set up for the day.
 
  • Haha
Reactions: TarkinDale

grilledcheesesandwich

macrumors member
Jun 10, 2021
64
251
The thing is that for a lot of software Rosetta2 emulation/translation is allready good enough. As long as you don’t do stuff that requires long cpu times (like rendering) you can not tell if the software runs rosetta or is native.
Rosetta definitely lags versus native, even for basic consumer apps, e.g., Messenger, Spotify
 

orthorim

Suspended
Feb 27, 2008
733
350
Kinda reminds me of the Pentium 4 days and Intels prediction that in a few years the chips would consume 600W of power...

Lol.

The only reason there's alder lake chips is AMD and M1s eating Intel's lunch. They try their old tactics but won't work this time. If anyone noticed, nobody can compete with Apple mobile GPUs already, and they will carry that into more CPU cores in the M2 Pro when they can then wipe the floor with any intel chip using 16 or 24 CPU cores.

The only problem Apple has is that fewer and fewer people will need all this CPU horsepower. Most ppl just wanna browse the web and do some Google sheets.
 
  • Like
Reactions: psychicist
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.