Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

microcolonel

macrumors newbie
Jul 30, 2020
2
8
Another aspect to consider is that optimising a performance oriented processor microarchitecture to become energy efficient is much harder than the other way around.

That’s been a key part of Arm’s ability to eat into Apple and AMD’s bastion’s bit by bit. 5 years ago Arm wasn’t an option for supercomputer or data center. Now Fukagu and AWS Graviton are sought after Arm based solutions.

Arm already had a bulk of the low end compute segment where the volumes (despite low margins) and the future of humanity (grandiose sounding but true) lie. Intel and AMD do not have the micro architecture chops as yet to venture there. Intel’s attempts to go there with Mobile handsets fell flat and time will tell if their new crop of cores can match Arm’s perf/watt lineage.

Please don’t underestimate the value connotations associated with perf/Watt superiority in system design. It is wrong to assume that power costs are OK for commodity PC systems. They are OK at present because of the savings at the datacenter that make it possible today for that PC to provide increasingly cloud based services at a high energy guzzle. This is true pretty much across the board and is likely to increase further.

Windows on Arm is actually quite pleasant to use already (Office included) on bleeding edge and although things are at an early stage this is bound to get better.

Overall Arm has a lot of catching up to do with AMD and Intel in the consumer PC segment but there’s far more micro architectural head room for Arm, as Apple is discovering to its incremental advantage.
 
  • Like
Reactions: pshufd and grandM

Spindel

macrumors 6502a
Oct 5, 2020
521
655
*snip*...

Not to mention that games for Apple Silicon are just not a thing, and gaming is a huge part of the PC market, and realistically will probably never be a thing, since Apple and gaming just don't work together.

...*snip*
I haven't read the entire thread so someone might have already commented on this but this claim is just plain wrong.

While yes gaming has a fair share of PC market it's not a huge part of the market.
 
  • Like
Reactions: JMacHack

UBS28

macrumors 68030
Oct 2, 2012
2,893
2,340
The power is there, but AMD and even possibly Intel will catch up in time. I fear the future market fragmentation, with developers having to develop specifically for Apple Silicon ARM and just not having the time to do so.

Not to mention that games for Apple Silicon are just not a thing, and gaming is a huge part of the PC market, and realistically will probably never be a thing, since Apple and gaming just don't work together.

Even the new 10nm Intel CPUs will be much better than before, and AMD is already doing great in raw power.

The idea of Apple controlling both software and hardware is great, something they've been trying to do for decades, but the big question is how the support from the developers will be.

I look forward to the power, but I'm just not so sure about the future.

I am a complete noob and have no idea what I'm talking about in this area, but I'm just wondering what other people here think.

I have been saying the same things from day 1. It is nice if you want to run benchmarks, but for real usage, Apple should have switched to AMD if they disliked Intel that much.

Some of my professional hardware equipment no longer works on MAC thanks to Apple. It only works with Windows now.

I will get one of these M2X 14” MBP or 16” MBP, but it won’t be my main computer.
 

Tech198

Cancelled
Mar 21, 2011
15,915
2,151
Chrome, and Microsoft are doing the same too, but to me, apart from performance, and battery life, its a step back, as all apps have to be re-written..

and the old will never be updated. like windows server 2012. Exchange etc.... it will cripple businesses because they won't be allowed to do the same things as they can today.

Apple and Google can get away with anything because they have a smaller market than Microsoft Windows...
 

holisticrunner

macrumors member
Jun 12, 2019
41
96
Sure we can, running the software we need is far more important than that. In fact, electricity for computers is a VERY minimal part of business, so much so that nobody cares about it. It comes under the lights budget.

Big server farms, yes, that makes a difference, desktop PC's and local servers, not so much.
so I guess you've never played mobile games or own a laptop. Cool.
 

Bandaman

Cancelled
Aug 28, 2019
2,005
4,091
I didn't deny that performance per watt is probably better on the M1 (although I'm not aware of performance per watt from Ryzen 3 + current Nvidia). The point is that this doesn't help you if the M1 can be so slow in demanding tasks that it's not or barely usable. In the DaVinci Resolve example, the M1 is said to have dropped well below 10FPS at times and just stuttered, while the Ryzen laptop with Nvidia still ran relatively smoothly.... If I could find the link, I could link it (but was in German).
Yes, the M1X is not there yet. But I said you have to compare what is there. If you need a laptop for such tasks today, you have to take what's there. And vice versa, it is constantly ignored that Apple currently has better manufacturing (5nm).

And the CSAM issue is important not to be forgotten. Yes, for the technical discussion it is off-topic. But the best CPU is useless if the system is no longer trustworthy.
The new DaVinci Resolve that's actually optimized for the M1 doesn't even remotely approach 10 FPS. And if you use Final Cut Pro X it's butter as always. It really comes down to the developers for getting the most out of the M1. Yes it is a first generation chip and it will get better, but I'm not sure where you're getting your information from. Sounds like the person testing it was still using the Intel version of DaVinci.
 

bobcomer

macrumors 601
May 18, 2015
4,949
3,699
so I guess you've never played mobile games or own a laptop. Cool.
Yep, I've played games on my phone -- totally different market segment.

And yes, I have several laptops, one even an M1 MBA, but I can't use that one for work, so it doesn't get used very much. My X1 Carbon gets used the most, running Windows, is used the most by a wide margin.
 

AgentMcGeek

macrumors 6502
Jan 18, 2016
374
305
London, UK
The point is that this doesn't help you if the M1 can be so slow in demanding tasks that it's not or barely usable.
Despite being on the entry-level MB Pro, the M1 was never meant to be in a Pro machine. You cannot expect it to compete with a 45W Intel CPU and a 50W GPU + 4 Gigs of VRAM. Yes, it does sometimes better at only 20W, but if it's raw power you need, wait for the M1X models. It will double the high-efficiency cores.

The M1X was supposed to arrive 4 months ago to quickly replace the higher-end Intel models, but as we all know it didn't happen because OLED. We just have to wait a few more weeks.
 

PsykX

macrumors 68030
Sep 16, 2006
2,747
3,926
It's not like Apple built the M1 and said "we're done" :)
My thoughts exactly.
Apple probably already has a roadmap of what the M2, M3, M4 and maybe even M5 will be.

And they have a history of designing chips, they already know how much they can stick to their own deadlines. We all know Intel never sticked to them, but using TSMC to build their processors will be interesting.

But for now, take this graph and draw a line through all Intel processors and another one through all Apple processors. It shows how Apple was, so far, taking the lead. It only made sense for them to make the switch. And Apple probably has a few more dots than what is shown on this graph, because they know what's coming next from their labs :

1630937206493.png

Source : https://www.anandtech.com/show/16226/apple-silicon-m1-a14-deep-dive/4
 
Last edited:

Spindel

macrumors 6502a
Oct 5, 2020
521
655
ARM has structural advantages over x86 and Apple just showed the world what those are. In response, Intel and AMD will have to eventually go to ARM or RISC. And they will have to go through the same, painful transition that Apple is going through right now.

Intel's i9-12900 is faster than Apple's M1 in single-core and multi-core Geekbench 5. The i9-12900 should start shipping later this year or early next year. The i9-12900 beeds 250 Watts to beat the M1 running at 20 Watts though. Intel wins!
Man I returned to this thread and still haven't gotten further than this post before commenting again :)

Don't forget that you are comparing a CPU that isn't even available yet to a CPU that has been on the market for almost a year :)

And as you pointed out to achieve this performance the power budget is more than 10x of the M1.
 

grandM

macrumors 68000
Oct 14, 2013
1,520
302
My thoughts exactly.
Apple probably already has a roadmap of what the M2, M3, M4 and maybe even M5 will be.

And they have a history of designing chips, they already know how much they can stick to their own deadlines. We all know Intel never sticked to them, but using TSMC to build their processors will be interesting.

But for now, take this graph and draw a line through all Intel processors and another one through all Apple processors. It shows how Apple was, so far, taking the lead. It only made sense for them to make the switch. And Apple probably has a few more dots than what is shown on this graph, because they know what's coming next from their labs :

View attachment 1827634
Source graph?
 

Sander

macrumors 6502a
Apr 24, 2008
521
67
Point was more just that I generally find the C++ community can be a bit exclusionary at times.
That's because we keep having to explain to the hipsters why we are using yesterday's technology instead of tomorrow's. It takes a few iterations of the hipsters telling us we should rewrite our app in Y because Y is the new X, and when we finally decide on a rainy Friday afternoon to check out Y and find it's already gone, and the hipsters are rewriting their app in Z because Z is the new Y. Then we decide to just shrug and sigh.
 
  • Like
Reactions: Gerdi and grandM

casperes1996

macrumors 604
Jan 26, 2014
7,599
5,770
Horsens, Denmark
That's because we keep having to explain to the hipsters why we are using yesterday's technology instead of tomorrow's. It takes a few iterations of the hipsters telling us we should rewrite our app in Y because Y is the new X, and when we finally decide on a rainy Friday afternoon to check out Y and find it's already gone, and the hipsters are rewriting their app in Z because Z is the new Y. Then we decide to just shrug and sigh.

That seems like the opposite of what's going on when a C++ guru sees my C style calls to malloc and free using regular pointers rather than smart pointers, new and delete :p
 

grandM

macrumors 68000
Oct 14, 2013
1,520
302
That's because we keep having to explain to the hipsters why we are using yesterday's technology instead of tomorrow's. It takes a few iterations of the hipsters telling us we should rewrite our app in Y because Y is the new X, and when we finally decide on a rainy Friday afternoon to check out Y and find it's already gone, and the hipsters are rewriting their app in Z because Z is the new Y. Then we decide to just shrug and sigh.
What's your take on Swift, JS?
 

dapa0s

macrumors 6502a
Original poster
Jan 2, 2019
523
1,032
How much better will Apple CPUs have to be than the competition, considering they will increase the prices even more?


The "m1x" supposedly benchmarks better than the 16 core mac pro, so I guess we will see.

Of course, AMD will probably also rise in prices, but we don't know whether Intel will.
 
  • Haha
Reactions: Maconplasma

casperes1996

macrumors 604
Jan 26, 2014
7,599
5,770
Horsens, Denmark
That's because you shouldn't use those in C++.
I know. But there's different ways of approaching that when someone comes to C++ from a C background and some ways will make that person go "Well this doesn't seem to be a very nice community" and other ways will make them go "Oh, thanks for that!" - That said I find that it's only the first layer of the onion so to speak. Once you get past the "noob" questions you ask along the way and get more into it, people are generally rather nice.

BTW. The links in your bio; Those are yours? I found the "So how much does an iPhone developer make" really quite interesting. And well written for that matter. Thanks for the post, and good luck with your games - It's been some years but hope all's still well with it all
 

cmaier

Suspended
Jul 25, 2007
25,405
33,474
California
I know. But there's different ways of approaching that when someone comes to C++ from a C background and some ways will make that person go "Well this doesn't seem to be a very nice community" and other ways will make them go "Oh, thanks for that!" - That said I find that it's only the first layer of the onion so to speak. Once you get past the "noob" questions you ask along the way and get more into it, people are generally rather nice.

BTW. The links in your bio; Those are yours? I found the "So how much does an iPhone developer make" really quite interesting. And well written for that matter. Thanks for the post, and good luck with your games - It's been some years but hope all's still well with it all

I coded in C for 10-15 years, and C++ for around 10 years, and never noticed any communities. What are these communities of which you speak? Alan and Morgan? Because if you are talking about my friends Alan and Morgan, they are very nice, and Alan only hit me on the head with K&R second edition once, so that wasn’t so bad.
 
  • Haha
Reactions: jdb8167

jz0309

Contributor
Sep 25, 2018
11,392
30,076
SoCal
How much better will Apple CPUs have to be than the competition, considering they will increase the prices even more?


The "m1x" supposedly benchmarks better than the 16 core mac pro, so I guess we will see.

Of course, AMD will probably also rise in prices, but we don't know whether Intel will.
you do realize that intel's margins have been between 50 and 60% for decades, right?
 

Sander

macrumors 6502a
Apr 24, 2008
521
67
What's your take on Swift, JS?

What I like about javascript is that you can run it without any installer dependencies - assuming "everyone" has a modern browser. So I used it for a few projects. What I don't like about it is its "helpful" type coersion which has never actually "helped" me. And what I utterly fail to understand is why so many web apps include dozens of frameworks by referring to 3rd party URIs. Do people not care at all to leave the functionality of their app in all these other peoples' hands?

Swift I haven't really looked in to. I typically focus on cross-platform work nowadays and I'd probably dust off my Objective-C++ if I wanted to write a Mac app again. In my day-to-day work I notice that I prefer writing functionality in C++ and using Python for the glue/scripting.

I know. But there's different ways of approaching that when someone comes to C++ from a C background and some ways will make that person go "Well this doesn't seem to be a very nice community" and other ways will make them go "Oh, thanks for that!" - That said I find that it's only the first layer of the onion so to speak. Once you get past the "noob" questions you ask along the way and get more into it, people are generally rather nice.

BTW. The links in your bio; Those are yours? I found the "So how much does an iPhone developer make" really quite interesting. And well written for that matter. Thanks for the post, and good luck with your games - It's been some years but hope all's still well with it all

"What do you think about malloc and free" is actually an interview question I often use. The answer and the ensuing discussion can be most enlightening. But like cmaier said - what is "the" C++ community..?

And yes the links are mine. I must say that I haven't really spent much time keeping my apps up-to-date (doing the minimum when Apple breaks the existing functionality every once in a while) and they buy me approximately one beer per week nowadays.
 

casperes1996

macrumors 604
Jan 26, 2014
7,599
5,770
Horsens, Denmark
I coded in C for 10-15 years, and C++ for around 10 years, and never noticed any communities. What are these communities of which you speak? Alan and Morgan? Because if you are talking about my friends Alan and Morgan, they are very nice, and Alan only hit me on the head with K&R second edition once, so that wasn’t so bad.

Mostly Stack Exchange and similar :p - I wouldn’t mind someone hitting me with K&R, haha
 

casperes1996

macrumors 604
Jan 26, 2014
7,599
5,770
Horsens, Denmark
What I like about javascript is that you can run it without any installer dependencies - assuming "everyone" has a modern browser. So I used it for a few projects. What I don't like about it is its "helpful" type coersion which has never actually "helped" me. And what I utterly fail to understand is why so many web apps include dozens of frameworks by referring to 3rd party URIs. Do people not care at all to leave the functionality of their app in all these other peoples' hands?

Swift I haven't really looked in to. I typically focus on cross-platform work nowadays and I'd probably dust off my Objective-C++ if I wanted to write a Mac app again. In my day-to-day work I notice that I prefer writing functionality in C++ and using Python for the glue/scripting.



"What do you think about malloc and free" is actually an interview question I often use. The answer and the ensuing discussion can be most enlightening. But like cmaier said - what is "the" C++ community..?

And yes the links are mine. I must say that I haven't really spent much time keeping my apps up-to-date (doing the minimum when Apple breaks the existing functionality every once in a while) and they buy me approximately one beer per week nowadays.

You should give Swift a go for fun sometime. It’s great! And if you’re mindful of how you use it, it can be portable. Linux and Apple Platforms for ages and somewhat recently Windows support.

Out of curiosity, what’s your ideal answer then for what people think of malloc and free?
As for “the C++ community”, mostly Stack Exchange and the few folks I’ve met who work with it daily IRL. - But again it’s not all of them and it only seems to be the first layer of the onion. I find the further in you go the nicer folks get :)
On the other hand, where I find that some C++ programmers can seem a bit elitist, Python people are the complete opposite to their detriment. They’ll talk up the language as if it’s so easy and brush over things that hold complexity to the point where people may get false impressions of it and get really stuck when they encounter a bug.
But all this is just my experience with individuals and it wasn’t really intended to spur a conversation about all this; It’s all anecdotal and I like C++, I think most people are great and all :)

Seems alright with a beer a week for fairly low effort work if you only do minimal maintenance :)
 

pshufd

macrumors G4
Oct 24, 2013
10,151
14,574
New Hampshire
Despite being on the entry-level MB Pro, the M1 was never meant to be in a Pro machine. You cannot expect it to compete with a 45W Intel CPU and a 50W GPU + 4 Gigs of VRAM. Yes, it does sometimes better at only 20W, but if it's raw power you need, wait for the M1X models. It will double the high-efficiency cores.

The M1X was supposed to arrive 4 months ago to quickly replace the higher-end Intel models, but as we all know it didn't happen because OLED. We just have to wait a few more weeks.

It will double the performance cores. It will halve the efficiency cores.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.