Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

grandM

macrumors 68000
Oct 14, 2013
1,520
302
I wasn't advocating for Electron apps. But even native code written in C, C++, Swift, Rust, Zig, you name it, will just need to be recompiled and it'll be ready to go on Apple Silicon. Not much if any work needed at all in most cases. I just acknowledged a lot of programs are already written using web tech and that's inherently portable
Most devs can't handle C. C++ perhaps, but I wouldn't bet on it. A lot is done in JS.
 

casperes1996

macrumors 604
Jan 26, 2014
7,599
5,770
Horsens, Denmark
Watch the whole video. M1 has twice the decoders and three times the decoder buffer size. AMD was asked why they only had four decoders and the response is that there weren't benefits to doing more. The video also explains how the four decoders work with variable length instructions. It also talks about the custom silicon for specialized functions in the M1. But this isn't an inherent advantage of the ISA. Apple can do anything that they want to and run with low-level interfaces.

I don't think that AMD and Intel can overcome the variable instruction length problem which is why they are eventually going to have to go the same route.

Everyone has seen the M1 block diagram with custom silicon and Intel is copying some of that in newer chips so kudos there. But that isn't unique to M1.
I feel like you think we're disagreeing on more aspects than we actually are. I'm not saying this isn't an advantage. I'm just saying there's more to chip design than this and the chip designers should be acknowledged for their work rather than reducing it to ARM good x86 bad.

Besides, since each instruction can contain more "work" in x86 decoding 1 instruction on an ARM chip != decoding 1 instruction on x86. If your limiting factor isn't the time to decode instructions anyway, but your ALU or something else then this is an irrelevant difference. It's a meaningful difference when you are bound by the performance of decoding instructions but if it takes longer to execute one instruction than it does to figure out what to do next, well it doesn't matter much
 

robco74

macrumors 6502a
Nov 22, 2020
509
944
Most devs can't handle C. C++ perhaps.
It's not a question of whether or not devs can handle it, increasingly, many don't want to. Unless you really need that level of performance, the overhead in development and testing often isn't worth it. Most companies want to be able to iterate quickly and computing resources are much less expensive than development time and resources.
 

casperes1996

macrumors 604
Jan 26, 2014
7,599
5,770
Horsens, Denmark
A lot of software will be ported this way but I expect that there's a lot of custom silicon that developers could take advantage of to get much larger performance gains. Intel sells a bunch of software libraries with accelerated performance for common math and other operations. Apple has likely implemented a lot of things in silicon and they've probably provided APIs to use those functions.
APIs that've existed for ages on iOS and macOS and dispatch work to whatever hardware is available, be it dedicated hardware on iOS and Apple Silicon Macs, T2 chips, GPUs or whatever else is available, yes. These are already widely used.
Most devs can't handle C. C++ perhaps.
No offence intended but have you ever written code? What do you base this statement on?
C is a fairly simple language. C++ is insanely big. You can learn all of C over a weekend if you know programming already. You can spend a lifetime and not know all of C++. (by which I include stdlib.h for C and the standard template library for C++). C++ gives you a lot of tools to use to make abstractions and structure your code. It also gives you things like std::vector so you won't have to implement dynamically growing arrays yourself. But all the ways you can shoot yourself in the face with C you can do with C++ too. And more too.
 

grandM

macrumors 68000
Oct 14, 2013
1,520
302
It's not a question of whether or not devs can handle it, increasingly, many don't want to. Unless you really need that level of performance, the overhead in development and testing often isn't worth it. Most companies want to be able to iterate quickly and computing resources are much less expensive than development time and resources.
Oh I recognize the business side. Problem is it leads to inferior products for the customer.
 

grandM

macrumors 68000
Oct 14, 2013
1,520
302
APIs that've existed for ages on iOS and macOS and dispatch work to whatever hardware is available, be it dedicated hardware on iOS and Apple Silicon Macs, T2 chips, GPUs or whatever else is available, yes. These are already widely used.

No offence intended but have you ever written code? What do you base this statement on?
C is a fairly simple language. C++ is insanely big. You can learn all of C over a weekend if you know programming already. You can spend a lifetime and not know all of C++. (by which I include stdlib.h for C and the standard template library for C++). C++ gives you a lot of tools to use to make abstractions and structure your code. It also gives you things like std::vector so you won't have to implement dynamically growing arrays yourself. But all the ways you can shoot yourself in the face with C you can do with C++ too. And more too.
Yes I have. You're a talented person but a lot of devs couldn't handle C pointers. In C++ that became easier. Even in Objective-C people struggled with pointers in array elements. It's no coincidence most of these became structs in Swift.
 

casperes1996

macrumors 604
Jan 26, 2014
7,599
5,770
Horsens, Denmark
Yes I have. You're a talented person but a lot of devs couldn't handle C pointers. In C++ that became easier. Even in Objective-C people struggled with pointers in array elements. It's no coincidence most of these became structs in Swift.
Pointers exist in the exact same way in C++ though. I mean C++ also has smart pointers with the fancy-smanchy std::shared_pointer and all that jazz and instead of allocating space for your structs with malloc you can use new and delete, but C++ still has pointer arithmetic and everything C has so I struggle to imagine someone who can deal with pointers in C++ but panic when they see a C pointer. Maybe that's just me though :)
 
  • Like
Reactions: KeithBN

grandM

macrumors 68000
Oct 14, 2013
1,520
302
Pointers exist in the exact same way in C++ though. I mean C++ also has smart pointers with the fancy-smanchy std::shared_pointer and all that jazz and instead of allocating space for your structs with malloc you can use new and delete, but C++ still has pointer arithmetic and everything C has so I struggle to imagine someone who can deal with pointers in C++ but panic when they see a C pointer. Maybe that's just me though :)
Oh you're right the concept is the same. In C however I recall &&&& **** . In C++ that be & *.
 

grandM

macrumors 68000
Oct 14, 2013
1,520
302
Pointers exist in the exact same way in C++ though. I mean C++ also has smart pointers with the fancy-smanchy std::shared_pointer and all that jazz and instead of allocating space for your structs with malloc you can use new and delete, but C++ still has pointer arithmetic and everything C has so I struggle to imagine someone who can deal with pointers in C++ but panic when they see a C pointer. Maybe that's just me though :)
As said before you're talented.
 

Homy

macrumors 68030
Jan 14, 2006
2,510
2,462
Sweden
"Here we get a big problem with the Intel and AMD business model. Their business models are based on selling general-purpose CPUs, which people just slot onto a large PC motherboard. Thus computer-makers can simply buy motherboards, memory, CPUs, and graphics cards from different vendors and integrate them into one solution.

But we are quickly moving away from that world. In the new SoC world, you don’t assemble physical components from different vendors. Instead, you assemble IP (intellectual property) from different vendors. You buy the design for graphics cards, CPUs, modems, IO controllers, and other things from different vendors and use that to design an SoC in-house. Then you get a foundry to manufacture this.

Now you got a big problem, because neither Intel, AMD, or Nvidia are going to license their intellectual property to Dell or HP for them to make an SoC for their machines.

Sure Intel and AMD may simply begin to sell whole finished SoCs. But what are these to contain? PC-makers may have different ideas of what they should contain. You potentially get a conflict between Intel, AMD, Microsoft, and PC-makers about what sort of specialized chips should be included because these will need software support.

For Apple this is simple. They control the whole widget. They give you, for example, the Core ML library for developers to write machine learning stuff. Whether Core ML runs on Apple’s CPU or the Neural Engine is an implementation detail developers don’t have to care about."

 
  • Like
Reactions: osx86

casperes1996

macrumors 604
Jan 26, 2014
7,599
5,770
Horsens, Denmark
Oh you're right the concept is the same. In C however I recall &&&& **** . In C++ that be & *.
? If the codebase you were looking at had a reference to a reference to a reference to a reference of a pointer of a pointer of a pointer of a pointer... I urge you to find the people responsible for that and slap them a few times, haha.
But no I get what you're saying. Same thing can happen in C++ but it's less likely to happen as people tend to think and code differently with the features available in C++.
As said before you're talented.
Flattering. But I'm just as good at coding bugs as the next person :p

But no genuinely I'm better at C than I am C++. Mostly because I've used C a lot and I've not really used C++ all that much so my C++ code isn't. idiomatic C++, it's just C With Classes.
PS. Really hope I didn't come off hostile or anything earlier when I asked if you've written any code. In my world C++ programmers have always seemed like they needed to know more than C programmers because the language is so massive and has evolved so much over the past decades and C has been mostly static since C99. And you wouldn't even need to change all that much to be up to date if you were used to K&R C. So I was genuinely wondering about your statement back then, but I do see your point too - that while C++ is a bigger language with more things to learn, those language additions also help you stay away from situations where you need to more manually manage memory.
 

bradl

macrumors 603
Jun 16, 2008
5,952
17,447
The power is there, but AMD and even possibly Intel will catch up in time. I fear the future market fragmentation, with developers having to develop specifically for Apple Silicon ARM and just not having the time to do so.

Not to mention that games for Apple Silicon are just not a thing, and gaming is a huge part of the PC market, and realistically will probably never be a thing, since Apple and gaming just don't work together.

Even the new 10nm Intel CPUs will be much better than before, and AMD is already doing great in raw power.

The idea of Apple controlling both software and hardware is great, something they've been trying to do for decades, but the big question is how the support from the developers will be.

I look forward to the power, but I'm just not so sure about the future.

I am a complete noob and have no idea what I'm talking about in this area, but I'm just wondering what other people here think.

This is all part of the ebb and flow that is IT.

See, most people nowadays are either too young or don't remember the time when there was so much CPU fragmentation that different versions of various applications were written to conform with the architecture, and not standardized. For example, Firefox, Netscape, Opera, Mosaic, etc., were all written for the following CPU architectures: Linux x86, Linux x86-64, sparc (Solaris x86), sparc64 (Solaris x86-64), Cyrix, Ultrix, OSF/1 (Digital Unix), HP/UX, NetBSD, 386BSD, 486BSD, BSDi, FreeBSD, AIX, NeXT, MacOS, SunOS, Windows, and others.

Each program had to be recompiled on that architecture for it to be released, let alone supported on that architecture. In short, this fragmentation was already done before. Time caught up, to where things were getting standardized: Most of the xBSDs are gone (except for FreeBSD). All 32bit processors are gone. Solaris dropped Sparc for x86-64. DEC got bought by HP. HP barely uses HP/UX anymore. NeXT got bought by Apple. So out of all of those, the only contenders remaining are Linux, AIX, HP/UX, (Open)Solaris x64, FreeBSD, and Windows. And even with those, four of them conform to the x86-64 spec, so it is only a matter of linking against the properly compiled libraries to create the binary.

That is the flow we've had for 30 years.

The ebb is Apple going back the other way and supporting things with their own hardware internally. The software companies are simply doing the opposite again, but without all of the fragmentation we had. No more RISC, Sparc, and Mach architectures. only Silicon and x86-64, so that really isn't any fragmentation at all. Apple will be okay as long as the others have a baseline piece of hardware to do their work on, and they'll be good.

Where the issue will come in is if the Silicon architecture is publicly published, and Intel and AMD decide to make their own based off that architecture. But even that should be the same as it would conform to the spec that Apple published. If it doesn't, that would mean that Intel and AMD would be cannibalizing their own customer base by offering two of their own different products to their customer market. That would be bad for them unless they would be planning to get out of the x86 market.

BL.
 

bobcomer

macrumors 601
May 18, 2015
4,949
3,699
How about laptops?
Same, we buy whatever runs our software as that costs WAY more.

Why did Microsoft start putting so much effort into WARM in the past couple of years?
What effort? They've dabbled and that's about it.

Why does Google want to make their own chips for their computers?
Who cares? Chromebooks have even less chance of being used in my environment than Macs, and Macs basically have no chance.

Why does nVidia want to buy Arm Holdings?
I suspect licensing fee income. Doesn't matter to me in any case.

Why is Intel doing a deal for RISC-V?
They're a chip foundry and make more than just x86 processors and RISC-V make pretty good device controllers. They'll probably sell a ton.
Why did AMD announce that they're working on ARM chips?
Who knows. Doesn't matter to me, we don't even need middle of the road video cards, so we don't buy their stuff either.

Just look at the market shares in the PC market and tell me who and what type of processors lead by a wide margin.

We need machines to run our software, that's it. fwiw, our main server is a Power9 machine, so we're not all x86, but close.
 

bobcomer

macrumors 601
May 18, 2015
4,949
3,699
No you can't, most businesses have moved to laptops ... and most average office workers need MS Office and a browser, that's it
?? Laptops can come with x86 processors too, and no, we need way more than Office (though we do use it) and a browser!
 

grandM

macrumors 68000
Oct 14, 2013
1,520
302
? If the codebase you were looking at had a reference to a reference to a reference to a reference of a pointer of a pointer of a pointer of a pointer... I urge you to find the people responsible for that and slap them a few times, haha.
But no I get what you're saying. Same thing can happen in C++ but it's less likely to happen as people tend to think and code differently with the features available in C++.

Flattering. But I'm just as good at coding bugs as the next person :p

But no genuinely I'm better at C than I am C++. Mostly because I've used C a lot and I've not really used C++ all that much so my C++ code isn't. idiomatic C++, it's just C With Classes.
PS. Really hope I didn't come off hostile or anything earlier when I asked if you've written any code. In my world C++ programmers have always seemed like they needed to know more than C programmers because the language is so massive and has evolved so much over the past decades and C has been mostly static since C99. And you wouldn't even need to change all that much to be up to date if you were used to K&R C. So I was genuinely wondering about your statement back then, but I do see your point too - that while C++ is a bigger language with more things to learn, those language additions also help you stay away from situations where you need to more manually manage memory.
Don't fret over it. It takes more to offend me. Plus I know you're a helpful person eager to improve IT and macrumors.
 
  • Like
Reactions: casperes1996

casperes1996

macrumors 604
Jan 26, 2014
7,599
5,770
Horsens, Denmark
?? Laptops can come with x86 processors too, and no, we need way more than Office (though we do use it) and a browser!
Think the point was that power consumption matters on laptops, not what architecture they run. And that a lot of laptops are effectively used as thin clients. You may run a lot of stuff on your laptop, but to a lot of folks in your organisation they may only use their laptop to SSH into a server or something like that. For that use case it doesn't matter if it runs ARM or x86 and if the ARM chip can provide 10x the battery life (just picking a random extreme number for the effect) that's a better device for the task
 
  • Like
Reactions: jz0309

bobcomer

macrumors 601
May 18, 2015
4,949
3,699
Think the point was that power consumption matters on laptops, not what architecture they run. And that a lot of laptops are effectively used as thin clients. You may run a lot of stuff on your laptop, but to a lot of folks in your organisation they may only use their laptop to SSH into a server or something like that. For that use case it doesn't matter if it runs ARM or x86 and if the ARM chip can provide 10x the battery life (just picking a random extreme number for the effect) that's a better device for the task
I know that was the point, but my point about what software it runs makes WAY more of a difference than that. That's all I'm saying, not that Arm is bad or not as power efficient as others say here, but it doesn't run what I need it to run to be considered where I work. I'm the IT manager, I know what we need to run and the budgets for everything.
 
  • Like
Reactions: Shirasaki

grandM

macrumors 68000
Oct 14, 2013
1,520
302
I know that was the point, but my point about what software it runs makes WAY more of a difference than that. That's all I'm saying, not that Arm is bad or not as power efficient as others say here, but it doesn't run what I need it to run to be considered where I work. I'm the IT manager, I know what we need to run and the budgets for everything.
Would you make the same choices if you could start from scratch? I don't know what you're running but I can imagine previous investments, personnel training, certain ERP systems pose a huge hurdle. Those budgets are monthly, quarterly, yearly, ...?
 

bobcomer

macrumors 601
May 18, 2015
4,949
3,699
Would you make the same choices if you could start from scratch? I don't know what you're running but I can imagine previous investments, personnel training, certain ERP systems pose a huge hurdle. Those budgets are monthly, quarterly, yearly, ...?
Right now with what else is available, hard to tell, maybe not with the power9 server, as that's where our ERP runs and I wouldn't mind a more mainstream ERP. Love the DB there, but that's just a personal like.

But with desktops and front end servers (and various PC device controllers), yes, I think I'd make the same choice, there's just more available to run in that space (software-wise is what I'm talking about)
 

sunny5

macrumors 68000
Jun 11, 2021
1,838
1,706
Do you even aware that nobody cant make computers without Intel, AMD, and Nvidia for a long period of time? Apple can make their own CPU and GPU which is a shocking big advantage that nobody cant so far.
 
  • Like
Reactions: grandM

sunny5

macrumors 68000
Jun 11, 2021
1,838
1,706
umm… IBM, Sun/Oracle, HP, and DEC, among others would beg to differ.

BL.
lol, in consumer computer market? They dont make any of them but servers or super computers.. Even HP uses Intel, AMD, and Nvidia parts.
 
  • Like
Reactions: grandM

throAU

macrumors G3
Feb 13, 2012
9,205
7,360
Perth, Western Australia
The power is there, but AMD and even possibly Intel will catch up in time. I fear the future market fragmentation, with developers having to develop specifically for Apple Silicon ARM and just not having the time to do so.
If you're developing specifially for Apple Silicon, rather than the APIs provided by apple which are CPU/GPU agnostic, you're probably doing development wrong, in this current century.
 

smoking monkey

macrumors 68020
Mar 5, 2008
2,363
1,508
I HUNGER
You're worried after the weakest AS chip ever is the only one released and it's a pretty amazing chip. The only way is up. Don't worry.

As for gaming. If you're serious about gaming on PCs you don't do so on a Mac. It really can't get much worse than it already is.

The AS future is so bright you're gonna have to wear shades when you use Macs.
 
  • Like
Reactions: huges84
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.