Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

pshufd

macrumors G4
Oct 24, 2013
10,149
14,574
New Hampshire
Versus system on chip which is the way big companies seem to be going? Feels like Betmax versus VHS although I could be wrong.

Do you understand the VLIW issue? The VLIW is additive. It may account for 1/2 of the M1 performance advantages over x86.

It's pretty simple if you're an EE or have a CS degree.
 

steve1960

macrumors 6502
Sep 23, 2014
293
300
Singapore
Time will tell but I have been around the block more than a few times and can usually see what's coming. Its not about any degree its about intuition and industry experience. A degree gets you no common sense whatsoever just education. Common sense got me to where I am today Senior Vice President of Operations in a world leading semiconductor company.
 

pshufd

macrumors G4
Oct 24, 2013
10,149
14,574
New Hampshire
Time will tell but I have been around the block more than a few times and can usually see what's coming. Its not about any degree its about intuition and industry experience. A degree gets you no common sense whatsoever just education. Common sense got me to where I am today Senior Vice President of Operations in a world leading semiconductor company.

I worked in industry for ten years before getting my first degree. A couple of employers paid for my Bachelors. I was in design meetings and didn't understand things that were discussed in the meetings that they assumed that everyone else knew so I got a graduate degree and then I understood what they were talking about.

You don't have to get a degree - you can just buy textbooks and read them. There are lots of online textbooks that you can read for free.

I'm a bit surprised that you would prefer ignorance to learning something that should be fairly easy to learn.

There are a number of articles on the web about this that try to dumb it down but you can literally read two paragraphs and understand it with a bit of theory.
 

steve1960

macrumors 6502
Sep 23, 2014
293
300
Singapore
I worked in industry for ten years before getting my first degree. A couple of employers paid for my Bachelors. I was in design meetings and didn't understand things that were discussed in the meetings that they assumed that everyone else knew so I got a graduate degree and then I understood what they were talking about.

You don't have to get a degree - you can just buy textbooks and read them. There are lots of online textbooks that you can read for free.

I'm a bit surprised that you would prefer ignorance to learning something that should be fairly easy to learn.

There are a number of articles on the web about this that try to dumb it down but you can literally read two paragraphs and understand it with a bit of theory.
 

pshufd

macrumors G4
Oct 24, 2013
10,149
14,574
New Hampshire
30 plus years in the semiconductor industry I think I have got it thank you.

Well, then please, by all means, explain the VLIW issue so that everyone here understands one of the two reasons why M1 is so fast using very little power.


Screen Shot 2021-01-04 at 1.27.13 PM.png
 
  • Like
Reactions: Nightfury326

steve1960

macrumors 6502
Sep 23, 2014
293
300
Singapore
You just know it would take a mammoth response to do justification to this which is why you set the challenge. Let me condense it. Custom silicon comes with custom software, get it right and you have an stellar performing computer (Apple M1). Get it wrong in either the silicon OR the software and you are screwed. VLIW works in the same way, get the software to do the hard graft not the hardware. Lets see how successful it will be compared to Apple hardware and software. Big is not necessarily better, proved that back in the old days.
 

pshufd

macrumors G4
Oct 24, 2013
10,149
14,574
New Hampshire
You just know it would take a mammoth response to do justification to this which is why you set the challenge. Let me condense it. Custom silicon comes with custom software, get it right and you have an stellar performing computer (Apple M1). Get it wrong in either the silicon OR the software and you are screwed. VLIW works in the same way, get the software to do the hard graft not the hardware. Lets see how successful it will be compared to Apple hardware and software. Big is not necessarily better, proved that back in the old days.

It's clear that you don't understand VLIW.

You would need to understand what a Decoder does to understand VLIW.
 

steve1960

macrumors 6502
Sep 23, 2014
293
300
Singapore
Lets convene a year from now and see how the computer world looks. One way or another one of us will be able to say 'I told you so' :)

No hard feelings take care and be safe.
 

pshufd

macrumors G4
Oct 24, 2013
10,149
14,574
New Hampshire
Lets convene a year from now and see how the computer world looks. One way or another one of us will be able to say 'I told you so' :)

No hard feelings take care and be safe.

I'll write up something on why VLIW is so important later this afternoon and why it will be impossible for Intel or AMD to do the same thing with x86.
 
  • Like
Reactions: Nightfury326

leman

macrumors Core
Oct 14, 2008
19,521
19,678
Apple processors work better under certain circumstances than Intel processors. That was always going to be the case given Apple designed the processor for their own optimal use case. That is what companies do when they design their own chip. The Intel chip is designed to be an all encompassing chip for all circumstances.

I don’t understand this. How are Intel chips “all-encompassing” and Apple chips narrow-purpose? Intel has a series of different chips for different use cases - Tiger Lake for low-powered mobile, (soon) Rocket Lake for consumer/enthusiast desktop, Cascade Lake for workstation/server. So far, Apple has only released a chip that targets the ultra-low-power mobile. There is nothing in the design of their processors that would stop them from building a workstation-targeting behemoth - assuming they are interested in doing so.
 
  • Like
Reactions: Nightfury326

leman

macrumors Core
Oct 14, 2008
19,521
19,678
Relevant. Custom silicon, with an optimized software suite, will always outperform generic silicon from Intel and AMD of elected properly. Comes with a huge R&D overhead but Apple can afford it.

What is so “custom” about Apple Silicon? Sure, it contained optimizations for some operations that Apple often uses in their code base. But those are still general-purpose operations. And Apple Silicon outperforms Intel and AMD in most general-purpose workloads. For example, AS has very good branch prediction, huge caches and four 128-bit FP units. You can’t brush those components away as “custom accelerators”. It’s just a more capable CPU.

Sure.

But a really big piece is VLIW.

Do you understand why VLIW is such a big issue in performance?

Can you clarify what you mean with “VLIW”? The only term I know with this mnemonics is very long instruction word and I don’t see how that would be even remotely relevant to this discussion.
 

leman

macrumors Core
Oct 14, 2008
19,521
19,678
Its all in the software not the hardware. Custom silicon only works well with great software (M1)

Can you elaborate on this point? I can take any established code, say a compiler, a scientific computation library, a browser, that uses only plain old C or C++, without calling into Apple frameworks, and it will run faster on M1 than on Intel. Is that custom software too?
 

steve1960

macrumors 6502
Sep 23, 2014
293
300
Singapore
What is so “custom” about Apple Silicon? Sure, it contained optimizations for some operations that Apple often uses in their code base. But those are still general-purpose operations. And Apple Silicon outperforms Intel and AMD in most general-purpose workloads. For example, AS has very good branch prediction, huge caches and four 128-bit FP units. You can’t brush those components away as “custom accelerators”. It’s just a more capable CPU.



Can you clarify what you mean with “VLIW”? The only term I know with this mnemonics is very long instruction word and I don’t see how that would be even remotely relevant to this discussion.
Chip design is not rocket science, or actually it can be because it is a marriage or software and hardware. I have been doing this in the Bluetooth arena for 20 years and it is not easy. Designing a chip in 5nm and marrying it to custom software is hugely difficult. Not only separately in isolation but together as a true SOC. Do not underestimate what it took to get to where the M1 is today.
 

steve1960

macrumors 6502
Sep 23, 2014
293
300
Singapore
Can you elaborate on this point? I can take any established code, say a compiler, a scientific computation library, a browser, that uses only plain old C or C++, without calling into Apple frameworks, and it will run faster on M1 than on Intel. Is that custom software too?
Sure, the company I worked for over the last 20 years had twice as many software engineers as hardware engineers and budgeted three times as much for software development compared to hardware development. I have no doubt about your low level software abilities in a particular focus but how about mass market multiple diverse applications. How many user cases could you support alone given multiple silicon revisions and ongoing software revisions, customer support and bug fixing?
 

leman

macrumors Core
Oct 14, 2008
19,521
19,678
Everyone forgets its not just the silicon, if you don't get the software right it comes to nothing.

I really don’t understand what you are trying to say. The M1 literally runs the same software as Intel. Maybe some low-level OS components are optimized specifically for M1, but most code out there was written long before Apple Silicon was a speculation on obscure tech forums.

The reason why M1 is fast is because it is designed for running code that exists in the real world. Large cache, best in class branch prediction, fast atomic operations, memory parallelism, wide execution backend, sophisticated prefetch - all those are thing that Apple implemented by carefully examining existing software and its needs. You can call this “custom silicon” but I’m not sure whether this nomination has any practical meaning then.
 

Ethosik

Contributor
Oct 21, 2009
8,142
7,120
Relevant. Custom silicon, with an optimized software suite, will always outperform generic silicon from Intel and AMD of elected properly. Comes with a huge R&D overhead but Apple can afford it.
Agreed. We have 8 CPU cores, 8 GPU cores (in the higher version), and 16 ML cores. Intel/AMD currently do not have dedicated machine learning cores, which can definitely be used in a lot of software. It also contains the secure enclave, low powered video playback, image signal processor and more. Things that Apple and software can make use out of to improve performance. Some of these are something a generic processor cannot include.
 

steve1960

macrumors 6502
Sep 23, 2014
293
300
Singapore
I think its a wash to be honest. I have 20 years experience in the semiconductor industry watching how hard it is to marry software with the hardware. You need twice as many software engineers as hardware engineers to even get close to making a chip successful. I can't convince you, you would have had to live my life for the past 20 years!
 

Krevnik

macrumors 601
Sep 8, 2003
4,101
1,312
Can you clarify what you mean with “VLIW”? The only term I know with this mnemonics is very long instruction word and I don’t see how that would be even remotely relevant to this discussion.

I think that is what is being mentioned, and it does impact complexity of decoders. Specifically, AMD has commented on how they noticed there's an upper limit on the useful number of decoders on x86 due to the need to handle these kind of instructions, despite being rarely used. The A14/M1 architecture is already beyond that particular limit, and it does benefit from it.

fast atomic operations

This is probably the closest thing to "custom" in Apple's CPU design (EDIT: specifically CPU design, not SoC). But that's more to support modern languages like Swift (and Objective-C using ARC). But there's a lot of other multi-threaded scenarios that would benefit from this as well, so even that's not terribly custom, it's just examining existing usage of the CPU and adapting to it.
 

Devyn89

macrumors 6502a
Jul 21, 2012
964
1,801
There are people not motivated solely by money, but instead integrity and ethics. It is also possible to have a focus on making great money and yet still have integrity and moral principles. What a novel idea...

By the way, clicking the angry face on every single post of mine is not a good look. You should probably get that rage checked out. It’s not healthy. Your username reflects as much.
It’s possible for individuals to, sure. But Apple is a public company whose sole job it is to maximize profits for shareholders. Does Apple have some good values and policy? Sure, but it’s all in the interest of maximizing profits. Privacy is a good example (and a reason I choose Apple actually - and a reason lots of Apple users choose Apple too). This allows them to cater to a certain segment of the market. I would bet money that if no one cared about privacy and security Apple would abandon those principals because they are no longer profitable. These are companies trying to make money, no more, no less. Sometimes they make cool things while trying to make money and make good choices but these choices are made because of market pressures, not to do the right thing.
 
  • Like
Reactions: Nightfury326
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.