Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

xxray

macrumors 68040
Jul 27, 2013
3,115
9,412

If you want to feel skeptical, this article does a good job getting you to. It discusses how Apple has had "brain drain" recently in their chip designing team and how the A15 appears to have no significant performance improvements over the A14.
 

poorcody

macrumors 65816
Jul 23, 2013
1,339
1,584
If you want to feel skeptical, this article does a good job getting you to. It discusses how Apple has had "brain drain" recently in their chip designing team and how the A15 appears to have no significant performance improvements over the A14.
It does sound disconcerting. I wonder how many of those 100 engineers were from the Apple Silicon design team. It seems to me keeping their team top-notch is essential to stay ahead.

I wonder if Apple will keep MacOS running on Intel in the lab...
 

Hexley

Suspended
Jun 10, 2009
1,641
505
Part of the reason I don’t go to family gatherings anymore is because nearly everyone in my family works in medicine. I’m left out of every conversation for that reason!
I'd use COVID as the reason not to show up. Odds of them being in a working environment with a high viral load is higher than you will ever be.
 

Hexley

Suspended
Jun 10, 2009
1,641
505

If you want to feel skeptical, this article does a good job getting you to. It discusses how Apple has had "brain drain" recently in their chip designing team and how the A15 appears to have no significant performance improvements over the A14.
I'd wait for benchmarks before believing clickbait articles.

The purported brain drain would only manifest itself years after and there isnt a shortage of engineering talent. You just need to give a better offer and working condition.

See these articles from 2011 and 2013
 

cmaier

Suspended
Jul 25, 2007
25,405
33,474
California
I'd wait for benchmarks before believing clickbait articles.

The purported brain drain would only manifest itself years after and there isnt a shortage of engineering talent. You just need to give a better offer and working condition.

See these articles from 2011 and 2013

Just go to linkedin and search for CPU designers, VLSI designers, SoC engineers, etc. and you will see that there is no braindrain. People leave jobs in Silicon Valley all the time. No one person is going to make a difference. Only in the rare situation where the majority of a team on a given project quits at once (I saw that happen once) is there an impact.
 

Hexley

Suspended
Jun 10, 2009
1,641
505
Just go to linkedin and search for CPU designers, VLSI designers, SoC engineers, etc. and you will see that there is no braindrain. People leave jobs in Silicon Valley all the time. No one person is going to make a difference. Only in the rare situation where the majority of a team on a given project quits at once (I saw that happen once) is there an impact.
Thank you for agreeing with me that the article's just meant to worry worrywarts.

Foretunately for people like me my carrrier will receive their shipment of iPhones by December so whatever flaws will be highlighted between now and then.
 

casperes1996

macrumors 604
Jan 26, 2014
7,599
5,770
Horsens, Denmark
Just go to linkedin and search for CPU designers, VLSI designers, SoC engineers, etc. and you will see that there is no braindrain. People leave jobs in Silicon Valley all the time. No one person is going to make a difference. Only in the rare situation where the majority of a team on a given project quits at once (I saw that happen once) is there an impact.
You know, now that we've mentioned him before, every time there was a big interview with Jim Keller or Raja Koduri or such, I always thought "I acknowledge that you're probably very talented to get all this acclaim, but how much can one single person really matter in designing something as big as a CPU/SoC/GPU"
Though at the same time I suppose there's also a factor of the management power of people who're good at assembling that great team in the first place. Hiring the right people and whatnot.

Any reason the majority of a team left all at once in that story you allude to?
 

pshufd

macrumors G4
Oct 24, 2013
10,151
14,574
New Hampshire
You know, now that we've mentioned him before, every time there was a big interview with Jim Keller or Raja Koduri or such, I always thought "I acknowledge that you're probably very talented to get all this acclaim, but how much can one single person really matter in designing something as big as a CPU/SoC/GPU"
Though at the same time I suppose there's also a factor of the management power of people who're good at assembling that great team in the first place. Hiring the right people and whatnot.

Any reason the majority of a team left all at once in that story you allude to?

I've seen whole teams leave (I was part of one once) and it's usually out of loyalty to a manager that has demonstrated great loyalty and management to employees in the past.
 

cmaier

Suspended
Jul 25, 2007
25,405
33,474
California
You know, now that we've mentioned him before, every time there was a big interview with Jim Keller or Raja Koduri or such, I always thought "I acknowledge that you're probably very talented to get all this acclaim, but how much can one single person really matter in designing something as big as a CPU/SoC/GPU"
Though at the same time I suppose there's also a factor of the management power of people who're good at assembling that great team in the first place. Hiring the right people and whatnot.

Any reason the majority of a team left all at once in that story you allude to?

Even-numbered chips (K6, K8, etc.) were being done in California, and odd in Texas, and the teams didn’t get along. I was working on K6 revisions, though early work on the first version of K8 was happening. Anyway, people were getting cranky about being stuck on K6, or were tired of the industry, and a few key people left. Many of us had also come into AMD via their acquisition of Nexgen, or had started work at “AMD” with the Nexgen team in Milpitas post-acquisition. They had sort of saved the company, because “K5” was terrible, and the original K6, done by Texas (which was replaced by the renamed Nx586 from nexgen) was also not good. But now they had achieved some success on K6 and wanted to move onto the next thing, but they weren’t able to for various reasons, felt disrespected by AMD, didn’t like the move from Milpitas into AMD’s Sunnyvale’s headquarters, etc.

Anyway, Dirk Meyer, then in charge of the Texas team, flew to california, supposedly to cheer us up. Instead, at a group meeting, he said something like “If you can’t hack it here, leave.”

A *ton* of people took him up on his offer.

That’s why we ended up with something like 15 designers, plus a few verification and layout engineers, and we had to start K8 over from scratch, and make it something achievable by a small number of us, each wearing lots of hats. And that, boys and girls, is how x86-64 née AMD64 was born.
 

sgtaylor5

macrumors 6502a
Aug 6, 2017
724
444
Cheney, WA, USA
It appears Apple has not changed the CPU much this generation. SemiAnalysis believes that the next generation core was delayed out of 2021 into 2022 due to CPU engineer resource problems. In 2019, Nuvia was founded and later acquired by Qualcomm for $1.4B. Apple’s Chief CPU Architect, Gerard Williams, as well as over a 100 other Apple engineers left to join this firm. More recently, SemiAnalysis broke the news about Rivos Inc, a new high performance RISC V startup which includes many senior Apple engineers. The brain drain continues and impacts will be more apparent as time moves on. As Apple once drained resources out of Intel and others through the industry, the reverse seems to be happening now.

We believe Apple had to delay the next generation CPU core due to all the personnel turnover Apple has been experiencing. Instead of a new CPU core, they are using a modified version of last year’s core. One of these modifications is related to the CPU core’s MMU. This work was being done for the upcoming colloquially named “M1X” generation of Mac chips. Part of the reason for this change is related to larger memory sizes and virtualization features/support. In addition, there may be other small changes as well, but we need hardware in the hand to analyze that. We also aren’t sure if Avalanche and Blizzard are the next generation cores or the current modified Firestorm and Icestorm cores.

excerpted from: https://semianalysis.substack.com/p/apple-cpu-gains-grind-to-a-halt-and
A perfect illustration of posts #461, #463 and others.

Hope Apple can recover quickly enough.
 

cmaier

Suspended
Jul 25, 2007
25,405
33,474
California
Interesting. Good to know @cmaier. As an aside and being sincere, what would I look for in that article to tell me that it is false? I would never want to post incorrect information.

Like I‘ve said - go to linkedin and you can find all the CPU engineers that still work at Apple and have for years. Nuvia was a tiny company, around a year old when it was acquired - no way it even had 100 employees.
 
  • Like
Reactions: sgtaylor5

wyrdness

macrumors 6502
Dec 2, 2008
274
322
Anyway, Dirk Meyer, then in charge of the Texas team, flew to california, supposedly to cheer us up. Instead, at a group meeting, he said something like “If you can’t hack it here, leave.”

A *ton* of people took him up on his offer.

The term for that is 'seagull management'. Fly in, sh** on people and fly out again.
 

ADGrant

macrumors 68000
Mar 26, 2018
1,689
1,059
The smart kids who were programming Apple II games in high school were peeking and poking directly into RAM, or via routines made out of poked 6502 op codes. When they saw C pointers, they said, "oh, that's poke with a cleaner syntax" and kept going.

When first using Swift for numeric crunching, I wrote mini-benchmarks in both C and Swift, and dumped the optimized compiled results in assembly language from Xcode. Found out that with careful use of types and memory allocation, one could get very similar op code execution paths out of Swift.
Peeking and poking was really for people typing in listings from magazines or manipulating memory mapped hardware. For actual code was much simpler just to use a macro assembler and since those 6502 CPUs were so slow, the only way to get decent performance.

There are hobbyists still writing games in 6502 assembler just for fun.
 

smulji

macrumors 68030
Feb 21, 2011
2,998
2,889
Erm, no. I've been buying Apple products for decades as well lol.

The others (amd, intel, etc.) WILL come on strong, the added competition from Apple will just increase innovation, and I don't think that Apple will go back to intel.

I think that the biggest problem will be getting developers on board for Apple. Apple will have to take a huge chunk of the market for developers to bother developing specifically for Apple Silicon, and that's something we just can't predict at this moment.
the biggest base for Apple developers come from iOS and iPadOS developers, not AppKit. Using Catalyst, it isn't a huge leap to create a macOS app if a developer already has an iPadOS app.
 

smulji

macrumors 68030
Feb 21, 2011
2,998
2,889
Isn't going to happen for decades, if ever. Backwards compatibility is king in Windows land.
Starting with Windows 11, MS has implemented their own version of Rosetta that allows x86 Windows apps to run much smoother on Windows on ARM.
 

jdb8167

macrumors 601
Nov 17, 2008
4,859
4,599
Starting with Windows 11, MS has implemented their own version of Rosetta that allows x86 Windows apps to run much smoother on Windows on ARM.
What makes you think x86 binary translation is new for Windows 11? The 32-bit version has been in Windows on Arm for Windows 10 since Microsoft released their first Surface X at least. Microsoft also had 64-bit translation on Windows 10 betas but moved the release to Windows 11.
 
  • Like
Reactions: cmaier
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.