Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

bradl

macrumors 603
Jun 16, 2008
5,952
17,447
lol, in consumer computer market? They dont make any of them but servers or super computers.. Even HP uses Intel, AMD, and Nvidia parts.

The Red is what is known as a goal post shift. You said nothing about which market, only that they never made their own CPUs. They make their own CPUs and use them in their own hardware, and have for a good 30+ years. It would help to be specific with what is said, because history and current offerings from those companies prove your statement completely wrong.

BL.
 

sunny5

macrumors 68000
Jun 11, 2021
1,838
1,706
The Red is what is known as a goal post shift. You said nothing about which market, only that they never made their own CPUs. They make their own CPUs and use them in their own hardware, and have for a good 30+ years. It would help to be specific with what is said, because history and current offerings from those companies prove your statement completely wrong.

BL.
Apple makes consumer computers and workstation and all M1 Macs are def consumer computers. What else then? You probably need to read before you even comment it.
 

iamMacPerson

macrumors 68040
Jun 12, 2011
3,488
1,927
AZ/10.0.1.1
Y’all are acting like Apple has always used x86 chips and a platform change has never occurred. The Mac was originally based on 68k cpus before switching to PowerPC, which they used for 11 years before moving to Intel. Guess what? Every time Apple moved platforms developers moved too. macOS is the second biggest desktop OS in the world (not counting that web browser with a file browser extension that Google calls an OS). If developers dropped support for macOS they themselves would lose a big chunk of business, especially in desktop publishing and design/creative programs where Macs still reign supreme and have for decades.
 
  • Like
Reactions: KeithBN and Homy

bradl

macrumors 603
Jun 16, 2008
5,952
17,447
Apple makes consumer computers and workstation and all M1 Macs are def consumer computers. What else then? You probably need to read before you even comment it.

I did. Your post stated:

Do you even aware that nobody cant make computers without Intel, AMD, and Nvidia for a long period of time? Apple can make their own CPU and GPU which is a shocking big advantage that nobody cant so far.

The P-series of computers by IBM are RISC-based CPUs, that have been in their servers since 2000.


HP/Ux is used on PA/RISC and IA-64 based CPUs, which have been used since 1982. They just had a release of their latest CPUs for their servers.


Sun/Oracle released the Sparc-M8 processors in 2017, that are used in Solaris and OpenSolaris. They have used Sparcs since 1996, and before that Motorola 680x0 CPUs with SunOS. Sparcs are still used to this day.


Your comment is blatantly wrong here, and perhaps a little of practicing what you are preaching here is in order, because there are plenty of examples of this going back 30-40 years to prove your statement to be completely wrong.

BL.
 

sunny5

macrumors 68000
Jun 11, 2021
1,838
1,706
I did. Your post stated:



The P-series of computers by IBM are RISC-based CPUs, that have been in their servers since 2000.


HP/Ux is used on PA/RISC and IA-64 based CPUs, which have been used since 1982. They just had a release of their latest CPUs for their servers.


Sun/Oracle released the Sparc-M8 processors in 2017, that are used in Solaris and OpenSolaris. They have used Sparcs since 1996, and before that Motorola 680x0 CPUs with SunOS. Sparcs are still used to this day.


Your comment is blatantly wrong here, and perhaps a little of practicing what you are preaching here is in order, because there are plenty of examples of this going back 30-40 years to prove your statement to be completely wrong.

BL.

Lame, you clearly wrong since I'm talking about now in a consumer market while we are talking about Apple Silicon. I told you it's about the consumer market and yet you brought sever grade parts. Im not wrong. You are.
 

SlCKB0Y

macrumors 68040
Feb 25, 2012
3,431
557
Sydney, Australia
All you need to know is that until now, the power increases in Apple chips over the last 7 years has been consistently linear. So consistent that I believe they have been purposefully and artificially holding back their chips until they gave us a pretty big jump with M1 because of the mobile use-case before M1.

Sure, both Intel and AMD have things in the pipeline but Apple has something that they don’t - an unprecedented amount of head room with regards to power consumption and thermals. Add to this the inherent efficiencies gained by Apple’s specific ARM implementation/package architecture, their lack of legacy support and their vertical integration through to the entire software stack and you have an extremely solid foundation to build upon.

Think of it like this - Apple is pretty close to matching the single core performance of the best their competitors have currently and they are doing so at clocks of 3.x ghz vs 5ghz, whilst consuming a fraction of the power. Apple has the opportunity to use all of this to now start scaling horizontally (increasing core counts) and vertically (clock speeds), without fighting heat and power consumption the whole way like the others - and that’s just on this node process.

I think once we get into the pro machines, Apple is going to step things up significantly and very comfortably destroy the competitors in terms of performance. I then think they will drip feed performance increases to keep in line or just ahead of the competition whilst having a lot in reserve - they won’t want to blow their load too early. This then gives them breathing space in their hardware release roadmap - they can focus on R&D of what they will do in say, 5 years from now, instead of spending all their resources on being reactive to the market and playing catch up, like I’m sure Intel is right now. This is the period where Apple needs to be extremely vigilant around complacency.

With regards to the concerns voiced by the OP about applications and gaming, for those developers that already have intel MacOS apps, it’s trivially easy to port to ARM, with the exception of Apps that can’t solely take advantage of high level APIs and libraries and must have interaction with the bare metal - the most glaring example being virtualisation software.

With regards to gaming, we are already seeing the emergence of projects aimed at acting as a compatibility layers to translate OpenGL/Vulkan function calls to Metal with negligible performance hit.

Looking at developer support on Apple ARM in particular, don’t forget that Apple has basically trained a generation of hundreds of thousands (if not millions) of Devs to code in this environment for iOS devices, the switch to ARM macOS is then relatively easy to move across.

With regards to ARM support more broadly over the next 5+ years as long as Microsoft can nail down their X86 and more importantly X86-64 compatibility on ARM (especially if they develop their own ARM implementation, which will be driven by their data centre requirements) I can see a situation where the majority of Windows Laptops will be ARM based. Especially given we’re likely to see much more focus on other companies with ARM competence such as Nvidia and maybe even Samsung pivoting more towards higher end ARM products for laptops.

You’ll then see this weird fragmentation where consumers are using Windows on ARM, most data centre Cloud workloads will be on ARM and the last holdout will be Windows in on-prem enterprise or data centre bare-metal use-cases (both of which are very quickly becoming legacy), being pressured along from both the consumer low end and the very high end enterprise market segments (a kind of piggy in the middle).

Even if Microsoft is sluggish in getting Microsoft ARM up to power Azure, they will be getting immense pressure from AWS and Google Cloud with their proprietary ARM and their users then running Windows workloads in the Cloud on ARM. I’m sure Microsoft is rapidly adapting all of their licensing models for both desktop and server to include ARM - it would be crazy of them to retain exclusivity of ARM Windows, that is the Microsoft of 15 years ago under Steve Ballmer.
 
Last edited:
  • Like
Reactions: CE3 and grandM

williedigital

Cancelled
Oct 4, 2005
499
131
I wasn't advocating for Electron apps. But even native code written in C, C++, Swift, Rust, Zig, you name it, will just need to be recompiled and it'll be ready to go on Apple Silicon. Not much if any work needed at all in most cases. I just acknowledged a lot of programs are already written using web tech and that's inherently portable
If it's just this "check the box to make an Apple Silicon version" why is there still so much software on the Mac that is without a version that run natively, 1.5 years into the transition? It's not like it's abandonware and the devs always seem to say something like they have a "dependency" that needs updating first.
 

thenewperson

macrumors 6502a
Mar 27, 2011
992
912
All you need to know is that until now, the power increases in Apple chips over the last 7 years has been consistently linear. So consistent that I believe they have been purposefully and artificially holding back their chips until they gave us a pretty big jump with M1 because of the mobile use-case before M1.
Was the M1 a "pretty big jump" actually? It was pretty big compared to the Intel chips in (most of) the target machines, sure, but what we got is pretty consistent with what simple extrapolations of previous A and AX chips gave us, so I think this conspiracy theory is silly.
 

k27

macrumors 6502
Jan 23, 2018
330
419
Europe
The power is there, but AMD and even possibly Intel will catch up in time.
In terms of performance, Apple has to catch up (M1x, M2) and not vice versa. A current Ryzen in a laptop is sometimes significantly faster in corresponding applications because of its 8 cores.
And the integrated hardware encoding in the M1 is super fast but very poor in terms of quality and file size. That's why I don't use it in my M1 (I use x264 and x265 on the CPU).
But hardware encoders are usually not good. An exception could be Nvidia NVENC.

However, due to the snooping that Apple wants to introduce, my interest in new Apple hardware has dropped considerably already.
 
  • Like
Reactions: g75d3

SlCKB0Y

macrumors 68040
Feb 25, 2012
3,431
557
Sydney, Australia
Was the M1 a "pretty big jump" actually? It was pretty big compared to the Intel chips in (most of) the target machines, sure, but what we got is pretty consistent with what simple extrapolations of previous A and AX chips gave us, so I think this conspiracy theory is silly.

I don’t believe I spoke of any “conspiracy theory”, which is it seems a very overused term. I believe I spoke of an extremely commonly used corporate strategy across many industries. Forgetting about any physical constraints your product might have, when you have a product with an annual release cadence, you will increase it’s performance just enough to keep consumers happy that incremental performance gains are being made, but not so much that you either increase your costs, or create an unreasonable expectation for large year on year performance gains by releasing an iteration with a performance gain significantly larger than your previous releases, even though you have the technology to do so.

There is nothing conspiratorial about this - it’s just a good business practice when you have a very capable technology on your hands. It is also very good from a risk perspective - if you’ve held back on your performance ceiling and run into some implementation issues for a couple of years, you’ve still got performance reserves to continue on your incremental performance increases without any adverse outcomes.

In contrast to this, a company that’s not in a great place relative to competitors, WILL claw and scratch to put every ounce of performance improvement it can into each release. See Intel right now.

So yes, of course Apple carefully drip feeds us improvements year on year and no, it isn’t a “conspiracy theory”.
 
  • Like
Reactions: Codpeace and grandM

casperes1996

macrumors 604
Jan 26, 2014
7,599
5,770
Horsens, Denmark
Lame, you clearly wrong since I'm talking about now in a consumer market while we are talking about Apple Silicon. I told you it's about the consumer market and yet you brought sever grade parts. Im not wrong. You are.
It doesn’t really matter if we constrain ourselves to consumer devices. Raspberry Pi is a consumer device running a Broadcom ARM SoC, Snapdragon has made several laptop ARM chips. Microsoft has, in beta, a Rosetta equivalent for running x86 on ARM on Windows now, and while the whole consumer market couldn’t currently be satisfied without AMD, Nvidia and Intel, there certainly already exist consumer devices that don’t rely on them. Take out TSMC and Global Foundries on the other hand and we get serious issues, haha.
If it's just this "check the box to make an Apple Silicon version" why is there still so much software on the Mac that is without a version that run natively, 1.5 years into the transition? It's not like it's abandonware and the devs always seem to say something like they have a "dependency" that needs updating first.
The answer’s right there :) - Dependencies. This is for example the reason why it’s near impossible for games on Steam to be M1 native. Disco Elysian for example has an Apple Silicon native version on the Mac App Store, but the Steam version runs through Rosetta. The reason, is that when Steam launches a game, it’s launched with a Steam dylib in its memory space and that dylib is x86 as is the launcher code steam uses, so if the game ships ARM binaries we have mixed instruction streams and it’ll crash on launch. The developer can make their own code Apple Silicon native by just updating their build config, but they have dependencies that haven’t been updated - in this example they depend on Valve to update Steam and its libraries. And because of the way Steam’s launcher and Steam integration in games work, it’s one of the 5% of cases that requires a bit more effort than the checkbox. Some studios circumvent this by having their own launcher be the first thing that shows, offering to launch the ARM binary without full Steam integration or launch through Rosetta with Steam integration.

Programs like QEMU and Parallels are also more work than a checkbox, but assuming you don’t rely on any packages that haven’t been updated and you only need to consider your own codebase, it’s typically a trivial process. If your software depends on some other software though it doesn’t matter it’s trivial for you. All you can do is wait. And it may even be trivial for the people you’re waiting on, but they’re waiting on someone else too, and so on. At the end of the chain there may be software for which it is not trivial or software with unreliable maintainers who’re just taking their time.

The end result of all this is still that it’ll take some time for everything to be Apple SIlicon native - even though 95% of software will be low effort to port, we’re to some degree held back by the 5%. But it also means that it won’t be a hurdle for future developer interest in the Mac platform; The CPU architecture difference between the Mac and the average PC currently is not a substantial development difference to consider - The rest of the software stack your code sits on top of may be - if your Windows software relies on WinForms and DirectX and all sorts of other Windows-only APIs you’ll still have to go through a lot of effort to port it, but that’s also the case for Intel Macs.
 

k27

macrumors 6502
Jan 23, 2018
330
419
Europe
I wouldn't be surprised if a future Mac Pro no longer has ECC Ram. Apple will have suitable excuses, which are just as strange as for the missing checksums (data) with APFS.
 

Bug-Creator

macrumors 68000
May 30, 2011
1,785
4,717
Germany
The answer’s right there :) - Dependencies.

Sure, that is true for the moment but ARM is gaining a bigger foothold day by day. Windows on ARM is a thing, Mac will soon all be ARM (and a few years down the road any Intel Mac will be obsolete) and on top of that everything mobile is ARM too.

So if your framework/library isn't platform agnostic it will be sidelined in the near future.

As to Intel/AMD catching up:
- Apple has dropped 32Bit both in SW and HW silicon real estate that can be used elsewhere without making the chip bigger
- ARM is more suitable for modern CPU design
These 2 alone mean that x86 needs to be much better on the engineering side to get the similar results.

- Apple for the time being has access to the fabs that are 1 generation ahead what everyone else can get
- Heard that the M1 has effectively a 6 channel RAM interface, something just isn't feasible in those form factors with DIMMs (and even if you'd end up with higher power usage and latency)

I still think Apple did in some form f### up with the M1X and lost at least 6 months on it, wether that will have any impact on M2/3 generation of AS is to be seen but right now they are still ahead of the competition just not as much as last year.
 

casperes1996

macrumors 604
Jan 26, 2014
7,599
5,770
Horsens, Denmark
- Heard that the M1 has effectively a 6 channel RAM interface, something just isn't feasible in those form factors with DIMMs (and even if you'd end up with higher power usage and latency)

Pretty sure it's 8 channel. And that's totally possible with DIMMs - The Mac Pro currently, with Intel, has 8 channel memory. It's just only really seen on super high end workstation/server chips
 
  • Like
Reactions: KeithBN

k27

macrumors 6502
Jan 23, 2018
330
419
Europe
I still think Apple did in some form f### up with the M1X and lost at least 6 months on it, wether that will have any impact on M2/3 generation of AS is to be seen but right now they are still ahead of the competition just not as much as last year.
That is wrong. They are behind in terms of performance. There are 8 Core Ryzen 3 Laptops out there with current NVIDIA GPU that destroy the M1 in applications such as DaVinci Resolve. When it comes to power consumption, the M1 is really ahead.
 
  • Like
Reactions: g75d3

Bug-Creator

macrumors 68000
May 30, 2011
1,785
4,717
Germany
And that's totally possible with DIMMs - The Mac Pro currently

I'm pretty sure I qualified "form factor" :p

I also think there is a reason why laptops all come with 1 or 2 channels and 2 DIMM slots. Heck even all pedestrian desktops are dual channel (with 2 or 4 DIMMs).

Adding more channels/slots on a PCB does mean the CPU/SoC needs far more contacts, much more traces that have to go further resulting in in massive cost increases (as you will have to add more layers or get really creative).
You will also hit a wall if you want to drive a tight timing and/or low voltages so no surprise that just isn't a thing on consumer HW and for sure not in laptops.
 
  • Like
Reactions: casperes1996

Kung gu

Suspended
Oct 20, 2018
1,379
2,434
I still think Apple did in some form f### up with the M1X and lost at least 6 months on it, wether that will have any impact on M2/3 generation of AS is to be seen but right now they are still ahead of the competition just not as much as last year.
Ming Chi kuo always stated that M1X macbooks were to be released in late Q3/Q4.

Mark Gurman also said Sep-Nov for M1X macbook.

As long as iPhone releases in Q3/Q4 every year Apple Sillicon will not be behind as that's where they release new uarch in the A series.
 
  • Like
Reactions: KeithBN

Kung gu

Suspended
Oct 20, 2018
1,379
2,434
There are 8 Core Ryzen 3 Laptops
But there is no 8 Firestorm cores in M1 yet. M1 is right now 4 Firestorm(big) 4 Icestorm(LITTLE).
M1 GPU is around gtx 1050/ti, of course current gen NVIDIA laptop cards beat M1.

M1X should have 8 Firestorm cores and more GPU cores.
 

theluggage

macrumors G3
Jul 29, 2011
8,015
8,450
Interesting. So you think that the whole market will shift towards ARM, because it is inherently superior? Which, in turn, would force developers to focus on developing for ARM, and it won't matter if they develop for Apple ARM or AMD Arm, since it will be the same thing, like in the case of AMD and Intel today?
The market *has* shifted towards ARM! Mobile tech has subsumed a significant chunk of the PC market and is dominated by ARM and Unix/Linux-related OSs (iOS, MacOS, Android) despite Wintel’s best efforts. That has broken the back of the Windows/x86 monopoly. Wintel is too big to vanish overnight, but it is now in decline. Linux has been eating away at Windows from the server end, too, mainly with x86, but x86 to ARM is a much easier switch with Linux than Windows...

Rather than ARM only, likely the whole market will simply shift away from dependency on binary compatibility with one particular CPU instruction set, as the need to run 20-30 year-old legacy code (some of which is still 8/16 bit) gradually shrinks to a niche market. That was mainly a IBM PC/DOS/Windows thing anyway, Unix was always more oriented towards high-level source compatibility, and most modern OSs that aren’t Windows take after Unix.

Rosetta2 has shown how good code translation has got, modern Windows .net code is distributed as byte code for a virtual machine rather than x86 binaries, and even the Apple App store can accept byte code from which it can generate optimised binaries for different processors (AFAIK just ARM variants at the mo, but I’m not suggesting this is all ready to roll today). “Real programmers“ look down on JavaScript, WebAssembly, electron etc. but they’re pretty powerful (and now you can write in Rust and compile to webassembly). I think the future will be duelling application frameworks/libraries (e.g. DirectX vs Metal vs web tech) with only a hard core of systems programmers needing to worry their heads about CPU architecture.

x86 processors already use an x86 instruction decoder feeding a RISC-like core - I don’t know enough to suggest that Intel/AMD could “just” dispense with the decoder and use the RISC core, but it seems like they could make a non-x86 RISC-like processor without starting totally from scratch.
 

Devin Breeding

macrumors 6502
May 2, 2020
296
251
Conway SC
The power is there, but AMD and even possibly Intel will catch up in time. I fear the future market fragmentation, with developers having to develop specifically for Apple Silicon ARM and just not having the time to do so.

Not to mention that games for Apple Silicon are just not a thing, and gaming is a huge part of the PC market, and realistically will probably never be a thing, since Apple and gaming just don't work together.

Even the new 10nm Intel CPUs will be much better than before, and AMD is already doing great in raw power.

The idea of Apple controlling both software and hardware is great, something they've been trying to do for decades, but the big question is how the support from the developers will be.

I look forward to the power, but I'm just not so sure about the future.

I am a complete noob and have no idea what I'm talking about in this area, but I'm just wondering what other people here think.
Having to code for a specific hardware and OS is coming to an end anyways isn't it? Seems the future of high power applications and console gaming will be letting servers do the hard work.
 

bobcomer

macrumors 601
May 18, 2015
4,949
3,699
Having to code for a specific hardware and OS is coming to an end anyways isn't it? Seems the future of high power applications and console gaming will be letting servers do the hard work.
Internet access latency/speed is the enemy of that. It's just not workable around here.
 

Maconplasma

Cancelled
Sep 15, 2020
2,489
2,215
The power is there, but AMD and even possibly Intel will catch up in time.
This is absolutely inane thinking. You're suggesting that Apple will stop innovation right now with AS and have no further plans to make more powerful and efficient processors and that they have zero interest past the current M1. You realize that's the ONLY way Intel or AMD could catch up right? Or are you in denial?

Also I'm not sure why you see this as some "competition". Apple is making processors for their computers to run MacOS. AMD and Intel moving forward will be built into Windows machines so why does it matter what they do vs. what Apple does? It's not like people will dump Macs simply because other Windows machines have better processors. Not everybody wants to run Windows.

On one end of the stick people complain that Apple is no longer innovating in regards to the Mac. Then they make something revolutionary and enough to make Intel cry like a child and the M1 is not only getting great press but amazing reviews and people just want to bring Apple down and often because they can't stand to see them on top. SMH.
Not to mention that games for Apple Silicon are just not a thing, and gaming is a huge part of the PC market, and realistically will probably never be a thing, since Apple and gaming just don't work together.
Yeah and nobody saw the M1 coming either. Apple doesn't have to speak publicly what their plans are. If people don't think Apple knows what they are doing at this point then please leave Apple alone and focus on another company's.....cough cough, "Lack of Innovations".
Even the new 10nm Intel CPUs will be much better than before, and AMD is already doing great in raw power.
Good for AMD....
I look forward to the power, but I'm just not so sure about the future.
Then you should support companies you have trust in. Simple as that.
I am a complete noob and have no idea what I'm talking about in this area
Agreed.
 
Last edited:
  • Like
Reactions: grandM and JMacHack
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.