Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Irishman

macrumors 68040
Nov 2, 2006
3,449
859
It’s more like a fact of life that as a Mac user, you don’t get to design Apple’s golden cage, you only get to live in it.

Apple decided to not upgrade to Blu-ray and rather abandon built-in optical drives altogether and never added drivers for Blu-ray to macOS. If you don’t like this decision, you can leave and live in another (less golden) cage.

The Magic Mouse supports secondary clicks and even supports left-handed operation. A third-party mouse might work or might not, there’s no guarantee either way. Your third mouse button could be just like Blu-ray.

True or false? Your fact of life is, your opinion, and none of the rest of us - nor Apple - are bound by it.

I, personally, will go on using my Mac setup, with all of its shortcomings and failings, and have great fun in the process.

Change my mind that I should see my Mac experience through your personal lens, and that yours should be mine as well. Because your Blu-Ray defense just didn't get it done for me.

Not by a long shot.
 

Gudi

Suspended
May 3, 2013
4,590
3,267
Berlin, Berlin
Your fact of life is, your opinion, and none of the rest of us - nor Apple - are bound by it.
I do not have opinions, only analyses to offer for debate.

Apple once tried to license their OS to OEMs and maybe become a pure software company just like Microsoft? It brought them almost to the brink of collapse, they only survived thanks to the second coming of Steve Jobs, who reversed this doomed course and put the tight integration of Apple hardware and software at the center of Apple's business model. As a result even upper management at Apple is not free to depart from the Apple way, which brought them so much success.

So while Apple isn't completely alien to open standards and a thriving third-party ecosystem, nothing will stop them from innovating in the direction they see fit. Floppy drives gone, optical drives gone, nVidia gone, OpenGL gone, Intel gone, Bootcamp into Windows gone, 32-bit Mac apps gone. Absolutely nothing will ever stop them. They are the company not afraid to break eggs (and backward compatibility) to make an omelet. You can't even have a headphone jack on an iPhone, even though the iPhone introduced it to phones in 2007.
I, personally, will go on using my Mac setup, with all of its shortcomings and failings, and have great fun in the process.
You can decide to view them as shortcomings or disagree with the reasoning of Apple, wether they made the right decision. But you can not deny that there is a fundamental difference between the hundreds of companies, who contribute to the development of the PC ecosystem, many of whom steering in opposite directions; and the much more controlled vertical integration under-one-roof of Apple, making everything form the silicon up to the movie subscription service themselves.
Change my mind that I should see my Mac experience through your personal lens, and that yours should be mine as well. Because your Blu-Ray defense just didn't get it done for me.
I don't care for your approval. I laid down my train of though and instead of a proper rebuttal, you only offer me an I don't follow? I think I won that debate.
Not by a long shot.
Let's make it a short shot then. I don't even need Blu-rays, I can argue with AirPrint. When introducing printing to iOS, Apple literally made all previous printer drivers obsolete. So yes, they were so rigid and dismissive of the need for a printer (the printer you already have) to work with the Apple ecosystem. And yes, I even support that decision.

New rule: Your printer must support AirPrint or it won't work with iPhone.

Golden Cage 101: You don't make the rules, Apple makes the rules.
 
Last edited:

Calaveras

macrumors regular
Dec 22, 2021
116
60
I just don't see gamers gravitating towards Macs. They have a perceived price premium. And despite the performance of the M series processors, Macs do not have a reputation for performance.
More importantly, I just don't see games developed for Macs at the same scale that they are for PCs and consoles.
Sure, you can dual boot into Windows, but Windows will not be running native. And I'm not sure how great the GPU drivers will be in that scenario. PC gamers will stick to building their own rigs because that is a fun hobby.
Developers will continue to code for PCs and consoles because there is still money there.
 

dmccloud

macrumors 68040
Sep 7, 2009
3,146
1,902
Anchorage, AK
It forced you to make, past tense! This is the difference Apple Silicon makes. It allows Apple to build laptops as powerful and persistent as desktops and desktops as thin, cool and quiet as laptops. The form factor no longer divides computer into performance classes. Only the fastest M1 Ultra doesn't exist in a laptop variant.

No, because hardware and software need to work well together and macOS (other than Windows) requires a 218 ppi display for native Retina resolution. So you don't have free choice, there are only a few rather expensive options. That's why it is best to buy your display from Apple. Same with all the Multitouch gestures. You want an Apple Trackpad not one from Logitech. If you decided for the Mac platform, you've decided against mixing and matching peripherals from all kinds of PC suppliers.

The problem with this post it that it is based off the false presumption that people need to run a retina-class display externally. While there are elements of the user base who benefit from that (content creation, video editing/creation, photographers, etc.), there is also a sizeable majority of the userbase who just use their machines for basic tasks. In those cases, it really is not even necessary to spend the $$ on a 1440p or higher resolution display, because it won't have enough of a benefit to make it worth the additional cost.
 
  • Like
Reactions: Irishman

dmccloud

macrumors 68040
Sep 7, 2009
3,146
1,902
Anchorage, AK
I do not have opinions, only analyses to offer for debate.

Apple once tried to license their OS to OEMs and maybe become a pure software company just like Microsoft? It brought them almost to the brink of collapse, they only survived thanks to the second coming of Steve Jobs, who reversed this doomed course and put the tight integration of Apple hardware and software at the center of Apple's business model. As a result even upper management at Apple is not free to depart from the Apple way, which brought them so much success.

So while Apple isn't completely alien to open standards and a thriving third-party ecosystem, nothing will stop them from innovating in the direction they see fit. Floppy drives gone, optical drives gone, nVidia gone, OpenGL gone, Intel gone, Bootcamp into Windows gone, 32-bit Mac apps gone. Absolutely nothing will ever stop them. They are the company not afraid to break eggs (and backward compatibility) to make an omelet. You can't even have a headphone jack on an iPhone, even though the iPhone introduced it to phones in 2007.

You can decide to view them as shortcomings or disagree with the reasoning of Apple, wether they made the right decision. But you can not deny that there is a fundamental difference between the hundreds of companies, who contribute to the development of the PC ecosystem, many of whom steering in opposite directions; and the much more controlled vertical integration under-one-roof of Apple, making everything form the silicon up to the movie subscription service themselves.

I don't care for your approval. I laid down my train of though and instead of a proper rebuttal, you only offer me an I don't follow? I think I won that debate.

Let's make it a short shot then. I don't even need Blu-rays, I can argue with AirPrint. When introducing printing to iOS, Apple literally made all previous printer drivers obsolete. So yes, they were so rigid and dismissive of the need for a printer (the printer you already have) to work with the Apple ecosystem. And yes, I even support that decision.

New rule: Your printer must support AirPrint or it won't work with iPhone.

Golden Cage 101: You don't make the rules, Apple makes the rules.

You always attempt to claim you win arguments by fiat, even when faced with multiple arguments disproving your contentions. You have also inserted pure conjecture into your post (Apple attempting to become a software company in the Mac Clones era), despite Apple actually increasing the number of Mac models it built in that same timeframe. The licensing move was an ill-fated attempt to expand the market share of the Mac OS, although it did lead to some innovation from third-parties such as Power Computing with their multi-processor systems, which made their way back to Macs built by Apple. Also, if someone doesn't follow your arguments, it doesn't mean you "won" the debate (which only you seem view this as), it means that you lacked clarity of both thought and narrative in your statements.

The difference between the Mac and PC ecosystems is honestly one of simplicity. With the Mac, it's nowhere near as difficult to change course (i.e, the 68k - PPC, PPC-Intel, or Intel-AS transitions) because you don't have hundreds of system builders, component manufacturers, etc. with skin in the game. I would also point out that the vast majority of those companies do NOT influence the OS or software side of things, because they either make components that usually have no drivers for the OS (PSUs, cases, case fans, etc.), or their devices just need a basic driver to function and nothing more. Likewise, the continued focus on backwards compatibility with Windows has hurt the overall inertia of the PC industry in terms of moving to a fully 64-bit OS. Attempting to change direction on the PC side of the industry (whether coming from Microsoft, Intel, AMD, or anyone else in the industry) would be like trying to turn a fully loaded supertanker on a dime in a typhoon.
 

Gudi

Suspended
May 3, 2013
4,590
3,267
Berlin, Berlin
The problem with this post it that it is based off the false presumption that people need to run a retina-class display externally.
No, it's based on the right assumption that Apple isn't obliged to offer what you want or need. They can offer whatever they want and only need to satisfy their recurring customers. I'm buying Macs precisely because they have way better displays than I need. It's a luxury brand focused on the user experience.
While there are elements of the user base who benefit from that (content creation, video editing/creation, photographers, etc.), there is also a sizeable majority of the user base who just use their machines for basic tasks.
The experience is still great, regardless if you need it to be great. 😆
In those cases, it really is not even necessary to spend the $$ on a 1440p or higher resolution display, because it won't have enough of a benefit to make it worth the additional cost.
What really isn't necessary is Apple offering products for people with no money. 🤑
 

Gudi

Suspended
May 3, 2013
4,590
3,267
Berlin, Berlin
You always attempt to claim you win arguments by fiat, even when faced with multiple arguments disproving your contentions.
I'm still waiting for your arguments.
You have also inserted pure conjecture into your post (Apple attempting to become a software company in the Mac Clones era), despite Apple actually increasing the number of Mac models it built in that same timeframe.
Offering more variants of increasingly less desirable and less profitable products, doesn't make you a hardware manufacturer. The hardware business was in trouble and so they tried to monetize the software directly. It's what happened, not a conjecture.
Also, if someone doesn't follow your arguments, it doesn't mean you "won" the debate (which only you seem view this as), it means that you lacked clarity of both thought and narrative in your statements.
Right. There are no dumb students, only teachers with a lack clarity. 👩‍🏫
The difference between the Mac and PC ecosystems is honestly one of simplicity. With the Mac, it's nowhere near as difficult to change course (i.e, the 68k - PPC, PPC-Intel, or Intel-AS transitions) because you don't have hundreds of system builders, component manufacturers, etc. with skin in the game.
It's way harder for the Mac development team, because they have no one to copy from. They need to figure it out all by themselves and that's how you learn, which of several ways is the best to implement a certain functionality. Apple might need years to develop a new feature, which gets hastily copied by Samsung in six months. As a result the copycats have no understanding of how one feature is integrated with another. Then when you update one abstraction layer like the silicon architecture, the entire house of cards build upon will fall.
I would also point out that the vast majority of those companies do NOT influence the OS or software side of things, because they either make components that usually have no drivers for the OS (PSUs, cases, case fans, etc.), or their devices just need a basic driver to function and nothing more.
Every screw within a computer ads size, weight, heat, noise, complexity, airflow and cooling problems. The most valuable stuff in a Mac is all the things that aren't even there, like the fan in a MacBook Air. Just look how many chips got replaced by a System-on-a-Chip, each made by a different company with the need to report a profit. You'll never build a small computer with so many vested interests.
Likewise, the continued focus on backwards compatibility with Windows has hurt the overall inertia of the PC industry in terms of moving to a fully 64-bit OS. Attempting to change direction on the PC side of the industry (whether coming from Microsoft, Intel, AMD, or anyone else in the industry) would be like trying to turn a fully loaded supertanker on a dime in a typhoon.
I don't particularly care about the excuses, why the PC can't keep up with Apple's speed of innovation. I just wish to be on the platform that moves forward. The dark side of the industry is dead to me.
 

dmccloud

macrumors 68040
Sep 7, 2009
3,146
1,902
Anchorage, AK
I'm still waiting for your arguments.

Offering more variants of increasingly less desirable and less profitable products, doesn't make you a hardware manufacturer. The hardware business was in trouble and so they tried to monetize the software directly. It's what happened, not a conjecture.

Right. There are no dumb students, only teachers with a lack clarity. 👩‍🏫

It's way harder for the Mac development team, because they have no one to copy from. They need to figure it out all by themselves and that's how you learn, which of several ways is the best to implement a certain functionality. Apple might need years to develop a new feature, which gets hastily copied by Samsung in six months. As a result the copycats have no understanding of how one feature is integrated with another. Then when you update one abstraction layer like the silicon architecture, the entire house of cards build upon will fall.

Every screw within a computer ads size, weight, heat, noise, complexity, airflow and cooling problems. The most valuable stuff in a Mac is all the things that aren't even there, like the fan in a MacBook Air. Just look how many chips got replaced by a System-on-a-Chip, each made by a different company with the need to report a profit. You'll never build a small computer with so many vested interests.

I don't particularly care about the excuses, why the PC can't keep up with Apple's speed of innovation. I just wish to be on the platform that moves forward. The dark side of the industry is dead to me.

You're "still waiting" for my arguments, then poorly attempt to dispute several of my arguments I allegedly never made in the same post. Do you always talk out both sides, or is this just your forum persona here?

You attempted to double down on your false assertion that Apple wanted to pivot to being just a software company in the clone era, but offer nothing more than a claim it was an attempt by Apple to monetize their software directly. At the time the clones were on the market, System 7 (later System 8 and 9) were already monetized, because you had to purchase the new OS. Apple was still charging for new versions of Mac OS into 2013, so monetization existed before, during, and well after the clone era and Steve Jobs' return to the company. The goal at the time Apple started licensing clones was to expand market share of the Mac platform and to gain ground against the behemoth Windows had become in the personal computing space. This is all easily verified by looking at articles written at the time, in addition to statements from Apple itself in that timeframe.

I also find it odd that you refer to the Apple Silicon architecture as an "abstraction layer." Abstraction layers exist to separate the programming side FROM the underlying hardware, not within the underlying hardware itself. Additionally, since thousands of developers were already coding for an ARM-based architecture for the A-series SOCs found in the iPhone and iPad, the switch to Apple Silicon actually simplified cross-platform development, as one code base now works across all products (Mac Pro the sole outlier at this time). This can result in massive cost savings over coding for both AS and Intel, as you can literally write once to compile for the Mac, iPhone, and iPad at the same time.

You also make the claim that "You'll never build a small computer with so many vested interests.", yet we see systems such as the Raspberry Pi and Arduino boards making significant inroads into the hobbyist space, which has a lot of overlap with both the developer and enthusiast/PC builder communities. These devices do have an architecture more similar to Apple Silicon than to x86 systems, yet still utilize some components from third parties, just as the new Macs use some third-party parts as well. This means small computers are in fact being built all the time, which directly refutes your argument to the contrary. Unfortunately on the Raspberry side, demand has been outpacing supply for the last 2+ years. You also have mini consoles such as the NES and SNES Classic, "The C64", and other devices which are using ARM-based architectures or FPGAs to emulate entire consoles on a handful of chips in an extremely small enclosure. In the case of the C64 and similar devices, one can also choose to write programs for those machines, which is another point contradicting your claims.

The market size of Windows-based PCs is the very reason why innovation is so much slower on that side of the PC market. It's significantly harder to row a boat upstream than it is to go downstream, especially when you have multiple obstacles in your path. With the PC side of the industry, you have hundreds of companies who make components for PCs, whether individual chips on the motherboards, plug-in components such as GPUs, RAID cards, USB, HDDs, SSDs, motherboard chipsets, etc. At the same time, certain aspects of the PC industry have massively shrunk in size over the last decade or so. Whereas you once had dozens of companies producing networking components for motherboards in the 2000s, today that number is down to three for all intents and purposes, two of which are Intel and Realtek. The innovation on the networking side now comes from companies that are platform neutral, such as Netgear and Linksys - not Microsoft, Intel, or AMD.
 
  • Like
Reactions: Irishman

Gudi

Suspended
May 3, 2013
4,590
3,267
Berlin, Berlin
Do you always talk out both sides, or is this just your forum persona here?
There's but one side! Truth is singular. It's 'versions' are mistruths.
You attempted to double down on your false assertion that Apple wanted to pivot to being just a software company in the clone era, but offer nothing more than a claim it was an attempt by Apple to monetize their software directly.
I never said they wanted to become a software company. It's just the consequence of how they reacted to the market circumstances. You don't plan to become unsuccessful with our core business.
Apple was still charging for new versions of Mac OS into 2013, so monetization existed before, during, and well after the clone era and Steve Jobs' return to the company.
But not in competition to their own hardware business. The clones got killed off rather quickly. We could argue a lot better, if you for once stated what you actually want to prove? My point was that the "tight integration of Apple hardware and software is at the center of Apple's business model" so much so that not even Tim Cook could reverse course and open macOS up to work with arbitrary third-party hardware even if he wanted. You can only stand up on a stage and tout that only the joint development of hardware and software enables "innovation only Apple can do" before you are bound by your words and your employees, customers and shareholders expect you to be exactly this innovative company. Not only would it be incredibly stupid to abandon this vertically integrated business model, which proved to be way more profitable and create much higher customer satisfaction in the long run. It's also impossible and will never happen. So when from time to time a Windows user comes along and demands, the Mac market should work by the same rules that govern the PC market, the answer is: no. If you want a Mac that works like a PC, then you don't want a Mac and should go buy a PC.
The goal at the time Apple started licensing clones was to expand market share of the Mac platform and to gain ground against the behemoth Windows had become in the personal computing space.
And how is gaining market share and win on mass market appeal not a direct copy of the Microsoft business model? Steve Jobs always wanted to build the best computers in the world, not the most. And he was unswerving that great profits will come from insanely great products. The ability to pair a cheap, good enough monitor − designed by some third-party even − with a Mac is the antithesis to Apple's corporate identity.
This is all easily verified by looking at articles written at the time, in addition to statements from Apple itself in that timeframe.
And what do you think you proved by that? That Apple isn't a company build on the idea to deliver software and hardware from one hand; and willing to break backward and third-party compatibility for the sake of innovation. Because you failed.

Steve Jobs quoting Alan Kay

You don't win an argument by nitpicking on my wording. You win by drafting a better description of the world and how it works with your own examples.
I also find it odd that you refer to the Apple Silicon architecture as an "abstraction layer." Abstraction layers exist to separate the programming side FROM the underlying hardware, not within the underlying hardware itself.
Everything is a layer. The ARM instruction set is a layer between one specific hardware implementation of these instructions and the machine code running on all ARM CPUs. A higher programming language is an abstraction layer between the machine code written for a specific instruction set and a program, which can be compiled to run on different silicon architectures. The OS is another layer between user-facing programs and programs which control a specific hardware configuration. And so on...

The proper separation of these layers and well defined interfaces between them, allow you to upgrade and transition from one hardware to another, from one OS version to another, one set of software to another, etc. ... hot-plugging a USB-stick without rebooting the whole computer to recognize the new hardware configuration. Dragging an app from the Applications folder into the Trash without starting a special Uninstaller to clear out all the DLL links. Switching form TouchID to FaceID without once breaking the security chain and demanding upgrades from app developers.
Additionally, since thousands of developers were already coding for an ARM-based architecture for the A-series SOCs found in the iPhone and iPad, the switch to Apple Silicon actually simplified cross-platform development, as one code base now works across all products (Mac Pro the sole outlier at this time). This can result in massive cost savings over coding for both AS and Intel, as you can literally write once to compile for the Mac, iPhone, and iPad at the same time.
I know.
You also make the claim that "You'll never build a small computer with so many vested interests.", yet we see systems such as the Raspberry Pi and Arduino boards making significant inroads into the hobbyist space, which has a lot of overlap with both the developer and enthusiast/PC builder communities.
Two stacked USB-A ports on a Raspberry Pi already make it thicker than my 11.5mm iMac and there is still no display, speakers, webcam, microphone, keyboard or mouse. Put everything together that's needed for a complete setup and then compare sizes again. Or if you want to argue that a computer does not necessary need a 24-inch display with Dolby Atmos, then compare your Raspberry Pi to an Apple Watch with a 1.61-inch display.
These devices do have an architecture more similar to Apple Silicon than to x86 systems, yet still utilize some components from third parties, just as the new Macs use some third-party parts as well.
And they are trash for hobbyists.
This means small computers are in fact being built all the time, which directly refutes your argument to the contrary.
image-119553--4587668.png.jpeg

This is your competition and until you migrate the whole Windows ecosystem to something this thin, I won't even grab my caliper gauge. Apple didn't integrate Thunderbolt 4 into USB-C for fun. If there is room for something as large as a USB-A plug, it's already not thin and light enough.
Unfortunately on the Raspberry side, demand has been outpacing supply for the last 2+ years. You also have mini consoles such as the NES and SNES Classic, "The C64", and other devices which are using ARM-based architectures or FPGAs to emulate entire consoles on a handful of chips in an extremely small enclosure. In the case of the C64 and similar devices, one can also choose to write programs for those machines, which is another point contradicting your claims.
A miniature C64 emulator is not the same as an octa-core M1 iPad. Everything that slow can be very small. Again, compare to an Apple Watch!
The market size of Windows-based PCs is the very reason why innovation is so much slower on that side of the PC market.
The iPhone is a much bigger market than PCs and yet Apple switches effortless from 32-bit single core to 64-bit quad-core big.LITTLE CPUs. It's not about the number of devices, but abstraction layers and vertical integration. A −30% slump in PCs sales did nothing to increase the innovation rate.
It's significantly harder to row a boat upstream than it is to go downstream, especially when you have multiple obstacles in your path. With the PC side of the industry, you have hundreds of companies who make components for PCs, whether individual chips on the motherboards, plug-in components such as GPUs, RAID cards, USB, HDDs, SSDs, motherboard chipsets, etc.
There are also hundreds of Apple suppliers, which need to integrated into one working product. The work is just easier, because there is one central command which sets the goal.
At the same time, certain aspects of the PC industry have massively shrunk in size over the last decade or so. Whereas you once had dozens of companies producing networking components for motherboards in the 2000s, today that number is down to three for all intents and purposes, two of which are Intel and Realtek. The innovation on the networking side now comes from companies that are platform neutral, such as Netgear and Linksys - not Microsoft, Intel, or AMD.
And now do some innovation, when you can't even be sure that the company, which worked on the last version, didn't go bankrupt in the meantime! Albeit Windows tries to stay backward compatible, compatibility between dozens of component makers breaks all the time. So what you end up is the least common denominator of loosely implemented standards. PS/2 ports forever! Whereas the ability for true innovation requires you to be able to replace a whole abstraction layer without interfering with anything above or beyond. Like switching from HFS+ to APFS and most people didn't even notice that they run on a whole different file system.

During my twelve years on the Mac, Apple changed absolutely everything about the platform. The chip architecture, the instruction set, the boot loader, the programming language, the graphics api, the screen resolution, the file system, the process management, the scroll direction and every single icon. Meanwhile Windows laptops innovated on being folded into origami mode with a stand.
 

dmccloud

macrumors 68040
Sep 7, 2009
3,146
1,902
Anchorage, AK
There's but one side! Truth is singular. It's 'versions' are mistruths.
Singular "truth" would be something such as "2+2=4". When it comes to what you're claiming, a lot of it is subject to interpretation and not necessarily a universally agreed upon "truth" in the sense you claim it to be.

I never said they wanted to become a software company. It's just the consequence of how they reacted to the market circumstances. You don't plan to become unsuccessful with our core business.

But not in competition to their own hardware business. The clones got killed off rather quickly. We could argue a lot better, if you for once stated what you actually want to prove? My point was that the "tight integration of Apple hardware and software is at the center of Apple's business model" so much so that not even Tim Cook could reverse course and open macOS up to work with arbitrary third-party hardware even if he wanted. You can only stand up on a stage and tout that only the joint development of hardware and software enables "innovation only Apple can do" before you are bound by your words and your employees, customers and shareholders expect you to be exactly this innovative company. Not only would it be incredibly stupid to abandon this vertically integrated business model, which proved to be way more profitable and create much higher customer satisfaction in the long run. It's also impossible and will never happen. So when from time to time a Windows user comes along and demands, the Mac market should work by the same rules that govern the PC market, the answer is: no. If you want a Mac that works like a PC, then you don't want a Mac and should go buy a PC.

And how is gaining market share and win on mass market appeal not a direct copy of the Microsoft business model? Steve Jobs always wanted to build the best computers in the world, not the most. And he was unswerving that great profits will come from insanely great products. The ability to pair a cheap, good enough monitor − designed by some third-party even − with a Mac is the antithesis to Apple's corporate identity.

Shifting advocacy just weakens your overall arguments, because it indicates you are more concerned with being perceived as "winning" an argument than you are defending your original ones. You keep trying to equate the Mac clone era to Microsoft, but until the Windows Phone and Surface devices, Microsoft was never in the hardware business. Microsoft got their foothold in the industry by making DOS available across a wide variety of platforms, including x86, the Apple II, CP/M machines, the Laser 128, TRS-80, and the Altair. They then took to poorly copying System 1 for the Mac by creating the original Windows OS.

Microsoft tried to gain market share not by creating the best products, but by sheer volume and flooding the market with Microsoft-created software. Hell, Microsoft honestly could have cared less about clock speeds, RAM, or graphics performance at the time, which is what so many so-called "experts" in the tech reporting industry fixate on. Even now, Microsoft's Surface offerings often lag behind systems from companies such as HP, Lenovo, Dell, etc. when it comes to hardware specs.

And what do you think you proved by that? That Apple isn't a company build on the idea to deliver software and hardware from one hand; and willing to break backward and third-party compatibility for the sake of innovation. Because you failed.

Steve Jobs quoting Alan Kay

You don't win an argument by nitpicking on my wording. You win by drafting a better description of the world and how it works with your own examples.

Since you like to put words into the mouths of others - no, I won't stoop to that level. It's easy enough to use your own words against your arguments. Your perspective is very narrowly tailored and wholly dismissive of any reasoning, facts, etc. which contradict your claims.

Everything is a layer. The ARM instruction set is a layer between one specific hardware implementation of these instructions and the machine code running on all ARM CPUs. A higher programming language is an abstraction layer between the machine code written for a specific instruction set and a program, which can be compiled to run on different silicon architectures. The OS is another layer between user-facing programs and programs which control a specific hardware configuration. And so on...

The ISA is as close as you can get to raw silicon unless you're directly accessing the underlying opcodes. Programming languages such as Swift, Java, any C variant, Python, etc. are abstraction layers on top of that ISA.

The proper separation of these layers and well defined interfaces between them, allow you to upgrade and transition from one hardware to another, from one OS version to another, one set of software to another, etc. ... hot-plugging a USB-stick without rebooting the whole computer to recognize the new hardware configuration. Dragging an app from the Applications folder into the Trash without starting a special Uninstaller to clear out all the DLL links. Switching form TouchID to FaceID without once breaking the security chain and demanding upgrades from app developers.

I know.

Two stacked USB-A ports on a Raspberry Pi already make it thicker than my 11.5mm iMac and there is still no display, speakers, webcam, microphone, keyboard or mouse. Put everything together that's needed for a complete setup and then compare sizes again. Or if you want to argue that a computer does not necessary need a 24-inch display with Dolby Atmos, then compare your Raspberry Pi to an Apple Watch with a 1.61-inch display.

And they are trash for hobbyists.

The OSI model (which is where the various layer definitions come from) has nothing to do with upgrading machines. Hot-swap and plug-and-play capabilities were added to Windows because people wanted the convenience of not having to constantly restart their machines or manually set IRQ, interrupts, or have to flip tiny DIP switches to configure hardware.

I can do a lot more with a Raspberry PI than the Apple Watch - emulation of multiple consoles, control and manage smart home devices, set up a media server for my house, run interactive displays for convention centers, use as a very small PC that can literally be attached to the back of a monitor, etc. You claim they're trash for hobbyists, yet there are a plethora of user groups centered around the Raspberry Pi around the world. Calling them "trash" is just your opinion, and not something backed up by facts or sales figures. This is also why your claim that "I do not have opinions, only analyses to offer for debate." is laughable, because you then attempt to portray your opinions as irrefutable fact, and skirt debate by shifting your advocacy when cornered. You also have a tendency to twist the comments of others into something you can argue against when more often than not you're not even arguing against the substance of the comments as made, but your misguided interpretations of said comments.

View attachment 2207435
This is your competition and until you migrate the whole Windows ecosystem to something this thin, I won't even grab my caliper gauge. Apple didn't integrate Thunderbolt 4 into USB-C for fun. If there is room for something as large as a USB-A plug, it's already not thin and light enough.

A miniature C64 emulator is not the same as an octa-core M1 iPad. Everything that slow can be very small. Again, compare to an Apple Watch!

Again the Apple Watch is irrelevant. Also, attempting to compare something like "The C64" to an iPad is a red herring and not pertinent, because your argument was that people can't make small computers. That was an example along with the Raspberry Pi which demonstrated your point was clearly wrong. Shifting the goalposts after the fact while ignoring your prior statements is disingenuous at best.

The iPhone is a much bigger market than PCs and yet Apple switches effortless from 32-bit single core to 64-bit quad-core big.LITTLE CPUs. It's not about the number of devices, but abstraction layers and vertical integration. A −30% slump in PCs sales did nothing to increase the innovation rate.

There are also hundreds of Apple suppliers, which need to integrated into one working product. The work is just easier, because there is one central command which sets the goal.

And now do some innovation, when you can't even be sure that the company, which worked on the last version, didn't go bankrupt in the meantime! Albeit Windows tries to stay backward compatible, compatibility between dozens of component makers breaks all the time. So what you end up is the least common denominator of loosely implemented standards. PS/2 ports forever! Whereas the ability for true innovation requires you to be able to replace a whole abstraction layer without interfering with anything above or beyond. Like switching from HFS+ to APFS and most people didn't even notice that they run on a whole different file system.

During my twelve years on the Mac, Apple changed absolutely everything about the platform. The chip architecture, the instruction set, the boot loader, the programming language, the graphics api, the screen resolution, the file system, the process management, the scroll direction and every single icon. Meanwhile Windows laptops innovated on being folded into origami mode with a stand.


At this point, it's clear you're more concerned with being perceived as "winning" arguments rather than discussing/debating things in good faith, so I'm done with this thread.
 
  • Like
Reactions: Irishman

Gudi

Suspended
May 3, 2013
4,590
3,267
Berlin, Berlin
Singular "truth" would be something such as "2+2=4". When it comes to what you're claiming, a lot of it is subject to interpretation and not necessarily a universally agreed upon "truth" in the sense you claim it to be.
Everything is a science. There is no such realm in which truth does not exist or is open for interpretation. All you can do is get a more nuanced understanding of the singular truth. If something is true, other true things don't take away from it, but add color to the overall picture of everything.
Shifting advocacy just weakens your overall arguments, because it indicates you are more concerned with being perceived as "winning" an argument than you are defending your original ones.
As Germans we have compound words to describe absolutely everything. There is no need to ever indicate or hint at anything. I say exactly what I mean and you should too to not waste our time.
You keep trying to equate the Mac clone era to Microsoft, but until the Windows Phone and Surface devices, Microsoft was never in the hardware business.
I equated nothing. The Mac clone era was merely an example of "Classic" Mac OS running on and being compatible to non-Apple hardware. Which seems to be the direction you want todays Apple to develop towards. This wasn't back then and isn't now a viable business model for a company, who doesn't hold a firm OS monopoly. Both companies and their market circumstances are very different, what works for one doesn't necessary work for the other. There is no equation to make.

Microsoft could afford to not have their own hardware business, because there was and is no serious threat to their software monopoly. Windows still holds a healthy 75% global market share. Apple only survived and became profitable by integrating hardware and software in a way no OEM can integrate with Windows. It's not just too many companies who can't agree on anything. A single hardware company also couldn't attempt a path of innovation that requires a complete rewrite of a major part of Windows. And even if they tried to team up, they'll find different corporate cultures, no experience of software and hardware guys working together and legacy products which were designed independent of each other.

People who buy Macs, buy them for what is great about them, not for what is not so great about them. And what's great comes from what Apple does differently than the rest of the PC industry. Think different.™ is not just a marketing slogan. When Apple thought it could license its OS out like Microsoft, it almost killed them. And whenever Microsoft thought, it could build a Zune or a Surface or a Windows Phone, they failed miserably.
Microsoft got their foothold in the industry by making DOS available across a wide variety of platforms, including x86, the Apple II, CP/M machines, the Laser 128, TRS-80, and the Altair. They then took to poorly copying System 1 for the Mac by creating the original Windows OS.
That's the facts, now shower me with your insights.
Microsoft tried to gain market share not by creating the best products, but by sheer volume and flooding the market with Microsoft-created software. Hell, Microsoft honestly could have cared less about clock speeds, RAM, or graphics performance at the time, which is what so many so-called "experts" in the tech reporting industry fixate on. Even now, Microsoft's Surface offerings often lag behind systems from companies such as HP, Lenovo, Dell, etc. when it comes to hardware specs.
More facts. But why did they win? Because they lifted hardware makers from the need and burden to write their own software. Even Apple initially was a hardware only company. When they released the first Macintosh it came with MacWrite and MacPaint to showcase its fonts and graphics, but there was no productivity software for which they relied on Microsoft (not yet called Office). Bill Gates threat to "stop all software development for Mac" even helped Microsoft to get away with theft. And later Microsoft exploited their Windows monopoly to dictate which other software OEMs could sell preinstalled.

The Apple vs. Microsoft GUI Lawsuit

Apps like Pages, Numbers, Keynote, Safari and even Apple Maps exist not only to increase user value and customer binding, they are an insurance against any future blackmail attempts. Apple's formative years for their corporate character were marked by extortion, exploitation and copying. They didn't became closed, secretive and mistrusting for nothing. The PC industry is like swimming in a shark tank.
The ISA is as close as you can get to raw silicon unless you're directly accessing the underlying opcodes.
As close as you can get with software, but hardware implementation efficiency gains exists too. CISC was an obstacle for better hardware solutions, hence RISC and ARM. Once you have a much simpler instruction set to operate with, your chip designers can start to build a whole different CPU logic. Remember back when tick-tock was an Intel release cycle for CPUs? The tocks were the microarchitecture changes, always a little less exciting than the process node shrinks, the ticks. The M1 was a boisterous tock, reminding everyone that chip design is good for more than measly 5% gains. But to make this possible, you need to recompile the entire OS, add a new compile target to Xcode and build a Rosetta 2 abstraction layer for legacy software. And you need to communicate and "sell" the transition to developers and customers. It's a giant effort inside and outside of the realm of chip making itself, even if only one company is doing all the heavy lifting.
The OSI model (which is where the various layer definitions come from) has nothing to do with upgrading machines.
I'm not remotely interested in the network architecture use of the term. I'm using 'abstraction layer' literally to describe the function of an ISA in how it connects software with hardware. It used to be that machine code was written for one specific chip and wouldn't even run on its direct successor build by the same company. ppc64, x86_64 and arm64 describe families of chips, sometimes made by different companies, who all communicate via the same instruction set. So the instruction set functions as an abstraction layer. You don't need to know whether the Mac has an M1, M2 or M3 chip, as long as your software was compiled for Apple Silicon.

The ability to not only advance technology at either side of an abstraction layer, but to swap out the abstraction layer itself for a more modern one (which requires a joint effort of hardware and software development combined) is what differentiates Apple innovations from PC innovations. You can achieve so much more in performance and energy efficiency, when you're able to drop x86 itself. Yes, this breaks native support for all triple-A games written so far, but it also unleashes the frame rate for all triple-A games written in the future for arm64.
Hot-swap and plug-and-play capabilities were added to Windows because people wanted the convenience of not having to constantly restart their machines or manually set IRQ, interrupts, or have to flip tiny DIP switches to configure hardware.
You can call it convenience, an interface or an abstraction layer. What it comes down to is the ability to change the hardware configuration without crashing the software.
I can do a lot more with a Raspberry PI than the Apple Watch - emulation of multiple consoles, control and manage smart home devices, set up a media server for my house, run interactive displays for convention centers, use as a very small PC that can literally be attached to the back of a monitor, etc.
You can do nothing like that without a display. And once you add a display the size of an iPhone, your iPhone can do all of that in a much smaller package. I don't need to attach anything on the back of my iMac, it's a monitor and a quad-core computer in one.
You claim they're trash for hobbyists, yet there are a plethora of user groups centered around the Raspberry Pi around the world. Calling them "trash" is just your opinion, and not something backed up by facts or sales figures.
Sure it is. Compare the sales figures of iPhones and Raspberry PIs and tell me what the average "non-hobbyist"
user choses to control their smart home? iOS simplicity wins by many magnitudes, so much so that the Raspberry Pi can be ignored as a rounding error. The competition is between iOS and Android with no third horse in the race.
This is also why your claim that "I do not have opinions, only analyses to offer for debate." is laughable, because you then attempt to portray your opinions as irrefutable fact, and skirt debate by shifting your advocacy when cornered.
Please corner me! To describe a display-less, case-less circuit board as trash was benevolent of me. With sub <1% market share it's actually nonexistent.
You also have a tendency to twist the comments of others into something you can argue against when more often than not you're not even arguing against the substance of the comments as made, but your misguided interpretations of said comments.
Says the guy who brought up the OSI model, which nobody talked about.
Again the Apple Watch is irrelevant. Also, attempting to compare something like "The C64" to an iPad is a red herring and not pertinent, because your argument was that people can't make small computers.
First of all. What's a computer? And who are you to decide about the relevance of Apple Watches and Raspberry PIs! And I didn't say PC guys can't make small computers, they can't make small yet powerful computers. For that you need energy efficiency and for that you need to be able to leave x86 behind in favor of ARM. Everyone can build a small Intel laptop with cores that throttle down when used. If it doesn't need to run an i9 can fit into a super small enclosure.
That was an example along with the Raspberry Pi which demonstrated your point was clearly wrong. Shifting the goalposts after the fact while ignoring your prior statements is disingenuous at best.
Maybe you didn't understand my prior statement, because you were distracted by something going on in your head and not mine?
At this point, it's clear you're more concerned with being perceived as "winning" arguments rather than discussing/debating things in good faith, so I'm done with this thread.
Perceived by whom? There's nobody else left in this forum but you and me. So maybe you want to win this argument at all costs!
 

diamond.g

macrumors G4
Mar 20, 2007
11,437
2,665
OBX
Remember when this thread was about the updates to Metal announced at WWDC last year? Now the entire thread has devolved into arguments about Mac gaming being dead or not, and then into Mac vs PC crap.
So RE4 for WWDC this year? Along with a release date for NMS plus the VR version for RealityOS?
 

Pressure

macrumors 603
May 30, 2006
5,182
1,545
Denmark
May 2023 Steam Survey puts Apple Silicon at 56.04% (+1.11%) market share on macOS.

SteamSurveyMay2023.png


SoCMarket shareChange
M130,77%+0,13%
M1 Pro9,49%+0,14%
M1 Max3,46%+0,03%
M29,57%+0,91%
M2 Pro1,95%+0,89%
M2 Max0,81%+0,29%

We can see the share of M2 based Apple Silicon machines increasing compared to the April 2023 survey.
 
Last edited:

diamond.g

macrumors G4
Mar 20, 2007
11,437
2,665
OBX
May 2023 Steam Survey puts Apple Silicon at 56.04% (+1.11%) market share on macOS.

View attachment 2211124

SoCMarket shareChange
M130,77%-0,23%
M1 Pro9,49%-0,21%
M1 Max3,46%-0,53%
M29,57%+0,91%
M2 Pro1,95%+0,89%
M2 Max0,81%+0,29%

We can see the share of M2 based Apple Silicon machines increasing compared to the April 2023 survey.
I wonder how long till Apple Silicon is 100%.
 

Nugat Trailers

macrumors 6502
Dec 23, 2021
297
576
I wonder how long till Apple Silicon is 100%.
Being serious, probably a fair while. It's only been really recently that the 2013 MacBook Airs running nVidia dropped off the GPU charts.

I can see ASI at about 62-65% in November, and hitting 75% in mid next year, but that last 25%'ll be a problem.
 
  • Like
Reactions: Irishman

diamond.g

macrumors G4
Mar 20, 2007
11,437
2,665
OBX
Probably the biggest news for AAA gaming on the Mac. This is direct acknowledgement that Apple wants AAA games.

View attachment 2213599
It will be interesting to see how much "heavy lifting" this does in assisting games be "ported" over. Just trying to picture how useful this would be for say CD Projekt Red to bring Cyberpunk 2077 to macOS. Or is this more for devs like @Ethosik where their game mostly already works on macOS and this is just to get them over the "last hurdle".
 
  • Like
Reactions: Irishman

Longplays

Suspended
May 30, 2023
1,308
1,158
May 2023 Steam Survey puts Apple Silicon at 56.04% (+1.11%) market share on macOS.

View attachment 2211124

SoCMarket shareChange
M130,77%-0,23%
M1 Pro9,49%-0,21%
M1 Max3,46%-0,53%
M29,57%+0,91%
M2 Pro1,95%+0,89%
M2 Max0,81%+0,29%

We can see the share of M2 based Apple Silicon machines increasing compared to the April 2023 survey.
If the title was

In 3 years, 50% of all Macs capable of playing AAA games will be Apple Silicon​


Then OP would be on point.
 

senttoschool

macrumors 68030
Original poster
Nov 2, 2017
2,626
5,482
Likely also an acknowledgement of developers' general lack of interest so far to do it that Apple is offering a Porting Toolkit.
Sure, but developer tools + large install base is how you get them to bring AAA games to Macs.

The install base is growing fast. The tools are developing.

The next few years will be significantly better than the last few years.

When I first wrote the original post, I never thought that Macs would be powerhouses for AAA games right away. All I saw was the potential.
 

diamond.g

macrumors G4
Mar 20, 2007
11,437
2,665
OBX
Sure, but developer tools + large install base is how you get them to bring AAA games to Macs.

The install base is growing fast. The tools are developing.

The next few years will be significantly better than the last few years.

When I first wrote the original post, I never thought that Macs would be powerhouses for AAA games right away. All I saw was the potential.
I still don't get how Apple could be 43% of the gaming market yet the Steam hardware survey numbers don't reflect that at all.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.