Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

IG88

macrumors 65816
Nov 4, 2016
1,116
1,645
Still, the earlier suggestion of ad blockers like uBlock Origin at all levels (and, lest I forget, a DNS server you might explore trying out sometime under your Network settings — 76.76.2.2 is an excellent one to begin with) will relieve a lot of that memory pressure, since your system won’t be loading all that superfluous, disruptive guff in the first place.

For fun, you ought to try the lightweight, Mozilla-based browser SeaLion by MR forum’s own wicknix.

I use Wipr, seems like I had issues years back with uBlock support for Safari being slow or going away.

Maybe I'll check out SeaLion.

I think I'm using Google DNS currently (ISP is Google Fiber) but yeah, perhaps I should look at alternatives.
 
newsflash: mechanical drives have finite life and go bad after x-amount of use too!

personally, i'm not even sure if a typical HDD will outlive an SSD when used as your long term daily drivers.
i think i'd actually trust a "good" SSD more than a HDD for this.

Consumables have limited lifespans. All of them. Even, yes, RAM. Solid-state storage has pre-calculated mean-time between failure based on the type of NANDs it uses and, thus, the maximum number of writes for that type of NAND. It’s fairly cut and dry.

With spinning rust, additional factors are involved (including frequency of shock, strong magnetic fields, ambient operating/storage temperature, humidity, and environmental air contaminants). But there are three things working to a spinner’s favour, if a user strives to reduce or eliminate most, if not all of those factors.

One, the maximum number of reads/writes per magnetic sector aren’t as calculable as the hard-and-fast rule around NAND writes. Additionally, other manufacturer design decisions may alter that fate somewhat (including some more radical measures, such as sealing an inert gas within the casing).

Two, in the event of SSD failure, it is often complete and instant, with zero recovery possible (which I have experienced once, on an OEM Apple SSD within a rMBP). With the failure of rust, unless the controller has failed (which I have experienced once), then data may be recoverable/savlageable. No, salvaged data might not be a joy, but it may be enough to save critical data which were not yet backed up elsewhere.

And three, a well-cared for HDD, housed in an optimal environment, can last much, much longer than either expected or even warrantied by the original manufacturer. Yes, that WD Black HDD may have been sold with a five-year warranty, but at twelve-plus years, it may still be going strong, with S.M.A.R.T. monitor status only showing number of hours in use being on the high side, with all other metrics showing considerable life left. Again, SSDs, especially when used in heavy-swap settings, such as on a system with unusually low RAM for the OS and applications it was designed to run, will wear out much more quickly and with far less advance notice before failure.

Each, at this time, have a valid use-case, but that valid use-case needs to factor the external factors involved. Apple, still selling systems with only 8GB RAM, know better than to do this, but also know that, shy of trade regulations and enforcement of those regulations, doing this makes shareholders pleased — at the cost of both the finite nature of this straining planet and the inconvenience of consumers who discover, not too far down the road following purchase, that they were sold an under-equipped product with no fallback or recourse for redressing that shortfall, making as good as near-junk to junk.
 
Last edited:
I use Wipr, seems like I had issues years back with uBlock support for Safari being slow or going away.

I’m not familiar with Wipr, but it appears to be a standalone app which approaches the tracking/blocking at the end point, and not at the point before all of that arrives to your system in the first place.

For a couple of years, many moons ago, I got used to using Safari, so I remember how weird it was to switch once I did (volleying back and forth between Chrome/Chromium and Firefox, then eventually finding Firefox the less problematic of the two). I encourage you to try the current Firefox in combination with the current uBlock Origin, to see how that feels as a test drive.

Maybe I'll check out SeaLion.

If, for no other reason, it gives you a new browser to try out — one which is really light on system resources.

I think I'm using Google DNS currently (ISP is Google Fiber) but yeah, perhaps I should look at alternatives.

Ooof. Yah. Google are an advertising company. Their DNS servers are certainly not going to pre-emptively block tracking or ads!

Have a look at the link I shared earlier (the Controld link), which at this time seems to be the best at blocking all that stuff before it even has a chance to be transmitted over to your Mac or home network. (Before ControlD, I had been using the AdGuard-related DNS IPs until I began to notice ads leaking through like a drafty biohazard lab — only to learn AdGuard was now owned by… Amazon 🙃 ).
 
Last edited:
  • Like
Reactions: IG88

DCBassman

macrumors 6502a
Oct 28, 2021
755
577
West Devon, UK
The annual major version cycle of OS X/macOS is the brainchild of Craig Federighi (i.e., the photogenic salt-n-pepper guy with the dark caterpillar eyebrows seen in many of Apple’s keynotes).

Federighi took over Bertrand Serlet’s role as SVP of software engineering in 2011. (Serlet was the captain behind the development of Tiger, Leopard, and Snow Leopard). Federighi’s influence was immediate, starting with his (late entry into leading) development of Lion, onward.

Lion not only was a complete re-engingeering of the operating system as a business model (and in how it closed out more of the open source software community, further making other components of the OS more hands-off as firmware updates locked down hardware), but it also marked when major software versions moved to a once-yearly cycle still in place today, as orchestrated by… Craig Federighi.

While I know some folks gripe reflexively about Tim Cook-this or even Jony Ive-that, my grivance with Apple for these last dozen years is, principally, in Federighi’s responsibility for (and piloting of) accelerating a cycle of planned obsolescence by at least double (when factoring the averaged lifespans of Tiger, Leopard, and Snow Leopard — all being much more complex operating environments which had to accommodate more architectures (four, in the case of Leopard) and 32- and 64-bit processors, and all being given time for major issues to be worked out before moving on to the next major version).

Periodically, I estimate where we would be with macOS versions in present day, had Federighi not come in and created this synthetic, once-per-annum major version cycle. Instead of discussing macOS 12.4 Sonoma right now, we would probably be discussing an upcoming macOS 10.13 High Sierra as the latest version as, averaged, each version between Lion and that hypothetical High Sierra would have had ~24 months to mature and stabilize.

Of course, between that and Moore’s Law’s slowdown, consumers would also be keeping their systems longer, “hurting” the Apple investor quarterly statement. Federighi’s approach has assured this won’t be happening as long as Apple sticks to this once-per-annum pattern.
Thanks, got it! So is he also the one that put hardware acceleration and the Metal API in, as a further tool of obsolescence?
 

ToniCH

macrumors 6502a
Oct 23, 2020
737
934
newsflash: mechanical drives have finite life and go bad after x-amount of use too!

personally, i'm not even sure if a typical HDD will outlive an SSD when used as your long term daily drivers.
i think i'd actually trust a "good" SSD more than a HDD for this.
Yeah, bought my first HDD in '89 and have owned hundreds ever since. And sold thousands. I still have some very old ones that work and in all have lost totally very few over the years. I've actually lost more SSD's in last 15 years than hdds ever. And I've owned way less SSD's over the years.

But, that is besides the point. I was advocating more RAM is better than less RAM. Using swap will kill any drive faster than if not using swap. And with SSDs we know that writing to one cell over and over again will kill that cell and once you have enough bad cells the whole drive is dead way before its actually all dead. And with new computers you can no longer replace the SSD's as they are soldered in. In some failure modes with present Macbooks you cannot even boot from external drive if the built in SSD fries. The MBA/P is then literally scrap. And this end result is just matter of time. Quite a difference to older machines with replaceable drives. Hence IMHO its better to max the RAM when possible. You just might stay out of trouble longer.
 

retta283

Suspended
Jun 8, 2018
3,180
3,482
And three, a well-cared for HDD, housed in an optimal environment, can last much, much longer than either expected or even warrantied by the original manufacturer. Yes, that WD Black HDD may have been sold with a five-year warranty, but at twelve-plus years, it may still be going strong, with S.M.A.R.T. monitor status only showing number of hours in use being on the high side, with all other metrics showing considerable life left. Again, SSDs, especially when used in heavy-swap settings, such as on a system with unusually low RAM for the OS and applications it was designed to run, will wear out much more quickly and with far less advance notice before failure.
This is one of the reasons you are advised not to defrag an SSD, it does nothing but slowly destroy the drive with excessive read/write cycles. The NAND has a much more hard-and-fast limit of cycles than a mechanical drive which usually either fails early due to defects, or later down the road due to general mechanical failure which is more impacted by factors like temperature and vibrations than the amount of written data.

I've seen WD Blacks get hundreds of thousands more in read/write cycles than they were rated for, same with power-on hours. A well taken care of mechanical HDD is extremely stable and reliable assuming it wasn't defective in the first place. SSDs are better in that they require less control over external factors, but are worse in raw survival on merits of drive wear, and same for cold storage. I see absolutely zero reason to replace an HDD with an SSD in a stable environment unless there are performance benefits involved. Clearly a modern computer is better with an SSD, but it does not eliminate the utility of a mechanical drive in other applications, where access times are irrelevant.
 

MRMSFC

macrumors 6502
Jul 6, 2023
369
379
Thanks, got it! So is he also the one that put hardware acceleration and the Metal API in, as a further tool of obsolescence?
The Metal api was created because OpenGL was stagnant at the time. It came to MacOS as a product of Apple merging their OS teams to simplify development across platforms.
 
  • Like
Reactions: ArkSingularity
The Metal api was created because OpenGL was stagnant at the time. It came to MacOS as a product of Apple merging their OS teams to simplify development across platforms.

The merging of the OS teams followed once Federighi assumed oversight over iOS/iPadOS when Scott Forstall was removed in 2012 (during the Mountain Lion era). Forstall was blamed, amongst other things, for iOS 6’s Maps feature being underdeveloped, but he was also the chief proponent/advocate for skeuromorphic UI/UX (which really began to fall away after Forstall left Apple in late 2012).

Federighi then unified macOS, iOS/iPadOS, watchOS, and tvOS all under one tent — his — much the way Ive attempted to unify hardware form. The “iOSification” of OS X/macOS is a Federighi-era decision (or, it was one which Federighi got Cook to approve formally). Under Federighi, the under-the-bonnet functions of macOS have gotten ever-more murky and more difficult for outside parties to review.

The Federighi era, in time, will probably be recognized as a mixed, if not troubling legacy — not unlike that of Alfred P. Sloan’s impact on the age of manufacturing.
 
  • Like
Reactions: FeliApple

MRMSFC

macrumors 6502
Jul 6, 2023
369
379
The merging of the OS teams followed once Federighi assumed oversight over iOS/iPadOS when Scott Forstall was removed in 2012 (during the Mountain Lion era). Forstall was blamed, amongst other things, for iOS 6’s Maps feature being underdeveloped, but he was also the chief proponent/advocate for skeuromorphic UI/UX (which really began to fall away after Forstall left Apple in late 2012).

Federighi then unified macOS, iOS/iPadOS, watchOS, and tvOS all under one tent — his — much the way Ive attempted to unify hardware form. The “iOSification” of OS X/macOS is a Federighi-era decision (or, it was one which Federighi got Cook to approve formally). Under Federighi, the under-the-bonnet functions of macOS have gotten ever-more murky and more difficult for outside parties to review.

The Federighi era, in time, will probably be recognized as a mixed, if not troubling legacy — not unlike that of Alfred P. Sloan’s impact on the age of manufacturing.
Controversial take:

The iOS-ification of MacOS was inevitable.

The iPhone and iPad are far far more widespread among people than the Mac ever was. And we’re entering an age where adults grew up in a world dominated by smartphones. To the point where they’re having to teach some high school/college students about file systems.

Making MacOS at least visually similar and having the same features across the platforms is just common sense.

As for the move away from skeuomorphism, trends come and go. I could write an essay on the “Frutiger Aero” era of graphics, but I don’t think it’s appropriate here.
 

Silly John Fatty

macrumors 68000
Nov 6, 2012
1,802
512
IMG_1581.jpeg


I know, and I refuse to wear one!
 

FeliApple

macrumors 68040
Apr 8, 2015
3,684
2,088
The annual major version cycle of OS X/macOS is the brainchild of Craig Federighi (i.e., the photogenic salt-n-pepper guy with the dark caterpillar eyebrows seen in many of Apple’s keynotes).

Federighi took over Bertrand Serlet’s role as SVP of software engineering in 2011. (Serlet was the captain behind the development of Tiger, Leopard, and Snow Leopard). Federighi’s influence was immediate, starting with his (late entry into leading) development of Lion, onward.

Lion not only was a complete re-engingeering of the operating system as a business model (and in how it closed out more of the open source software community, further making other components of the OS more hands-off as firmware updates locked down hardware), but it also marked when major software versions moved to a once-yearly cycle still in place today, as orchestrated by… Craig Federighi.

While I know some folks gripe reflexively about Tim Cook-this or even Jony Ive-that, my grivance with Apple for these last dozen years is, principally, in Federighi’s responsibility for (and piloting of) accelerating a cycle of planned obsolescence by at least double (when factoring the averaged lifespans of Tiger, Leopard, and Snow Leopard — all being much more complex operating environments which had to accommodate more architectures (four, in the case of Leopard) and 32- and 64-bit processors, and all being given time for major issues to be worked out before moving on to the next major version).

Periodically, I estimate where we would be with macOS versions in present day, had Federighi not come in and created this synthetic, once-per-annum major version cycle. Instead of discussing macOS 12.4 Sonoma right now, we would probably be discussing an upcoming macOS 10.13 High Sierra as the latest version as, averaged, each version between Lion and that hypothetical High Sierra would have had ~24 months to mature and stabilize.

Of course, between that and Moore’s Law’s slowdown, consumers would also be keeping their systems longer, “hurting” the Apple investor quarterly statement. Federighi’s approach has assured this won’t be happening as long as Apple sticks to this once-per-annum pattern.
Agreed, completely. It’s sad that the only thing between users using the devices they like, on the software version they like is Apple and their yearly major update policy.

You can’t fight it. There’s nothing you can do. Older OS X and iOS versions struggle more and more as time goes by. You can have a flawless device, but you can’t do anything if your version of the Operating System isn’t supported.

You can eschew security, but software itself just won’t run. With some luck, it stops receiving updates but older versions work. With worse luck, it stops working.

I have my favourite Mac ever, my 2015 MacBook Pro. It runs OS X El Cap, because of course it does. To give one example, I can’t update Google Chrome anymore. It’s just way too quick. It’s even quicker on iOS. You can’t update Safari as it is bundled with iOS updates, which obliterate devices. iOS 12, the 2018 release, already has some issues on Safari. Some websites don’t render, others directly put up a popup saying “your browser is unsupported”.

I’d use my favourite iPhone ever, my iPhone 6s on iOS 10, forever, but iOS 10 lacks too much support at this point.

I’d use devices for a very long time, because I like them and I don’t care about newer devices as long as my devices work properly, but this yearly release schedule directly undermines the longevity I try to give to my devices. Older Apple software versions lose support of everything. It’s sad, because Apple’s hardware and original software designed for that device is complete and utter perfection. Their own policy undermines that.
 
Controversial take:

The iOS-ification of MacOS was inevitable.

The iPhone and iPad are far far more widespread among people than the Mac ever was. And we’re entering an age where adults grew up in a world dominated by smartphones. To the point where they’re having to teach some high school/college students about file systems.

Making MacOS at least visually similar and having the same features across the platforms is just common sense.

As for the move away from skeuomorphism, trends come and go. I could write an essay on the “Frutiger Aero” era of graphics, but I don’t think it’s appropriate here.

Well…

You had my attention until you used the phrase “common sense” — which I always parse as shorthand for, “It’s too hard to think about this stuff critically and thoughtfully, so just lean on what ‘feels right’.” People get hurt being on the short end of someone else’s “common sense” fallacy.

Otherwise, you’re correct: yours is a controversial take.

It would be no more controversial for Apple, under Spindler and Amelio in the ’90s, to have tried to “Newtonify” Mac OS because, well, it was newer and offered a different, novel kind of UI/UX from Mac OS. That, for maaaany reasons, did not happen, but attempts in that direction would have been probably no less difficult and no more fraught with problems which, in the end, would have kneecapped the much more robust platform, even as the newer platform continued to mature.

That’s where we are now (and where we have been since late 2011) with iOSifying macOS.

The only reason this synthetic merger feels “normal” or, ahem, “common sense” is because Apple, under Federighi’s oversight, have been pressing for this since 2011. They have been trying all sorts of ways to make this version of fetch happen. And yet, some aspects of UI/UX and function will never work comparably across this chasm. They remain two discrete paradigms: one, with a cursor and physical keyboard, the other with a capacitance-based digitizer and incredibly crafty, but still limited fingertip gestures and (optional, always optional, for additional cost) human interface devices.

The problem with this rationale is the fallacy that tablets/glass phones (they’re no more “smart” than any other CPU-based device) and full, purpose-built computers are one in the same (or should be one in the same) solely because they use a CPU, a GPU, or both; use a display; and sport a graphical user interface. Their use-cases remain discrete.

For instance, try using your MacBook Pro with Apple CarPlay. Try writing raw software code — or even Xcode — on an iPhone or iPad. Try using NFC-based Apple Pay from a MacBook Air. Theoretically, all may be possible, but none is practical.

Certain features and software will have overlap (drawing with a WACOM tablet on macOS and drawing with Apple Pencil on an iPad Pro, or watching Netflix, for instance), but one does not replace the other because each are different tools for different jobs and, often enough, for different ends. One is more limited than the other, but the trade-off is in sheer portability and purpose of function. The Mac(intosh) was and remains Apple’s unbroken line of GUI-based microcomputers (in the very classic sense), whereas the iPhone and, later, iPad trod new ground. Likewise, AppleTV/tvOS is a different paradigm from either iOS/iPadOS or macOS. And watchOS… again, it strives to do more like an iPhone or iPad, but its use-case remains unique to its (predetermined) form factor.

With technological evolution, a general trend is not to attempt to merge different paradigms so much as to further specialize existing tech to fill niche solutions. The evolution of iOS, iPadOS, tvOS, and watchOS, from OS X/macOS is a “natural” evolution of this phenomenon at work. Apple, under Federighi and Cook, are now striving to swim upsteam in the opposite direction of this evolution, and it’s not organic in the slightest.

Had iPhone been, from inception, developed as a strict parallel of OS X ( even as iOS 1 was developed from a heavily modified Leopard), without “jailing” the OS or developing “walled gardens” and the like, then this might have been a different discussion. But from the outset, iPhone was a clean-sheet invention discrete from a Mac because it could do things a Mac could not, and vice-versa.

The problem is there is nothing “organic” or “common sense” about trying to treat these different tools as the same tool or to merge them as such, because they are neither marketed as variations on the same tool nor are they fundamentally designed to operate in exactly the same situations, software, provisions, or circumstances.
 
It’s sad, because Apple’s hardware and original software designed for that device is complete and utter perfection. Their own policy undermines that.

You hit on an important point here:

Apple, at least until fairly recently, designed and built Mac hardware to be robust and to last much longer than what competitors offered along the same lines. That, partly, is what elevated Apple brand’s cachet during this century, as ushered in by an older, less mercurial Jobs. And that build quality can, in certain Mac products, still be found — even as Apple now lock down so much of what goes inside.

Meanwhile, Apple develop system software to be as cheap and disposable as it can be, in the name of baking in consumer product turnover. Again, I put this on Craig Federighi’s lap.

These objectives, when divorced from shareholders and profit/loss quarterly statements, clash fundamentally against one another. Which is, frankly, illogical as heck.
 

xantufrog

macrumors regular
Jul 7, 2023
130
135
Consumables have limited lifespans. All of them. Even, yes, RAM. Solid-state storage has pre-calculated mean-time between failure based on the type of NANDs it uses and, thus, the maximum number of writes for that type of NAND. It’s fairly cut and dry.

With spinning rust, additional factors are involved (including frequency of shock, strong magnetic fields, ambient operating/storage temperature, humidity, and environmental air contaminants). But there are three things working to a spinner’s favour, if a user strives to reduce or eliminate most, if not all of those factors.

One, the maximum number of reads/writes per magnetic sector aren’t as calculable as the hard-and-fast rule around NAND writes. Additionally, other manufacturer design decisions may alter that fate somewhat (including some more radical measures, such as sealing an inert gas within the casing).

Two, in the event of SSD failure, it is often complete and instant, with zero recovery possible (which I have experienced once, on an OEM Apple SSD within a rMBP). With the failure of rust, unless the controller has failed (which I have experienced once), then data may be recoverable/savlageable. No, salvaged data might not be a joy, but it may be enough to save critical data which were not yet backed up elsewhere.

And three, a well-cared for HDD, housed in an optimal environment, can last much, much longer than either expected or even warrantied by the original manufacturer. Yes, that WD Black HDD may have been sold with a five-year warranty, but at twelve-plus years, it may still be going strong, with S.M.A.R.T. monitor status only showing number of hours in use being on the high side, with all other metrics showing considerable life left. Again, SSDs, especially when used in heavy-swap settings, such as on a system with unusually low RAM for the OS and applications it was designed to run, will wear out much more quickly and with far less advance notice before failure.

Each, at this time, have a valid use-case, but that valid use-case needs to factor the external factors involved. Apple, still selling systems with only 8GB RAM, know better than to do this, but also know that, shy of trade regulations and enforcement of those regulations, doing this makes shareholders pleased — at the cost of both the finite nature of this straining planet and the inconvenience of consumers who discover, not too far down the road following purchase, that they were sold an under-equipped product with no fallback or recourse for redressing that shortfall, making as good as near-junk to junk.
Obviously not a portable, but my 1989 Mac SE’s 20MB HDD is still grinding along
 

zapmymac

macrumors 6502a
Aug 24, 2016
925
1,076
SoCal ☀️
I think it stems from decades of how the computer industry has set up sales.

Years ago, a cheap computer was barely good enough to do anything, basically gimped. One had to nickel & dime your way through to get a tolerable computer. Almost everything was upgradable for a price:

Cpu
Ram
Video
Sound card
Hard drive size and/or speed
Resolution
Speakers
Wall wort
O/S
Bluetooth
Optical drive dvd/burner/other
Wifi
USB
Extra ports:
-Usb
-PCMCIA
-Firewire
-storage

This was/is how it’s been…but for the average consumer, they have been burnt before, so it’s just easier to buy the most expensive / current model and hope for the best.

Now, generally, even a cheap $299 laptop from Costco can and will get an average consumer by (saw one this weekend).

This mentality carries over a bit for people new to macintoshes I believe; still kinda stuck in that mindset. Since Apple (specific) takes a bunch of those options away, people kinda assume it is still required to buy new more frequently than not.

+ : netbooks! They burned many a consumer back in the day.


That’s my take from Commodore / Classic Mac OS / win 3.1 —> present

P.S. we do so much more online yearly as well
 

Heindijs

macrumors 6502
May 15, 2021
421
834
I love my 2012 Mini, but I'm also glad I bought a brand new M2 Pro Mini this year. Yes, I might not have totally needed it, the 2012 does basic tasks just fine.
But this thing is so much faster that I do not need my higher end Windows laptop anymore. I might not take advantage of all of its power, but the fact that I get so much more of it, while still being whisper quiet, makes it worth it for me.

It's also a plus that I don't have to worry about updates or patches anymore. Not that it was a huge hassle, but it was clear to me that it will miss out on more and more new features.
 
  • Like
Reactions: Lioness~

MRMSFC

macrumors 6502
Jul 6, 2023
369
379
Well…

You had my attention until you used the phrase “common sense” — which I always parse as shorthand for, “It’s too hard to think about this stuff critically and thoughtfully, so just lean on what ‘feels right’.” People get hurt being on the short end of someone else’s “common sense” fallacy.

Otherwise, you’re correct: yours is a controversial take.

It would be no more controversial for Apple, under Spindler and Amelio in the ’90s, to have tried to “Newtonify” Mac OS because, well, it was newer and offered a different, novel kind of UI/UX from Mac OS. That, for maaaany reasons, did not happen, but attempts in that direction would have been probably no less difficult and no more fraught with problems which, in the end, would have kneecapped the much more robust platform, even as the newer platform continued to mature.

That’s where we are now (and where we have been since late 2011) with iOSifying macOS.

The only reason this synthetic merger feels “normal” or, ahem, “common sense” is because Apple, under Federighi’s oversight, have been pressing for this since 2011. They have been trying all sorts of ways to make this version of fetch happen. And yet, some aspects of UI/UX and function will never work comparably across this chasm. They remain two discrete paradigms: one, with a cursor and physical keyboard, the other with a capacitance-based digitizer and incredibly crafty, but still limited fingertip gestures and (optional, always optional, for additional cost) human interface devices.

The problem with this rationale is the fallacy that tablets/glass phones (they’re no more “smart” than any other CPU-based device) and full, purpose-built computers are one in the same (or should be one in the same) solely because they use a CPU, a GPU, or both; use a display; and sport a graphical user interface. Their use-cases remain discrete.

For instance, try using your MacBook Pro with Apple CarPlay. Try writing raw software code — or even Xcode — on an iPhone or iPad. Try using NFC-based Apple Pay from a MacBook Air. Theoretically, all may be possible, but none is practical.

Certain features and software will have overlap (drawing with a WACOM tablet on macOS and drawing with Apple Pencil on an iPad Pro, or watching Netflix, for instance), but one does not replace the other because each are different tools for different jobs and, often enough, for different ends. One is more limited than the other, but the trade-off is in sheer portability and purpose of function. The Mac(intosh) was and remains Apple’s unbroken line of GUI-based microcomputers (in the very classic sense), whereas the iPhone and, later, iPad trod new ground. Likewise, AppleTV/tvOS is a different paradigm from either iOS/iPadOS or macOS. And watchOS… again, it strives to do more like an iPhone or iPad, but its use-case remains unique to its (predetermined) form factor.

With technological evolution, a general trend is not to attempt to merge different paradigms so much as to further specialize existing tech to fill niche solutions. The evolution of iOS, iPadOS, tvOS, and watchOS, from OS X/macOS is a “natural” evolution of this phenomenon at work. Apple, under Federighi and Cook, are now striving to swim upsteam in the opposite direction of this evolution, and it’s not organic in the slightest.

Had iPhone been, from inception, developed as a strict parallel of OS X ( even as iOS 1 was developed from a heavily modified Leopard), without “jailing” the OS or developing “walled gardens” and the like, then this might have been a different discussion. But from the outset, iPhone was a clean-sheet invention discrete from a Mac because it could do things a Mac could not, and vice-versa.

The problem is there is nothing “organic” or “common sense” about trying to treat these different tools as the same tool or to merge them as such, because they are neither marketed as variations on the same tool nor are they fundamentally designed to operate in exactly the same situations, software, provisions, or circumstances.
The “newtonification” of Mac OS didn’t happen because the Newton didn’t sell. Kids didn’t grow up with a Newton everywhere, and weren’t accustomed to Newtons before touching a Mac.

I think you jumped the gun on my post to grind the axe against Federighi.

All I’m saying is that building your software controls around what consumers understand (which going forward is going to be more like iOS) is just smart.
 

ahurst

macrumors 6502
Oct 12, 2021
410
815
I think it's more or less the same as with phones, people see the "new and shiny" and think they must have it. Apple encourages this behavior, which is really unfortunate as it contributes to unnecessary product production which in the end results in increased e-waste.

Though Apple isn't exclusive in this, NVIDIA has been encouraging this buy-every-gen behavior for their GPUs, and it feels like, at least for some people, they see cars in a similar way: needing to buy the latest and greatest every few years.
The thing is, companies basically have to do this: if you're a publicly-traded company, the nature of the market is to pressure you to keep growing your profits indefinitely or risk losing investor confidence (which affects your stock price, which affects your ability to take out loans against your stock, which affects your ability to to grow or make long-term investments...). Often if you're a public company and fall on hard times, you get bought up by Private Equity, restructured/downsized/milked for all your worth, and then discarded once they've squeezed the last toothpaste out of your tube (so to speak). I don't personally believe any of that's a good or sustainable thing, but it's the way the economic system's been designed.

Individual companies can be morally better or worse than others at chasing those increasing profits, but they all have to do it, because our economic system demands it. I'm not saying this because I think that absolves individual companies like Apple or NVIDIA, I just think it's missing the forest for the trees to talk about *some* big companies putting profits over societal/consumer interests as if it's a few malevolent actors, and not the fundamental design of the world economy strongly incentivizing them all to be that way.
 
The “newtonification” of Mac OS didn’t happen because the Newton didn’t sell. Kids didn’t grow up with a Newton everywhere, and weren’t accustomed to Newtons before touching a Mac.

This wasn’t the point I was striving to make.

The point I was striving to make, that of a paradigm, is technological diversification/specialization is expected and typical of ongoing innovation and maturation from an original invention.

Attempts to, then, delete those lines of specialization/diversification/maturation, as is the case with the “iOSification” of macOS — which has never been iOS/iPadOS, unlike the inverse — runs against the grain of technological specialization/diversification/maturation.

If my Newton example — to have drawn from a paradigmatic reference point within Apple’s history for sake of simplicity (irrespective whether, in this universe, it actually happened or not, or irrespective whether the Newton OS was successful or not) — was not enough to illustrate how technological diversification/specialization likes to move in one direction, then let’s try another from the transportation sector, one including kids’ consumption: bicycles.

Kids, like adults, use bicycles for transportation and enjoyment. After an earlier iteration with the boneshaker and penny farthing, bicycles originated in one form factor which, ever vaguely, looks like a familiar “ten speed” bicycle (specifically, an adult-sized frame with tires generally over 600mm — or about 24in — in diameter).

Over time, the bicycle became a platform in of itself and it, technologically, diversified — in size, in drive train, in robustness, method of propulsion, in materials (think wood, steel, chromium-molybdenum, aluminium, carbon fibre), in compactness (think folding bikes), and even in shape and deployment (think recumbents and tandems). Some even came with petrol motors, spinning off to become what we know now as “motorcycles”.

Attempts by any one company to take all the diversified variations — including variations by their own invention — and then shoehorn them back into a “unified” or hybridized form and function, successfully, runs against the grain of why and how people use different variations for different functions.

People use a tourer with several gear ratios for on-road, long-distance riding; a small, knobby, single-speed bike for BMX (and, once, for freestyling); a beefy mountain bike (with a different set of gear ratios) for off-roading (or a spin-off of this with low-PSI fat tires for riding through snow or mud); a lithium battery-assisted drive train for a pedelec; a long-wheelbase frame for cargo; or just two miniature wheels for a toddler to push themselves to learn how to keep balance). This also includes tricycles (kids and adults alike); single-gear, fixed drive train bikes (whether for the closed track or for city riding), and on and on.

Diversification for this technology has benefited many people, with further diversification only more so. Trying to, say, affix parts from a Bianchi touring bicycle onto, say, a modern motorcycle frame would be too much mass to use the result as an effective bicycle — not unlike trying to kludge in iOS elements onto a much more robust personal computer running on a progenitor operating system purpose-engineered for a personal computer.

Attempts to merge these into an all-in-one will, generally, fail on all intended merits.


So rather than to encourage each new spin-off device to develop along their own path and — and this is key — establishing a set of interconnectivity protocols (and/or industry standards) to work seamlessly with one another, without proprietary/opaque shenanigans, Apple instead have been trying to pretend that a Mac and an iDevice, or an iDevice and a Watch, or an AppleTV and an iDevice, can be shoehorned back together again whilst, at the same time, wanting to diversify Apple’s offerings.

These are objectives in conflict with one another, and at this time, the product line/paradigm to hurt most from it are the Macs.


I think you jumped the gun on my post to grind the axe against Federighi.

This entire thread is jumping the gun because it doesn’t belong in the Early Intel Macs forum. :D

Putting a critical lens on Craig Federighi’s vision is entirely fair game in this conversation, as it is Federighi who’s captained this fallacious notion that trying to graft together spun-off technological diversification is a “natural”, benign premise. This notion is, I conjecture, hurting the Mac, macOS, and hurting the reasons why one buys and uses a purpose-designed personal computer/personal workstation and not, say, a glass consumption appliance.

Let glass appliances evolve and diversify, organically, as their thing, seeing fit to assure they can still play nice with personal computers, but not at the expense of trying to make personal computers act more like a glass appliance. This results in a hobbled personal computer and a growing body of dissatisfied personal computer users looking elsewhere.

So yah, this is entirely on Federighi’s hands and putting him under the microscope is in-bounds. “iOSification” of the Mac has always been his singular vision since 2011, green-lighted by the only boss he’s ever had whilst SVP of software engineering, because in the short-term, it’s still making money for the company. There are, unquestionably, other paths not taken which, as well, would make the company just as much money, but they remain paths not taken because they are not the work or vision of Federighi.


All I’m saying is that building your software controls around what consumers understand (which going forward is going to be more like iOS) is just smart.

It isn’t smart when long established UI/UX features and controls, specific to Macs/OS X/macOS, are being removed steadily in favour of importing an iOS kludge in the process.

System Preferences, anyone? The Dock? The menubar? The complete (and aggressive) de-skeuromorphing of UI elements? The enforcement of SIP as the default? Aggressive cryptographic matching of even the more basic, (still semi-) modular components?

It may seem odd or brusque to pause (or conclude) on this, but topical to the thread title:

In 2023, for a lot people/consumers of whatever age, an iPad or iPhone — a glass appliance — really is all they’ll ever need in a computing device.

A glass appliance really is all they need instead of a consumer desktop or a netbook/chromebook or even a iBook/MacBook (which, in the past, was the best alternative absent a glass appliance). Today and tomorrow, perhaps they ought not bother with a new Mac when what they really want (and need) out of the everyday is the iOS/iPadOS experience for things like media consumption; making purchases; syncing with their car; or making video/Zoom calls, and not, say, a far more robust piece of purpose-designed equipment still known as the personal computer and/or computer server.

At this point, as least for folks with these needs (and at least for iPads and iPhones, now with processors which can handle high-def video, games, and whatever else thrown at them just fine), possessing a full-fledged personal computer these days is nothing more than a vanity possession and far more computer than they’ll ever use.

But as as a longtime personal computer user — for the kind of work I do and for the variety of things I create with them — there isn’t a place for an iOS/iOS-lite in that matrix.

For the ways I want to tailor my personal computer to accomplish what I need it to do, the “appliance-ifying” of the Mac has no place, either.

And when I make a purchase of a durable good from a company like Apple, it may be less frequent, but the dollar figure for that purchase — and profit margin — is liable to be more substantial than a consumer just buying iPhones or iPads every 2–5 years or, worse, buying a low-end Mac now being engineered with the same, unsustainable disposability in product life cycle streams as an iDevice (and never once using it to the designed limits of its capabilities). (Whether Federighi was involved in that or not is another discussion.)
Federighi, stop trying to make fetch happen.

For everybody else who wants an iOS in their Mac, buy an iPad Pro and a keyboard for it and enjoy your glass appliance with the familiar screen-based UI/UX: you don’t need a Mac.



[EDIT to add for everyone else: I’m sorry I brought a shovel to this discussion when a hand spade probably would have done the job just fine. I pinged the moderators, unsuccessfully, to have this thread moved (as it’s not topical to Early Intel Macs) to the Mac Basics, Help and Buying Advice forum. Maybe other EIM folks could try for the same and nominate this thread to a more suitable home.]
 
Last edited:

MRMSFC

macrumors 6502
Jul 6, 2023
369
379
I’m not sure what point you’re trying to make. And I don’t feel it’s right of me to reduce this down to “old man yells at cloud” despite what I think it reads like.

I’m not sure if you’re interpreting my points correctly either. I hadn’t said anything about components being cryptographically locked or the move aware from skeumorphism. Or even that the Mac should just be iOS.

I’m going to respond in the best of my ability though.

You’re correct in that technology evolves and specializes, and that iOS is fundamentally different than MacOS. Hence them being different products for vastly different purposes (and by extension the weird limbo the iPad Pro exists in).

We agree what matters is the people using it and how they interact with it, but I think we disagree on who gets priority.

(I interpret) that you assert that legacy users should be prioritized and that the visual language shouldn’t change.

That’s where I disagree. Because as you and I and everyone here has fundamentally grown up in a world different than our parents and younger generations. In my personal experience, my world is even significantly different than my siblings.

Taking that into consideration, newer products should be easily usable to them, and not insist that they adapt to old paradigms.

And yes, that means changes in the interface to look more like iOS in this case. Because as much as we old men will complain about those darned kids and their phones, they, not us, are the primary demographic.

If I may offer an anecdote, I was a smartphone holdout until 2016. I stuck stubbornly with my old samsung flip phone insisting that I didn’t need to change and that everyone else was wrong. Turns out that when I got my first iPhone, I immediately regretted holding out. The world and technology changes, whether we like it or not.
 
I’m not sure what point you’re trying to make. And I don’t feel it’s right of me to reduce this down to “old man yells at cloud” despite what I think it reads like.

Well, that’s nice.

I’m not sure if you’re interpreting my points correctly either. I hadn’t said anything about components being cryptographically locked or the move aware from skeumorphism. Or even that the Mac should just be iOS.

I conjecture the same with you.

I’m going to respond in the best of my ability though.

You’re correct in that technology evolves and specializes, and that iOS is fundamentally different than MacOS. Hence them being different products for vastly different purposes (and by extension the weird limbo the iPad Pro exists in).

I don’t think the iPad Pro exists in a weird limbo. Then again, I have friends who do things like draw comic panels and digital illustration.

We agree what matters is the people using it and how they interact with it, but I think we disagree on who gets priority.

You’re right. We disagree on who gets priority on people who do work on a Mac — the kind of work which has long relied on using a Mac.

Said people are taking a back seat.

I disagree with this. I also think Federighi’s vision cannot be divorced from this problem. I also think Federighi would stick to being SVP of iOS/iPadOS/WatchOS, and another party, as it was before Federighi, to assume SVP oversight of macOS, because it has languished under Federighi.


(I interpret) that you assert that legacy users should be prioritized and that the visual language shouldn’t change.

That is not the case. The visual language should reflect how personal computing users are using their systems as systems, discrete from when (or if) they pick up a tablet from the living room ottoman to watch an episode of a show).

The visual language of the Mac user, foremost, should not draw principally, even solely, on the iOS bucket (as plenty of Mac users choose to limit their use of Apple’s “walled ecosystem”/“secure enclave”/“tight integration”/whatever or they skip it altogether).

That’s where I disagree. Because as you and I and everyone here has fundamentally grown up in a world different than our parents and younger generations. In my personal experience, my world is even significantly different than my siblings.

I often look after my friend’s kids. Their ages have ranged from 8 to 15. What I have noted, time and again, is their mental association with the “personal computer” (by which I include laptops, desktops, towers, etc.) being a tool for which they have very few positive associations. Typically, they ascribe meaning to these systems as “the thing I have to do my homework on”, and they pine to hop back to their screens and phones as soon as the “work” is done. They give me grief because I use a laptop. I find all of it endearing and also illuminating.

They comprise a generation — the youngest being “Gen Alpha” — who will, by many metrics, never want to own a laptop or larger, unless they move to getting into video gaming beyond a Switch, an Oculus, or something they downloaded from the Apple Store or Google Play for their glass UI devices. They will be able to do their banking and even their work on a glass screen.

This paradigm should mean a designer of hardware might bring over selected hardware innovations over to a laptop/server from, say, a glass UI. A good example: the arrival of multitouch trackpads beginning with the 2008 MacBook Air. Good stuff. By the same measure, MacBook products do not have glass interfaces, despite fanbois pining for such for aeons.

As for software, using industry-based protocols worked (and they still work). But Apple have been shunning that door for some time, setting up a closed ecosystem to benefit their non-personal computer products. (This doesn’t even hit on the question of iOS elements being dragged back to a non-glass UI hardware operating environment.) This comes at the expense of Mac users who simply need to get their work done and for the Apple-created, proprietary kludges to stay out of the way.

This gets harder every year. Meanwhile, the kids-these-days folks show little interest in a iOS-gussied laptop when they’re doing just fine on their iPads.


Taking that into consideration, newer products should be easily usable to them, and not insist that they adapt to old paradigms.

The Mac was, by design, always easier to use than contemporary competitors. These days, much as it strains me to admit it, Microsoft sort of caught up there. The Mac was (and is) also a professional tool for people to focus on the work they needed to do.

Unless Apple turn their entire Mac line — or, at least all save their Mac mini, Mac Studio, and Mac Pro — into glass-UI devices (i.e., following Microsoft’s Surface form factor), then “iOSifying” macOS will continue to alienate users who buy a Mac to use because they need it for their work.

And yes, that means changes in the interface to look more like iOS in this case. Because as much as we old men will complain about those darned kids and their phones, they, not us, are the primary demographic.

An aside: it would be mightily civil and helpful of you to, idk, refrain from saying stuff like the bolded until after you’ve confirmed that the people you’re responding to actually fit that bill. Cheers.


If I may offer an anecdote, I was a smartphone holdout until 2016. I stuck stubbornly with my old samsung flip phone insisting that I didn’t need to change and that everyone else was wrong. Turns out that when I got my first iPhone, I immediately regretted holding out. The world and technology changes, whether we like it or not.

Hey, I once used a Nokia E62 as my Blackberry device. It was delightful to use. By 2011, I had moved on to Android. Its 2023, and I still use an Android device — even as earlier this year I inherited an iPhone and an iPad Pro.

I embrace new technology. Re-treading existing technology designed for a particular UI/UX onto a technology never designed for it isn’t a gesture of looking forward. It’s extremely reactionary and counter-innovative. Then again, companies which get too big have a nasty tendency stop looking forward and to start making sure they don’t lose ground.

As that happens, they stop taking fresh risks.

Apple stopped taking fresh risks a long, long time ago.
 
  • Like
Reactions: DCBassman
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.