Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
You see, and now you are getting dangerously close to suggesting that we don't need high performance computing at all,

No.

What I’m saying is if an edge-use consumer needs high-performance computing, to complete the kinds of specialized work you do (or, say, a particle physicist modelling the femtosecond-scale degeneration of a Higgs boson), there should be always be a line of purpose-designed and -tailored hardware for those needs. And as those needs are going to vary by specialization, modularity is a must, especially as specialized components are required for those specialized applications.

To borrow a private transportation metaphor:

In the computing sense, what I’m describing above (and in my related reply to @nathansz ) are the supercars and hypercars of computers. To get to work, to get groceries, to haul materials, etc., such a vehicle is wildly impractical. Sure, it’s doable, but foolhardy and a really silly flex. (“L@@K I PUT GROCERIES IN MY FERRARI’S FRUNK”)

The general trend with technological evolution in any area, from any historical era, is not one toward uniformity or convergence, but rather, toward specialization. This not only includes appliances (including computers), but also the technology of agriculture, the technology of language, and even the technology of harnessing the conversion of matter to energy (i.e., fire, internal combustion, fissile materials, etc.).

What is taking place right now with industry juggernauts like Apple is a retrograde approach — to uniformize different products/equipment intended for different uses. That the same base chip architecture in an iPad (or even iPhone) — and no, I do not mean ARM — is the same as that used in a current Mac Pro or Mac Studio, is retrograde. This is the case whether as viewed from the supercar-hypercar analogy, or as viewed from an analogy of “Silicon/M is a Swiss Army Knife to handle all of it.”

Convergence is not how arcs of technological evolution cant.

So yes, I am saying, one last time, how different computing needs present for different use-cases.

No, every consumer does not need a supercar-hypercar for a laptop or consumer desktop — especially one locked down hard for no technological advantage — when the most strenuous work they’ll do with it is liable to be some moderate video editing for content creation as they listen to Spotify and, maybe, have a browser window open on a second display. Even “econobox” computers of 2024 can handle these tasks. They also don’t need a Swiss Army knife when they find themselves turning to the all-purpose kitchen knife for nearly every daily task.


and all that matters is a notion of a "computer growing with your needs" (presents by you in a matter which I consider rather limiting). Which kind of loops back to my earlier mention of people who want to make computing objectively worse because of their marginal view/needs. Yes, I do acknowledge that you propose maintaining two lines of products, where modularity is emphasized at the baseline, and I do think this is wishful thinking that won't solve anything. I will get back to this later.

The more you go on about this, the more I’m seeing some projection going on.

You present this highly tailored, edge-use case applicable to a fraction of a per cent of all users. Even a marginal uptick in power users wanting compact LLM locally on their setups is specialized; most will, once hype-dust settles, subscribe to access a much larger one on the cloud.

To shoehorn every consumer into a closed, locked architecture (with a relatively high price point entry), whose raw computing capabilities will never come close to being tapped in full (even as the pittance of soldered/integrated/irreplacable consumables, like storage, get used extremely hard — that persistent Achilles’ heel of “8GB RAM/256GB SSD”, especially on the most common, entry-level models), is a three-act play of un(der)checked, un(der)regulated corporate arrogance. It’s an arrogance fuelled by the publicly-traded nature of the big glass ring in Cupertino.

It is the act of inisting every purchaser of a vehicle needs a hypercar when a well-kitted estate (wagon) or crossover is more in check with everything they’ll ever throw at it. Buying for way more than that which will ever get used is so utterly… wasteful and, well, American.


I think the ecological concerns you voice are fundamentally valid and extremely important. I also think that your presentation conflates multiple separate (and often difficult to reconcile!) concerns and that your conclusions are flawed because of a partial misrepresentation of the industry, it's possibilities, as well as consumer needs.

Interdiscplinary preoccupations — the foundation of my education, research, and application over the years — necessitate not thinking or working within singular research silos. Although this is no longer a new idea within academia, the inertia of conservatism within the academy still throws up its cornucopia of roadblocks and administrative barriers to discourage and to complicate the tearing down of those silo walls. This often remains the case.

I say this because whole-systems analysis across disciplines is a cornerstone of what I have to do with my work. Although I do when it’s needed, I can’t really focus solely on the granular when the demands to the problems require being a generalist across multiple disciplines and looking for ways to weave them together where shared objectives align.

You may see this as “conflation” because of the kind of highly specialized work you do. I do not.

I can’t afford to not step back to view the wider picture across discplines, as it’s on me (and others in this area of work) to not only understand the core epistemological frameworks shaping research from different silos, but also to find where siloed expertise could be working (without the emcumbrance of those silo walls) with experts from other disciplines (also without walls blocking their view), to try to solve a complex, whole-systems-based problem (or group of interrelated problems).

That approach moulds the way I look at many topics, including right here on this discussion.


Currently I do not have the presence of the mind or the time to write a detailed, careful essay, so apologies in advance that my thoughts will be presented in a crude and incomplete way. I will try to mention a few general points, which I consider the most important.

That’s fine. I won’t ask you to.


First regards upgradeability and repairability, and how they relate to ecology, environmentalism, and consumerism. It seems very popular to present and discuss upgradeability and reparability as a single concern. I believe this view is misrepresenting the reality. The notion of modular computer is most important to the DYI crowd, and its demographic has changed a lot over the decades (we saw it with other markets historically as well, for example radio). In the early day of computing, DYI was pretty much the only way to have a computer you wanted. Now it's mostly gamers seeking to upgrade components and improve experience on a limited budget (costs are important, I will come to this later!). And this market is fundamentally toxic and anti-environmental. It is dominated by components that consume tremendous amounts of power and waste resources on a gargantuan scale, simply because it's a cheap way for the companies to make money and satisfy the demand driven not by actual need but by gamer psychology. At the same time, upgradeability offers only very limited utility to the "casual" user with moderate computing needs, simply because the resource demands have massively slowed down in the last decade. A 1GB RAM computer bought in 2008. years ago would become obsolete within 1-2 years. A 8GB RAM computer bough today will still be usable and useful over its entire projected lifespan. A popular argument is that of environmental damage caused by disposable appliances. This argument is undoubtedly valid, but what about the environmental cost of modularity with its resource usage overhead and the opportunity cost for economy and innovation? There are better ways to deal with appliances than forcing them to be modular in some narrow sense. Again, I will get back to this later.

That’s nice and all.

User-upgradeable commodities tend to use the same core materials over time. Once removed, they tend to be small enough for setting aside in a designated, small e-waste recycling bin whose disposal from that bin is not, generally, a chore.

Getting rid of a whole desktop, meanwhile, tends to be a logisitcial headache which, too often, ends up wishfully on curbsides and ditched inside general rubbish dumpsters (where e-waste has no business being). The same for old laptops applies, especially so the dumpster part. In my experience, it never ceases to astonish me what people will, do, and have tossed into the general waste stream.

Haphazard disposal, especially for localities lacking public services for at-home/at-office collection of large-dimension e-waste items, is what we should expect to see more of with completely non-upgradeable desktops and laptops whose native OS is no longer supported by the manufacturer.

Not everyone who owns a wholly un-upgradeable, un-repairable iMac M1, for instance, is going to go back to the Apple Store for their next computer (or remember to bring back the iMac to Apple to have a Genius recycle it via Apple’s local e-waste service provider). It will just as likely end up in some rubbish dumpster. That’s not ideal, because it accelerates the frequency of this broken cycle repeating itself.

Many localities and regions are slow to bring online robust e-waste recycling programmes. But accelerating the rotation of high-e-waste materials — rather than staggering it over longer spans — is not going to usher a groundswell of public recycling prgrammes. Instead, the waste adds up faster, the amount of energy and resources consumed in the extraction, refinement, manufacturing, and distribution of replacement whole, un-upgradable/un-repairable computers goes way up; recycling and reclamation also suffer. That’s completely backwards. That’s completely bonkers.

But this persists/worsens for a very specific reason, and it is tied to shareholders. Keep reading.


This would inevitably lead to much higher costs across the board (because the companies need to maintain profits and because they cannot efficiently amortize their R&D costs anymore).

OK. What you’re talking about here is a macro-level thing — one relating to R&D of ever-tightening, ever-fewer nanometre wafer designs, whose development and production limits the number of “fabs” capable of that production (like TSMC and Intel), and whose requirements necessitate bigger plots of manufacturing land and greater sums of freshwater sources to produce in quantities great enough to amortize the cost on the time scales demanded.

That’s an industry trying to make shareholders (many of whom are still in denial about Moore’s law no longer applying) happy. This is not the consumer’s fault, nor should it be upon them to make up for that industry’s action by having to buy, ditch, and buy another related whole desktop/laptop/tablet system every handful of years. That’s, ipso facto, an anti-consumer plan.

The consumer wants, obviously, a good value for the money they spend on it. Shareholders, on the other hand, are greedful and impatient, and their Lacanian thirst for desire (more wealth) is insatiable. The planet’s resources have a hard limit, one which our species is closing in on rapidly. This model is broken.

“Someone help, my family is starving.”

If innovation is “stagnating” in any way (look around: it never was), then it’s not due to a safeguarding of modularity and industry standards in personal computing. Rather, it’s in the inescapable physics of the atom we’re reaching as 5nm is supplanted, then replaced by 3nm; the necessity to invest high-energy, post-EUV (basically, X-ray) lasers for wafer etching at those atomic scales; and so on.

Maybe shareholders forgot incremental, intra-generational improvements in product performance can, do, and will always occur without an 2-year Moore’s law turnaround on the chip generation itself. By the same measure, this doesn’t mean the only improvement can be SoC without any upgrade path (not even the option to upgrade the mainboard with the successor chip in its stead, as with Framework’s approach).


Second, your notion of the computer that works smoothly for many years, and that can adapt with users needs over its lifespan. I believe this notion to be fundamentally misguided and mischaracterizes both the consumer needs and the industry's possibilities.

Have you some supporting, peer-reviewed analyses to share, because I would really love to read them.


You essentially advocate a stagnation model for consumer hardware.

No, I do not. 🤦‍♀️

Consumer hardware evolution has never been stagnant, and for virtually all of that history, consumer hardware has had replaceable and upgradeable capabilities. In addition, consumer hardware, as with professional and enterprise hardware, has diversified further into specialized uses. Throughout, these still rely on shared industry standards, conserving interchangeability and also some degree of forward-compatibility. This is good for slowing the Apple-accelerated whole-product life cycle I just got done critiquing.


It would also have only a questionable benefit to the customer because things won't change much from the current status quo. Again, if your computing needs are very low, going from 8GB to 32GB on a modern computer won't make a noticeable difference over its entire projected lifespan. Especially under a stagnation model where the software will inevitably adapt to the stagnating hardware. There is at least one benefit for the stagnation model though — software will be able to take advantage of the fact that the hardware doesn't change to become more efficient (just as we see with gaming consoles historically). But again, what will be the opportunity cost?

Typically, the main thing people want to upgrade is storage, not RAM.

(I also conjecture a total, mass-reliance on cloud storage for personal storage is a long-term mistake on so many levels. But that’s for another conversation.)

But in the same spirit, the buyer of a base model laptop or consumer desktop with only 8GB RAM aboard, because that’s what they can afford at time of durable goods purchase, should not be put in a permanent penalty box for budgeting and planning for future expansion as needs and resources allow.

I think we’re forgetting this is still an actual, quotidian, pedestrian thing in millions of consumer lives — university students, parents with growing kids, people on fixed incomes, and so on.

Forcing a whole-system replacement onto consumers as the only way to move on/up in storage capacity or even memory, if they so choose, is bananas. It’s brainworms logic to all except, well, unrepentant, impatient, insatiable shareholders holding the white-hot coals just beneath the soles of companies like Apple, Dell, and others running with this approach. Apple’s approach, especially in the post-T2 era, screams of “uncle!” to shareholder arm-twisting. As I wrote earlier in this main thread’s discussion, growth cannot be infinite. There will be a breaking point.


Which brings me to the final point: what most customers care about is neither upgradeability nor repairability, but the total cost of ownership. They have moderate computing needs and want some assurance that they can keep using the device without the risk of flooring hefty repair bill or having to buy a new device where the old one would still be sufficient to their needs. This is often sold as "repairability". But it has nothing to do with repairability. It is about warranty and hardware support. Customers generally don't care if they get back a new computer or their old computer with parts replaced or repaired from the shop, as long as they don't have to pay extra. This is a very strong customer motivation, and there are multiple ways to satisfy it.

Warranty support as not a supplement, but as a compulsory replacement for upgrades (or having a local shop make the upgrades for you, coupled with their own warranty), is a campaign which companies like Apple have worked assiduously to condition consumers through their tightly-co-ordinated marketing of these last several years.

That is: “You needn’t worry about anything inside, because you can just replace the whole thing with our warranty plan and your cloud subscription account.” There’s no flex room for allowances to upgrade. On Apple’s behalf, through no accident, there’s wilful and significant understatedness around how replacing whole systems every few years strains the enviornmental life cycle at every stage except one: usage (the fifth stage). That’s where we’ve actually gone backwards courtesy of a returning sales cycle which benefits not consumers, but shareholders.

[Speaking of: are you (or have you been) a shareholder of Apple, or having an investment account whose portfolio includes Apple? You needn’t answer if you don’t wish. Full disclosure: I held eight shares back in the early aughts. I sold them during the Great Recession. I broke just north of even. 🙃 ]

And now Apple’s strategy is paying off (for Apple shareholders).

It may be self-rationalizing for Apple, but it inveigles consumers away from an awareness that this paradigm doesn’t have to be this (one, currently presented) way (or the highway). It’s nasty, but public regulation in absentia around this kind of coercion is also effective for Apple’s revenue and growth stream. It’s fundamentally unsustainable, and they know this. They also have shareholders to whom they must answer every three months.

Welp.


For now, I’m skipping the rest (and will probably not reply as volubly on this thread’s side-branch), as I detailed some of this with my last reply to @nathansz . I appreciate your time. Cheers.

Also, I’ve written way long enough. I couldn’t ask for more than one or two people to go through all of it!
 
Last edited:
because they can't run a current, supported OS
There speaketh the man who is brainwashed into thinking that MacOS is all that exists. Almost every Intel Mac ever built can run a modern OS even more secure that MacOS. Even PPC Macs can do this. Apple would dearly like AS Macs NOT to be able to do this. Fortunately, there are are folks out there that make it happen despite Apple's intentions, and good for them.
I like some Macs, those that Apple haven't glued together being my preference. Non-repairability is simply a scam.
 
My opinion:

The intel era can be divided into two parts: Pre-butterfly and butterfly (keyboards. I'm including the post-butterfly keyboards in the era as well).

Pre-butterfly keyboard era was wonderful. The machines were very upgradable and parts could be replaced easily. I did not like the MacOS (except for Tiger/ Leopard/ Snow Leopard) then though.

The butterfly keyboard era is when things started falling apart. The machines became closed, and parts became hard to replace -if they could be replaced at all. The operating system towards the end of the era was much better.
 
My opinion:

The intel era can be divided into two parts: Pre-butterfly and butterfly (keyboards. I'm including the post-butterfly keyboards in the era as well).

Pre-butterfly keyboard era was wonderful. The machines were very upgradable and parts could be replaced easily. I did not like the MacOS (except for Tiger/ Leopard/ Snow Leopard) then though.

The butterfly keyboard era is when things started falling apart. The machines became closed, and parts became hard to replace -if they could be replaced at all. The operating system towards the end of the era was much better.

As a very long-term Mac user, I sat out a lot of that time using an old MacBook Air waiting for those goddamned butterfly "keyboards" to get phased out. Those things should go down in infamy as a perfect example of what happens when you let your designers put human factors on the back burner in pursuit of thin-ness. They felt like absolute **** to type on, and I'm very glad they turned out to be fragile as well because otherwise they might still be trying to cram them down our throats.

So, I absolutely LEAPED at the first 2020 MacBook Air that was released with a usable keyboard. That one (I had an i5) turned out to be quite a dog, however. I was very lucky to get 5 hours battery life out of it, fans gasping and wheezing the whole time. The identical-looking M1 was (and really still is) a revelation. It gave me double to triple the battery life, and, subjectively double the responsiveness and apparent speed. I'm still using it. Probably the best MacBook I've ever owned, and that's going back to the late 90s when I blew a flipping fortune on a PowerBook.
 
Last edited:
  • Like
Reactions: trusso and ric22
The touch bar might have been a good idea if it didn't replace the function row. It should have been above the physical function row, and I think that would have stuck.

i had a 2018 intel mac had none of the issues with screen or keyboard and my daughter is still abuseing it now. i actually found the Touch Bar very useful particularly for spelling corrections.
I do think your right if it had been an extra not replacing the function keys we would still have it now.

My new M2 after just 9months has huge battery life issues its down to 90% all ready and only get used for 3 or 4 hours a day not impressed. I think it must have been one of the first M2 made and sat on a shelf for 6m before i got it.
I use low power mode on battery and optimised charging.
 
My new M2 after just 9months has huge battery life issues its down to 90% all ready and only get used for 3 or 4 hours a day not impressed. I think it must have been one of the first M2 made and sat on a shelf for 6m before i got it.
I use low power mode on battery and optimised charging.

Unless the laptop you bought had been stored, pre-sale, in extreme conditions, which is unlikely, then you probably have a sub-par battery, chemically speaking. Non-usage over time tends not to be what robs a battery of its charge capacity.

To wit, I remember my surprise when the original battery in my unibody MBP, one purchased almost a dozen years after it was made, still reported 93 per cent capacity despite, obviously, seeing limited usage by its previous (and probably original) owner (at, like 290 charge cycles after all that time). Since, I’ve added about 150 charge cycles of heavy use, and the battery holds in the 87–88 per cent range. Prior MBPs — unibody and retina — I’ve owned never came close to this.

I have come to conclude that the OEM battery with this particular unit happens to come from a chemically high-quality batch. That said, you might want to check in with Apple about their current criteria for in-warranty battery replacement.
 
Apple Silicon Macs of any form and shape are much better machines in almost every aspect than their Intel equivalents. They are much faster and offer a smoother and more premium experience overall, for lack of other general term.

On the downside, memory and storage prices are utterly ridiculous and stink corporate greed. Also, the lack of upgradeability and in some cases of repairability is a major drawback.

That said, they are still, in general, much, much, super-duper-extra-ultra much better machines than their predecessors. The difference, in many cases, is day and night - it is another league, another experience.
I always see over-the-top superlatives, but has anyone quantified the performance delta between the last Intel Macs (say Mac Mini) and the M1 Macs? I doubt there was that big of a performance difference (and probably not much of a difference now between M3 and the best Intel offers). It seemed the bigger difference was battery life for portables and no fans for Air laptop models.
 
There’s no endemic issues. The 2018 Minis are overpriced for what they are because of demand due to perceived “value” of being able to run Windows on them.

And I don’t mean dual boot - I get that - I mean exclusively boot Windows on them, which is, in my opinion, pointless when perfectly good business grade PCs can be had for a virtual pittance once they’re 3-5 years old.
None of that seems to address his question of why these M-series Minis are available 'for parts' (implying they already don't work and have failed for some reason).
 
  • Like
Reactions: B S Magnet
The only thing I miss about Intel is being able to run a proper Windows VM in x86. I don't miss the heat or the throttling.

ARM Windows on Parallels is a lot better than I expected. X86 Windows apps run faster inside my ARM virtual machine via Windows translation (effectively, Microsoft's version of rosetta) than they do on most native windows machines I'm deploying here at work (mostly intel 10th gen i5s). With no fan noise, vs. constant freaking fan ramp on these dell latitudes and HP elite books.

Older games run too (e.g., Neverwinter Nights 2 runs perfectly). Haven't tried much recent, because I'm not running Windows to try and turn it into a gaming machine, but I was quite surprised at the performance through Parallels.
 
  • Like
Reactions: ric22
I always see over-the-top superlatives, but has anyone quantified the performance delta between the last Intel Macs (say Mac Mini) and the M1 Macs? I doubt there was that big of a performance difference (and probably not much of a difference now between M3 and the best Intel offers). It seemed the bigger difference was battery life for portables and no fans for Air laptop models.
All benchmarks, all of them, show a big performance gap in favor of M-series over Intel Macs, especially when it comes to Pro and Max models -and, naturally, this gap gets larger with every generation of M-series models. Also, in Cinebench 2024, M3 Pro 12 cores beats Intel Core i9-13900H/13900HK CPUs and M3 Max 16C beats most HX processors from both Intel and AMD. These findings have been separately confirmed also via laptop performance comparisons by a few independent reviewers (e.g. check Jarrods tech for M2 Max vs Intel and AMD).

In terms of real-life comparisons, personally I have an unfair one, a 2014 MBA with 8GB of RAM vs. a 2023 MBP with M3 Pro 12C and 36GB of memory. Anyway, the simulation of an optimization algorithm in Matlab using parallel computing and Mosek optimization toolboxes took about 10 hours in the former case and 3.5 minutes in the latter... There is a 9 years gap and it is an entry-level machine limited by memory and thermals vs. a mid-range machine with none of these issues, so I really don't know how helpful that comparison is (probably not much).

As a side comment: Having good performance (maybe not the best out there, but still seriously good by any artificial or real-life metric) while ensuring prolonged battery life and no performance gap between on battery vs. plugged-in usage is important (that is why, for instance, Intel introduced recently the Core Ultra CPUs) and difficult (that is why Core Ultra CPUs failed impressively compared to M3 series).
 
By the way, having more and more native programs for M-series CPUs as time progresses is a really good development in terms of performance.

As mentioned previously, my simulation in Matlab R23b and Mosek 10.1 native for AS took 3.5 minutes, while the same simulation using the respective x86 versions of these programs took 17 minutes due to Rosetta 2 mapping CISC/x86 instructions to RISC/ARM ones.
 
  • Like
Reactions: throAU
I agree about the initial leap in general performance when moving to Apple silicon. My worry about my personal use M1 Air is the soldered components. My final Intel MacBook Air's RAM died. Now that even the SSD is spitefully soldered down, you really better pray you do a hell of a lot of backing up!!! Even one backup isn't enough, as I've had a TimeMachine backup fail. ☹️
 
I agree about the initial leap in general performance when moving to Apple silicon. My worry about my personal use M1 Air is the soldered components. My final Intel MacBook Air's RAM died. Now that even the SSD is spitefully soldered down, you really better pray you do a hell of a lot of backing up!!! Even one backup isn't enough, as I've had a TimeMachine backup fail. ☹️

If you aren’t backing your stuff up, you’re inviting disaster if your machine is stolen, your house burns down, etc. Replaceable components or not.

Relying on hardware to not die or be stolen as a form of not losing your data is going to end in tears eventually.
 
If you aren’t backing your stuff up, you’re inviting disaster if your machine is stolen, your house burns down, etc. Replaceable components or not.

Relying on hardware to not die or be stolen as a form of not losing your data is going to end in tears eventually.
What did I say in my post gave you the impression I don't regularly backup? I was trying to emphasise that people better do a lot of backing up- and even more so now that the SSD is soldered in.
 
  • Like
Reactions: B S Magnet
If you aren’t backing your stuff up, you’re inviting disaster if your machine is stolen, your house burns down, etc. Replaceable components or not.

Relying on hardware to not die or be stolen as a form of not losing your data is going to end in tears eventually.

That’s not the point.

When a manufacturer locks out paths for any way to replace faulty consumables on their whole product (even consumables which the manufacturer have either soldered, “unified”, and/or crytpographically paired), then the manufacturer must be held to higher regulatory conditions. [EDIT to add: Or, absent more stringent regulatory conditions in place, the fallback may be, remedially, juridical action.]

Why? It owes to a greater public, social, and ecological impact arising from generating, by manufacturer design, higher volumes of complex waste far sooner than necessary (and hastening rates of resource extraction for replacement hardware). Additionally, when the manufacturer lack a clear plan to replace those faulty consumables without sacrificing the entire product, even within their own on-site repair facilities, then that goes double for a need to impose regulatory penalties upon that unsustainable, anti-consumer, and anti-competitive practice.

Those regulatory standards — whether nationally or by worldwide accord — are not yet in place, insofar as it relates to companies like Apple, Samsung, Microsoft, Dell, Asus, and others of that echelon.

As for Apple inserting themselves into the “let’s put in aggressive anti-theft measures which, literally, reduce the functionality and extensibility of the hardware”?

This is a superb example of Apple reaching beyond their core wheelhouse: i.e., making good hardware and making a good operating system on which that hardware can run. It’s especially galling how Apple ventured forth down this path when a superior product, already sold by Orbicule, Undercover, delivered turnkey anti-theft capability for Macs from 2006. (Disclosure: I was a customer/subscriber of Undercover from 2009 until they had to close doors in 2019).

[A similar historical case goes for the competitive elimination of a utility like Growl as Notifications was rolled out. There have been many more examples along these lines, particularly after 2011.]

The above presents an illusion to end users that Apple are selling the notion of a perfect safety. Not even Apple may guarantee that, as perfect safety is a fantasy. Moreover, it really is up to the end user to supply the anti-theft measures, should manufacturer-driven interventions not only impair the utility of their own hardware, but also venture toward being bogged down by a plausible indictment of anti-competitive practices.

The same, quite frankly, also goes for the end user to supply their own backup measures/routine and not be steered, coerced, or swayed toward reliance on a closed, subscription-based cloud backup system (rolled out by the very same company) as part of that company’s complete vertical integration scheme — one immanently anti-competitive and, long-term, anti-consumer by design.
 
Last edited:
  • Haha
  • Like
Reactions: ric22 and throAU
As mentioned previously, my simulation in Matlab R23b and Mosek 10.1 native for AS took 3.5 minutes, while the same simulation using the respective x86 versions of these programs took 17 minutes due to Rosetta 2 mapping CISC/x86 instructions to RISC/ARM ones.

When you ran your Matlab and Nosek simulations on the last Intel Mac you owned, how long did those take? On what Intel Mac did you run those?
 
What did I say in my post gave you the impression I don't regularly backup? I was trying to emphasise that people better do a lot of backing up- and even more so now that the SSD is soldered in.

Wasn’t referring to you, more your comment that “you really better pray you do a hell of a lot of backing up!!!”

My point it this is 100% no different to any other machine ever produced, if you give a toss about your data.

Alternatively, living in this century… i just sync my stuff. Mac breaks, i work from ipad or whatver and pretty much carry on where i left of. Or if it is demanding enough, sign into another mac and pull the stuff down off the network.

But again, this is exactly no different to an intel machine… or a pc for that matter.
 
Last edited:
Wasn’t referring to you, more your comment that “you really better pray you do a hell of a lot of backing up!!!”

My point it this is 100% no different to any other machine ever produced, if you give a toss about your data.

Alternatively, living in this century… i just sync my stuff. Mac breaks, i work from ipad or whatver and pretty much carry on where i left of. Or if it is demanding enough, sign into another mac and pull the stuff down off the network.

But again, this is exactly no different to an intel machine… or a pc for that matter.
Well, no. Not quite. Hardly anyone backs up every hour, and synching every conceivable type of application hourly isn't a thing either. If someone's RAM died in a system where it was replaceable, they could literally go to a shop, replace the dead stick and keep going- ZERO data loss. Similarly if the SSD was removable- take it out, plug it in elsewhere- ZERO data loss.

Add to that you have an otherwise perfectly fine laptop that is suddenly e-waste just because one component failed! The level of disaster IS undeniably different with a failure of a removable component compared to a soldered one.
 
  • Like
Reactions: B S Magnet
When you ran your Matlab and Nosek simulations on the last Intel Mac you owned, how long did those take? On what Intel Mac did you run those?
As I said in the post above the one you replied to, the same simulation in Matlab R23b and Mosek 10.1 in a MBA 2014 took 10 hours. In MBP M3 Pro 12C/36GB took 3.5 minutes with native AS versions of the above-mentioned programs and 17 minutes with x86 versions.
 
  • Like
Reactions: throAU
I literally work exactly like that. I have my documents synced to icloud/365 and an external time machine disk plugged in at work all day - when i plug into my USB hub/power at my desk.

If my machine burns down with my house, i am already 90% synced from icloud, anything else is on the time machine disk stored off site from home (at work).

It isn’t hard to not lose data. People just don’t care about their data. You can (and many do) work like this using either macos, windows, or whatever.

And if you don’t work that way, i don’t care what machine you use, you’re at risk of data loss due to theft, fire, etc. Can all be avoided for the cost of cloud sync and a couple of hundred dollars on an external drive (or two, for 2 forms of disk backup if you’re not synching to cloud or other devices/network).

Not losing data has nothing to do with what machine you use. It’s just giving enough of a crap to make sure you have multiple copies in different places.

Basically, if you’re worried about data loss from hardware failure or theft - you’re doing it wrong.
 
Last edited:
  • Haha
Reactions: B S Magnet
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.