Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
And that’s perfectly fine. Your tastes and preferences are your own. I don’t think your decision are rational, but matters of tastes rarely are.
If 'rational' means having to be like everyone else, then that has never been me. I just get what I like when I can afford it. Over the last decade and a half being in the financial spot to only buy what I could afford has made me focused on making what I bought work for me.

There is nothing I do in my personal life that has ever required a current version of any computer. And in the exceptions where we had to produce something modern, the things we already have were able to do it. I do upgrade, I just do it when we get to a point where modern things can no longer be done with what we have.

If you look in the PowerPC forums you'll find that I was using PowerPC from 2001 to 2020 because it was what I could afford and I made it work. In 2020 is when I left PowerPC behind, because Intel was now in my affordable range and PowerPC was no longer doing what I needed it to do without serious workarounds. In about 10 years I figure, it will be time to do this again.

In my employment I also chose a career where the company provides the work machine. Right now, that's a 2023 M2 MBP. In the past it's been other 'current' computers. So I have access if I need.

So, rational is relative.
 
How does upgradeability make computing worse for everyone else?

The history of computing is literally the history of tight integration. The next logical step is integration of system memory. Upgradeable RAM for example is slowly but surely dying out, and that’s a good thing.
 
  • Like
Reactions: JinxVi
I — briefly — wondered the same, but my wonderment went back to doing other things from the moment loaded language like “entitled enough” and “make computing objectively worse for everyone else” crept up in the remark. That wasn’t helpful or meaningful.

I think that “me and a bunch of my buddies like to tinker with obsolete tech in my basement, so you can’t have a fast laptop” is rather entitled, wouldn’t you agree? And I’m not inventing this, this is literally what some right to repair activists are demanding. They want that all computers are designed for their marginal needs and that other customers subsidies their hobbies.
 
Pretty much this. Upgradeability matters only to a very small vocal minority. It’s really a shame that this minority feels entitled enough to try to make computing objectively worse for everyone else.
Did only a small minority of people upgrade *when it was possible* on all computers? I doubt it. Everyone I knew 20 years ago would up upgrades, or their more technically capable relatives would do it for them. It was standard practice! My mum and sister had laptops last years and years longer because I did things like boost the RAM and add SSDs or HDDs. Heck, I even bought a used CPU for my mum's old laptop to make it snappier!
 
I think that “me and a bunch of my buddies like to tinker with obsolete tech in my basement, so you can’t have a fast laptop” is rather entitled, wouldn’t you agree? And I’m not inventing this, this is literally what some right to repair activists are demanding. They want that all computers are designed for their marginal needs and that other customers subsidies their hobbies.

Again, with loaded language and (impressive) projection.

Re-think your tack, please. Cheers.

On upgradeability and computing, these two have gone hand-in-hand from the inception of not only personal computers, but even earlier with IBM minicomputers and mainframes. So I’m not wholly sure what your snide commentary is on about regarding consumers — business and/or personal — who prefer to have upgradeability/repairability in their computing equipment.
 
Last edited:
  • Like
Reactions: arkitect
Changed the topic title to better reflect my feeling. Cheers everyone

Not sure there was anything wrong with the topic title.

I took it to mean that your appreciation for the Intel Mac era was increasing. Not that you appreciated more than the apple silicon era.

Incidentally, I had high hopes for apple silicon. Now here I am having built a new pc to run macos….
 
Did only a small minority of people upgrade *when it was possible* on all computers? I doubt it. Everyone I knew 20 years ago would up upgrades, or their more technically capable relatives would do it for them. It was standard practice! My mum and sister had laptops last years and years longer because I did things like boost the RAM and add SSDs or HDDs. Heck, I even bought a used CPU for my mum's old laptop to make it snappier!

Many people I knew 20 years ago also did upgrades, and there were a bunch of reasons for that. First, my friends and me were computer nerds who loved tinkering with this stuff, and we were also poor. At the end of the day, that aspect was just a hobby. Second (and something I find much more interesting), the nature and benefits of upgrades was different. In the time when 2GB were the standard RAM size a d software evolved very quickly, an additional RAM stick could make all the difference. We observed something similar when the SSD revolution came. Things have changed quite a lot since then though. These days computers come with very large memory pools and very fast storage out of the box. Upgrades are simply much less meaningful.

Whyyyyy? 🧐

Because we want more performance, more energy efficiency, reliability, and all that without exploding the costs. I mean, what would a 512-bit modular RAM with the same area and power footprint as what Apple ships today in the Mac series look like? Nobody is offering products like these because it’s not feasible. We even see supercomputers moving to soldered-on RAM because of the performance benefits and implementation improvements it brings.

Again, with loaded language and (impressive) projection.

Re-think your tack, please. Cheers.

On upgradeability and computing, these two have gone hand-in-hand from the inception of not only personal computers, but even earlier with IBM minicomputers and mainframes. So I’m not wholly sure what your snide commentary is on about regarding consumers — business and/or personal — who prefer to have upgradeability/repairability in their computing equipment.

You are quoting ancient history while ignoring the most recent one. Modern computing devices are not only minituarized, but integrate components such as mathematical coprocessors, caches, memory controllers, I/O, graphics, power management - all things that used to be separate modules at some point. replaceable memory was never even a topic with GPUs, simply because modularity is incompatible with performance requirements that these devices carry. These days users want to run large language models on their machines which require extreme data movement and data-local computing, how can you hope to achieve this kind of capability while insisting on old super-narrow modular interfaces? The newest generations of supercomputer chips either completely abandon replaceable RAM or start phasing it out gradually. The current plan appears to be equipping CPUs with a limited amount of very fast local RAM and handle everything else via IO bus.
 
  • Like
Reactions: wyrdness
How does upgradeability make computing worse for everyone else?
Upgradability requires standardized interfaces, which prevent tighter integration as well as innovation. There are so many immutable requirements, which can never be changed to keep up a minimum of backward compatibility. A modern PC graphics card can easily cost $2000, suck up 450 watts, take up two expansion slots on the motherboard, require its own custom cooling solution, share its graphics RAM with no other system component. None of that is particular efficient. Not space-, heat-, weight-, energy-, resources- or cost-efficient. And lack of efficiency ultimately in the end always means worse overall performance.
 
  • Like
Reactions: leman
How does upgradeability make computing worse for everyone else?

Depends on the device. The push for 'easily swappable batteries' on phones is definitely going to make them worse for everyone, because swappable batteries require hard shells to prevent damage, which reduces capacity, which will lower battery life. (this was a primary reason Samsung eventually followed Apple on the flagships). Phones will also be thicker and feel cheaper by the simple nature of requiring the back panel to be removed.

On something like a MacBook, I think Apple's found a happy medium by having in-chassis batteries, but with pull tabs making them easier to replace for more experienced users or repair shops - a definite upgrade over the epoxy slathered ones of the Retina era.
 
  • Like
Reactions: leman
As long as Apple computers were powered by Intel processors, there was no reason for me to switch to Apple. And MacOS wasn't that much better either.
It was only with the introduction of the Silicon chipsets that Macs became interesting to work with. Especially in terms of power consumption. A Windows PC with the same performance consumes at least ten times as much power. And macOS only became interesting for me from Big Sur onwards because, together with the Silicon chipset, it resulted in a very stable system.
 
And lack of efficiency ultimately in the end always means worse overall performance.

Less efficient sure, performance per watt absolutely. But worse overall performance? Nope.

In terms of pure performance Apple doesn’t have any thing that can touch last gen amd gpu even in metal performance, and just barely keeps up with current gen intel cpus.

Is the performance per watt impressive? Sure. But if the integration means that is where performance tops out then that’s a real shame. Not everyone needs or wants to use a laptop (or a laptop soc floating inside a big expensive steel case).
 
Many people I knew 20 years ago also did upgrades, and there were a bunch of reasons for that. First, my friends and me were computer nerds who loved tinkering with this stuff, and we were also poor. At the end of the day, that aspect was just a hobby. Second (and something I find much more interesting), the nature and benefits of upgrades was different. In the time when 2GB were the standard RAM size a d software evolved very quickly, an additional RAM stick could make all the difference. We observed something similar when the SSD revolution came. Things have changed quite a lot since then though. These days computers come with very large memory pools and very fast storage out of the box. Upgrades are simply much less meaningful.
It wasn't just dorky enthusiasts upgrading back then, but it was standard in business too. Work desktops and laptops all got upgrades when it was deemed necessary. They certainly had updates before they were replaced.

Upgrades might be less impactful now, but a hell of a lot of people run out of the stingy basic SSD space, for example, and regret not buying more as it's not convenient to move everything to external disks.
 
Because we want more performance, more energy efficiency, reliability, and all that without exploding the costs. I mean, what would a 512-bit modular RAM with the same area and power footprint as what Apple ships today in the Mac series look like? Nobody is offering products like these because it’s not feasible. We even see supercomputers moving to soldered-on RAM because of the performance benefits and implementation improvements it brings.

I have to correct it. I'm not sure why you said it.

Some companies develop things differently like Framework, with parts never replaceable on laptop computers. I hope Apple notices it and does the same thing with their computers. They did this with Mac Pro 2019.

I want to see a computer from the M series having one or a few upgradable parts. We can observe the ever-increasing number of parts that we can't swap between the same M Mac models. Simple parts like the lid angle sensors.

What is wrong if customers have one computer in the whole series that fulfills their needs?
 
  • Like
Reactions: ric22
I miss the compatibility x86 offered, but this statement is just bananas. Even the new parts of Windows are surprisingly painful on the eyes.
x86 should have died a long time ago, 4GB RAM limits today are pretty insane for a lot of applications, however I'm with you on the idea that Windows 11 is still surprisingly painful on the eyes - Windows 11 is in fact Windows 10, 8, 7, VIsta, XP, 2000, 98 Me and even the whole way back to 3.1 and thats a disgrace. Whoever is holding you back Microsoft from deleting 60% of the ridiculously old kernel, dump them quickly and move on - they're keeping Windows a hodge podge of utter crap - whereas macOS just willingly cuts old tech and stays slim and efficient.
 
Because we want more performance, more energy efficiency, reliability, and all that without exploding the costs. I mean, what would a 512-bit modular RAM with the same area and power footprint as what Apple ships today in the Mac series look like? Nobody is offering products like these because it’s not feasible. We even see supercomputers moving to soldered-on RAM because of the performance benefits and implementation improvements it brings.
Performance- negligible. Demonstrate anything close to 1% gain over the socketed stuff, please. There probably is some niche task that can demonstrate this, but in the News board here people failed to find it when investigating.

Energy efficiency- kinda depends on the truth of socketed LPDDR5 being available. If it truly is available this year, differences will again be negligible.

Reliability- RAM reliability has improved across the board. As we discussed in previous pages, there's little evidence on the reliability of modern sockets.

Footprint- Claims from manufacturers suggest the footprint of socketed RAM is extremely small, and not significant in a MacBook Pro sized device. Have a Google.

Edit: I'll give you cost- it apparently saves around $1-2 to dodge using sockets in each device. A small price to pay, one might argue, on a device costing upwards of $2,000.
 
  • Like
Reactions: arkitect
Depends on the device. The push for 'easily swappable batteries' on phones is definitely going to make them worse for everyone, because swappable batteries require hard shells to prevent damage, which reduces capacity, which will lower battery life. (this was a primary reason Samsung eventually followed Apple on the flagships). Phones will also be thicker and feel cheaper by the simple nature of requiring the back panel to be removed.

On something like a MacBook, I think Apple's found a happy medium by having in-chassis batteries, but with pull tabs making them easier to replace for more experienced users or repair shops - a definite upgrade over the epoxy slathered ones of the Retina era.
The pull tabs are great, when they don't snap like they often do on older phones. 😅
 
Anyone else here returning to Intel Mac's or still using early/late Intel Mac's in 2024? What's your reason?
I got my 2019 16" MBP with long-term use in mind, Apple mildly screwed me on the timing of Apple Silicon (it had been rumored for at least 5 years by 2019 but just kept not happening). It continues to work fine for me though.

I'm looking toward something Linux-based as my next desktop computer, the main reason being Apple soldering the RAM and charging these insane prices for RAM upgrades. I upgraded the RAM in my late 2015 Intel iMac myself for 1/3 the cost of Apple's upgrade. Not only was it 32 GB but it was faster than the factory-installed RAM, and it worked fantastically for many years. I'm going to repurpose the iMac as a 5K display using the DIY conversion, still waiting for the parts to arrive.

For what I do, Linux is sufficient. Windows 11 VM or secondary OS is a possibility as I should be able to get it for free from my institution, but I'm likely to spend as little time in that as possible. I saw Windows 11 looks very Mac-like and copied the MacOS Dock. My favorite WWDC quote ever: "Redmond, start your photocopiers!" - Bertrand Serlet (WWDC 2006).
 
In all 25+ years of using Macs I don't remember seeing Mac "users" so militant. I don't remember PowerPC users ever attacking 68k users or invading 68k forums back then. What happened to the old, friendly spirit?
 
Many people I knew 20 years ago also did upgrades, and there were a bunch of reasons for that. First, my friends and me were computer nerds who loved tinkering with this stuff, and we were also poor. At the end of the day, that aspect was just a hobby. Second (and something I find much more interesting), the nature and benefits of upgrades was different. In the time when 2GB were the standard RAM size a d software evolved very quickly, an additional RAM stick could make all the difference. We observed something similar when the SSD revolution came. Things have changed quite a lot since then though. These days computers come with very large memory pools and very fast storage out of the box. Upgrades are simply much less meaningful.



Because we want more performance, more energy efficiency, reliability, and all that without exploding the costs. I mean, what would a 512-bit modular RAM with the same area and power footprint as what Apple ships today in the Mac series look like? Nobody is offering products like these because it’s not feasible. We even see supercomputers moving to soldered-on RAM because of the performance benefits and implementation improvements it brings.



You are quoting ancient history while ignoring the most recent one. Modern computing devices are not only minituarized, but integrate components such as mathematical coprocessors, caches, memory controllers, I/O, graphics, power management - all things that used to be separate modules at some point. replaceable memory was never even a topic with GPUs, simply because modularity is incompatible with performance requirements that these devices carry. These days users want to run large language models on their machines which require extreme data movement and data-local computing, how can you hope to achieve this kind of capability while insisting on old super-narrow modular interfaces? The newest generations of supercomputer chips either completely abandon replaceable RAM or start phasing it out gradually. The current plan appears to be equipping CPUs with a limited amount of very fast local RAM and handle everything else via IO bus.
I'm curious about this "soldered RAM in super computers" comment. Admittedly I haven't personally seen inside a super computer for nearly 20 years, when I had the opportunity through work, but the memory boards that plugged in were already covered with soldered RAM. But the board itself they were soldered on to was removable and replaceable. 🧐 I'm struggling to see the relevance.
 
  • Like
Reactions: nathansz
In all 25+ years of using Macs I don't remember seeing Mac "users" so militant. I don't remember PowerPC users ever attacking 68k users back then. What happened to the old, friendly spirit?
These boards do tend to veer kinda angry rather quickly. I'm not sure users are militant in any way, though.
 
I have to correct it. I'm not sure why you said it.

Was what I said incorrect? Apple is the only company so know of that ships laptops with a 512-bit memory bus.

Some companies develop things differently like Framework, with parts never replaceable on laptop computers. I hope Apple notices it and does the same thing with their computers. They did this with Mac Pro 2019.

I want to see a computer from the M series having one or a few upgradable parts. We can observe the ever-increasing number of parts that we can't swap between the same M Mac models. Simple parts like the lid angle sensors.

What is wrong if customers have one computer in the whole series that fulfills their needs?

How do you imagine this working? You are asking them to either make Apple Silicon modular (which would compromise their their laptop lineup) or develop a special modular chip for some selected Macs (which would significantly raise costs).

There are of course other options, like using an additional (slower) memory pool either via PCIe or some other connector. This could be an interesting enough solution for the Mac Pro and it’s also something I’ve been advocating for. At the same time, this solution is not without problems of its own. For example, it might be fast enough to amortize some CPU workloads, but it’d be woefully underwhelming with GPU-intensive workloads.
 
I know it's controversial, and I hope that you correct me if you don't agree with it.

It looks like that I can't say that users buy new M Macs computers because they would have the right to swap parts between the same models. It works like with subcriptions that you can use for specific period of time, and then, company drop the technical support.

People renting their new Apple computer with the right to sell it.
 
Last edited:
The PPC era was also great for buying low end spec and upgrading yourself with cheap PC hardware that Apple & Mac retailers were marking up big time.

RAM, Hard Drives, Optical drives and to some extent video cards were easily upgradeable then, as they were in the first 6-7 years of the Intel era.
The old Power Mac / Mac Pro towers were fantastic. If you could afford a new one, you'd have not just a cool toy but also a dependable computer for many years to come. If you could only afford a used one, simply upgrade it with components like graphics card, more RAM, better HDD, etc. and it'd be much better than some brand new systems still. Apple killed an entire enthusiast community which had formed around these towers.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.