Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
An unsafe assumption to consider people who have reported no issues 'as they seem technically competent'....Merely the laws of physics dictate that they might not be aware of degradation in performance if they are using the equipment for basic tasks, and many of these what you call 'technically competent' have admitted they only use their computers for basic tasks?
One reason for not detecting would that it actually isn't a problem. To them.

I was kind of a half decent Linux user until I realised their love for the kitchen sink was " 'till death do us apart". Picking kernels from different distribution, stripping it and compiling it for another distribution and so on.

I'm no longer competent as it's been ages since I've been at that stuff, but the SSD's I used kind of appreciated swap, and as my setups were put together with minimal resource usage and maximum performance for the applications, I packed up with ram and moved the swap to ram. Worked wonderfully. Blasting fast in the past.

Now, a few of the SSD manufacturers tended to report a slightly lower storage capacity than the actual, thus when capacity were lost due to heavy r/w, the SSD still kept within the promised spec. These SSDs were not soldered and had their own controller in the package, mind you. There was all this stuff with block sizes as well.

...but I'd argue that you don't loose performance due to r/w wear and tear. I'd argue what you loose is capacity. The controllers tend to have the big picture on where the lost capacity is positioned, and would make sure that there are no attempts to read or write from damaged/unusable spots. Thus no performance should be lost.

Kindly note that the above is taken directly from ram and not the ssd. Freely from my memory that is.
 
Well, just having some swap used is a non-issue - what counts is the page fault rate which shows how often data has to be written to or from swap. That isn’t displayed in Activity Monitor any more, but it is reflected in the “memory pressure” reading, according to some algorithm designed by Apple.

If your system is swapping frequently then the processor is being slowed down by lack of RAM - accessing SSD is an order of magnitude slower than RAM - end of. Apple Silicon is very efficient at using swap, and faster than the old Intel machines in other ways, so whether this leads to something you’d notice is unclear. In the early days there were a lot of reviews comparing 8GB M1 and 16GB Intel using tests that simply didn’t need more than 8GB or which flew through on M1 - swap or not - because they were now being accelerated by the Media Engine.

I can see why people with modest RAM needs don’t want the base price to go up by $200 - the issue is that going to 16GB shouldn’t cost anything like that amount, and a premium-priced system (and all Macs are premium priced) shouldn’t be cutting corners like that.
I’ve been looking at some the “reviews” of pre-order options for Windows-on-ARM laptops with the Qualcomm Snapdragon Elite SoC, and noted the difference in both the base RAM and upgrade costs from various vendors.They all start with 16GB, and the upgrade costs are far more reasonable. In the case of the Lenovo Yoga Slim in my country it’s only the equivalent of $72 to upgrade from 16GB to 32GB, and $39 to upgrade the (removable?) PCIe Gen4 SSD from 512GB to 1 TB.

Similar MBP upgrades cost about FIVE TIMES these prices!!

[UPDATE: If real-world performance of native/emulated Windows 11 apps on Snapdragon X Elite lives up to the hype and is roughly equivalent to M3 Pro AND vendors keep the upgrade prices much lower than Apple, then I would seriously consider voting with my wallet and moving to a WoA laptop.

However, unless Qualcomm or another ARM vendor can match the M3 Max, I don’t see any competition for heavier workloads, especially those requiring lots of GPU cores…unless of course discrete GPU options are offered]
 
Last edited:
A new mac mini doesn't solve the problem of a lack of FREEDOM with Apple devices.

Let us run Apple TV style software off of our Mac Mini, when connected to a TV.
Let us run Windows for the software that is needed there, ala Bootcamp.
Let us run Linux.

GIVE US FREEDOM Apple.

Government needs to step in and stop Apple from locking down bootloaders.
 
Similar MBP upgrades cost about FIVE TIMES these prices!!

That's really nothing new - it was already true back when Apple RAM upgrades were just bog standard DDR4 SODIMMs - I bought a 8GB iMac in 2017 and upgraded it to 24GB for less than the £200 (even back then) Apple wanted for an upgrade to 16GB - and that was using a "guaranteed to work with your Mac model" "kit" from Crucial (i.e. the same Micron RAM that Apple used) and zero shopping around.

It's been increasingly true that "premium" thin'n'light ultrabooks from the big three (Dell, Lenovo, HP) costing much over $1000 either come with 16GB RAM and 512GB SSD or offer upgrades for a lot less than Apple. Of course, there are thousands of PCs to choose from (and Dell, Lenovo etc. offer a dumpster fire range of overlapping models) so you can always find exceptions (or a $400 brick with 4GB) if you're in denial - but you have to assume that buyers will hunt for the best deal, not the worst one... Plus, of course, it's hard to argue a direct equivalence between an Intel processor and Apple Silicon - but this issue dates back to the good old days when Apple were using Intel processors and AMD GPUs. Since the price points of machines with Apple Silicon haven't really changed that much since Intel - and Apple wouldn't have switched to Apple Silicon if they were going to lose money on it - it's pretty clear that these are strategic price points based on the target market, with little to do with the actual marginal cost. Unfortunately, the target market for Macs seems to be "people who would die rather than consider buying a PC" - which may be great in the short term, but if they don't attract new customers, the Mac won't have a long term.

However, unless Qualcomm or another ARM vendor can match the M3 Max, I don’t see any competition for heavier workloads
That will come. Anybody the size of Qualcomm, Intel or Microsoft can hire good chip designers. Apple had a short-term advantage in that they already had a nearly platform-independent (at source code level) ecosystem, three rounds of experience in major architecture changes (68k->PPC, PPC->Intel, and don't forget Classic Mac OS -> Mac Os X) and a customer base with low expectations of legacy compatibility. Microsoft has a huge, slow-moving corporate market hooked on ancient "legacy" software and has a struggle moving customers onto the latest version of Windows, let alone moving to a different platform. They've had at least one failed attempt to support ARM in the past, and their current attempt has been half-hearted for the last few years - but with these new Snapdragon "AI PCs" they finally seem to be going for it seriously. Once the mountain starts to move, and the "serious computing needs x86" sentiment starts to dispel then the R&D money for faster ARM chips will start to flow.

This announcement is one in the eye for Intel, and I wouldn't be surprised if Intel (once they've finished pouring bleach on Microsoft's clothes) got back into the ARM game soon (I think they did make higher-end ARM chips for a while after they inherited StrongARM from DEC).

Also bear in mind that companies like Ampere, Amazon AWS and NVIDIA are already making ARM chips for "big iron" (the first two are more traditional server chips, NVIDIA's Grace/Hopper takes a more radical, vaguely Apple Silicon-like approach) so there's plenty of tech to trickle down.
 
I've stopped keeping tabs on Windows, as I really don't care, but they have promised paradigm shifts many many times before. The showstopper was always the technical structure of Windows and the disadvantage they had to the simplified layered build-up of *nix and derivates.

The second issue they have had was the so called "wintel" thing. A.O. the Intel AMD compiler scandal.

Both included significant efforts to hamper competition rather than innovate and be competitive by actually becoming better than the competition. A.O they secured advantages by the discriminating server side and so on.

The above may sound like a blast from the past, but it shapes the foundation of the issues both Intel/Amd and Microsoft are facing. (AMD was really taking on Intel and was seriously hurt by a.o. the compiler trick). Intel and Microsoft spent a lot of energy upon restricting others which they could have spent modernising their own products and get real competitive advantage.

So, Apple decided they had to do their own chip stuff and they've been pretty good at it. Just about everything deskside is about catching up with M now. That includes "machine learning" to a certain extent, as that is being used to level out the advantages of the "architecture" and rational resource/energy usage for the M's.

As far as hardware/device design goes, "everything available from anyone" is made to match or outmatch Apple device design, and it has been like that since iPod, Unibody, Air, iPhone and iPad. And now they are franticly tailing M.

They will eventually get there, that race ain't over yet. But Microsoft, Intel, Amd, Arm and just about everybody else had to seriously drop everything and turn around with an entirely different focus as a result of M. Kernel panic.

At one stage, sooner or later, (they are at it by absorbing Linux stuff) Microsoft will have to adress that the layered modularised *nix derivates are way more rational and efficient both running and in terms of development than the Windows design.

Trouble is, they have built a vast population of professionals that gets their livinghood by servicing and deploying that stuff everywhere, and that is not an easy mass to flip around.

As for Linux on the user side, they missed out on several levels. One was formats, one was Android and third was challenges with proprietary hardware and drivers. Did I mention the business model for desktops? Another was the focus they had upon delivering everything in one package to everyone. A total lack of focus on the UX and desktop side. Rather than sharp and lean, they provided all kinds of stuff confusing the general computing audience.

As with Windows, it doesn't really matter to have great hardware if what's facing the user is a mess. Or a hog. So Linux (the kernel and the layers below the UX and software can be as great as it can (and it is), but what's facing the user is still crap. Didn't at all have to be, but that's what they chose to provide.

MacOS don't, and that's what's gives Apple an edge, combined with increasingly perfect control over the entire stack, iron to eye.
 
Getting the booting and hardware drivers sorted out is great, but it still leaves the lack of sound strategies by the desktop distributions and a rational approach to desktops to be sorted out. They got most/all the building blocks, but they are still not going hardcore rational by e.g. focused desktop kernel iterations and to the point desktops providing BOTH intuitive user experiences AND clean environments. Whatever they choose to do, 1 distribution, 1 desktop, 1 software environment, no server stuff and kernels/systems optimised for desk only.

If they want anyone but enthusiasts using the stuff, that is.

I write this using KDE/Arch on an antique Thinkpad by the way (Yeah, I postponed replacing my MBP too long) :eek:
 
I hope they make it even smaller by using Thunderbolt Power delivery to power it instead of internal PSU. GaN chargers are tiny now, It could be powered from the Studio Display with a single cable.
 
  • Like
Reactions: Chuckeee
I hope they make it even smaller by using Thunderbolt Power delivery to power it instead of internal PSU. GaN chargers are tiny now, It could be powered from the Studio Display with a single cable.
Yes, it seems like the time is right for that. It would still come with a power supply in the box, but it would be similar to the one that comes with the MacBook Air, for example.
 
You might be surprised how much load non-tech users put on their devices. Non-tech users will open up app after app, web page after web page, photos, music players, and on and on, never thinking that maybe they should shut some down.

What does that do to an 8Gb RAM device? Starts to cause a lot of memory swapping to disk. What’s the result of that? SSD faillure much sooner from heavy usage it gets. 16GB RAM cuts the memory to disk swapping down dramatically. Who likes to pay an extra $200 for 8GB RAM, when Apple could include it in the base price model?

In 2024, ALL users should be sold a Mac out of the box with 16Gb of RAM.
There are no good reasons not to include it.
 
You might be surprised how much load non-tech users put on their devices. Non-tech users will open up app after app, web page after web page, photos, music players, and on and on, never thinking that maybe they should shut some down.

What does that do to an 8Gb RAM device? Starts to cause a lot of memory swapping to disk. What’s the result of that? SSD faillure much sooner from heavy usage it gets. 16GB RAM cuts the memory to disk swapping down dramatically. Who likes to pay an extra $200 for 8GB RAM, when Apple could include it in the base price model?

In 2024, ALL users should be sold a Mac out of the box with 16Gb of RAM.
There are no good reasons not to include it.
I have a 2020 8gb M1 Mini and the SSD still hasn't blown up.
 
  • Like
Reactions: danb1979
While there appears to be a wide consensus that the upgrade to 16GB baseline is an “about time overdue update” and/or is a general necessity for [generic] AI functionality.

I keep wondering if it is something more specific. That there is a specific basic functionality (or two) of Apple Intelligence (as opposed to AI in general) that resulted in unacceptable performance that was discovered during testing of OS18.1.

To me, that would seem a bigger motivator to drive Apple’s update than just “Apple needed to keep up with industry standards”.
 
oh I should guessed lol. for some reason that escaped me.

Here's a video you might like then:

Ah OK. So what size is your SSD? They have a finite life you know.

OK - I'll have a rant. It's not aimed at you. Don't take it personally!

In a typical Windoze notebook, in order to add another drive (like with the new M$ Snapdragon units) you have to turn the computer off. Open the flap and insert the drive. Restart and format it. Steve Jobs style user friendly.

I'm guessing many people here think an SSD change in a Mini, or MacBook, or Studio, is really easy? They should have a look at the thread here on trying to upgrade Studio SSDs. And guess what - they are far from normal drives. Inside the MacBook Pros, even the angle sensor for the screen is dedicated to the serial number of each individual notebook. Same for the screen - if you crack your screen, only Apple can get it fixed. Same for the finger print sensor. The list goes on ... that I'm tempted to buy a high end Apple machine, is really crazy. The cheapest ones make a lot more sense. You are certainly clever in that regard - the cheap ones make more sense it seems to me. But the saving grace in a costly one might be annual and perpetual AppleCare. Its probably why Apple have gone to such lengths to stop servicing. The only logical solution is annual AppleCare.
 
Last edited:
Ah OK. So what size is your SSD? They have a finite life you know.

OK - I'll have a rant. It's not aimed at you. Don't take it personally!

In a typical Windoze notebook, in order to add another drive (like with the new M$ Snapdragon units) you have to turn the computer off. Open the flap and insert the drive. Restart and format it. Steve Jobs style user friendly.

I'm guessing many people here think an SSD change in a Mini, or MacBook, or Studio, is really easy? They should have a look at the thread here on trying to upgrade Studio SSDs. And guess what - they are far from normal drives. Inside the MacBook Pros, even the angle sensor for the screen is dedicated to the serial number of each individual notebook. Same for the screen - if you crack your screen, only Apple can get it fixed. Same for the finger print sensor. The list goes on ... that I'm tempted to buy a high end Apple machine, is really crazy. The cheapest ones make a lot more sense. You are certainly clever in that regard - the cheap ones make more sense it seems to me. But the saving grace in a costly one might be annual and perpetual AppleCare. Its probably why Apple have gone to such lengths to stop servicing. The only logical solution is annual AppleCare.
I don't worry one iota about my Mini blowing up.
 
  • Like
Reactions: Chuckeee
I don't worry one iota about my Mini blowing up.
But you're looking at the next version, so ...

You didn't give your spec either.

DriveDx is a tool for looking at its usage. If you do video work, then the SSD will fail considerably earlier. RAM usage and capacity also effects the life of an SSD. A 500 GB drive has around 150-300 TBW, 1 TB 300-600 and 2 TB 600-1200 TBW. I wouldn't worry either if I hardly used the mini. But my 2011 MacBook Pro's SSD failed when I took it out of storage, and it had only been used for 2.5 years. It was a I think a 128 GB drive with 4 GB RAM. I needed it for downloading Firewire videos of Firewire digicams. I put in a 1 TB drive, and added 16 GB RAM, it took 20 minutes, along with a new battery. Evidently it can run Sonoma using Opencore (but Firewire wouldn't work I think, let alone the uploading software).

I wonder how our 2020 M1 minis will be going in 2033?
 
But you're looking at the next version, so ...

You didn't give your spec either.

DriveDx is a tool for looking at its usage. If you do video work, then the SSD will fail considerably earlier. RAM usage and capacity also effects the life of an SSD. A 500 GB drive has around 150-300 TBW, 1 TB 300-600 and 2 TB 600-1200 TBW. I wouldn't worry either if I hardly used the mini. But my 2011 MacBook Pro's SSD failed when I took it out of storage, and it had only been used for 2.5 years. It was a I think a 128 GB drive with 4 GB RAM. I needed it for downloading Firewire videos of Firewire digicams. I put in a 1 TB drive, and added 16 GB RAM, it took 20 minutes, along with a new battery. Evidently it can run Sonoma using Opencore (but Firewire wouldn't work I think, let alone the uploading software).

I wonder how our 2020 M1 minis will be going in 2033?
You do the worrying about your ssd blowing up. I'll remind the doomsayers every year that my Mini still hasn't blown up. Eventually I will be wrong. Maybe in 2033.
 
You do the worrying about your ssd blowing up. I'll remind the doomsayers every year that my Mini still hasn't blown up. Eventually I will be wrong. Maybe in 2033.
It won't be supported in 2033, so Apple doesn't think it matters anyway.

Apple didn't keep older macs going - that was due to volunteer communities, many people here. Opencore legacy patcher allowed us to install and run the latest macOS versions on our macs by modifying system files and injecting necessary drivers - all due to the incredible volunteer developers & macOS modding communities who live on this site and some others. like the Dortania guys especially for OpenCore and great guides on installing macOS on unsupported mac hardware. See OpenCore Legacy Patcher GitHub.

Its best for you to keep your head in the sand, its more comfortable that way. And your M1 won't be supported in 2033. It's support will stop many years earlier. Apple of course is not alone in this trend, but they lead the non fixable brigade of hardware makers. Don't fix it replace it is their policy.
 
Last edited:
It won't be supported in 2033, do Apple doesn't think it matters anyway. Apple didn't keep have kept older macs going - that was due to volunteer communities, many people here. Keep you head in the sand, its much safer that way.
I can’t help it that my Mini still works.
 
  • Haha
Reactions: eldho
Can the current M series minis boot from external? If it's got decent thunderbolt ports I may buy an M4 mini with expanded memory but default ssd, with the plan to just normally run it off an external. Leave the factory install on the internal as a fallback. I'd probably need some advice at what point buying a Mac Studio makes more sense, though, assuming that gets updated early next year. Someone remind me where it makes more sense on the current lineup, anyway, because the current lines might be a lot cheaper after the new ones come out, heh.
 
OK, So we can run an external SSD if something happens internally (hopefully) Also wouldnt it be more prudent to have an external power supply on the new M4 mini. Why do so many people place too much emphasis on aesthetics on the desktop and whine that they would prefer an internal PSU.Power supplies always go first on any Mac I have had. It drives me nuts.
 
OK, So we can run an external SSD if something happens internally (hopefully) Also wouldnt it be more prudent to have an external power supply on the new M4 mini. Why do so many people place too much emphasis on aesthetics on the desktop and whine that they would prefer an internal PSU.Power supplies always go first on any Mac I have had. It drives me nuts.
I'd rather have an external brick. Imagine how much hotter our laptops would be, how much extra cooling and how thick they would be, if they had to have the power supply internal. Also imagine having to take the mini in for servicing because of a bad PSU. An external brick for the mini would just be air cooled, and you already have to run that cable anyway. Of course, if it supported being powered by USB PD then it'd just be that cable.
 
  • Like
Reactions: avkdm
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.