It's a sad day. Later today a 16in 2021 MacBook Pro will be delivered.
Why so sad, then?
It'll be time to retire my mid-2015 15in MacBook Pro (2.8GHz i7, 16GB RAM).
The best laptop I've ever owned. Hands down. It was the last good Intel MacBook Pro. After this, everything went wrong. Butterfly keyboard. Touch Bar. Hot-running Intel chips. Hashtag donglelife.
I've had it since the model was released, in April 2015. More than seven years. I bought it with future-proofing in mind. And I was right. For example, it can play 4K videos just fine, if I ever need it to (I don't).
Alas, it won't get the Ventura macOS release later this year. And macOS is getting a little sluggish, despite a fresh reinstall recently. I strongly suspect Apple simply isn't optimising for older machines any longer. For example, in the old days I could run an external monitor without the fans spinning up (in fact, at one point I had two external displays running plus mirroring to an iPad via Duet – and no fan spin at all).
Now, the fans get loud if I do just about anything with an external monitor attached other than web browsing. I'm pretty sure this is Apple's poor graphics driver implementation. It's been this way for a year or two.
The new 16in 2021 MBP is my next future proofed laptop. 32GB of RAM, only because I couldn't bear to get 16GB again. M1 Max. Weirdly, on a pragmatic level, the new MBP is extremely similar to the old one – it's even the same size — but simply with newer and better tech. Apple drifted away from the ideal of the ideal laptop, and have come back around just in time for my upgrade.
This new MBP should last me another seven years. Maybe more. I hope so because it's f***ing expensive.
It's weird to imagine what computing will be like in 2029. If it still exists, macOS will be packed full of machine learning for just about everything.
I read a deep technical dive into the M1 Max that said that the 400GB/s bandwidth could only be utilised if the CPU, GPU and media engines were all firing at the same time. Gee, I wonder what could ever require that? Maybe AR? Where the CPU controls the actual computing functions, the GPU draws a complex UI, and the media engines process the video of what you see...? It's like Apple's preparing for a future nobody has yet thought about. Not even the tech YouTuber bores who are always so sure they know what's going on.
So, thanks mid-2015 MBP 15in.
Why so sad, then?
It'll be time to retire my mid-2015 15in MacBook Pro (2.8GHz i7, 16GB RAM).
The best laptop I've ever owned. Hands down. It was the last good Intel MacBook Pro. After this, everything went wrong. Butterfly keyboard. Touch Bar. Hot-running Intel chips. Hashtag donglelife.
I've had it since the model was released, in April 2015. More than seven years. I bought it with future-proofing in mind. And I was right. For example, it can play 4K videos just fine, if I ever need it to (I don't).
Alas, it won't get the Ventura macOS release later this year. And macOS is getting a little sluggish, despite a fresh reinstall recently. I strongly suspect Apple simply isn't optimising for older machines any longer. For example, in the old days I could run an external monitor without the fans spinning up (in fact, at one point I had two external displays running plus mirroring to an iPad via Duet – and no fan spin at all).
Now, the fans get loud if I do just about anything with an external monitor attached other than web browsing. I'm pretty sure this is Apple's poor graphics driver implementation. It's been this way for a year or two.
The new 16in 2021 MBP is my next future proofed laptop. 32GB of RAM, only because I couldn't bear to get 16GB again. M1 Max. Weirdly, on a pragmatic level, the new MBP is extremely similar to the old one – it's even the same size — but simply with newer and better tech. Apple drifted away from the ideal of the ideal laptop, and have come back around just in time for my upgrade.
This new MBP should last me another seven years. Maybe more. I hope so because it's f***ing expensive.
It's weird to imagine what computing will be like in 2029. If it still exists, macOS will be packed full of machine learning for just about everything.
I read a deep technical dive into the M1 Max that said that the 400GB/s bandwidth could only be utilised if the CPU, GPU and media engines were all firing at the same time. Gee, I wonder what could ever require that? Maybe AR? Where the CPU controls the actual computing functions, the GPU draws a complex UI, and the media engines process the video of what you see...? It's like Apple's preparing for a future nobody has yet thought about. Not even the tech YouTuber bores who are always so sure they know what's going on.
So, thanks mid-2015 MBP 15in.