Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

TheMountainLife

macrumors 6502
May 24, 2015
355
458
Err, why did you even join this forum then?
I was also very late to join this forum. But I did it because I love Apple products, and have added more this year.

It does not make sense to me if you join the community just as you decide to drop out, and switch to the PC ?
*shrug* I'm apart of different subreddits and forums of things I'm interested in but may not own. Its a great thing when we can enjoy products but also expect more from it and hear how others use them in different ways. Everything doesn't have to be an echo chamber.
 

Kronsteen

macrumors member
Nov 18, 2019
76
66
I can certainly sympathise with some of the points @LambdaTheImpossible makes. The original post was quite thought-provoking, and caused me to wonder whether, all things considered, I am still happy with the Apple kit I own and have owned over the last 15 years. My conclusion is that I am, very much so. So perhaps I might offer an alternative view?

There have been times when I have felt mildly peeved by Apple’s prices and some of their products’ idiosyncrasies. That said, with a combination of 2008 and 2013 Mac Pros, a 2012 retina MacBook Pro, several iPads and, to some extent, a 2019 MacBook Pro, things I’ve happily done include some fun computational number theory on GPUs (quite rudimentary perhaps, but pleasing to me), using Mathematica, prototyping code for real-time capture & analysis of network traffic data and amusing myself with a variety of Python projects. Along with all of the usual, mundane “productivity” stuff, trying not to spoil my photographs in Lightroom or PixInsight and spending far too much time reading and playing chess on iPads. Using LINUX virtual machines when appropriate (including, in an act of extreme silliness, to run APL on Ubuntu/390 in Hercules in a Ubuntu vm). I find the ability to use such a range of software without any real difficulty on a single platform rather amazing. Perhaps I am unusual in not needing Windows (not even under Fusion or Parallels).

Could I have done all this on a mixture of Windows and LINUX machines? Sure, and I would probably have spent less money. Was the 2013 Mac Pro perfect? Far from it, and I realise that for some purposes it was a pretty duff design. The 2019 MBP seems to get hot and to spin up the fans far more than I would like. But I have been able to do everything I wanted — and I still can — without drama and with no hardware problems aside from, eventually, after seven or eight years, cooking the dGPU on the 2012 rMBP (which was probably my fault for treating it like a workstation and letting it get too hot too many times). Oh, and the rMBP needed a new keyboard when I dropped it on hard ground when using it outside in the dark with my telescope. I rather think that was my fault, not Apple’s. It still worked, though (and didn’t mind the dew).

So, overall, I’m pretty satisfied with what the kit has enabled me to do and, perhaps more importantly, with what I have learned using it (although, given how long I tend to keep computers, I’m probably far from the ideal customer; I’m still very happy with my iPhone X, for example). And I am happy not to use Windows. I’m quite excited by the prospect of buying an M3 Max or Ultra machine soon (possibly very soon …. ;)) and am confident that I will get plenty of enjoyment from that, too. That doesn’t preclude the possibility of getting a reasonably high-end Nvidia GPU to experiment with although, given that even the M3 Max’s GPU will be four to five times more powerful than my currently machines’ GPUs, I may not bother with that for now. The kit I have been fortunate to own has been more than good enough for me and, as I haven't had major problems, there has been no compelling reason to change. For me, life is too short to spend time worrying about whether the alternatives might be better (even though, in some respects, they may be). YMMV.
If I may, I’d like to add one point, concerning price, to the post I made yesterday (above).

So far as I can recall, my 2008 Mac Pro cost in the region of £2,000 (UKP). To me, fifteen years ago, that seemed like a lot of money. But it was a great machine that served me perfectly for 6+ years. I could say the same about my 2012 retina MacBook Pro (it lasted 7 years). The 2013 Mac Pro was flawed in several respects but, again, it did exactly what I needed to: it enabled me to do the GPU-based programming that I wanted to and was also perfectly adequate for LINUX-based prototyping (for which I would otherwise have needed another machine). And it still works perfectly, despite the amount of dust in my study.

Now, I bought the second Mac Pro in 2014 for around £4,000. If I go ahead and buy an M3 Max 16” MacBook Pro now, the spec I would like will cost …. just over £4,000. The GeekBench 6 numbers — not my actual workloads, granted, but adequate for a rough idea — suggest that, relative to the 2013 MP, this new machine will be over 5 times (CPU) and 4 times (GPU) powerful. Getting that capability in a portable computer with an excellent display, for essentially the same price — without any inflationary increase — seems pretty reasonable to me.

And, yes, I have heard of Moore’s Law (although I rather think it’s coming to an end). For sure, I could cobble together an Intel/Nvidia machine that would be equally capable for less, but being able to move seamlessly to a new machine, retaining the same development environments and everything else that I’m familiar with, does have some value (for me). Perhaps I’m fortunate that everything I want to use seems to work perfectly well on a Mac (or in a LINUX VM) and can readily use C++ and Python ….

So, so long as whatever M3 machine I buy is reliable, I’m pretty confident that I’ll get good value from it for 5+ years. Less than the cost of one fancy coffee per day. That’s my perspective, for what it’s worth. As I said yesterday, YMMV.
 

6916494

Cancelled
Jun 16, 2022
105
157
But neither Metal nor Swift are C, they are proprietary, which means Apple can change how this code is translated to C.
Metal is written in C++ and Objective-C.
The Swift compiler is written C++, and is a LLVM-based compiler and LLVM is written in C++.

Neither Metal nor the Swift compiler are proprietary. And their source code is on GitHub.

Don't know what you mean by "...translated to C". C++, C, Objective-C and Swift are all directly compiled to the respective target executable or library.
 
Last edited:

Gudi

Suspended
May 3, 2013
4,590
3,267
Berlin, Berlin
And, yes, I have heard of Moore’s Law (although I rather think it’s coming to an end).
For as long as Apple can throw money at the problem, it's not coming to an end. But the greater advancements come from chip design, all the little engines for a specialized task.
 

Gudi

Suspended
May 3, 2013
4,590
3,267
Berlin, Berlin
Neither Metal nor the Swift compiler are proprietary. Their source code is on GitHub.

Don't know what you mean by "...translated to C". C++, C, Objective-C and Swift are all directly compiled to the respective target executable or library.
Swift 5.9 and Metal 3 can be extended to whatever Apple needs to make the most of their new silicon. And also in absolute secrecy without asking anyone else. Apple can't just unilaterally change C++ and release a new version of it, it's not their own language.
 
  • Like
Reactions: SymeonArgyrus

leman

macrumors Core
Oct 14, 2008
19,517
19,664
Neither Metal nor the Swift compiler are proprietary. And their source code is on GitHub.

Metal is proprietary and closed source. Swift is open source, but the Apple uses a private version with their own extensions.

As to the rest, I’m also confused what the poster you quoted is trying to say.
 
  • Like
Reactions: MRMSFC

bgillander

macrumors 65816
Jul 14, 2007
1,024
1,047
Well the point is my software is all platform portable. But it breaks on macOS on a semi regular basis. The same is not true for Windows.

The docker thing I work around by running a VM in AWS but that's an inconvenience I was willing to live with for the other gains. Turns out there weren't many
This sounds like you very obviously made the correct choice for you. So much so that I'm not sure why you would even bother looking back and posting on MacRumors. I hope your current direction works best for you.
 

Gudi

Suspended
May 3, 2013
4,590
3,267
Berlin, Berlin
As to the rest, I’m also confused what the poster you quoted is trying to say.
You're much better suited than me to explain how moving to C++ instead of "going their own way" with Swift would've limited Apple in developing their software platform. Good or bad SwiftUI is how macOS works. Nobody wants to pay premium for a Mac that looks and feels like any Linux distribution.
I get wanting to modernize everything; Obj-C is old but it works, but they could have just fixed Obj-C, moved to C++, adopted Rust or something, instead of going their own way with the mess that is Swift + SwiftUI right now.
Apple did fix Objective-C and they gave it a new name, it's called Swift now.
 

leman

macrumors Core
Oct 14, 2008
19,517
19,664
You're much better suited than me to explain how moving to C++ instead of "going their own way" with Swift would've limited Apple in developing their software platform.

Ah, sorry, I kind of missed the context of this entire discussion.


I get wanting to modernize everything; Obj-C is old but it works, but they could have just fixed Obj-C, moved to C++, adopted Rust or something, instead of going their own way with the mess that is Swift + SwiftUI right now.

I don't get this. C++ and Rust are arguably poor choices for developing applications (with C++ being outright terrible). The only problem of Swift is that it has to maintain runtime and semantic compatibility with Objective C. Apple would have been better off if they had dropped all the OOP nonsense from the start and focused on value semantics instead (what Hylo is doing).

The simple fact is that Apple needed a language to do certain thing, and no language suited the bill. So they made their own. Nothing wrong with that. That's how Rust came to be as well.
 

jillpygok

macrumors member
Oct 25, 2023
56
81
I’ve gotten as far as 3 pages and I’m confused as to the all or nothing theme. Yes, Apple keyboard and mice suck. Ergo, I use a trackball and Logitech keyboard.

Yes, I don’t like Safari for work stuff. That’s why I use Edge on the Mac. I’ve always used Authenticator and used a combination of Parallels/Fusion and RDP to get stuff done in Windows. Plus I rely on homebrew and GCP/AWS.

I think Apple displays are overpriced. So I use LG 4k/5k. I find Xcode meh. So I have VSCode and Visual Studio with Resharper. I love paying for Apple Family plan for the 4 of us. Love News. Love Fitness. Love Music. Don’t like Apple TV. Like iCloud. Pay for M365 Family as well. Hate OneDrive unless it’s for Xbox captures for the kid. Love the productivity stuff. Hate Outlook. But like Outlook for the Web.

The goddamn point is it’s nice to have choice. Going all in with one platform is like putting all your investment eggs in one basket and then learning the hard way. You diversify even with technical choices and end up with a solution that works. For you.

After 20 years of building machines though, I’m done building machines. My old Windows tower is still a silent beast but I’m done with building machines, recompiling kernels and compiling from source. Give me prebuilt stuff. Hardware. Software. Binaries. Granted that Core ML kinda sucks but I’ll figure out a solution and stick to it. Windows isn’t usually bad but Windows 11 made me want to hit myself till I felt numb. That OS combined with Teams was a nightmare to use. On a freaking Lenovo workstation laptop.

The older I get, the more I want stuff to just work. I don’t care for rooting my Android phone or jailbreaking my iPad. I just want to get my stuff done so I can do stuff that matters more. Spend time with loved ones and find some solitude where I do nothing. Not googling for hours and trying to get Teams to work with the webcam that Zoom has no issue with. F that s**t.
 

Gudi

Suspended
May 3, 2013
4,590
3,267
Berlin, Berlin
The simple fact is that Apple needed a language to do certain thing, and no language suited the bill. So they made their own. Nothing wrong with that.
That's what I'm trying to say. All perceived downsides of Swift are necessities to achieve important requirements like compatibility with Obj-C code and a smooth transition from Intel to Apple Silicon with a Rosetta 2 compatibility layer etc. pp. All this only works (or works much better) with designing their own programming language(s) for their own needs. The lock-in effect is completely secondary and unavoidable.
 

simonscheurer

macrumors newbie
Aug 29, 2016
13
16
Interesting read. My main reason to stick with the Mac/iPhone ecosystem was and still is the OS and the ease of migration in case of a new system.

Basic things simply work. While windows for instance still struggles with proper scaling with some dialogs still looking like 10 years ago.

The switch from the Intel MBP 16 to the MBP 14 M1 Max was a game changer for me (software developer as well). No more burning legs, because the Intel runs so hot. Silence. Long hours without charging (never left my home without charger before) and compile/bundle times were less than half than before.

But you got a point when it comes to RAM. I find the prices for standard users excellent given build quality (my wife loves the m2 air). But as a software developer you need more RAM. And the upgrade prices for RAM are indeed outdated and IMHO quite abusive. Pushing an M3 Max config to 64GB (hey, it's 2023) quickly exceeds 4000 CHF (in Switzerland). I would have switched to an M3 Max already, if not the hefty price tag for RAM upgrades.
 

falainber

macrumors 68040
Mar 16, 2016
3,539
4,136
Wild West
Ah, sorry, I kind of missed the context of this entire discussion.




I don't get this. C++ and Rust are arguably poor choices for developing applications (with C++ being outright terrible). The only problem of Swift is that it has to maintain runtime and semantic compatibility with Objective C. Apple would have been better off if they had dropped all the OOP nonsense from the start and focused on value semantics instead (what Hylo is doing).

The simple fact is that Apple needed a language to do certain thing, and no language suited the bill. So they made their own. Nothing wrong with that. That's how Rust came to be as well.
I was under an impression that most great apps (Adobe PS being one example) are written in C++. There is nothing wrong with writing apps in C++ (I developed quite a few) unless you are doing it wrong. Use Qt framework (or some other proper framework) and you will avoid most typical issues (memory management etc.)
 

johnmacward

macrumors 6502
Jul 12, 2011
374
286
My God, how I thought this was going to be a whingy rant piece about how Apple make crap, bring back from the dead Steve etc. But I actually strongly agreed with a lot of points and the truth is most of what you said is what affects me - certainly on the question of the prices, which have left orbit in terms of ridiculousness. Same in terms of upgradability, but less so in terms of software because for me that's rock solid and I don't really have any issues. Great post and thanks for taking the time to write it.
 
  • Like
Reactions: 120FPS

kc9hzn

macrumors 68000
Jun 18, 2020
1,824
2,193
The thing is built on top of Qt. Apple break Qt regularly with undocumented API changes and weirdness inside regularly. For example some VNA client software I use was stuck in a UI thread deadlock due to a bug in Cocoa after a minor update to macOS. That's an actual macOS bug. I submitted it with profiler traces. Still not fixed after 2 years.

Stuff I wrote back in 2002 in C++ on windows still runs now. Unchanged with no problems and no recompile.
API changes to documented APIs where the changes are undocumented (which would be perceived to be a minor change, one that wasn’t perceived to break compatibility) or changes to undocumented APIs? There’s a big difference, there’s no guarantee that the latter will remain consistent between security patches, let alone x.x.1 releases (to say nothing of major releases), and it’s a developer’s fault for using the latter. If it’s the only way to do the job (as might be the case with Qt, I don’t know), then you know that breakage will occur and you plan around it.

As for the specific case of UI threads deadlocking, every system I’m aware of uses single threaded UI processes (it’s possible that some obscure system like BeOS or Haiku might have multithreaded UI processes). This sounds less like a system bug and more like a bug (quite possibly a race condition, one of the trickiest types of bugs to debug) in the specific application. Of course, concurrency in general is not easy.
 

kc9hzn

macrumors 68000
Jun 18, 2020
1,824
2,193
Yeah for digital art, there is nothing better. I'll give you that.

But my purposes, which are academic note taking via GoodNotes, paper worked out to be better than the supposed gains of doing this electronically. I just dump the notepad sheets in my AIO printer/scanner tray and out pops a PDF.

Fundamentally apart from that it's a crap laptop which you have to buy a really expensive keyboard for and then put up with all the compromises on iOS to boot.
Even back in 2010, the iPad with a cheap stylus and a notebook app was already a huge improvement for handwritten notetaking for me, just the ability to tap to add a new page instead of making sure I had a fresh notebook was hugely beneficial. There’s a time and place for paper notes, but I’d rather use an Apple Pencil on an iPad Pro and have it natively digital instead of mucking about with scanners to make a PDF. Your PDFs probably aren’t searchable (unless you insert the extra step of running OCR), but your handwriting on iPadOS almost certainly is if you’ve got a notetaking app that uses the system libraries properly.
 

marstan

macrumors 6502
Nov 13, 2013
302
210
Basic things simply work.
Not in my case.

I have a mac mini m2 pro and several basic things don't work:

- Apogee Duet audio driver unloads frequently requiring a reboot
- screen sharing into the mini from my M2 MBA doesn't work unless you first turn off screen sharing on the mini, reboot then turn it back on. So if you ever reboot the mini without going through this procedure screen sharing won't work. PIA
- night shift doesn't turn on automatically at sunset

I could go on. The reason I have preferred Apple is the integration of the hardware and software. When that stops working, then the rationale for the Apple ecosystem diminishes. That point is approaching for me.
 

seek3r

macrumors 68030
Aug 16, 2010
2,559
3,770
I don't, so I take it back -- but I first heard it was expiring sometime in 2021, and Microsoft now allows licensing WIndows on Arm in an Apple VM, so I assumed it had expired.

It seems nothing has been officially announced one way or the other. (except for being about to run WoA on something else)
Looks like the deal ends sometime next year btw, this is an excerpt from the leak about nvidia and AMD making ARM chips for future windows machines

“The high-level summary is that Microsoft has a contract with Qualcomm to develop ARM chips for Windows, according to Reuters, based on two people familiar with the matter. However, that contract ends in 2024”
 

120FPS

macrumors regular
Oct 26, 2022
174
206
We are in for exciting times. I wonder if history will repeat itself and Apple will find themselves, at some point, outpaced by the competition.
 

kc9hzn

macrumors 68000
Jun 18, 2020
1,824
2,193
I doubt it, the A series chips definitely still have a commanding lead over Qualcomm’s Snapdragon processor in phones in terms of performance, and Apple’s not standing still.
 

kc9hzn

macrumors 68000
Jun 18, 2020
1,824
2,193
We are in for exciting times. I wonder if history will repeat itself and Apple will find themselves, at some point, outpaced by the competition.
I doubt it, the A series chips definitely still have a commanding lead over Qualcomm’s Snapdragon processor in phones in terms of performance, and Apple’s not standing still.
 

MRMSFC

macrumors 6502
Jul 6, 2023
371
381
We are in for exciting times. I wonder if history will repeat itself and Apple will find themselves, at some point, outpaced by the competition.
I won’t say “never”, but as it currently stands, I’d say it’s unlikely for a few gens.

NVidia is likely to continually outperform Apple in GPU, and is industry standard in AI (though how much is raw performance vs. library optimization I don’t know), and AMD already has considerable success in the x86 space, and reportedly been investigating ARM production for years.

But NVidia has only used ARM reference designs for their processors, and x86 success doesn’t necessarily translate to ARM success. So they’ll likely need some time to get up to speed.

Intel is going to have a real squeeze put on them, AMD is eating away at them in the x86 space, their graphics cards are good value but not super competitive, and their own fabs are behind TSMC.

The major advantage that they have that Apple doesn’t though is that OEMs have the choice of mixing and matching parts. There’s no reason an AMD ARM CPU (too many acronyms!) couldn’t be used with an NVidia mobile GPU, or Qualcomm, etc etc.

So we’ll see.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.