Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Disappointed with Mac Pro 2023?


  • Total voters
    534

Ethosik

Contributor
Oct 21, 2009
8,142
7,120
Yes you did. Not if is already proves you are wrong. Clearly, bringing iGPU to Mac Pro grade computer is already out of point.
It's wrong that I am now able to do some Photoshop work with an iGPU, utilizing QuickSync with an iGPU, play some games on an iGPU? Please, research the issue because this is not wrong. It is a fact. The past few years, even the last 4 have been a SIGNIFICANT increase in what the iGPUs can do.
 
  • Like
Reactions: Colstan and VivienM

VivienM

macrumors 6502
Jun 11, 2022
496
341
Toronto, ON
How articulate your position and your persistence in providing unsolicited help for others are indications of maybe your opinion on their computing habits are tiring to them?

If they wanted your help they would have asked and acted upon them.

Their polite silence may be a sign that... "I love this fella" but I'm too Canadian to be impolite for his efforts.

I've discovered that if I gift them a new device, use Migration Assistance to move their data and remove the decommissioned device from their reach they will use an updated machine gladly.

Microsoft putting a limit on 2021 Win11 supported devices reduces overhead on hardware support to those made 2017-today. Seeming Windows is perpetually "free" they need to cut cost elsewhere.

Charging a fee for extended support works with large enterprises and not consumers and SMEs that would likely get a hacked copy that may compromised thus resulting in bad press for Win10 or Win 11.

Again, with no intention of hurt feelings, it isn't worth their time.

Money's better spent on Vision Pro and Bing's ChatGPT integration.

People in their circumstances don't care about support except for the village nerd who is too caring about it.

Imagine the outrage of TenFourFox's dev abandoning the 2,000+ PowerPC users who depend on his free labor for a modern browser for the modern Internet.
My persistence in debating with you is a separate issue. I don't know where you get off on somehow psychoanalyzing someone you've never met and expressing harsh judgments on situations you have not witnessed based on 100 paragraphs on a forum, simply because... honestly, I don't even know what we are disagreeing on anymore. A day ago, I thought you thought everybody I know should be eagerly buying new computers every 7-10 years. And if anything, you seemed to be saying I should be more persistent in selling them on the benefits of new systems rather than complaining that Apple/Microsoft should offer security updates for their old ones.

Now you seem to be saying the opposite and I am some kind of rude horrible self-centered monster imposing my preferences on others for wanting my friends/family to have computers with security updates!

I don't get it. Yesterday, according to you, the problem was that they didn't see the benefits of new computers or were too poor to act on those obvious benefits; today the problem is that I'm persistently trying to sell them technology they don't want.

Since, apparently, i) it's wrong to hope for OS vendors to provide longer software support, ii) it's self-centered and persistent to suggest to people that they need newer computers for security reasons, and iii) nothing other than hardware failure will make them spontaneously want a new computer before some of those OS vendor deadlines, what is the proper course of conduct, according to you? Shut up until their computer is full of malware two years later, then send them to Geek Squad when they call about the malware? I'd like to think only a sociopath is comfortable sending anyone, let alone friends and family, to Geek Squad!

As for the TenFourFox example, I don't know what to say. I tried running his browser when I got my G4, and I found the performance unusable. Obviously not the developer's fault - the modern web is an insane thing. But I don't understand how someone could i) be relying on it, and ii) be upset that he wouldn't maintain it anymore. Maybe the performance is a lot better on a G5.

That being said... if someone who bought a G5 in late 2005 had bought a spaceheater Pentium D running Windows (XP) the same day instead, I think there's a very good chance they'd be crawling the latest version of Chrome or Firefox on Windows 10 today and not relying on the mad technical wizardry of one dude to have a modern web browser. I don't have a Pentium D around to test, and I am not sure if something would prevent Windows 10 22H2 on a PCI-Express-based Pentium D system. There are Geekbench results for Pentium Ds running Windows 10 of various flavours, so I suspect it works. Something to think about when we ponder technology lifecycles, isn't it? The G5 hit an 'artificial' end over a decade ago by Apple and web browser vendors ending support (or, in Chrome's case, never supporting PPC); the Pentium D would hopefully have reached a natural obsolescence point when it became too slow to be usable, but it's not going to hit any artificial ends until October 2025, twenty years after it was sold, and even then, Chrome/Firefox won't drop Windows 10 support until at least a little while after Microsoft does.

And actually, look at this - https://browser.geekbench.com/v6/cpu/756371 . Someone got Windows 11 running on a Pentium f***ing D. So... I guess the Pentium D could be crawling on the modern web until 2028-2029 at least on Windows, possibly longer on Linux or other operating systems. That's nuts. But no more nuts than expecting TenFourFox to be supported in 2023.

Who knows what will happen with the Apple silicon transition - I note that web browser vendors today seem to support High Sierra, so who knows when they'll drop macOS-on-Intel. Hopefully there won't be a group of Intel diehards reliant on another mad wizard programmer for a web browser in 2029.
 
  • Love
Reactions: turbineseaplane

theluggage

macrumors G3
Jul 29, 2011
8,011
8,444
It's all about the economies of scale. If 'good enough' costs $400 and 'better' costs $600, way more people will buy the 'good enough' for $400.

"Good enough" is also about passing the point of diminishing returns: if a phone battery won't see you through a typical day, you'll pay extra for something better. Once it can get through a day, packing a charger in your overnight bag is no big deal. Human vision & hearing isn't getting more acute: if you're not going to have your living room dominated by a 50"+ TV, 4k is barely worth it, 8k (for TV/movies) is going to be strictly for "serious" home cinema. A 5k 27" monitor isn't night-and-day better than 4k (at least on Windows or Linux GUIs that have properly resolution-independent GUIs). If a video game runs at 12fps it's gonna be jerky - but you've been happily watching 24 fps movies for years and once you get over 50fps you'll only notice improvements in side-to-side comparisons. Distributing audio at over 16bit/48kHz is a waste of space, as is "prosumer" production at more than 24bit/96kHz. Until, maybe, the mid 00s technology was catching up with these thresholds but now, increasingly, they're being left behind and "bleeding edge" tech is for increasingly specialised markets.

Apple's forthcoming goggles might be a good bet to start a new 'arms race' in terms of the level of technology you need to open a video message. If they're not a complete flop, shaving a few ounces off the weight or adding an extra half-hour of battery life should have punters coming back for annual upgrades for the rest of the decades...

My guess is that the iPhone mini fans are old dinosaurs like us with their iMacs at home, etc, who actually have... fairly low demands...
I would actually expect big phones to be even more popular in Androidland than iPhoneland, simply because I'm sure there are a LOT of people in Androidland for whom the phone is the primary computing device.
On the other hand, Apple has completely ceded the low-end of the market to Android - and 4" android phones tend to go for under $200. If you just want a phone for texts, email alerts (to be read on your laptop) or even (god help you) making phone calls - or maybe as a backup phone then you're probably going to get a cheap Android.

As for Jobs... well, you cheer for the horse you have backed. I don't think he was alone in being skeptical about Phablets. I remember dithering between the iPhone 5 and a Galaxy Note II and they were certainly both "courageous decisions" by their makers - the iPhone 5 with its unusually tall, skinny screen designed for one-hand operation (I'm sure Steve's inner hippie was pitching for 1:4:9!) vs the huge - for the time - Galaxy. Turns out, Samsung won that game of rock, paper, scissors, but I don't think that was a foregone conclusion at the time.
 

Abazigal

Contributor
Jul 18, 2011
20,392
23,890
Singapore
As for Jobs... well, you cheer for the horse you have backed. I don't think he was alone in being skeptical about Phablets. I remember dithering between the iPhone 5 and a Galaxy Note II and they were certainly both "courageous decisions" by their makers - the iPhone 5 with its unusually tall, skinny screen designed for one-hand operation (I'm sure Steve's inner hippie was pitching for 1:4:9!) vs the huge - for the time - Galaxy. Turns out, Samsung won that game of rock, paper, scissors, but I don't think that was a foregone conclusion at the time.
I think sometimes, you just end up having to follow the market.

For instance, I remember Steve Jobs wasn't a fan of music streaming, and the reasons he gave were spot on, but Apple would eventually have to enter that market as well because Spotify had open the Pandora's box and there was really no going back to individual music downloads. He was right in that it's probably better in the long run to just own your music, and there was no money to be made for all parties, but sometimes, you just can't fight the tide.
 

chucker23n1

macrumors G3
Dec 7, 2014
9,091
12,112
you cheer for the horse you have backed

Yep.

For example, he publicly argued that watching video on an iPod was pointless because the screen was too small, but at that point (or soon after), they were already developing an iPod that supports video.
 

Longplays

Suspended
May 30, 2023
1,308
1,158
My persistence in debating with you is a separate issue. I don't know where you get off on somehow psychoanalyzing someone you've never met and expressing harsh judgments on situations you have not witnessed based on 100 paragraphs on a forum, simply because... honestly, I don't even know what we are disagreeing on anymore. A day ago, I thought you thought everybody I know should be eagerly buying new computers every 7-10 years. And if anything, you seemed to be saying I should be more persistent in selling them on the benefits of new systems rather than complaining that Apple/Microsoft should offer security updates for their old ones.
Each time I interact with you I try to think of the myriad of reasons why people that matter to you resisting what you want them to do for their own good.

I apologise if it comes off as uncouth but you gotta wonder why you are so preoccupied with what others think as unimportant to them.
And actually, look at this - https://browser.geekbench.com/v6/cpu/756371 . Someone got Windows 11 running on a Pentium f***ing D. So... I guess the Pentium D could be crawling on the modern web until 2028-2029 at least on Windows, possibly longer on Linux or other operating systems. That's nuts. But no more nuts than expecting TenFourFox to be supported in 2023.

Who knows what will happen with the Apple silicon transition - I note that web browser vendors today seem to support High Sierra, so who knows when they'll drop macOS-on-Intel. Hopefully there won't be a group of Intel diehards reliant on another mad wizard programmer for a web browser in 2029.
Microsoft having a simplified cut off on 14nm Intel chips released after Sep 2017 & 12nm AMD chips released after Apr 2018 is in the hopes of reducing hardware support cost, excising legacy Windows code relevant to older PCs and providing a better 2021 Windows 11 user experience.

As most users cannot distinguish nor understand the nuances of a 2009 C2Q 45nm from a 2006 Pentium D 65nm it would be simpler to cut it off by chip generation that any layman PC user could understand.

Apple has it easier as they can point to year models of the product line. The year model approach would also communicate to the buyer that their product is ~decade old already. Hopefully this translates to a purchase of a Apple Silicon Mac with a useful life of another decade.

The requirement of using 2017 or newer hardware on a 2021 OS is not that unreasonable considering Windows 11 end of support will likely occur by Dec 2031. This is 122 months later. So that's more than a dozen year old hardware entering 2032.

2007 Window Vista was widely mocked by a lot of Windows users by how awful it ran on "Vista Capable" hardware that had the raw performance in the ballpark of a 1997 Pentium II. This includes chips with raw performance similar to 1999 Pentium III and 1998 Celeron and AMD Athlon (1999 K7 and 2000 Thunderbird), 1997 K6, 1998 K6-2, 1999 K6-III and 1996 AMD K5 with or without SSE.

If Microsoft made your 2009 C2Q 45nm the Vista min sys req then odds are Vista would be remembered far far more favorably.

2007 is the year where PCs were starting to be replaced from the every 3 year norm of the 90s to a few months longer than that. The 4-6 year upgrade cycle that became the norm before 2016 has not arrived at that point.

Watching this video on Windows Vista being that "bad" reminded me of this conversation & why Microsoft & Apple need to put a practical hard limit on what hardware can run their software or get a litany of complaints and bad press. That incidentally the reason why Microsoft imposed a free Windows 10 upgrade to anyone to avoid the bad press of malware from abandoned older Windows versions.


Apple providing 9-11 years of macOS support for 2007-2017 Intel Macs is very very reasonable. This surpasses Android support that lasts 2-5 years tops. To think Android ships ~1 billion units annually when macOS ships <29 million units annually.

Would anyone want to hazard a guess is the user experience of using 2007 iMac 65nm on 2023 macOS Sonoma via OCLP? A number of Sonoma's features are disabled for even 2020 Intel Macs as it lacks the specialized cores or media engines that it depends upon for efficient processing. Who wants to use a Mac with full throttle fans upon power on because it's that underpowered, old and full of dust largely made up of dead human skin & dead insects?

My pointing to PowerPC users using TenFourFox is to emphasize how many actual users are there on the Internet using pre-2006 hardware.

Let us assume there are about 2,000 C2Q chip users online worldwide. Is it worth Microsoft or Apple's time to support them?

That is almost as bad as the ~20% of 15,000 units/year Mac Pro buyers demanding Apple re-engineer Ultra chips to accommodate swappable SoC, RAM, dGPU, eGPU, SSD & logicboard just for them.
 
Last edited:

duffman9000

macrumors 68020
Sep 7, 2003
2,331
8,089
Deep in the Depths of CA
And you are justifying ditching and reducing the pro market for what? Beside, you don't represent all of us.
I think the writing has been on the wall for many years now. Apple probably knows where the majority of its income is coming from and this isn’t it. If a developer needs a high end Nvidia or AMD GPU this won’t fit their needs and Apple already knows this.

If Apple doesn’t fit a developers needs then dump Apple.
 

Longplays

Suspended
May 30, 2023
1,308
1,158
I think the writing has been on the wall for many years now. Apple probably knows where the majority of its income is coming from and this isn’t it. If a developer needs a high end Nvidia or AMD GPU this won’t fit their needs and Apple already knows this.

If Apple doesn’t fit a developers needs then dump Apple.
They could outsource the AMD/Nvidia dGPU requirements to the cloud.

Cashflow-wise it would be cheaper and when newer dGPUs becomes available you can easily subscribe to that.

That is how workflows are changing over time with tech lowering down prices.

Why buy a $1599 dGPU when you can just rent it from the cloud.
 

bobcomer

macrumors 601
May 18, 2015
4,949
3,699
Why buy a $1599 dGPU when you can just rent it from the cloud.
Latency, transfer times, you name it... No way is it going to be as fast as local, and may not even be cheaper.

I can understand it for short term projects, but long term, no way is that a way to go.
 
  • Like
Reactions: Longplays

sunny5

macrumors 68000
Original poster
Jun 11, 2021
1,837
1,706
I think the writing has been on the wall for many years now. Apple probably knows where the majority of its income is coming from and this isn’t it. If a developer needs a high end Nvidia or AMD GPU this won’t fit their needs and Apple already knows this.

If Apple doesn’t fit a developers needs then dump Apple.
Wow, that's a worst way to answer the problem. Apple is the one who ditched and made it worse. Is it really wrong to criticize them? Seriously? Just dump Apple? You only defending Apple's problem.
 

duffman9000

macrumors 68020
Sep 7, 2003
2,331
8,089
Deep in the Depths of CA
Wow, that's a worst way to answer the problem. Apple is the one who ditched and made it worse. Is it really wrong to criticize them? Seriously? Just dump Apple? You only defending Apple's problem.
I’m not defending Apple at all. I think there is a major language barrier if you think I’m defending Apple. Pros have been complaining for years about Apple abandoning their Pro users. If you don’t like Apple’s direction dump Apple and look for something else.
 
  • Love
Reactions: Longplays

chucker23n1

macrumors G3
Dec 7, 2014
9,091
12,112
As most users cannot distinguish nor understand the nuances of a 2008 C2D 45nm from a 2006 Pentium D 65nm it would be simpler to cut it off by chip generation that any layman PC user could understand.

For the software developers, it also makes more sense. Suppose the C2D introduced AVX2. (It probably didn’t; that’s not the point.) Without the cutoff, developers either a) couldn’t use a compiler’s auto optimization or b) would have had to provide two binaries (which, while somewhat easy in macOS, becomes unwieldy and bloats disk space at some point). But if all Macs can be assumed to have it? You simply compile against that microarch.
 
  • Like
Reactions: Longplays

mcnallym

macrumors 65816
Oct 28, 2008
1,210
938
Wow, that's a worst way to answer the problem. Apple is the one who ditched and made it worse. Is it really wrong to criticize them? Seriously? Just dump Apple? You only defending Apple's problem.
That’s the way the western economies work. You buy products that meet your needs.

If Brand X product’s don’t meet your needs then you switch to Brand Y Whose product does meet your needs.

there is a reason that Nvidia doesn’t still bang on about Apple not signing there drivers anymore.

End Users that require software that depends upon Nvidia GPU technologies, Ie Machine Learning with CUDA, aren’t going to stop buying Nvidia GPU cards because they are not supported in Mac OS.

instead they know that they will still buy Nvidia GPU and put them in a Linux or Windows PC, and redeploy the developers that spent time on Mac OS drivers.

it also isn’t Apples problem to fix. Apple don’t have a problem in that their products don’t meet your requirements. There are people for whom it will and they will buy the product.

Steve Jobs said they identify a set of requirements for the product and cater to a target market. Develop a solution for those requirements, and if it meets your requirements fantastI’d, if it doesn’t meet your needs then the product is not for you.

from the listed usage that Apple developed the Mac Pro 2023 for then if your usage isn’t in there the product is not suited to you and you need to find a solution.

that solution maybe to hang on to a Mac Pro 2019 however going to need a long term solution.

and Yes it is perfectly fine to criticise apple however it is still down to people to find a solution to their requirements.
 

AlphaCentauri

macrumors 6502
Mar 10, 2019
291
457
Norwich, United Kingdom
Wow, that's a worst way to answer the problem. Apple is the one who ditched and made it worse. Is it really wrong to criticize them? Seriously? Just dump Apple? You only defending Apple's problem.
Dumping Apple = criticising them. You vote with your wallet and choose other options, which are better for you, in that case - PC Workstation.
 

sunny5

macrumors 68000
Original poster
Jun 11, 2021
1,837
1,706
That’s the way the western economies work. You buy products that meet your needs.

If Brand X product’s don’t meet your needs then you switch to Brand Y Whose product does meet your needs.

there is a reason that Nvidia doesn’t still bang on about Apple not signing there drivers anymore.

End Users that require software that depends upon Nvidia GPU technologies, Ie Machine Learning with CUDA, aren’t going to stop buying Nvidia GPU cards because they are not supported in Mac OS.

instead they know that they will still buy Nvidia GPU and put them in a Linux or Windows PC, and redeploy the developers that spent time on Mac OS drivers.

it also isn’t Apples problem to fix. Apple don’t have a problem in that their products don’t meet your requirements. There are people for whom it will and they will buy the product.

Steve Jobs said they identify a set of requirements for the product and cater to a target market. Develop a solution for those requirements, and if it meets your requirements fantastI’d, if it doesn’t meet your needs then the product is not for you.

from the listed usage that Apple developed the Mac Pro 2023 for then if your usage isn’t in there the product is not suited to you and you need to find a solution.

that solution maybe to hang on to a Mac Pro 2019 however going to need a long term solution.

and Yes it is perfectly fine to criticise apple however it is still down to people to find a solution to their requirements.
Guess what? A lot of Mac users already proves that Apple is doing wrong. Mac Pro 2013 is a great example and yet Apple learned nothing. Seriously, is it really hard to understand their failure?
 

AlphaCentauri

macrumors 6502
Mar 10, 2019
291
457
Norwich, United Kingdom
Guess what? A lot of Mac users already proves that Apple is doing wrong. Mac Pro 2013 is a great example and yet Apple learned nothing. Seriously, is it really hard to understand their failure?
We do understand their “failure”. I was disappointed with new Mac Pro myself.

It is you who don’t seem to understand that expressing your disappointment on internet forum about trillion dollar company not making a computer for your needs is pointless.

Or perhaps you think that 0.1% of Apple desktop users affected by this is going to persuade Apple to change their ways if they all shout loud enough?

If so, you must be smoking some really good stuff 🤣🤣🤣
 
Last edited:

leman

macrumors Core
Oct 14, 2008
19,520
19,671
It is you who don’t seem to understand that expressing your disappointment on internet forum about trillion dollar company not making a computer for your needs is pointless.

The funnies part is that they don't even need it. I gave up long time ago trying to understand what sunny5's point actually is. At the end of the day it all seems to boil down "Apple sucks because they don't use AMD", give or take.
 

AlphaCentauri

macrumors 6502
Mar 10, 2019
291
457
Norwich, United Kingdom
The funnies part is that they don't even need it. I gave up long time ago trying to understand what sunny5's point actually is. At the end of the day it all seems to boil down "Apple sucks because they don't use AMD", give or take.
Yeah. They do suck, for a small subset of desktop Mac users who need dGPUs and large ECC RAM.

Which is such a small percentage of Mac Pro users (who are themselves a very small percentage of Mac desktop users) that Apple is not bothered with them being unhappy at all.
 

Longplays

Suspended
May 30, 2023
1,308
1,158
Guess what? A lot of Mac users already proves that Apple is doing wrong. Mac Pro 2013 is a great example and yet Apple learned nothing. Seriously, is it really hard to understand their failure?
One could argue that the 2013 was a success because it remained unchanged until 2019.

It help spawn the 2017 iMac Pro and the wildly successful 2022 & 2023 Mac Studio.
 

Homy

macrumors 68030
Jan 14, 2006
2,506
2,458
Sweden
I’m not defending Apple at all. I think there is a major language barrier if you think I’m defending Apple. Pros have been complaining for years about Apple abandoning their Pro users. If you don’t like Apple’s direction dump Apple and look for something else.

But then people wouldn't have anything to complain about on MR Mac forum, despite not owning or having any intention of buying such products. Must vent that "crisis/failure/sucks" steam every time there is an opportunity.
 
Last edited:

TechnoMonk

macrumors 68030
Oct 15, 2022
2,604
4,112
They could outsource the AMD/Nvidia dGPU requirements to the cloud.

Cashflow-wise it would be cheaper and when newer dGPUs becomes available you can easily subscribe to that.

That is how workflows are changing over time with tech lowering down prices.

Why buy a $1599 dGPU when you can just rent it from the cloud.
Cloud is expensive if you have consistent work load. I use cloud for training but running inferences can get very expensive in cloud. 3-4 months of cloud cost pays the GPU.
 
  • Like
Reactions: singhs.apps
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.