Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Disappointed with Mac Pro 2023?


  • Total voters
    534

Ethosik

Contributor
Oct 21, 2009
8,142
7,120
And do they even use iGPU as a main GPU or replace external GPU? Seriously?
Yes, a lot of business computers until about the $1,500 or $2,000 range uses the integrated graphics from the processor. 5 of my work computers used to have only integrated graphics, and they were not on the cheap side.
 

sunny5

macrumors 68000
Original poster
Jun 11, 2021
1,837
1,706
Yes, a lot of business computers until about the $1,500 or $2,000 range uses the integrated graphics from the processor. 5 of my work computers used to have only integrated graphics, and they were not on the cheap side.
Those are low end computers which they don't use iGPU as a main. Seriously, that's not a good example and it's already a long time since iGPU is being used. What I'm saying is that are there any companies using iGPU instead of external GPU especially for workstation?
 

sunny5

macrumors 68000
Original poster
Jun 11, 2021
1,837
1,706
Hard to tell - obviously, most of those processors that land in gaming/enthusiast machines will have discrete GPUs and the iGPU is possibly never ever used.

But if you looked at, say, all i7 desktops - who knows how many of those don't get a discrete GPU? I presume it must be a significant number, otherwise Intel would have removed the on-processor GPU from the higher-spec processors...

I thought it might have been different in AMD land, but it's not - a Ryzen 7900X3D, for example, also has on-CPU graphics.

I will say, having had to do some GPU troubleshooting on a vintage C2Q machine a little while ago, having on-CPU graphics available is not a bad thing, if only for troubleshooting and emergencies.
We are talking about high-end or high performance chips. Why do you even mention normal uses? Clearly, you didn't get the point especially since we are talking about Mac Pro.
 

Ethosik

Contributor
Oct 21, 2009
8,142
7,120
Those are low end computers which they don't use iGPU as a main. Seriously, that's not a good example and it's already a long time since iGPU is being used. What I'm saying is that are there any companies using iGPU instead of external GPU especially for workstation?
They do use the iGPU as a main, I literally have one in use at an office site right now, connected to a monitor and does some processing on the iGPU. It can even play some older games with reasonable FPS.
 
  • Haha
Reactions: sunny5

Longplays

Suspended
May 30, 2023
1,308
1,158
Those are low end computers which they don't use iGPU as a main. Seriously, that's not a good example and it's already a long time since iGPU is being used. What I'm saying is that are there any companies using iGPU instead of external GPU especially for workstation?
The better question would be are companies in industries where in GPUs are important using x86 iGPU or AMD/Nvidia dGPU.
 

chucker23n1

macrumors G3
Dec 7, 2014
9,091
12,113
Big phone have bigger batteries.

That, too.

But another big factor is that women's clothing tends to have inadequate pockets, so they use a purse. As a result, half the population is fine using very large phones, since they're not gonna stuff them into their pants anyway.

And do they even use iGPU as a main GPU or replace external GPU? Seriously?

Are you asking if Intel integrated a GPU out of boredom?

No, they put it in there because the vast majority of computers don't have a dGPU. (Also, as an anti-competitive measure against nVidia's nForce.)
 

Longplays

Suspended
May 30, 2023
1,308
1,158
That, too.

But another big factor is that women's clothing tends to have inadequate pockets, so they use a purse. As a result, half the population is fine using very large phones, since they're not gonna stuff them into their pants anyway.
Not all women have purses.
 

VivienM

macrumors 6502
Jun 11, 2022
496
341
Toronto, ON
I think SSDs were TOO good when it introduced, brought a lot of life back to even a 2010 computer in today's day. But also Intel stalled out in the early 2010s with their processors. I did not experience as big of a jump as I thought I would going from a mid range 2010 Mac Pro to the 2019 i9 iMac. It barely moved the needle in saving me time exporting my videos. And that was a 9 year upgrade. I DID however gain the ability to export in HEVC which was a plus, but the performance was not very much better.

Contrast that to even the base M1 Mac mini at launch, that thing just beat the socks off my 2019 i9 iMac. It was ridiculous how the Mini beat my $5,000 purchase.

I do think people hold on to their computers a bit too much. Why upgrade from a 2012 computer? Why upgrade from a 2011 Office? Security is a big part of it. The other part of this is of course support. What will these people do if their 2012 Mac suddenly dies or needs support or their 2011 Office experiences a bug or if Office 365 introduces a security feature that requires a more modern Office version (I recall it was even a mess to get Office 365 accounts added to the 2010 version of Office for Windows, you had to configure a bit of advanced settings for it to work). Apple and Microsoft more than likely won't help now. I make it a rule to NEVER run outdated software or hardware as my primary device.
It's weird because... I was looking up some benchmark numbers earlier, and the benchmark numbers largely show significant improvement in performance on the Intel side until the 14nm issue in the later part of the 2010s. Yet somehow it doesn't feel that way. Somehow, the 45nm C2Qs and their successors hold up better than one thinks they should... (I dug up a 45nm C2Q from the closet a month ago, put a new $35 SSD in it and installed Windows 11, current version of Office, web browsers, etc. If I gave most non-techies that machine plugged into a modern-looking monitor, they would never guess that that motherboard/CPU is 15 years old. They might think it's a little slower than some other things they've used, sure, but they would never think that it is 15 years old. They'd probably think it was just a $300 Worst Buy special. Keep in mind that 15 years is the length of time between the 128K Mac and the first G4.)

Your 2019 i9 iMac, for example, has three times the single core performance as a typical mid-2010 Mac Pro according to Geekbench. About twice the multicore performance as my dual-processor 2010 Mac Pro. But somehow, it doesn't feel that way to you, does it? It's like something else in the software/hardware stack is bottlenecking that extra performance for many workloads, and somehow Apple silicon removed that bottleneck.

That being said, as someone who bought a mid-2010 Mac Pro two weeks ago... I will say, what actually shocked me about the Mac Pro is how slow the storage is compared to all the other Macs I've had. The "Verifying..." stage opening up a new app takes FOREVER. And my Mac Pro's original owner splurged big money on the (SATA-300) Apple 512GB SSD. The faster SSDs starting in the retina MBPs make a HUGE difference.

People hold on to their computers because, other than security, no one has given them a good reason not to. It's not like there's a killer feature in Office 2021 that wasn't in Office 2011, most home users use horrific browser-based email not Office 365 email, etc. I was going to say that you could write a job application just as well in Word 5.1a, but that's not true, there were improvements to Word until the early 2000s that are material for that kind of a task. And needs tend to go down - your grandmother who used a G4 iMac to receive photos of the grandkids in 2005, for example, is likely to do that using a smartphone in 2023. People don't really think of computers as something needing "support" (and maintenance), even though that's obviously wrong. And, I think many older non-techies have tech spending fatigue - it's like they're traumatized from buying new computers every 3-4 years at much higher prices between 1990 and 2010, so now they just don't want to do it anymore unless their existing computer is actually broken.

And to be honest, as I've gotten older, I understand it - as you get older, time passes faster, so even though teenage you thought you had a computer forever if you kept it for three years, from the parents' perspective it feels like they opened up their wallet yesterday and yet they're already asked to open their wallet again.
 

Boil

macrumors 68040
Oct 23, 2018
3,477
3,173
Stargate Command
2023(?) iMac 27" 5nm
If a 2023 iMac 27" 5nm appeared within 4 months I'd buy the base model.

Best I can do is a 5K 27" Studio Display & your choice of a Mac mini or Mac Studio...

Yup - this is something that Steve Jobs was dead wrong about. Or... maybe he was right at the time he said it...

Steve said a lot of things that are contradictory now; but he was first & foremost a salesman, and the things he said should be taken into context in regards to the time and the product he was currently pushing...
 

Longplays

Suspended
May 30, 2023
1,308
1,158
It's weird because... I was looking up some benchmark numbers earlier, and the benchmark numbers largely show significant improvement in performance on the Intel side until the 14nm issue in the later part of the 2010s. Yet somehow it doesn't feel that way. Somehow, the 45nm C2Qs and their successors hold up better than one thinks they should... (I dug up a 45nm C2Q from the closet a month ago, put a new $35 SSD in it and installed Windows 11, current version of Office, web browsers, etc. If I gave most non-techies that machine plugged into a modern-looking monitor, they would never guess that that motherboard/CPU is 15 years old. They might think it's a little slower than some other things they've used, sure, but they would never think that it is 15 years old. They'd probably think it was just a $300 Worst Buy special. Keep in mind that 15 years is the length of time between the 128K Mac and the first G4.)

Your 2019 i9 iMac, for example, has three times the single core performance as a typical mid-2010 Mac Pro according to Geekbench. About twice the multicore performance as my dual-processor 2010 Mac Pro. But somehow, it doesn't feel that way to you, does it? It's like something else in the software/hardware stack is bottlenecking that extra performance for many workloads, and somehow Apple silicon removed that bottleneck.

That being said, as someone who bought a mid-2010 Mac Pro two weeks ago... I will say, what actually shocked me about the Mac Pro is how slow the storage is compared to all the other Macs I've had. The "Verifying..." stage opening up a new app takes FOREVER. And my Mac Pro's original owner splurged big money on the (SATA-300) Apple 512GB SSD. The faster SSDs starting in the retina MBPs make a HUGE difference.

People hold on to their computers because, other than security, no one has given them a good reason not to. It's not like there's a killer feature in Office 2021 that wasn't in Office 2011, most home users use horrific browser-based email not Office 365 email, etc. I was going to say that you could write a job application just as well in Word 5.1a, but that's not true, there were improvements to Word until the early 2000s that are material for that kind of a task. And needs tend to go down - your grandmother who used a G4 iMac to receive photos of the grandkids in 2005, for example, is likely to do that using a smartphone in 2023. People don't really think of computers as something needing "support" (and maintenance), even though that's obviously wrong. And, I think many older non-techies have tech spending fatigue - it's like they're traumatized from buying new computers every 3-4 years at much higher prices between 1990 and 2010, so now they just don't want to do it anymore unless their existing computer is actually broken.

And to be honest, as I've gotten older, I understand it - as you get older, time passes faster, so even though teenage you thought you had a computer forever if you kept it for three years, from the parents' perspective it feels like they opened up their wallet yesterday and yet they're already asked to open their wallet again.
Many older persons are on a fixed income. What you personally prioritize matters little to them.

If they were single, without dependents or any terminal illness then a less than $100k annual fixed income that is not inflation adjusted becomes a worry for many.

This is how Canadians saved on average

Averange-saving-table.png


Source: https://myratecompass.ca/blog/savings/what-is-the-average-savings-of-a-canadian-by-age/
 

Longplays

Suspended
May 30, 2023
1,308
1,158
Best I can do is a 5K 27" Studio Display & your choice of a Mac mini or Mac Studio...
Alas I can save a bundle by just keeping to my 2019 MBP 16" 14nm + an external display ;)

On clamshell mode until 2027. Hopefully by then a 2027 larger iMac M5 1.4nm (A14) will be out.
 

Boil

macrumors 68040
Oct 23, 2018
3,477
3,173
Stargate Command
With no Target Display Mode and the appliance nature of Apple silicon, the iMac line-up falls into the "Double Disposable" category of computing appliances; so it is highly doubtful we will ever see anything larger/more powerful than the 4.5K 24" Mn-based iMac...
 

VivienM

macrumors 6502
Jun 11, 2022
496
341
Toronto, ON
Many older persons are on a fixed income. What you personally prioritize matters little to them.

If they were single, without dependence or any terminal illness then a less than $100k annual fixed income that is not inflation adjusted becomes a worry for many.

This is how Canadians saved on average



Source: https://myratecompass.ca/blog/savings/what-is-the-average-savings-of-a-canadian-by-age/
I don't understand what your point is supposed to be here.

You are the one who was saying that, unless someone is near homeless levels of poverty, they should enthusiastically be buying new computers every 6-8 years, if not sooner. And that someone too poor to do that doesn't warrant any level of attention from Apple or Microsoft - they're "not a customer" so screw them, let them eat malware in their unpatched OS.

Now you are accusing ME of wanting them to spend money on what "I" personally prioritize? You are the one who thought they should be buying new computers, I am the one who thought it was unfortunate that Apple/Microsoft's lifecycle policies were pushing them in the direction of having to buy new computers that they don't feel a need to buy.

Scroll up. I'm not the one who says that seniors should be replacing 45nm C2Qs and Sandy Bridges and late-2013 MacBooks. You are. You are the one who thinks that it's A-OK for Apple/Microsoft to say to a senior "oh, this computer you are happy with, you need to replace it because we won't give/sell you any security updates and the big bad malware will steal all your money and the cybercriminals will kidnap your grandchildren" and they should enthusiastically walk down to the store and buy a $1800 MacBook Air. In fact, you said that anyone unenthusiastic about handing over their credit card to do so must be unable to afford it. That's how obvious the decision to you is - any reasonable person with $1800 in their bank account SHOULD, in your view, unquestioningly hand it over to Apple to get a computer with security updates and a current OS regardless of how functional their current machine is.

If anyone in this discussion is telling seniors to spend money on something technological someone else "personally prioritizes", it is YOU. Not me. You are the one who insulted my friends and family and called them poor, and now you are telling me I want them to spend their money on something I prioritize?????
 

Longplays

Suspended
May 30, 2023
1,308
1,158
I don't understand what your point is supposed to be here.

You are the one who was saying that, unless someone is near homeless levels of poverty, they should enthusiastically be buying new computers every 6-8 years, if not sooner. And that someone too poor to do that doesn't warrant any level of attention from Apple or Microsoft - they're "not a customer" so screw them, let them eat malware in their unpatched OS.

Now you are accusing ME of wanting them to spend money on what "I" personally prioritize? You are the one who thought they should be buying new computers, I am the one who thought it was unfortunate that Apple/Microsoft's lifecycle policies were pushing them in the direction of having to buy new computers that they don't feel a need to buy.

Scroll up. I'm not the one who says that seniors should be replacing 45nm C2Qs and Sandy Bridges and late-2013 MacBooks. You are. You are the one who thinks that it's A-OK for Apple/Microsoft to say to a senior "oh, this computer you are happy with, you need to replace it because we won't give/sell you any security updates and the big bad malware will steal all your money and the cybercriminals will kidnap your grandchildren" and they should enthusiastically walk down to the store and buy a $1800 MacBook Air. In fact, you said that anyone unenthusiastic about handing over their credit card to do so must be unable to afford it. That's how obvious the decision to you is - any reasonable person with $1800 in their bank account SHOULD, in your view, unquestioningly hand it over to Apple to get a computer with security updates and a current OS regardless of how functional their current machine is.

If anyone in this discussion is telling seniors to spend money on something technological someone else "personally prioritizes", it is YOU. Not me. You are the one who insulted my friends and family and called them poor, and now you are telling me I want them to spend their money on something I prioritize?????
I never said everyone should replace at 6-8 years.

I advocate they replace after support ends in a decade's time. I highlight the + of doing that after 10 years.

I pointed out practical options if that wasn't possible. If the options are unusable then continue using whatever they have. Odds are their circumstances wouldnt change all that much if they get hit by malware.

You're having concern for others about unsupported hardware ignores other people's circumstantial priorities.

No business wants to do business for free beyond their contractual obligations. Neither for a shrinking niche. I am sure the millions of people who enjoyed dozen+ years of Windows XP support whined about it but market forces had them upgrading.

I am not insulting anyone. I pointing to possible reasons why they do not want to spend further that may not be apparent to you as you have a "horse blinder" love for tech.

Your preoccupation on providing unsolicited help will eventually not make them happy.

For me... it's simple... change topics like when my buddy started talking about his new Pentium the day Intel announced decommission the brand.

Market forces will force the replacement and not a schedule.
 
Last edited:

bobcomer

macrumors 601
May 18, 2015
4,949
3,699
Yes, a lot of business computers until about the $1,500 or $2,000 range uses the integrated graphics from the processor. 5 of my work computers used to have only integrated graphics, and they were not on the cheap side.
Agreed, most people just use the iGPU on whatever type of PC they own, and it's more than enough these days... (gamers and miners are the biggest exceptions, but video and graphics people still need more.). Just one of my intel desktops has a graphics card, but it's a workstation level machine. Yes, it's that word again, not really what I'd call a workstation, but Lenovo does and it's beefier and more expandable than a consumer desktop.
 

sunny5

macrumors 68000
Original poster
Jun 11, 2021
1,837
1,706
They do use the iGPU as a main, I literally have one in use at an office site right now, connected to a monitor and does some processing on the iGPU. It can even play some older games with reasonable FPS.
That's for office uses, not GPU intensive software. Clearly, you are missing the point from the beginning.
 

Ethosik

Contributor
Oct 21, 2009
8,142
7,120
That's for office uses, not GPU intensive software. Clearly, you are missing the point from the beginning.
iGPU is used for some workloads, it is used for QuickSync. I use it for some video processing and some Photoshop work.

The conversation was not limited to EXTREME WORKFLOW. Some GPU acceleration is possible with integrated graphics and it is good enough for some professionals. I am not talking about 8K gaming or working on massive 3D modeling here. That is not what the conversation was. It was stated that Intel has had integrated graphics for the past decade and was replied by does that replace dedicated GPUs. Nothing in the conversation mentioned EXTREME workflows.

Even if you want to go there in the conversation, there are some iGPUs better than a dedicated GTX 1060 graphics card. Just because it's dedicated, doesn't automatically mean it's better. The dedicated graphics cards that do not require any more power from the power supply are generally not better than a more modern iGPU.

Not sure what the hate is for iGPU here.


 
Last edited:
  • Haha
Reactions: sunny5

VivienM

macrumors 6502
Jun 11, 2022
496
341
Toronto, ON
I never said everyone should replace at 6-8 years.

I advocate they replace after support ends in a decade's time. I highlight the + of doing that after 10 years.

I pointed out options if that wasn't possible. If the options are unusable then continue using whatever they have. Odds are their circumstances wouldnt change all that much if they get hit by malware.

You're having concern for others about unsupported hardware ignores other people's circumstantial priorities.

No business wants to do business for free beyond their contractual obligations. Neither for a shrinking niche.

I am not insulting anyone. I pointing possible reasons why they do not want to spend further that may not be apparent to you as you love tech.

Your preoccupation on providing unsolicited help will eventually not make them happy.

For me... it's simple... change topics like when my buddy started talking about his new Pentium the day Intel announced decommission the brand.

Market forces will force the replacement and not a schedule.
And I'm sorry, but I think it's a bit insulting to suggest that I don't know enough about my friends/family general circumstances to know whether they can afford $1800 or not. In part because, you know, they talk about other purchases. If someone spent the same amount of money on a handbag last week that would have bought a MacBook Air, it tells me that they could have afforded a MacBook Air, but they thought the handbag was more useful to them. Am I supposed to tell them not to buy the handbag because the MacBook they use twice a year is no longer supported and so they need to buy a new one?!

And I'm not sure why you think I am "preoccupied on providing unsolicited help." Do you really think I am calling people every week and suggesting that they go and replace their computers?!? No. But... I do generally try to do some amount of lifecycle planning for the people who rely on me for technology advice, and that includes flagging things that will soon be losing security updates, etc. And so, I notice how tragic it is that hardware that meets their very limited needs perfectly is colliding with these lifecycle issues, and how those collisions are getting worse.

I don't think it's unreasonable, for example, to occasionally mention to my mom that her 2020 Intel MacBook will need to be replaced sooner than her late-2013 did. It certainly does not need to be replaced now, and frankly, I would discourage her from buying a new MacBook today even if she wanted to because it's completely unnecessary. Although that 15" MBA is perfect for her. But I think everybody on this forum agrees that Apple will not support as many macOS versions on the 2020 Intel MacBook Pro as on the late-2013.

And I did warn her about the lower-than-before lifecycle expectation on the 2020 Intel when she got it, but given her late-2013 had experienced hardware failure that was not fixable in peak-covid-conditions, she couldn't wait for an M1 model. Obviously it would have been a lot better if the late-2013 had lived six months longer - an M1 would probably have lasted 2-3 more years than the 2020 Intel.

And for my parents, or my late aunt if she was around, yes, I would probably insist they replace any unsupported machine - fundamentally, I am the primary tech support for them and I don't want to be cleaning up predictable malware on their machine. And if I thought that new hardware had some benefit that would be material to them, I might gently suggest they upgrade, but I haven't seen any new development in Mac or Windowsland meeting that standard in a long time. Even Apple silicon! That is what you seem to also be missing - despite allegedly "loving tech", I actually am unable to identify something other than OS lifecycle (or hardware failure) that might make family/friends/etc want to upgrade some of these lightly used machines.

And sure, I may "love tech", but one thing I love to do is buying quality tech things that last. Not buying tech for the sake of buying tech. I'm too old and have owned and thrown away too many disappointing tech products for that. And that's why I am grumpy when a vendor like Microsoft disrespects my purchase of a quality item with their lifecycle policies and tells me that someone who bought a piece of garbage for $300 one year later is worthy of their newest OS but I'm not.

I don't know why you seem to have some monstrous image of me simply because I don't want people I care about using software with unpatched security bugs... and Apple/Microsoft are making it difficult to do anything about that without those people spending money on new hardware that they don't have any desire for. If you are happy with your existing computer, but you're buying a new one just to get security updates, then honestly, you're going to be about as enthusiastic about your purchase as you are about paying taxes - you seem to think that at least, they'll be wowed by how much better the new machine is after they unbox it, and maybe they will, or maybe that's just you loving tech and they won't even see a difference. Or, what may also happen is that they'll buy a worse machine - in Windowsland, it is not exactly hard to buy a new laptop today that would have a worse screen/keyboard/etc than a decent Windows laptop you bought in 2017. Maybe even a slower processor. Apple does not sell junk so that is not a concern in Macland.

"Market forces" are colliding with support lifecycles. No one even cared about support lifecycles for home/small business 20 years ago because the hardware was obsolete before the OS stopped being patched. One exception - people, I think, are still grumpy about how PowerPC Macs were treated - there were a lot of G5s whose owners did not consider them obsolete in 2009 who sure wish the plug hadn't been pulled on the PPC version of Snow Leopard at the last minute. Most of these people would probably not have been grumpy had PPC support been dropped one release later. But Apple is Apple and Apple will move at a speed that's faster than what the buyers of their high-end desktops might prefer. (Mid-2019 Mac Pro owners, you will re-discover this sooner than you'd like.)

But the problem facing the industry is that this is no longer true. We'll see what happens as Microsoft's October 2025 deadline happens - my guess is that there are going to be a hell of a lot of unpatched Windows 10 systems in active use in 2026. Those are the "market forces" at work. Microsoft already had a horrible time getting people off XP, and i) XP hardware in 2014 was a lot more obsolete than Windows 10 hardware in 2025 will be, and ii) most healthy-and-usable XP machines could be upgraded to 7 with a modest amount of time and a small payment to Microsoft. The problem in 2026 is that there are going to be hundreds of millions of machines running Windows 10 that Microsoft will not take money to upgrade to 11. And no one other than an IT professional or tech enthusiast is going to toss a perfectly functional machine because of the non-availability of security updates. :( At some point, this is going to become a political problem, at least in the EU. Sometimes I wonder if Microsoft is not just playing a political game - see if they get away with this, and if they get too much backlash as the date approaches, well, they can swoop in and say "we've heard you, our engineering teams did some great work, and we can now offer Windows 11 on slightly older systems without some of our newest security tech" when in fact the only work for the engineering team is to remove the CPU check.
 

Longplays

Suspended
May 30, 2023
1,308
1,158
And I'm sorry, but I think it's a bit insulting to suggest that I don't know enough about my friends/family general circumstances to know whether they can afford $1800 or not. In part because, you know, they talk about other purchases. If someone spent the same amount of money on a handbag last week that would have bought a MacBook Air, it tells me that they could have afforded a MacBook Air, but they thought the handbag was more useful to them. Am I supposed to tell them not to buy the handbag because the MacBook they use twice a year is no longer supported and so they need to buy a new one?!

And I'm not sure why you think I am "preoccupied on providing unsolicited help." Do you really think I am calling people every week and suggesting that they go and replace their computers?!? No. But... I do generally try to do some amount of lifecycle planning for the people who rely on me for technology advice, and that includes flagging things that will soon be losing security updates, etc. And so, I notice how tragic it is that hardware that meets their very limited needs perfectly is colliding with these lifecycle issues, and how those collisions are getting worse.

I don't think it's unreasonable, for example, to occasionally mention to my mom that her 2020 Intel MacBook will need to be replaced sooner than her late-2013 did. It certainly does not need to be replaced now, and frankly, I would discourage her from buying a new MacBook today even if she wanted to because it's completely unnecessary. Although that 15" MBA is perfect for her. But I think everybody on this forum agrees that Apple will not support as many macOS versions on the 2020 Intel MacBook Pro as on the late-2013.

And I did warn her about the lower-than-before lifecycle expectation on the 2020 Intel when she got it, but given her late-2013 had experienced hardware failure that was not fixable in peak-covid-conditions, she couldn't wait for an M1 model. Obviously it would have been a lot better if the late-2013 had lived six months longer - an M1 would probably have lasted 2-3 more years than the 2020 Intel.

And for my parents, or my late aunt if she was around, yes, I would probably insist they replace any unsupported machine - fundamentally, I am the primary tech support for them and I don't want to be cleaning up predictable malware on their machine. And if I thought that new hardware had some benefit that would be material to them, I might gently suggest they upgrade, but I haven't seen any new development in Mac or Windowsland meeting that standard in a long time. Even Apple silicon! That is what you seem to also be missing - despite allegedly "loving tech", I actually am unable to identify something other than OS lifecycle (or hardware failure) that might make family/friends/etc want to upgrade some of these lightly used machines.

And sure, I may "love tech", but one thing I love to do is buying quality tech things that last. Not buying tech for the sake of buying tech. I'm too old and have owned and thrown away too many disappointing tech products for that. And that's why I am grumpy when a vendor like Microsoft disrespects my purchase of a quality item with their lifecycle policies and tells me that someone who bought a piece of garbage for $300 one year later is worthy of their newest OS but I'm not.

I don't know why you seem to have some monstrous image of me simply because I don't want people I care about using software with unpatched security bugs... and Apple/Microsoft are making it difficult to do anything about that without those people spending money on new hardware that they don't have any desire for. If you are happy with your existing computer, but you're buying a new one just to get security updates, then honestly, you're going to be about as enthusiastic about your purchase as you are about paying taxes - you seem to think that at least, they'll be wowed by how much better the new machine is after they unbox it, and maybe they will, or maybe that's just you loving tech and they won't even see a difference. Or, what may also happen is that they'll buy a worse machine - in Windowsland, it is not exactly hard to buy a new laptop today that would have a worse screen/keyboard/etc than a decent Windows laptop you bought in 2017. Maybe even a slower processor. Apple does not sell junk so that is not a concern in Macland.

"Market forces" are colliding with support lifecycles. No one even cared about support lifecycles for home/small business 20 years ago because the hardware was obsolete before the OS stopped being patched. One exception - people, I think, are still grumpy about how PowerPC Macs were treated - there were a lot of G5s whose owners did not consider them obsolete in 2009 who sure wish the plug hadn't been pulled on the PPC version of Snow Leopard at the last minute. Most of these people would probably not have been grumpy had PPC support been dropped one release later. But Apple is Apple and Apple will move at a speed that's faster than what the buyers of their high-end desktops might prefer. (Mid-2019 Mac Pro owners, you will re-discover this sooner than you'd like.)

But the problem facing the industry is that this is no longer true. We'll see what happens as Microsoft's October 2025 deadline happens - my guess is that there are going to be a hell of a lot of unpatched Windows 10 systems in active use in 2026. Those are the "market forces" at work. Microsoft already had a horrible time getting people off XP, and i) XP hardware in 2014 was a lot more obsolete than Windows 10 hardware in 2025 will be, and ii) most healthy-and-usable XP machines could be upgraded to 7 with a modest amount of time and a small payment to Microsoft. The problem in 2026 is that there are going to be hundreds of millions of machines running Windows 10 that Microsoft will not take money to upgrade to 11. And no one other than an IT professional or tech enthusiast is going to toss a perfectly functional machine because of the non-availability of security updates. :( At some point, this is going to become a political problem, at least in the EU. Sometimes I wonder if Microsoft is not just playing a political game - see if they get away with this, and if they get too much backlash as the date approaches, well, they can swoop in and say "we've heard you, our engineering teams did some great work, and we can now offer Windows 11 on slightly older systems without some of our newest security tech" when in fact the only work for the engineering team is to remove the CPU check.
If you really want to help gift them a new device, use Migration Assistance to move their data and remove the decommissioned device from their reach they will use an updated machine gladly.

All your worry for people who do not care would evaporate.
 
Last edited:

sunny5

macrumors 68000
Original poster
Jun 11, 2021
1,837
1,706
iGPU is used for some workloads, it is used for QuickSync. I use it for some video processing and some Photoshop work.

The conversation was not limited to EXTREME WORKFLOW. Some GPU acceleration is possible with integrated graphics and it is good enough for some professionals. I am not talking about 8K gaming or working on massive 3D modeling here. That is not what the conversation was. It was stated that Intel has had integrated graphics for the past decade and was replied by does that replace dedicated GPUs. Nothing in the conversation mentioned EXTREME workflows.

Even if you want to go there in the conversation, there are some iGPUs better than a dedicated GTX 1060 graphics card. Just because it's dedicated, doesn't automatically mean it's better. The dedicated graphics cards that do not require any more power from the power supply are generally not better than a more modern iGPU.

Not sure what the hate is for iGPU here.


Doesn't change the fact that we ARE talking about the high end and workstation grade GPU. I didn't say I hate it, you are the one who missed the point.
 

VivienM

macrumors 6502
Jun 11, 2022
496
341
Toronto, ON
That's for office uses, not GPU intensive software. Clearly, you are missing the point from the beginning.
"Office uses" would have been "GPU intensive software" 25 years ago. I remember how you needed to pay attention to what graphics your machine had just to get, say, 1024x768 at 16.8 million colours. There were lots of machines that couldn't even do 16.8 million colours at 640x480!

Then, magically, something happened around 1998 and every machine had enough graphics power to do 2D at any resolution up to about 1920x1200 with 16.8 million colours. Moore's Law!

And over time, integrated graphics have gotten better. In 1992, if you wanted to lay out a print magazine on your Mac, you probably needed/wanted a fancy NuBus video card. In 2020, I suspect you could plug an external monitor into your 13" Intel MacBook Pro and run QuarkXPress/InDesign/Photoshop just fine on your Intel integrated graphics and do your print magazine layout. And actually... the graphics designer's 21" monitor in 1992 was 1152x870, a 13" MacBook Pro is not far from that resolution, so maybe you don't even really "need" the external monitor.

Put another way - in 2000, the first iteration of Intel onboard graphics could drive a screen at a resolution and colour depth that would have required a pricy NuBus card on a Mac less than a decade earlier.

What you are failing to appreciate is that what counts as "GPU intensive software" is largely shrinking over time. And as a result, that is affecting the market structure for GPUs, pushing discrete GPUs to the higher end only and destroying the market for lower-end discrete GPUs. And it is affecting engineering/architectural decisions for Apple Silicon - if their architecture results in better GPU performance for 99.8% of customers, but 0.2% of people are doing heavy GPU-intensive loads that don't work well on the Apple Silicon GPU architecture, well, too bad, they're not going to spend huge amounts in R&D to accommodate a completely different GPU architecture for those people.

And others will probably say that it's less than 0.2%.

Doesn't help that Apple hasn't exactly ever been cultivating a gamer or miner user base, so you don't have those folks clamoring for an RTX 4090-style GPU in a Mac. All you have are a number of professional uses.
 
  • Like
  • Haha
Reactions: sunny5 and Ethosik

Ethosik

Contributor
Oct 21, 2009
8,142
7,120
Doesn't change the fact that we ARE talking about the high end and workstation grade GPU. I didn't say I hate it, you are the one who missed the point.
I didn't miss the point. It was a generalized statement asking if iGPUs are replacing dedicated GPUs. NOT if it's used for 3D modeling or 8K gaming. I explained where it replaced dedicated GPUs in some scenarios. Not sure why this conversation needs to be this difficult. You asked if it replaced dGPUs, and I explained some scenarios why it did. Let's just leave it at that please.
 
  • Haha
Reactions: sunny5

sunny5

macrumors 68000
Original poster
Jun 11, 2021
1,837
1,706
"Office uses" would have been "GPU intensive software" 25 years ago. I remember how you needed to pay attention to what graphics your machine had just to get, say, 1024x768 at 16.8 million colours. There were lots of machines that couldn't even do 16.8 million colours at 640x480!

Then, magically, something happened around 1998 and every machine had enough graphics power to do 2D at any resolution up to about 1920x1200 with 16.8 million colours. Moore's Law!

And over time, integrated graphics have gotten better. In 1992, if you wanted to lay out a print magazine on your Mac, you probably needed/wanted a fancy NuBus video card. In 2020, I suspect you could plug an external monitor into your 13" Intel MacBook Pro and run QuarkXPress/InDesign/Photoshop just fine on your Intel integrated graphics and do your print magazine layout. And actually... the graphics designer's 21" monitor in 1992 was 1152x870, a 13" MacBook Pro is not far from that resolution, so maybe you don't even really "need" the external monitor.

Put another way - in 2000, the first iteration of Intel onboard graphics could drive a screen at a resolution and colour depth that would have required a pricy NuBus card on a Mac less than a decade earlier.

What you are failing to appreciate is that what counts as "GPU intensive software" is largely shrinking over time. And as a result, that is affecting the market structure for GPUs, pushing discrete GPUs to the higher end only and destroying the market for lower-end discrete GPUs. And it is affecting engineering/architectural decisions for Apple Silicon - if their architecture results in better GPU performance for 99.8% of customers, but 0.2% of people are doing heavy GPU-intensive loads that don't work well on the Apple Silicon GPU architecture, well, too bad, they're not going to spend huge amounts in R&D to accommodate a completely different GPU architecture for those people.

And others will probably say that it's less than 0.2%.

Doesn't help that Apple hasn't exactly ever been cultivating a gamer or miner user base, so you don't have those folks clamoring for an RTX 4090-style GPU in a Mac. All you have are a number of professional uses.
So what? Is that how you treat professional users? Give me a break.
 

sunny5

macrumors 68000
Original poster
Jun 11, 2021
1,837
1,706
I didn't miss the point. It was a generalized statement asking if iGPUs are replacing dedicated GPUs. NOT if it's used for 3D modeling or 8K gaming. I explained where it replaced dedicated GPUs in some scenarios. Not sure why this conversation needs to be this difficult. You asked if it replaced dGPUs, and I explained some scenarios why it did. Let's just leave it at that please.
Yes you did. Not if is already proves you are wrong. Clearly, bringing iGPU to Mac Pro grade computer is already out of point.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.