Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Disappointed with Mac Pro 2023?


  • Total voters
    534

bobcomer

macrumors 601
May 18, 2015
4,949
3,699
I've been selling my iPhone after 2 years... no e-waste here!
It eventually ends up there, and more devices means more e-waste, even if it's not your personal e-waste anymore. And yes, I'm responsible for a heck of a lot of e-waste over my career. I don't even want to think about it to tell you the truth.
 

Longplays

Suspended
May 30, 2023
1,308
1,158
It eventually ends up there, and more devices means more e-waste, even if it's not your personal e-waste anymore. And yes, I'm responsible for a heck of a lot of e-waste over my career. I don't even want to think about it to tell you the truth.
It tends to get recycled as parts in the end.

There's money in trash.
 

bobcomer

macrumors 601
May 18, 2015
4,949
3,699
I look forward to the day when x86 will just be legacy. Like mainframes.
I know you do. :). I'm a little curious about just why, but only a little curious.

But I hope to be dead and gone before that happens, then it wouldn't be my budget or my employers very livelihood risk due to costs associated with such a thing.
 

Longplays

Suspended
May 30, 2023
1,308
1,158
I know you do. :). I'm a little curious about just why, but only a little curious.

But I hope to be dead and gone before that happens, then it wouldn't be my budget or my employers very livelihood risk due to costs associated with such a thing.
Nothing malicious against anyone who makes a living on x86.

I just want to see all of those gamers getting a Blackberry, Palm, Motorola and Nokia moment.

TBH consumers will easily switch systems.

I want the cost of parts to shoot up to cover worsening economies of scale.

SoC gobbling up dGPU margins.

When Intel was forced to move to Intel 7 after 2014-2020 at 14nm... it was vindication that a monopoly was forced to spend on fab improvements.
 

VivienM

macrumors 6502
Jun 11, 2022
496
341
Toronto, ON
Random observation: the Xeons in my not-yet-shiny new-to-me Mac Pro 2010 have a Geekbench 6 score of roughly 500 single-core, 3800 multi-core.

My 12 inch MacBook that everyone thinks is a poor performer is 950 single-core, 1750 multi-core.

My 2020 iMac has 1500 single-core, 6000 multi-core.

The 2019 Mac Pro with a W-3245 has 1400 single-core, 11100 multi-core.

An M2 Pro Mac Mini is at 2700 single core, 14500 multi-core. That's comparable to an i7-13700 Intel running Windows.

M2 Ultra is 2700 single core, 21000 multi-core.

That's fundamentally the problem. 2019's high-end workstation has the same single core performance as 2020's lowest-end 27" iMac. And about half the single core performance of the M2 chips.

Oh, and do you want to look up Intel's brand new Sapphire Rapids workstation processors? The w9-3495X (which costs $6000 for the processor alone) seems to get 2380/18000, i.e. less than M2 Ultra. I had tried to look what the higher-clock-rates-fewer-cores versions got but there doesn't seem to be much data.

I feel like like the people disappointed with the 2023 Mac Pro want a workstation benchmarking, say, 3000-3500 single-core and/or 30000 multi-core. How many people are willing to pay how much money to fund the R&D to make such a chip? And who is going to make it?

(Note - I am not endorsing Geekbench as necessarily the greatest benchmark, but I think these numbers make the point.)
 

Longplays

Suspended
May 30, 2023
1,308
1,158
Random observation: the Xeons in my not-yet-shiny new-to-me Mac Pro 2010 have a Geekbench 6 score of roughly 500 single-core, 3800 multi-core.

My 12 inch MacBook that everyone thinks is a poor performer is 950 single-core, 1750 multi-core.

My 2020 iMac has 1500 single-core, 6000 multi-core.

The 2019 Mac Pro with a W-3245 has 1400 single-core, 11100 multi-core.

An M2 Pro Mac Mini is at 2700 single core, 14500 multi-core. That's comparable to an i7-13700 Intel running Windows.

M2 Ultra is 2700 single core, 21000 multi-core.

That's fundamentally the problem. 2019's high-end workstation has the same single core performance as 2020's lowest-end 27" iMac. And about half the single core performance of the M2 chips.

Oh, and do you want to look up Intel's brand new Sapphire Rapids workstation processors? The w9-3495X (which costs $6000 for the processor alone) seems to get 2380/18000, i.e. less than M2 Ultra. I had tried to look what the higher-clock-rates-fewer-cores versions got but there doesn't seem to be much data.

I feel like like the people disappointed with the 2023 Mac Pro want a workstation benchmarking, say, 3000-3500 single-core and/or 30000 multi-core. How many people are willing to pay how much money to fund the R&D to make such a chip? And who is going to make it?

(Note - I am not endorsing Geekbench as necessarily the greatest benchmark, but I think these numbers make the point.)
People get emotionally attached with what they got used to.

There are old timer who still insist than the mainframe used at the banks are more powerful than iPad.

Macs used to be upgradeable. Now it isn't.

It became a giant iPhone... it performs better than any PowerMac or Mac Pro prior to a decade ago but people cannot wrap their heads around that quantifiable speaking an Apple Watch has more raw performance than a G5.

M2 Extreme probably did not yield that well hence the shelving to a M3 Extreme by Q1 2025.

The iPhone 3 months from now will have more raw performance than this 11yo iMac 27" 2.5K Core i7.

I think this Mac Pro M2 Ultra is the bee's knees. It will make anyone with the most popular use case very happy for least a decade.

I look forward to it being updated in step with the Mac Studio. Ya'll deserve Apple Silicon tech.
 
  • Like
Reactions: AlphaCentauri

VivienM

macrumors 6502
Jun 11, 2022
496
341
Toronto, ON
When Intel was forced to move to Intel 7 after 2014-2020 at 14nm... it was vindication that a monopoly was forced to spend on fab improvements.
I don't think Intel spent 2014-2020 at 14nm willingly or out of some monopolist strategy. They clearly had major technical problems getting the 10nm process going. Maybe they could have spent more money to overcome those technical problems faster, but that would likely require a time machine.

They were moving to new processes every 2-3 years regardless of competitive pressures for many, many years. It's one of the things that built the x86/x64 juggernaut. And failing to continue that move to new processes after 2014 has led to an existential threat - I don't think you would have Ampere on the datacenter side, Apple Silicon having cost them their best customer of higher-priced laptop chips on the consumer side, etc if 10nm/7nm (real 7nm, not "Intel 7"), AMD leveraging TSMC and the smartphone economy to beat them in the gaming and other sphere, etc had arrived at Intel's normal pace.
 

VivienM

macrumors 6502
Jun 11, 2022
496
341
Toronto, ON
There are old timer who still insist than the mainframe used at the banks are more powerful than iPad.
Has anybody benchmarked IBM's latest zArchitecture processor (the "Telum" - https://en.wikipedia.org/wiki/IBM_Telum_(microprocessor) ) against an iPad?

The bank may be running a somewhat older generation of IBM mainframes, but I'd like to think a z16 with 200 cores of this Telum thing should be able to outperform an iPad :) ARM is not that amazing!

Now, if you want to talk performance per dollar, I'm sure the iPad wins by a mile...
 
  • Haha
Reactions: Longplays

JouniS

macrumors 6502a
Nov 22, 2020
638
399
Why does Windows identify as "Unknown"?
One thing working in science has taught me is that data is usually unreliable garbage.

You can almost never measure the thing you want to measure directly. You can't measure operating system market shares, but you measure something related (such as user-agent strings from browsers). Then you hope that you can interpret your measurements correctly and that your measurements are not too biased to make the conclusions unreliable.

But then something changes, which is rather common in the software world. Maybe a popular browser starts returning user-agent strings your code can't parse, at least in some situations. This goes on for a while, until start seeing a spike in measurements you can't interpret. You investigate what's going on and eventually fix your methodology. But you don't always bother correcting the interpretations of old data. Maybe you are lazy, maybe you don't care about data quality, or maybe fixing it would not be worth the effort. You just continue reporting the old measurements you know to be incorrect.
 

VivienM

macrumors 6502
Jun 11, 2022
496
341
Toronto, ON
That's a good point, and where I'm at. We just don't have the $'s necessary to get an internet connection that could handle all our workers using VDI and basically dumb terminals, and the internet just isn't that reliable enough anyway -- my IT spidey sense is yelling in alarm at the thought. Due to what we make, we have to have a lot of water, so by a river in a non populated region = poor internet = No VDI unless it's locally served and that is actually more expensive to do than consumer PC hardware.
There are a number of MSPs who offer "private cloud" solutions where a VDI setup is running in their data center and you can connect your office to it over a private circuit from your favourite carrier if you (rightly) don't trust using the public Internet...

I'm sure the covid/post-covid world, hybrid work, etc also encourages VDI-type solutions. Very flexible for remote working.
 

Longplays

Suspended
May 30, 2023
1,308
1,158
I don't think Intel spent 2014-2020 at 14nm willingly or out of some monopolist strategy. They clearly had major technical problems getting the 10nm process going. Maybe they could have spent more money to overcome those technical problems faster, but that would likely require a time machine.

They were moving to new processes every 2-3 years regardless of competitive pressures for many, many years. It's one of the things that built the x86/x64 juggernaut. And failing to continue that move to new processes after 2014 has led to an existential threat - I don't think you would have Ampere on the datacenter side, Apple Silicon having cost them their best customer of higher-priced laptop chips on the consumer side, etc if 10nm/7nm (real 7nm, not "Intel 7"), AMD leveraging TSMC and the smartphone economy to beat them in the gaming and other sphere, etc had arrived at Intel's normal pace.
Smaller players than Intel managed to get to die shrinks nearly on schedule.

I really do see Intel staying stuck as a cost cutting measure considering they had all the PC OEMs are customers.

Apple started using their chips in 2006 and Intel stayed stuck on 14nm nearly a decade later.

Then by May 2020 a mid-2020 MBP 13" received a 10nm Intel chip before WWDC 2020.

As the kids say... that's "sus".

The world's a better place for it. Apple's supply chain masters figured it would be a net gain to have their own chips even when it made the Mac Pro like a giant iPhone with PCIe slots.

What baffled me was Apple's decision not to get into the supercomputer, data center and cloud computing business considering performance per watt is very important in that field. It would allow for a way to sell Ultra and Extreme chips when there is desktop workstations isn't growing at a pace as laptops.

That was the key reason why Qualcomm's NUVIA's ex-Apple engineers left as they wanted to work on workstation/server chips.
 

VivienM

macrumors 6502
Jun 11, 2022
496
341
Toronto, ON
It eventually ends up there, and more devices means more e-waste, even if it's not your personal e-waste anymore. And yes, I'm responsible for a heck of a lot of e-waste over my career. I don't even want to think about it to tell you the truth.
This is now completely off-topic, but I've gotten increasingly interested in vintage computing. Both Macs and more recently on the Windows side. So I've been reading quite a bit about what people in both communities are finding desirable.

It's very interesting when you think at the number of machines/parts/etc that my friends and I ewasted a decade or a decade and a half ago, and now today, at least some of those appear to be in real demand. I had a friend with a last-generation titanium G4 and an LGA775 i865 AGP motherboard! And I am pretty sure he e-wasted both, not sure what condition the G4 was in when he replaced it with a unibody MacBook. I e-wasted a Dell desktop I was actually quite emotionally attached to because, well, there was no use for a middling XP PIII box in 2011 or 2012, yet, looking at it from the mindset of the vintage PC community, it turns out that it would have made an outstanding Win98 retro box for anyone masochistic enough to want Win98. (I ran Win98 on it the first six months I had it, and well, I still have the trauma 20 years later) Hell, I suspect my family's Mac SE that we had trouble getting rid of for $40 CAD in 1996 would be worth a lot more today, especially when its biggest weakness (no internal hard drive) is, if anything, a positive in 2023. But the problem is, unless you have a large house in the country or something, who is going to store something that seems completely obsolete for two decades on the hope that two decades later, some nostalgic person will want it? And the time when you want to get rid of something is the time when it's too new for other people to attach any nostalgia/vintage value to it, and too old for anyone to want it for its original purpose. So off to e-waste most stuff goes.

Then there are some things that are insanity, e.g. I am seeing people offer to sell crappy Windows boxes from the mid-late 1990s for real money. If you want a mid-1990s retro box, why wouldn't you get a good one rather than the crappy one that maxed out your parents' budget?!? Is anyone really so emotionally attached to the Windows computer they had in 1995 that they want to buy it again in 2023? (I can understanding wanting to run the software you used in 1995... but why not run it on the hardware you couldn't afford in 1995?)

(Also, vintage Windows is very different from vintage Mac. The most desirable parts in vintage Windowsland seem to be determined very differently...)

Anyways, back to pointing out how smartphone processors are going to cause Mac Pros to be e-wasted...
 

Longplays

Suspended
May 30, 2023
1,308
1,158
This is now completely off-topic, but I've gotten increasingly interested in vintage computing. Both Macs and more recently on the Windows side. So I've been reading quite a bit about what people in both communities are finding desirable.

It's very interesting when you think at the number of machines/parts/etc that my friends and I ewasted a decade or a decade and a half ago, and now today, at least some of those appear to be in real demand. I had a friend with a last-generation titanium G4 and an LGA775 i865 AGP motherboard! And I am pretty sure he e-wasted both, not sure what condition the G4 was in when he replaced it with a unibody MacBook. I e-wasted a Dell desktop I was actually quite emotionally attached to because, well, there was no use for a middling XP PIII box in 2011 or 2012, yet, looking at it from the mindset of the vintage PC community, it turns out that it would have made an outstanding Win98 retro box for anyone masochistic enough to want Win98. (I ran Win98 on it the first six months I had it, and well, I still have the trauma 20 years later) Hell, I suspect my family's Mac SE that we had trouble getting rid of for $40 CAD in 1996 would be worth a lot more today, especially when its biggest weakness (no internal hard drive) is, if anything, a positive in 2023. But the problem is, unless you have a large house in the country or something, who is going to store something that seems completely obsolete for two decades on the hope that two decades later, some nostalgic person will want it? And the time when you want to get rid of something is the time when it's too new for other people to attach any nostalgia/vintage value to it, and too old for anyone to want it for its original purpose. So off to e-waste most stuff goes.

Then there are some things that are insanity, e.g. I am seeing people offer to sell crappy Windows boxes from the mid-late 1990s for real money. If you want a mid-1990s retro box, why wouldn't you get a good one rather than the crappy one that maxed out your parents' budget?!? Is anyone really so emotionally attached to the Windows computer they had in 1995 that they want to buy it again in 2023? (I can understanding wanting to run the software you used in 1995... but why not run it on the hardware you couldn't afford in 1995?)

(Also, vintage Windows is very different from vintage Mac. The most desirable parts in vintage Windowsland seem to be determined very differently...)

Anyways, back to pointing out how smartphone processors are going to cause Mac Pros to be e-wasted...
People are buying their youth back. Some imagine substituting a PATA drive with a CF card for a PATA SSD.

Very weird to turn on a 486 DX2 and not hear the cracking sound of a PATA drive. It just zooms to Win95 SE because its a CF card. Somewhat similar to replacing a 2011 MBP 13" HDD with a SSD. Just surreal how little bounces are needed to start an app.

About a decade ago Nintendo started offering mini consoles with 720p HDMI output for NES & SNES. Sony, Sega and TurboGFX joined in the mix.

Nice novelties but essentially e-waste if it ends up being shelf porn.

Then come FPGAs that has hardware programmed to become actual video console hardware so there zero emulation when inserting a cart.

Ultimately, another novelty contributing to e-waste.

If I could redo some things I'd have wanted to make it a habit to sell off hardware once I replace them. It takes up space and I do not want to be misidentified as a hoarder.

Also my regret in upgrading 2011 MBP 13" 32nm & 2012 iMac 27" 22nm to a 2017 Dell 15" 14nm, 2018 MBA 14nm and 2016 MBP 16" 14nm when I could have delayed for a 2020 MBA M1 5nm and 2021 MBP 16" 5nm. Imagine jumping into a decade's worth of advances.
 

VivienM

macrumors 6502
Jun 11, 2022
496
341
Toronto, ON
Smaller players than Intel managed to get to die shrinks nearly on schedule.

I really do see Intel staying stuck as a cost cutting measure considering they had all the PC OEMs are customers.

Apple started using their chips in 2006 and Intel stayed stuck on 14nm nearly a decade later.

Then by May 2020 a mid-2020 MBP 13" received a 10nm Intel chip before WWDC 2020.

As the kids say... that's "sus".

The world's a better place for it. Apple's supply chain masters figured it would be a net gain to have their own chips even when it made the Mac Pro like a giant iPhone with PCIe slots.

What baffled me was Apple's decision not to get into the supercomputer, data center and cloud computing business considering performance per watt is very important in that field. It would allow for a way to sell Ultra and Extreme chips when there is desktop workstations isn't growing at a pace as laptops.

That was the key reason why Qualcomm's NUVIA's ex-Apple engineers left as they wanted to work on workstation/server chips.
Two things:
1) Look at Intel's history starting in 2006. The Conroe Core 2 Duo basically destroyed everything - AMD, PowerPC, etc. If you compare, say, the C2D E6600 and the i7-6700, Intel quadrupled single-core performance and 8x multicore performance. Went from 65nm to 14nm in those, oh, ten or so years.

The reason they had all the PC OEMs as customers, etc and that AMD was basically left out of every space in which the Opterons and Athlon 64s of 2002-2005 had managed to make inroads in is because their chips were improving consistently.

And I would further add that this is the roadmap that sold Steve Jobs on Intel - at least quadrupling performance per watt over a decade is significant and certainly not something Freescale or IBM could do. And that's taking Conroe as a starting point - the performance per watt there was dramatically better than anything else.

The i7-6700 came out 8 years ago and it is not clear to me that Intel's performance-per-watt, at least, has improved at all since then. Raw performance, sure, but that's running monster liquid cooled setups with 240W TDPs in gaming rigs. If Intel had maintained the same pace as they did from Conroe to Skylake, from 65nm to 14nm, they'd be today at chips that would probably be 2.5X the performance of Apple silicon.

I honestly do think they just screwed up, and at the worst possible time when TSMC, fuelled by smartphone money, was able to just keep powering forward. It's worth noting that this stuff is hard and expensive - you only have really... three... semiconductor companies left playing at the leading edge. Compare that to the number of people who had fabs in, say, the mid-late 1990s.

2) I actually think Apple is right not to get into data center/cloud/etc. Why? Because the market is shrinking to a few big customers - Amazon who is working on their own ARM chips, Azure, etc., all of which are using at least semi-custom hardware. They're not going to abandon that for some off-the-shelf thing from Apple of all places.

I think there's plenty of opportunity to use the 'smartphone economy' and the same principles that drove Apple silicon in the data center/cloud sphere, but that product will sell dramatically better if i) it's sold as a chip/motherboard reference design that can be custom integrated into what Amazon/Azure/etc want to build, and ii) it doesn't have the Apple brand. Oh, and perhaps iii) I suspect Ampere's investors expect lower margins than Apple's :)
 

VivienM

macrumors 6502
Jun 11, 2022
496
341
Toronto, ON
People are buying their youth back. Some imagine substituting a PATA drive with a CF card for a PATA SSD.

Very weird to turn on a 486 DX2 and not hear the cracking sound of a PATA drive. It just zooms to Win95 SE because its a CF card. Somewhat similar to replacing a 2011 MBP 13" HDD with a SSD. Just surreal how little bounces are needed to start an app.

About a decade ago Nintendo started offering mini consoles with 720p HDMI output for NES & SNES. Sony, Sega and TurboGFX joined in the mix.

Nice novelties but essentially e-waste if it ends up being shelf porn.

Then come FPGAs that has hardware programmed to become actual video console hardware so there zero emulation when inserting a cart.

Ultimately, another novelty contributing to e-waste.

If I could redo some things I'd have wanted to make it a habit to sell off hardware once I replace them. It takes up space and I do not want to be misidentified as a hoarder.

Also my regret in upgrading 2011 MBP 13" 32nm & 2012 iMac 27" 22nm to a 2017 Dell 15" 14nm, 2018 MBA 14nm and 2016 MBP 16" 14nm when I could have delayed for a 2020 MBA M1 5nm and 2021 MBP 16" 5nm. Imagine jumping into a decade's worth of advances.
Hey, I know the mini-consoles. I own... pretty much all of them, or at least all the Nintendo ones, the Sega ones, and the Sony. I would be surprised if I have spent more than 1-2 hours playing any one of them. In my case, it's somewhat payback for my parents' strict no-consoles-in-the-house rule. But I will note that the Nintendo ones, at least, were super-popular even among very ordinary adults with zero interest in video games otherwise. I probably helped like 8-10 people source the classic Super Nintendo. Nostalgia sells.

I have way too much hardware, certainly enough to be identified as a hoarder sadly. I actually want to get rid of quite a lot of it, but it's so labour-intensive to try and sell off, oh, a low-end 32" TV, a 6X DVD-ROM from an MDD G4, a Thunderbolt to Ethernet adapter, a GPU that would have sold for 3X as much money had I tried to sell it when I stopped using it, etc. And yet so depressing to e-waste it.

One thing the world is missing is a good marketplace for these things. And I understand why there is no such marketplace - the real estate and staffing required to run it are not practical. So... yeah.
 

canadianreader

macrumors 65816
Sep 24, 2014
1,204
3,280
I doubt this will have any relevance in practice. The entire Ultra chip uses less power at full load than some newer x86 mobile CPUs after all. The Studio won't have any problems with dissipating the 100-150W of power needed to operate the M2 Ultra.

There were some rumours that the Ultra in the Mac Pro might be significantly overclocked but judging by Apple product page the performance level is the same as a horizontally scaled M2 Max.

What's the point of saving power and the environment when you have to buy a new machine every time you need more ram or a faster CPU/GPU.
 
  • Like
Reactions: AlumaMac

JouniS

macrumors 6502a
Nov 22, 2020
638
399
What baffled me was Apple's decision not to get into the supercomputer, data center and cloud computing business considering performance per watt is very important in that field. It would allow for a way to sell Ultra and Extreme chips when there is desktop workstations isn't growing at a pace as laptops.
Performance per watt is important, but so is configurability. Tight coupling between CPU performance, GPU performance, and the amount of RAM is not cost-effective when user requirements vary widely.

If you look at what kind of hardware AWS is offering today, you will mostly find Xeon-based systems with 64 cores and 256/512/1024 GB RAM, EPYC-based systems with 96 cores and 384/768/1536 GB RAM, and Graviton-based systems with 64 cores and 128/256/512 GB RAM. Those instances don't have any GPUs, because most workloads don't need them. If you want an instance with a GPU, the underlying hardware usually has 8 of them, because most workloads that need a GPU need a lot of GPU performance.

Then there is a lot of weird stuff, such as systems with obscure accelerators and high-memory instances with 6/12/18/24 TB RAM.
 

theorist9

macrumors 68040
May 28, 2015
3,880
3,060
The other thing that would be interesting is to measure the gap between family cars vs sports cars.

When that 1984 Corvette was 7.9 seconds, what was a mainstream family car getting?

And what about in 2023?

I haven't done the research (and don't really want to), but I wonder whether the gap has remained consistent or if it has shrunk.
Depends on the model, but the fastest current stock Corvette is the 2023 Z06 (2.6 s). So 7.9/2.6 => 3.0 x faster since 1984.

For passenger cars, I chose the Accord 4-door sedan, since it's stayed consistently within the "family car" market segment. To be consistent, we should also choose the fastest current Accord, which is the Accord Sport 2.0T (5.4 s). So 11.4/5.4 => 2.1 x faster.

Thus based on this very limited look, the gap between sports car and family sedan 0-60 times has actually widened.

Now that 7.9 s time you quoted for the '84 vette seems a bit slow. But even if I reduce it to 7.0 s, I still get a 7.0/2.6 = 2.7-fold improvement for the vette.
 

theorist9

macrumors 68040
May 28, 2015
3,880
3,060
What baffled me was Apple's decision not to get into the supercomputer, data center and cloud computing business considering performance per watt is very important in that field. It would allow for a way to sell Ultra and Extreme chips when there is desktop workstations isn't growing at a pace as laptops.
The key advantage Apple has in the PC market is the relatively high efficiency of its chips. But that advantage goes away in the server market, where efficient ARM-based chips are already being used, like AWS's Graviton series:

"Amazon has thus chosen to use TSMC’s cutting edge 5 nm process to reduce power consumption. TSMC’s 7 nm process already did wonders for low power designs, and 5 nm would take this even further. While Graviton 3 is a beefier core than N1, it’s nowhere near as ambitious as Intel’s Golden Cove, and should still be considered a moderate design. Such a core running at 2.6 GHz on 5 nm should absolutely sip power. That in turn lets AWS pack three of these chips into a single node, increasing compute density. The final result is a chip that lets AWS sell each Graviton 3 core at a lower price, while still delivering a significant performance boost over their previous Graviton 2 chip."


That's not to say a hypothetical AS server chip wouldn't offer advantages over Graviton (for instance, I think Graviton instances typically use discrete NVIDIA GPUs, which are probably much less efficient than Apple's on-die GPU's) but just that the market is already competitive and challenging. Further, server loads are very different from desktop loads, so creating an optimized server chip would require a significant redesign—and, as others have said on this topic, this would divert Apple's attention away from its core market.
 
Last edited:

theorist9

macrumors 68040
May 28, 2015
3,880
3,060
With emphasis I was pointing out where the SoC R&D money is coming from. Macs would not have the revenue to self support their own SoC R&D with how unpopular they are relative to AMD/Intel.
That response completely avoids acknowleding that you used non-representative numbers.
I think someone else answered your analogy succinctly.
Not in any way that invalidated it. Indeed, the info. in the response supported my analogy.
 

bobcomer

macrumors 601
May 18, 2015
4,949
3,699
There are a number of MSPs who offer "private cloud" solutions where a VDI setup is running in their data center and you can connect your office to it over a private circuit from your favourite carrier if you (rightly) don't trust using the public Internet...
Absolutely aware of them, and their cost. First off, you have to have that private circuit and our plant is in the middle of a swamp, I drive 45 minutes (37 miles) to get there from home. That costs high dollars. It's just not doable at all. It's way more than what we spend now, even if we include our current servers. Even if not a private circuit, just via VPN, it's still way too costly.
I'm sure the covid/post-covid world, hybrid work, etc also encourages VDI-type solutions. Very flexible for remote working.
Remote work?? Not a thing around here. I get a kick out of all the workers complaining about going back to work, I just can't have any sympathy for them. For the whole pandemic, our office workers, bosses, and myself were sitting at our desks, just like every other year. And then there's the plant floor -- they CAN'T work remotely, nowhere, and office workers want to lounge on their couches while the people that actually make things have to drive to work like always. As a somewhat manager, I just can't abide by that difference.

Forgive my rant and off topic'ness, just a sign of the times.

The people that may need access off hours use a VPN, but only a couple people can be termed remote workers and they're definitely special case, non covid related. (their jobs have not changed for a few years). We don't use VDI at all. I use a VPN to get into my LAN and use whatever I need, and users that do occasionally work remotely, remote into their existing PC's over that VPN. It actually works pretty well!
 

bobcomer

macrumors 601
May 18, 2015
4,949
3,699
This is now completely off-topic, but I've gotten increasingly interested in vintage computing. Both Macs and more recently on the Windows side. So I've been reading quite a bit about what people in both communities are finding desirable.
Sorry, and I'm always interested in that kind of thing too.

It's very interesting when you think at the number of machines/parts/etc that my friends and I ewasted a decade or a decade and a half ago, and now today, at least some of those appear to be in real demand. I had a friend with a last-generation titanium G4 and an LGA775 i865 AGP motherboard!
And some are buying it for actual usage, like me, I'm always rooting around on ebay to find a part. We have plenty of old tech in testers and the computers that control them that I need to keep running. It's cheaper that way. And I have a lot of stuff there I could sell to someone that needs it. I swear I have an 80186 processor PC at work on the shelf, and I've never actually seen one in use. (it predates my working there) My first 80' chip was a 80286. I have an original pentium running as control for very important machinery. That actually is in the budget to replace, but it's a BIG multi-year project.

Then there are some things that are insanity, e.g. I am seeing people offer to sell crappy Windows boxes from the mid-late 1990s for real money. If you want a mid-1990s retro box, why wouldn't you get a good one rather than the crappy one that maxed out your parents' budget?!? Is anyone really so emotionally attached to the Windows computer they had in 1995 that they want to buy it again in 2023? (I can understanding wanting to run the software you used in 1995... but why not run it on the hardware you couldn't afford in 1995?)
If there goal is to run certain software, they may need the old cheap hardware just to get it to work, there are certain processor/bus differences that can upset timing sensitive software. Even dropping in an SSD with the appropriate converters can make it not work if something works faster than it should! Win98 has a CPU race condition that made it not able to run easily on decent hardware for a long time.
(Also, vintage Windows is very different from vintage Mac. The most desirable parts in vintage Windowsland seem to be determined very differently...)
Just what's available and not. Windows has a much richer and more varied history.

Anyways, back to pointing out how smartphone processors are going to cause Mac Pros to be e-wasted...
I don't think so, but it maybe does effect how fast that happens. :)
 

VivienM

macrumors 6502
Jun 11, 2022
496
341
Toronto, ON
Remote work?? Not a thing around here. I get a kick out of all the workers complaining about going back to work, I just can't have any sympathy for them.
If you're in the middle of a swamp, it's probably different than the big city. The real big city, i.e. metro areas with north of 3-5 million people and skylines full of buildings well north of 20 stories. Especially the big city where a lot of businesses in the past decade decided to set up shop downtown because 'that's what the millennials want'. Including, interestingly, tech companies - the Microsoft Canadas and the Apple Canadas all used to have suburban office parks and they all moved to big tall very downtown towers a few years before the pandemic. Well, guess what, moving a lot of people in and out of a tiny downtown area full of tall towers is not very efficient.

In the big city, there's something absurdly stupid about spending 2+ hours/day, either spending huge amounts of money on parking and being stuck in traffic moving at <10km/h or being stuck on slow moving, unreliable trains/subways/buses, commuting to do a job that you can do just as well (or, frankly, potentially better) from anywhere else.

Frankly, we need more remote work in the big city if we're going to keep packing people into ever denser housing and not have any reasonable ways for them to get out of that housing and to offices. And more remote work also means less need to pack people into dense urban housing, since they can do their jobs just as well from a swamp with reliable high-speed Internet access 150km away.

Not trying to be dismissive of your perspective - if you're in manufacturing in a low-density, rural area where people get to drive at 70km/h+ to and from the plant, get there in 20 minutes and park at the plant for free, I can understand that remote work feels like spoiled whining. But office jobs in the real big city involve very different tradeoffs.
 

bobcomer

macrumors 601
May 18, 2015
4,949
3,699
If you're in the middle of a swamp, it's probably different than the big city.
Definitely, and the nearest "big" city is another 20 minutes the other way for me, and we're only talking 3-4 million.

Well, guess what, moving a lot of people in and out of a tiny downtown area full of tall towers is not very efficient.
Yep. We should have better public transport.

Not trying to be dismissive of your perspective - if you're in manufacturing in a low-density, rural area where people get to drive at 70km/h+ to and from the plant, get there in 20 minutes and park at the plant for free, I can understand that remote work feels like spoiled whining. But office jobs in the real big city involve very different tradeoffs.
No doubt, but I still laugh at them and don't have sympathy. We also get paid a lot less than they do, in addition to having to pay for gas for the commute. Most of our workers are well outside the 20 minute out mark, but they can park free. And no, I wouldn't work in a big city, even to get paid more and work remotely sometimes.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.