Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Disappointed with Mac Pro 2023?


  • Total voters
    534

VivienM

macrumors 6502
Jun 11, 2022
496
341
Toronto, ON
If there goal is to run certain software, they may need the old cheap hardware just to get it to work, there are certain processor/bus differences that can upset timing sensitive software. Even dropping in an SSD with the appropriate converters can make it not work if something works faster than it should! Win98 has a CPU race condition that made it not able to run easily on decent hardware for a long time.

Just what's available and not. Windows has a much richer and more varied history.
There's a difference between old and crap. If you want a 486, say, you can get a good 486 on a motherboard with lots of expansion, VLB or PCI options for good video, etc. Or you can buy my first Windows machine - an AST 486DX2/50 that's actually Cyrix not Intel (clever deceptive marketing, the 50MHz Cyrix is apparently closer to a 33MHz Intel in performance), 8250 UARTs for the serial ports so forget about any external modems, only 1 PATA channel so your 2x CD-ROM and your HDD run on the same channel, no expansion other than 3 ISA slots (two of which were used from the factory by an SB 16, the only good part of the box, and a 14.4 modem), crappy onboard Cirrus Logic video, a BIOS that is limited to 512 meg HDDs (it shipped with a 420 meg), etc. Need I keep going? :)

I actually saw an original unmodified version of that machine, complete with the original 4MB of RAM and the 14" CRT that didn't support any good refresh rates, on Facebook Marketplace for CAD$150. I don't get it. Can't you find a decent 486DX2/66 with a real motherboard, 2 PATA channels, etc if you want a 486 DOS box? And if you want a CRT, aren't there any nice 17/19"s that support real refresh rates still floating around?

A few weeks earlier, I saw somebody selling a sibling of my second Windows machine, an IBM "E-series" (i.e. Acer-built/designed) Aptiva with a K6 or K6-2 for something like $300-350. This is a Mac forum so I won't set out all the ways in which this is a crappy system. But it was. Again, if you can find a Gateway or Dell or home-built system from the same era, you'll get a much better system.

These are bottom of the barrel, low-expandability, utterly forgettable clones that were sold to mass market consumers who didn't know better. I would further add that neither AST nor IBM were still selling consumer machines by the turn of the millennium, which might tell you something about the market's reaction to this junk.

And no, it's not just 'what's available and not'. That's what I found interesting. In Windowsland, things that were low-end at the time can sometimes end up more valuable for vintage purposes because those low-end things have better compatibility with older things. e.g. vintage folks seem to love K6-2s because... not sure exactly, but I think the compatibility for DOS/Win3.1 is much better. No one bought a new K6-2 in 1999 if they could afford a Pentium II/III or Athlon. Similarly, in ~2004-5, the PC world moved to DDR2/SATA/PCI Express, so if you had money and wanted the newest Pentium 4, you bought a i915 chipset board, a PCI-E GPU, DDR2 RAM and a SATA HDD. If you had no money, you bought an LGA775 i865 board that would you keep your AGP GPU, DDR1 RAM, and had two PATA channels. Later variants of a few of those boards even supported Core 2 Duo CPUs. But here's what's interesting (and what almost no one was thinking about in 2004-7 when these things were being purchased) - in the vintage community, the i865 board is actually more valuable because it runs Windows 98, which does not support PCI-E, whereas the i915 is useless - if you want something to run XP, well, there are a lot better XP machines than a mid-life single-core Pentium 4.

That doesn't appear to be true in Mac world, or at least I cannot think of any scenario where a lower-end Mac/component/etc would turn out to be more prized in the vintage world than a higher-end alternative from the same time period. (Okay, if you want to stretch it, I can think of one example - the MDD 2003 vs the FW800 MDD, the 2003 is preferable for retro purposes because it can boot OS 9 while the FW800 had the extra FW ports, faster CPUs, and I think better AirPort support. But the MDD 2003 is also the only example in Apple history I can think of where they deliberately built a mildly-'retro' system...)
 

VivienM

macrumors 6502
Jun 11, 2022
496
341
Toronto, ON
No doubt, but I still laugh at them and don't have sympathy. We also get paid a lot less than they do, in addition to having to pay for gas for the commute. Most of our workers are well outside the 20 minute out mark, but they can park free. And no, I wouldn't work in a big city, even to get paid more and work remotely sometimes.
Yup - one tragedy of the big city is that, sure, you get paid more, but a lot of your big city pay goes to very overpriced housing, expensive commute, etc. There are lots of jobs that, at least before the advent of remote work, only existed in big cities, so if you're in those lines of work, you're kinda stuck, but if you work in a sector where there are opportunities both in the big cities and in smaller places, it's not at all clear whether the big city's bigger money actually leaves you ahead once you've paid for your overpriced big city housing.

And it's worth noting that this is one of the things that appeals to people about remote work - if you can have the big city job and the big city pay without having the big city commute and the big city costs, well, that's a net win for a lot of people.

The only cheap thing about the big city is cheap fast Internet :) Oh, and a relatively plentiful supply of used Macs. Except Titanium G4s and beige G3s.
 
  • Like
Reactions: bobcomer

bobcomer

macrumors 601
May 18, 2015
4,949
3,699
Need I keep going?
No, I understand what you're saying. :)

Can't you find a decent 486DX2/66 with a real motherboard, 2 PATA channels, etc if you want a 486 DOS box?
Probably. I know I really don't pick and choose to get better components, just ones that would work for the job. I can always add something else to it to make up for any shortfall. And lower spec machines are just as reliable IME.

A few weeks earlier, I saw somebody selling a sibling of my second Windows machine, an IBM "E-series" (i.e. Acer-built/designed) Aptiva with a K6 or K6-2 for something like $300-350. This is a Mac forum so I won't set out all the ways in which this is a crappy system. But it was. Again, if you can find a Gateway or Dell or home-built system from the same era, you'll get a much better system.
I remember those, even bought a few when new, but nothing special about them so I wouldn't buy them used. Maybe it's all nestalgia driving the purchaes.

That doesn't appear to be true in Mac world, or at least I cannot think of any scenario where a lower-end Mac/component/etc would turn out to be more prized in the vintage world than a higher-end alternative from the same time period. (Okay, if you want to stretch it, I can think of one example - the MDD 2003 vs the FW800 MDD, the 2003 is preferable for retro purposes because it can boot OS 9 while the FW800 had the extra FW ports, faster CPUs, and I think better AirPort support. But the MDD 2003 is also the only example in Apple history I can think of where they deliberately built a mildly-'retro' system...)
A 2020 27" iMac will be one of those prized systems since Apple seems to have discontinued that line. And it's a special PC, I'd buy one if I didn't already own one and use it more than my M based Mac. In fact I'm using it right now. :)
 
  • Like
Reactions: AAPLGeek

VivienM

macrumors 6502
Jun 11, 2022
496
341
Toronto, ON
Maybe it's all nestalgia driving the purchaes.
It's worth noting that I don't know if they actually get purchased. I've just seen the listings. But it seems odd that you can take a crappy computer from 25-28 years ago, dig it up from your mom's basement or attic or wherever it was stored for the last 20 years, and get it in your mind to sell it for the prices these are listed for.
A 2020 27" iMac will be one of those prized systems since Apple seems to have discontinued that line. And it's a special PC, I'd buy one if I didn't already own one and use it more than my M based Mac. In fact I'm using it right now. :)
Hey, I have a 2020 27" iMac :) Also typing on it right now, though I may turn my mind back to my 'new' Mac Pro in a second. I didn't buy it for its collectibility, really, I just bought it when the Mac studio was launched at 2.5X the price. I want to say it's the first T2 iMac and that might give it some extra longevity maybe, too? My one worry about its value as a collectible is on the battery front - this is not a system I want to open up to change a BR2032 or CR2032 or whatever battery is in this thing. But yes, the 2020 iMac is a machine that seems like it has great collectibility - the last of the 64-bit Intels...
 

Longplays

Suspended
May 30, 2023
1,308
1,158
That actually works pretty well, and proving more reliable than old IDE hard drives.
Indeed it does but many point out that the substitution detracts from the experience.

Personally I am just amazed by how fast a 90s Intel PC flies with a CF card.

Then the joy of maxing our the RAM of the motherboard!

Like say 8MB to 64MB!

It is just so weird how the PSU fan and HSF sounds without the creak of a HDD.
 
Last edited:

CraigJDuffy

macrumors 6502
Jul 7, 2020
480
780
It’s big if you need to add high end capture cards, multi cam broadcasting. If there is no need for PCIE, studio is damn fine machine.
Didn't say it was a bad thing just that for the vast majority of people it’s not really that big a deal - especially since it can’t be used to add GPUs
 
  • Haha
Reactions: Longplays

bobcomer

macrumors 601
May 18, 2015
4,949
3,699
Hey, I have a 2020 27" iMac :) Also typing on it right now, though I may turn my mind back to my 'new' Mac Pro in a second. I didn't buy it for its collectibility, really, I just bought it when the Mac studio was launched at 2.5X the price. I want to say it's the first T2 iMac and that might give it some extra longevity maybe, too? My one worry about its value as a collectible is on the battery front - this is not a system I want to open up to change a BR2032 or CR2032 or whatever battery is in this thing. But yes, the 2020 iMac is a machine that seems like it has great collectibility - the last of the 64-bit Intels...
I didn't buy it for it's collectibility either, I only thought about that from this thread. Yeah, opening it up will be a problem if that time comes. :(. It's a great machine though, best display I've ever owned and good in every other way. Though I do wonder why Apple doesn't like height adjustment.
 

bobcomer

macrumors 601
May 18, 2015
4,949
3,699
Indeed it does but many point out that the substitution detracts from the experience.
I'll pass! I hope those old spinny things go the way of the dinosaurs sooner rather than later.

Personally I am just amazed by how fast a 90s Intel PC flies with a CF card.

Then the joy of maxing our the RAM of the motherboard!

Like say 8MB to 64MB!
I know what you mean, they're usable again!
 

VivienM

macrumors 6502
Jun 11, 2022
496
341
Toronto, ON
Mac Pro is a niche product, it’s not for vast majority of people.
And that's where we come back full circle.

A beige Power Mac G3 was not a niche product; it was for the vast majority of people, or at least the vast majority of 'pro' users regardless of what their work was. Probably at least some home users if they could get their hands on one.

Same with, say, all of the G4s with their base US$1699 pricing.

This started to change with the G5 - pricing went up, 'extremeness' went up - and the Mac Pro, which consistently rejected Intel's mainstream chips for their serious workstation architecture and also kept upping the base entry price. But still... a low-end 5,1 configuration for US$2499 was for a reasonable number of people. And at least some of those folks were not traditional pro users but enthusiasts who had embraced the G3/G4/G5/Mac Pro as the Mac world's closest equivalent to building your own Windows box.

But now the successor to these machines is US$5999 or US$6999, lost many of the things that some people found appealing, and basically is aimed at only one set of people - big-budget professional people who do things that require PCI-E cards that are not GPUs for their work, two network interfaces, or a few other very niche requirements.

Put another way, the 2013/2019/2023 Mac Pros are only for people who, if they were Windows users, would pull out their employer's credit card and order a Dell or Lenovo workstation with expensive, fully-warrantied OEM RAM, a long on-site warranty, and components certified by particular software vendors. Not for people who, if they were Windows users, would go down to the computer store or Newegg, buy a big pile of parts, put them together, and then replace one or two of those parts every year or two. :(

The irony is that Apple has actually delivered a reasonable headless desktop for the first time in at least 10 years... but it has zero, zero modularity.
 
  • Like
Reactions: Longplays

Longplays

Suspended
May 30, 2023
1,308
1,158
And that's where we come back full circle.

A beige Power Mac G3 was not a niche product; it was for the vast majority of people, or at least the vast majority of 'pro' users regardless of what their work was. Probably at least some home users if they could get their hands on one.

Same with, say, all of the G4s with their base US$1699 pricing.

This started to change with the G5 - pricing went up, 'extremeness' went up - and the Mac Pro, which consistently rejected Intel's mainstream chips for their serious workstation architecture and also kept upping the base entry price. But still... a low-end 5,1 configuration for US$2499 was for a reasonable number of people. And at least some of those folks were not traditional pro users but enthusiasts who had embraced the G3/G4/G5/Mac Pro as the Mac world's closest equivalent to building your own Windows box.
This changed because G5 laptops were impossible and the better alternative was Intel at the time.

Steve knew laptops will be preferred as late as the 1st PowerMac G5.

As more users upgrade from their PowerMac to a MacBook Pro then economies of scale for the Mac Pro dropped leaving the 1% of use cases that needs it.

That Macbook Pro users then eventually went to Macbook Air 11" & 13" and maybe Macbook 12" because their use case did not change all the much but there were lighter/cheaper/faster/smaller tools to fulfill their needs.

Which brings us to the M1 and M2 Macs... these are the most popular SoC SKU. Likely having present in more than 80% of all Macs shipped while the Pro and Max SKUs occupying less than 19% and the Ultras having less than 1%.

Why is it that way? Likely because the M1 and M2 are "good enough".
 

burgerrecords

macrumors regular
Jun 21, 2020
222
106
I look forward to the day when x86 will just be legacy. Like mainframes.
though it has some merit I think there are a number of faults with the x86 mainframe analogy

I think a better analogy for x86 in personal computing is the internal combustion engine to Soc’s electric motor. At least for the next 5 years, while a lot of overlap, often different tools for different jobs rather than better or worse.

And for personal computing soc Apple is like Tesla. First to cross the mainstream finishline and a great leap forward but far from best at everything that needs done by a vehicle.
 

VivienM

macrumors 6502
Jun 11, 2022
496
341
Toronto, ON
though it has some merit I think there are a number of faults with the x86 mainframe analogy

I think a better analogy for x86 in personal computing is the internal combustion engine to Soc’s electric motor. At least for the next 5 years, while a lot of overlap, often different tools for different jobs rather than better or worse.

And for personal computing soc Apple is like Tesla. First to cross the mainstream finishline and a great leap forward but far from best at everything that needs done by a vehicle.
Two things:
1) SoC and x86 are not incompatible.

If you look at x86 over the last 25 years, x86 systems have gotten more and more integrated in a number of subtle waves. Not as integrated as a smartphone, sure, but you've seen what used to be discrete controllers moving into multipurpose chipsets (the so-called north/south bridges in the later 1990s to the late 2000s), then graphics, PCI-e controllers, memory controllers, etc all moving onto the processor die, particularly starting in the 2010s.

The rise of ARM is a different phenomenon; in Apple's case, motivated by economies of scale with their mobile platforms.

The two trends happen to converge in the Apple case, but it doesn't have to be that way. Windows-world could easily run x64 on a SoC in a Dell laptop. In fact, doesn't AMD already call some of their x64 CPUs SoCs?

2) I actually think the mainframe analogy makes sense. Mainframes continue to exist, continue to run their own ISA backwards compatible to 1964, but they have adopted some technologies from mini/microcomputers and smartphones over the years (such as moving to CPUs made using the same transistors as microcomputers or smartphones, networking/storage technologies, etc) and they have become relegated to ever more specialized workloads. Even, say, some of the big travel systems (which were one of the killer apps for mainframes in the 50s/60s) are moving away from mainframes - I think it's Amadeus that got rid of all of their mainframes a few years ago.

I suspect IBM, if someone actually wanted it, could design you a SoC that implements zArchitecture and Samsung/TSMC could make it, and then create a smartphone-sized device with full mainframe compatibility. But unless bank software vendors have travelling sales reps who need to pull a demo out of their bag, no one probably wants a zArchitecture SoC.

The thing is - x86 isn't going away. There's too much half-zombified x86 software that businesses rely on written in the 1990s/2000s by companies that have since gone through three mergers and laid off most of the developer teams. But whether that software might run in emulation, in hosted desktops running on public clouds, etc is an open question, and certainly the era of everybody having an x86 system on their desk is ending.
 

Longplays

Suspended
May 30, 2023
1,308
1,158
Two things:
1) SoC and x86 are not incompatible.

If you look at x86 over the last 25 years, x86 systems have gotten more and more integrated in a number of subtle waves. Not as integrated as a smartphone, sure, but you've seen what used to be discrete controllers moving into multipurpose chipsets (the so-called north/south bridges in the later 1990s to the late 2000s), then graphics, PCI-e controllers, memory controllers, etc all moving onto the processor die, particularly starting in the 2010s.

The rise of ARM is a different phenomenon; in Apple's case, motivated by economies of scale with their mobile platforms.

The two trends happen to converge in the Apple case, but it doesn't have to be that way. Windows-world could easily run x64 on a SoC in a Dell laptop. In fact, doesn't AMD already call some of their x64 CPUs SoCs?

2) I actually think the mainframe analogy makes sense. Mainframes continue to exist, continue to run their own ISA backwards compatible to 1964, but they have adopted some technologies from mini/microcomputers and smartphones over the years (such as moving to CPUs made using the same transistors as microcomputers or smartphones, networking/storage technologies, etc) and they have become relegated to ever more specialized workloads. Even, say, some of the big travel systems (which were one of the killer apps for mainframes in the 50s/60s) are moving away from mainframes - I think it's Amadeus that got rid of all of their mainframes a few years ago.

I suspect IBM, if someone actually wanted it, could design you a SoC that implements zArchitecture and Samsung/TSMC could make it, and then create a smartphone-sized device with full mainframe compatibility. But unless bank software vendors have travelling sales reps who need to pull a demo out of their bag, no one probably wants a zArchitecture SoC.

The thing is - x86 isn't going away. There's too much half-zombified x86 software that businesses rely on written in the 1990s/2000s by companies that have since gone through three mergers and laid off most of the developer teams. But whether that software might run in emulation, in hosted desktops running on public clouds, etc is an open question, and certainly the era of everybody having an x86 system on their desk is ending.
I think AMD calls their SoCs APU?
 

Homy

macrumors 68030
Jan 14, 2006
2,507
2,459
Sweden
Okay, they just confirmed upgradable memory and GPU is not coming to Apple Silicon so now we can put that discussion to rest.

"Fundamentally, we've built our architecture around this shared memory model and that optimization, and so it's not entirely clear to me how you'd bring in another GPU and do so in a way that is optimized for our systems," Ternus told Gruber. "It hasn't been a direction that we wanted to pursue."

 

burgerrecords

macrumors regular
Jun 21, 2020
222
106
I actually think the mainframe analogy makes sense.
the analogy breaks for to me in the sense that the change from mainframe to x86 personal computer is much more dramatic than x86 to soc; SoC Macs are still used as personal computers in essentially the same way that personal computers have been for the past 25+ years (whether powerpc or x86 or even lingering 68k in 1998). I like low power draw of the apple silicon, but it isn't having an effect on my productivity.

in the medium term, virtualization of x86 on soc when so many good native x86 software exists on such inexpensive hardware doesn't make sense at all.

Also medium term though, for computing that doesn't involve manipulating large high resolution files on their client machine of some type or another, productivity applications will continue to become more and more browser based. In that environment apple could be even more competitive not necessarily based on actual horsepower, but based on ecosystem familiarity and ease of use.
 

VivienM

macrumors 6502
Jun 11, 2022
496
341
Toronto, ON
the analogy breaks for to me in the sense that the change from mainframe to x86 personal computer is much more dramatic than x86 to soc; SoC Macs are still used as personal computers in essentially the same way that personal computers have been for the past 25+ years (whether powerpc or x86 or even lingering 68k in 1998).
I wasn't really around at the time, but I don't think there was really a change from mainframe to x86 directly.

First you got minicomputers - DEC PDPs, VAXes, etc - and those took some of the mainframe's workloads away. Mainframes kept doing the big stuff.

Then RISC machines (e.g. SPARC) that continued to run *NIX ate the CISC minicomputers.

Then, a good chunk of the RISC guys running *NIX, plus the legacy of the minicomputer era's biggest player, DEC, made a very wrong bet on Itanium.

With most of the RISC guys on Itanium, Sun as the main remaining player, etc, x86, and more particularly, x64 (running either NT or Linux) starts seriously eating the RISC machines' lunch. And as that continues, the x64 machines get bigger, moving into bigger workloads that previously would have been reserved for the biggest boxes from Sun or whoever. (To put it another way - if you were starting eBay in the late 1990s and you needed a giant database server, you bought a Sun Enterprise 1000. Some years later, you might have bought a big x64 box, then a few years after that, you would have hosted your database in a public cloud.)

Meanwhile, the mainframes have kept mainframing in at least some of the core segments, e.g. banking, etc., and have been replaced directly in some other cases by virtualized/cloud x86 boxes.

But really, the world has probably been moving away from mainframes slowly starting at some point in the 1970s. And yet, it's worth noting, the IBM mainframes have outlasted all the minicomputer/RISC stuff. When's the last time DEC released a new VAX CPU? Or the last time Sun released a new SPARC chip? IBM released a new zArchitecture one a year or two ago...

I don't know why you keep saying "SoC" - the fact that the Apple silicon chips are SoCs is... not that significant. The i5-1038NG7 in the 2020 MacBook Pros is not that far from a SoC. It runs a different ISA, it has a different memory architecture, etc. And it has much bigger transistors. The AMD CPU in my Steam Deck is an x64 SoC, I believe - you could have put a SoC like that in a MacBook and it would be a SoC, yet it wouldn't be Apple Silicon or ARM or the underlying philosophy of Apple Silicon.

What's more important is that Apple Silicon indicates that smartphone technology is creeping its way up and replacing traditional PC technology. Just like, say, an x64 server running Solaris or OpenVMS is fundamentally just another x64 IBM PC compatible, an Apple Silicon Mac is fundamentally a smartphone with a big display and a physical keyboard running a Mac's GUI and adding some drivers for some types of peripherals not commonly used with smartphones. And the fact that such a thing... turns out to be much better at being a Mac and doing 98% of Mac workflows than a Mac built out of traditional PC technology says a lot. 15 years ago, a leading smartphone chip was a joke compared to Intel's industry-leading traditional PC technology; today, it's... the other way around.
 

TechnoMonk

macrumors 68030
Oct 15, 2022
2,606
4,114
And that's where we come back full circle.

A beige Power Mac G3 was not a niche product; it was for the vast majority of people, or at least the vast majority of 'pro' users regardless of what their work was. Probably at least some home users if they could get their hands on one.

Same with, say, all of the G4s with their base US$1699 pricing.

This started to change with the G5 - pricing went up, 'extremeness' went up - and the Mac Pro, which consistently rejected Intel's mainstream chips for their serious workstation architecture and also kept upping the base entry price. But still... a low-end 5,1 configuration for US$2499 was for a reasonable number of people. And at least some of those folks were not traditional pro users but enthusiasts who had embraced the G3/G4/G5/Mac Pro as the Mac world's closest equivalent to building your own Windows box.

But now the successor to these machines is US$5999 or US$6999, lost many of the things that some people found appealing, and basically is aimed at only one set of people - big-budget professional people who do things that require PCI-E cards that are not GPUs for their work, two network interfaces, or a few other very niche requirements.

Put another way, the 2013/2019/2023 Mac Pros are only for people who, if they were Windows users, would pull out their employer's credit card and order a Dell or Lenovo workstation with expensive, fully-warrantied OEM RAM, a long on-site warranty, and components certified by particular software vendors. Not for people who, if they were Windows users, would go down to the computer store or Newegg, buy a big pile of parts, put them together, and then replace one or two of those parts every year or two. :(

The irony is that Apple has actually delivered a reasonable headless desktop for the first time in at least 10 years... but it has zero, zero modularity.
It’s very simple. Apple has gone the way of a CPU/GPU in a single SOC and memory unified in single package. I like the direction Apple is going with unified memory. I couldn’t care less about G3, G5 or what ever happened in the past. My last Mac Pro was 2012, moved on to AMD thread Ripper and NVidia 3090 and now upgraded to 4090. There are certain tasks, M1 Max MBP with 64 GB can do much better than my 4090 which runs out of VRAM limited to 24 GB. Hopefully, Apple can beef up tensor cores/GPU and bump unified memory.
 

ADGrant

macrumors 68000
Mar 26, 2018
1,689
1,059
You're right, I don't have an M2 ultra, nor will I. The noise issue is also real for the M1 based Studios, and no way will I ever chance that again without a big redesign. I know most don't have the problem, but I'm "lucky" that way. I just traded my Studio Max in for a Mini Pro. The Mini will have to do.

As for having a thermal problem, you seemed to be falling into that side of the argument questioning those that said it doesn't have a problem. Remember you're talking about basically a mobile chip in a desktop with desktop level cooling.
Since none of us has a Mac Studio with an M2 Ultra, I don't see how we can know if there will be any thermal issues. The Ultra is the one Apple Silicon SoC that is not available in any mobile device and the cooling system in the Studio is upgraded in the Ultra version.

I think we can safely assume that there will be no thermal issues with the Mac Pro because that cooling system was designed for a 14nm 28 core Intel CPU.
 

VivienM

macrumors 6502
Jun 11, 2022
496
341
Toronto, ON
Since none of us has a Mac Studio with an M2 Ultra, I don't see how we can know if there will be any thermal issues. The Ultra is the one Apple Silicon SoC that is not available in any mobile device and the cooling system in the Studio is upgraded in the Ultra version.

I think we can safely assume that there will be no thermal issues with the Mac Pro because that cooling system was designed for a 14nm 28 core Intel CPU.
One thing I briefly noticed but didn't spend any time looking into - when I was looking for Geekbench M2 Ultra numbers, I got the impression there were two clock rates of M2 Ultras in there.

Is it possible that the variant in the Mac Pro is clocked higher or something?
 

ADGrant

macrumors 68000
Mar 26, 2018
1,689
1,059
But... the fundamental question is why Windows should be around for the ride. No one has been writing any new Windows software for over a decade - Chrome is the new OS for pretty much any new project, with perhaps an Electron version if you want to trick people into thinking you have a native desktop app.

I wrote some new Win32 code just last week (using the classic C APIs and my tower PC with a Threadripper).

Microsoft has a large development team working on Microsoft Office for Windows (the core product, not the abomination know as Teams). Adobe also develops a lot of Windows software.
 
  • Like
Reactions: bobcomer

VivienM

macrumors 6502
Jun 11, 2022
496
341
Toronto, ON
Microsoft has a large development team working on Microsoft Office for Windows (the core product, not the abomination know as Teams). Adobe also develops a lot of Windows software.
Sure, but those are not new products - they're just newer versions of long-established code bases. Look at new products - things that did not exist 15 years ago. If you had a great idea for software to do X (especially if X has a databasey/client-server aspect to it) in 2010, went and got a bunch of investor money, and built something to do X, there's a very, very good chance that you wrote that for Chrome. Your investors love how web technologies basically almost guarantees a subscription business model too.

I will note that Adobe's "Creative Cloud Desktop App" (probably the buggiest piece of Creative Cloud) is written in Electron. And let's not talk about 'the abomination known as Teams' - it amazes me that Microsoft, a company which should still have a top-tier team of Windows developers and certainly used to have a top-team team of Mac developers, is somehow embracing Electron (and doing a bad job of it) with Teams. It's one thing if some scrappy startup with 3 developers says they don't have the resources for anything but Electron, but Microsoft?!?

And how do you feel about the plan to rewrite Windows Outlook in Edge WebView2? Probably the same person who came up with Windows 8 came with that one - I still don't understand who thought trashing a big bloated mess of a program with a broad ecosystem of third-party add-ins that businesses rely on every day was a good idea. Big legacy Windows Outlook is probably half the reason for choosing Office 365 over G Suite...
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.