Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Adarna

Suspended
Original poster
Jan 1, 2015
685
429
I don't think you understand the target market audience of the Mac Pro.

Those that buy them are not looking at buying them from the standpoint of "is my computer the latest and the greatest", they look at it strictly as a matter of "will this tool help me do what I need to do and as quickly as I need to do it" and also "how long will I be able to have support for both my hardware, Apple's software, and the third party tools that I need to run on it".

Mac Pros last WAY longer than MacBook Pros and, for the most part, longer than most iMacs as well. They're tanks. The Xeons that have gone into them are tanks. Most Intel processor technology isn't designed to last as long as your average Xeon, nor is it rated to be supported as long. That's why the Mac Pros have, historically, had the longest support life of any Apple Mac-based product. Someone who bought a 2019 Mac Pro will get 10 years of use out of it, easily. There will surely be those that try and even succeed at getting 20 (despite eventually losing software support for the OS well before that 20 year mark).

Incidentally, there's a LOT of professional-grade software that probably shouldn't be run in Rosetta 2 (despite likely being usable in Rosetta 2) that STILL isn't native and probably won't be native for a few more years to come. Someone who bought a 2019 Mac Pro even as recently as two weeks ago (and really needed one of those over a 27" iMac or anything with the M1/M1 Pro/M1 Max), probably still made the right call in doing so.

Furthermore, something doesn't become useless just because something better than it comes out.
My 2012 iMac has security updates ending in 2022.

2009 MBP I researched on had security updates until 2018.

From 2013 I'd want the Mac Pro to be refreshed every 4-5 years

So that would be
  • 2013 Intel
  • 2017/2018 Intel
  • 2021/2023 Apple silicon
When the Mac Pro with Apple Silicon debuts keep selling the Intel Mac Pro until demand dwindles to <20% of Apple silicon.

2013 iMac Pro came out because customers demanded for a pro desktop with 2017 Xeon chips. Same reason why 2019 Mac Pro was shipped later as users wanted a tower Mac.

It personally bothers me that the refresh was done this way.
 
Last edited:

Analog Kid

macrumors G3
Mar 4, 2003
9,360
12,603
I would think that the AS Mac Pro will have ECC DIMM slots.
Which raises the question of whether the existing M1/Pro/Max memory controllers support ECC RAM, or if this is another indication that there’s another chip architecture in process. ECC was important enough to Apple in the Intel machines that they were willing to sacrifice some performance and a lot of cost to move to Xeons and ECC. I tend to agree they’ll use ECC in the AS Mac Pros as well.
 

Taz Mangus

macrumors 604
Mar 10, 2011
7,815
3,504
Unified memory, macOS, Apple Silicon destroys PCs with more ram. Even in ram stress test Apple Silicon delivers incredible results Instead Windows slowed down to stone speed.

Pathetic performance on the Surface laptop. Battery life is also not great.
 
  • Like
Reactions: ikir

quarkysg

macrumors 65816
Oct 12, 2019
1,247
841
Which raises the question of whether the existing M1/Pro/Max memory controllers support ECC RAM, or if this is another indication that there’s another chip architecture in process. ECC was important enough to Apple in the Intel machines that they were willing to sacrifice some performance and a lot of cost to move to Xeons and ECC. I tend to agree they’ll use ECC in the AS Mac Pros as well.
I would think supporting ECC would not be too difficult and I don't think it'll require a lot of die real estate. I'll not be surprised if the memory controllers in even the M1 already baked in support for LPDDR5 and ECC.

From what I know, ECC is kind of a requirement for systems with a huge amount of RAM, as the probability of getting a flipped bit is higher. In fact, ECC should be mandatory for all computer systems regardless of memory capacity.
 

cmaier

Suspended
Jul 25, 2007
25,405
33,474
California
I would think supporting ECC would not be too difficult and I don't think it'll require a lot of die real estate. I'll not be surprised if the memory controllers in even the M1 already baked in support for LPDDR5 and ECC.

From what I know, ECC is kind of a requirement for systems with a huge amount of RAM, as the probability of getting a flipped bit is higher. In fact, ECC should be mandatory for all computer systems regardless of memory capacity.
ECC is pretty trivial to support.
 

Analog Kid

macrumors G3
Mar 4, 2003
9,360
12,603
ECC is pretty trivial to support.
I would think supporting ECC would not be too difficult and I don't think it'll require a lot of die real estate. I'll not be surprised if the memory controllers in even the M1 already baked in support for LPDDR5 and ECC.

From what I know, ECC is kind of a requirement for systems with a huge amount of RAM, as the probability of getting a flipped bit is higher. In fact, ECC should be mandatory for all computer systems regardless of memory capacity.
Agreed, it’s trivial, but I haven’t seen any indication one way or the other whether it’s present in the current parts…. If it isn’t in the current M1 Max, for example, that would be further evidence that the Pro desktops will be built around a different part.
 

iPadified

macrumors 68020
Apr 25, 2017
2,014
2,257
How much RAM do I need bioinformatics?

In general, 32 cores and 128 GB of RAM is usually sufficient for most common bioinformatics pipelines to run within a reasonable timeframe. With that being said, some programs might require much less than this, while others may have much higher memory requirements or enable greater parallelisation.Feb 18, 2021



My son's workplace does this stuff running multiple pipelines simultaneously.

My last job typically allocated servers with 1.4 TB of RAM so I assume that our customers were doing something big to require that much RAM. One of my friends (Senior Manager) at a big semiconductor company told me that this is the kind of configuration that his engineers use for chip design.
Hm, our bioinformatics developers has relatively old MBPs and let a mainframe/supercomputer/cloud computing do the heavy lifting. That is quite common. The means that you do not need 32 cores and 128 Gb on your desk.
 

pshufd

macrumors G4
Oct 24, 2013
10,151
14,574
New Hampshire
I don't think you understand the target market audience of the Mac Pro.

Those that buy them are not looking at buying them from the standpoint of "is my computer the latest and the greatest", they look at it strictly as a matter of "will this tool help me do what I need to do and as quickly as I need to do it" and also "how long will I be able to have support for both my hardware, Apple's software, and the third party tools that I need to run on it".

Mac Pros last WAY longer than MacBook Pros and, for the most part, longer than most iMacs as well. They're tanks. The Xeons that have gone into them are tanks. Most Intel processor technology isn't designed to last as long as your average Xeon, nor is it rated to be supported as long. That's why the Mac Pros have, historically, had the longest support life of any Apple Mac-based product. Someone who bought a 2019 Mac Pro will get 10 years of use out of it, easily. There will surely be those that try and even succeed at getting 20 (despite eventually losing software support for the OS well before that 20 year mark).

Incidentally, there's a LOT of professional-grade software that probably shouldn't be run in Rosetta 2 (despite likely being usable in Rosetta 2) that STILL isn't native and probably won't be native for a few more years to come. Someone who bought a 2019 Mac Pro even as recently as two weeks ago (and really needed one of those over a 27" iMac or anything with the M1/M1 Pro/M1 Max), probably still made the right call in doing so.

Furthermore, something doesn't become useless just because something better than it comes out.

A trading friend asked me to help him configure a Mac Pro back in late 2019. His last Mac Pro was purchased in 2012 or 2013 and it had intermittent crash issues. I suspect a faulty PSU but he wanted to upgrade so I just explained his options and the benefits of those options and he placed an order for it. My guess is that his use of the Mac Pro helps him generate at least $500,000 a year in his regular work so $10K for a computer is a very small cost compared to the utility in his making money. He had an older Mac Pro before that.

It's just a tool to him and I think that most people or companies that buy them look at them that way.

I see a lot of people that earn their livings using MacBook Pros and Mac minis. I think that there are many that even do that with the base M1 models now as well. They have so much punch that uses can be personal, entertainment, work or the main component of your business.
 

Taz Mangus

macrumors 604
Mar 10, 2011
7,815
3,504
I think we will see the same situation happen with the AS for Macs that has happened for the AS for iPhone/iPad. Apple will progress so far ahead of Intel and AMD that neither of them will be able to match Apple in performance and power efficiency. This is the situation with QualComm Snapdragon and Samsung Exynos, both can't match Apple A series in performance and power efficiency.
 

darngooddesign

macrumors P6
Jul 4, 2007
18,366
10,128
Atlanta, GA
If Apple were cool they would call it the M1-Quadra.

The M1-Quadra needs to be a lot more powerful than Alder Lake, because AL isn't a workstation/pro-tower chip.
 
Last edited:
  • Like
Reactions: Technerd108

Tagbert

macrumors 603
Jun 22, 2011
6,259
7,285
Seattle
Not falling for the clickbait but why compare to <=$620 consumer desktop Alder Lake? Wait for 3nm Raptor Lake HPC in 2022 for a fair comparison. Plus, what good is fast hardware when hardly any native software exist?
I'm sure that there will be some software running on Alder Lake processors soon. ?
It only took a few months for most current Mac software to be ready for Apple Silicon.

 
Last edited:

rhysmorgan

macrumors 6502
Dec 14, 2008
317
122
Cardiff, Wales
We don't even know if it will fully be replaced anytime soon. Sure, there are workflows for which the AS MacPro might replace the Intel one, but others probably won't allow that. I'd be a little wowed if Apple pulls it off next year to provide a SoC based MacPro with 1 to 1.5TB of RAM. And if they decide to go "the old way", then that integrated unified memory advantage is gone.
I can see them potentially going down a hybrid route.
Lots and lots of unified memory, plus expansion slots for DDR5 DIMMs.
 

falkon-engine

macrumors 65816
Apr 30, 2010
1,307
3,093
Why do people think that gamers are the target market for apple pro level products. Sure you CAN game on them but I don't know anyone with a Mac personally who is caring about running a FPS. Yeah the target Demo for MacRumors forums users are the type that might try to run a game or two but I don't think the the average customer cares that much. If you're a gamer, you want a machine that's upgradable so you can pop different video cards and upgrade the Ram etc and that's just not possible unless you have a Mac Pro and who buys a Mac Pro and a $6000 display to run Crysis?
Just because you're limited in the amount of people you know personally doesn't mean that there isn't desire amongst many of us in the Mac community for better gaming on macOS. The new Macbook Pros are amazing, but they're also expensive... being able to run games while simultaneously being able to use the machine for pro tasks like video editing and so on isn't some magical wish. It would put $3000 to good use, rather than needing to have two separate systems (one for pro tasks and another for gaming).

Apple is clearly iterating on its hardware, adding 32 GPU cores and up to 64 GB of unified RAM. What's lacking is investment in the software, which includes AAA game developer support. That is something that Apple could change, if it invested the requisite time and money. Hardware is only one side of the equation. The other important side is software.
 
Last edited:
  • Like
Reactions: ikir and rhysmorgan

danwells

macrumors 6502a
Apr 4, 2015
783
617
Just because you're limited in the amount of people you know personally doesn't mean that there isn't desire amongst many of us in the Mac community for better gaming on macOS. The new Macbook Pros are amazing, but they're also expensive... being able to run games while simultaneously being able to use the machine for pro tasks like video editing and so on isn't some magical wish. It would put $3000 to good use, rather than needing to have two separate systems (one for pro tasks and another for gaming).

Apple is clearly iterating on its hardware, adding 32 GPU cores and up to 64 GB of unified RAM. What's lacking is investment in the software, which includes AAA game developer support. That is something that Apple could change, if it invested the requisite time and money. Hardware is only one side of the equation. The other important side is software.
The Mac Pro isn't the gaming machine... If you were talking about the 27" MBP, MAYBE there needs to be a gaming optimization (a lot of creative pros don't want to see the instability that AAA game support brings).

One of the reasons MacOS is generally more stable than Windows is that it doesn't make concessions to games, which want to access the hardware at a lower level than most other software.

I'd say the ideal solution would be to offer an OS-level switch, where you could boot either with gaming off (more stable, but games that want heavy hardware access won't run) or on (allows much-increased game compatibility, but at a stability cost). Of course, little casual games will run either way - those don't have special requirements. Should be switchable with a reboot without having to reinstall anything.
 

crazy dave

macrumors 65816
Sep 9, 2010
1,453
1,229
The Mac Pro isn't the gaming machine... If you were talking about the 27" MBP, MAYBE there needs to be a gaming optimization (a lot of creative pros don't want to see the instability that AAA game support brings).

One of the reasons MacOS is generally more stable than Windows is that it doesn't make concessions to games, which want to access the hardware at a lower level than most other software.

I'd say the ideal solution would be to offer an OS-level switch, where you could boot either with gaming off (more stable, but games that want heavy hardware access won't run) or on (allows much-increased game compatibility, but at a stability cost). Of course, little casual games will run either way - those don't have special requirements. Should be switchable with a reboot without having to reinstall anything.

I’m assuming you mean the 27” iMac? Obviously a 27” MBP would a bit big to lug around. ;) Either way though, the hardware in the 14/16” MBP is quite capable for laptop gaming.

The issue of application/gaming software optimization has little to do with the OS. At least not in the way you’ve written here. The Metal API and macOS frameworks are no lower or higher level than DX12/Vulkan and the Windows frameworks. I’m assuming you’re thinking of drivers? It’s true that Nvidia and AMD will release game-optimized drivers for their GPUs when big AAA titles are launched and sometimes those drivers can be buggy even if they run the game faster. Apple now has full stack control from the GPU hardware to the drivers to OS frameworks and APIs. So I suppose Apple could do the same but are unlikely to do so.

The professional software that Apple has been touting their performance in also wants “heavy access” to the bare metal too. The issue of gaming software support is the rewriting of renderers and code paths to support aarch64 CPUs with TBDR GPUs and unified memory. That simply takes time to do and optimize. That has little to with specific qualities of macOS other than it not being Windows and therefore requires different libraries and frameworks.
 

binarysmurf

macrumors member
Jun 2, 2008
79
71
Western Australia
They’d do anything to make a buck on YouTube. Their content is made for monetization purposes only.

Totally agree. That channel is one I have purposely unsubscribed from. They simply regurgitate info from other sources and is obviously nothing more than clickbait the vast majority of the time. How he has so many subscribers, I don't know.
 
Last edited:

binarysmurf

macrumors member
Jun 2, 2008
79
71
Western Australia
One of the reasons MacOS is generally more stable than Windows is that it doesn't make concessions to games, which want to access the hardware at a lower level than most other software.

This hasn't been true for a long time. There are hardware abstraction layers (DirectX under Windows, and Metal for macOS) that remove the need for game devs to hit the hardware directly while still providing high performance gaming capability.

The problem with gaming on the Mac is quite simply that GPU hardware hasn't been up to scratch, and there hasn't been enough focus by developers on AAA Mac games because of this. Add to that the relatively low level of Mac market share relative to Windows and there's not much incentive for serious game development on the Mac. Hopefully this will change with the AS hardware and advancements in Metal.

I've had a great time gaming on my Mac system for years. It runs the style of games I love to play and there are plenty of options. However.. I didn't buy my Mac as a gaming rig, the fact I can game at all is a bonus.
 

jjcs

Cancelled
Oct 18, 2021
317
153
I think we will see the same situation happen with the AS for Macs that has happened for the AS for iPhone/iPad. Apple will progress so far ahead of Intel and AMD that neither of them will be able to match Apple in performance and power efficiency. This is the situation with QualComm Snapdragon and Samsung Exynos, both can't match Apple A series in performance and power efficiency.

I see the Reality Distortion Field that mostly dissipated after the transition from the PowerPC is building back up....
 
  • Haha
Reactions: singhs.apps

AetherMass

macrumors newbie
Oct 19, 2017
13
19
  • Large scale 3D rendering. Complex models can consume a lot of RAM.
  • Scientific modeling with large datasets.
I’m sure there are others
Data scientist, computer scientist, programmer here.

Anyone buying a $5k or higher cost machine for scientific modeling, machine learning, or statistical work should certainly move to the cloud in 2021. Cloud setups are much more scalable and probably cheaper for 99% of business applications.

I understand that 3D rendering is the same in 2021. Also, security is not a legit response to this. The cloud is as secure or more secure than your local workstation if managed properly.

Still, I understand that some individuals may choose to work locally out of habit… if this is for a business, it will be tough to compete with businesses that use the cloud.

Seems like a shiny toy but not really practical or cost effective.
 
  • Like
Reactions: iPadified

cmaier

Suspended
Jul 25, 2007
25,405
33,474
California
Data scientist, computer scientist, programmer here.

Anyone buying a $5k or higher cost machine for scientific modeling, machine learning, or statistical work should certainly move to the cloud in 2021. Cloud setups are much more scalable and probably cheaper for 99% of business applications.

I understand that 3D rendering is the same in 2021. Also, security is not a legit response to this. The cloud is as secure or more secure than your local workstation if managed properly.

Still, I understand that some individuals may choose to work locally out of habit… if this is for a business, it will be tough to compete with businesses that use the cloud.

Seems like a shiny toy but not really practical or cost effective.

Security is absolutely a legit response to this. There are federal laws, regulations, and ethical obligations that prevent the use of the cloud in many cases, absent special contracts and arrangements that ensure that data is hosted only in the jurisdiction, that only people subject to special contracts or agreements have physical access to the server (or, worse, that only employees of the organization that “owns” the data does), etc. There are entire industries that cannot use the cloud (unless you are including self-hosted clouds).
 

JMacHack

Suspended
Mar 16, 2017
1,965
2,424
Security is absolutely a legit response to this. There are federal laws, regulations, and ethical obligations that prevent the use of the cloud in many cases, absent special contracts and arrangements that ensure that data is hosted only in the jurisdiction, that only people subject to special contracts or agreements have physical access to the server (or, worse, that only employees of the organization that “owns” the data does), etc. There are entire industries that cannot use the cloud (unless you are including self-hosted clouds).
Not to mention that more than a few freelance operators probably don’t want to render some of their work on other peoples machines.
 

leman

macrumors Core
Oct 14, 2008
19,521
19,679
Data scientist, computer scientist, programmer here.

Anyone buying a $5k or higher cost machine for scientific modeling, machine learning, or statistical work should certainly move to the cloud in 2021. Cloud setups are much more scalable and probably cheaper for 99% of business applications.

I understand that 3D rendering is the same in 2021. Also, security is not a legit response to this. The cloud is as secure or more secure than your local workstation if managed properly.

Still, I understand that some individuals may choose to work locally out of habit… if this is for a business, it will be tough to compete with businesses that use the cloud.

Seems like a shiny toy but not really practical or cost effective.

Couldn't disagree more. First of all, computers are dirt cheap compared to other expenses. A $5k MacBook Pro? That's like less than 0.5% of the personal expenses for a single employee amortized over time. Second, you are still doing development and testing on a local machine. A faster, higher quality local workstation means I can do my work quicker and with more comfort. Third, of course heavy-duty production work will be done on some sort of high-throughput cluster (be it the cloud or a supercomputer or a local farm). And frankly, I am not starting a cluster job for something my laptop can do in an hour: I just run it over the lunch break or run it in the background while I am writing a paper.

So no, what you say does not make any sense to me.
 
  • Like
Reactions: throAU and bobcomer

theorist9

macrumors 68040
May 28, 2015
3,881
3,060
Data scientist, computer scientist, programmer here.

Anyone buying a $5k or higher cost machine for scientific modeling, machine learning, or statistical work should certainly move to the cloud in 2021. Cloud setups are much more scalable and probably cheaper for 99% of business applications.

I understand that 3D rendering is the same in 2021. Also, security is not a legit response to this. The cloud is as secure or more secure than your local workstation if managed properly.

Still, I understand that some individuals may choose to work locally out of habit… if this is for a business, it will be tough to compete with businesses that use the cloud.

Seems like a shiny toy but not really practical or cost effective.
Just out of curiosity:

What if your development work is highly interactive—in that case might cloud latency be an irritant? As you know, cloud latency is about more than just your internet connection. And, IUUC, providers are generally unwilling to guarantee minimum standards for latency, which suggests to me it is a real issue.

And can you configure a cloud setup so interacting with it is indistinguishable from interacting with your machine locally, including displaying across multiple monitors (as opposed to being like interacting with a virtual desktop, which I've found is quite cumbersome—especially if you want to display it across multiple monitors)?

If working in the cloud really were indistinguishable from working locally (such that I couldn't tell the difference), then that might be interesting. But I suspect it's not.
 
Last edited:

Taz Mangus

macrumors 604
Mar 10, 2011
7,815
3,504
I see the Reality Distortion Field that mostly dissipated after the transition from the PowerPC is building back up....
Except this time Apple is the one in control. This is the holly **** moment and Intel knows it. Joke all you want and yet here we are discussing how Apple has put Intel and even AMD on notice. This is the reality.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.