Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
What ECC? On which quadros? Check from the source and educate yourself:
https://devblogs.nvidia.com/inside-pascal/

Don’t write like summer intern in marketing department....

so does that make you the winter intern? from your own posted link....

ECC Memory
Another HBM2 benefit is native support for error correcting code (ECC) funtionality, which provides higher reliability for technical computing applications that are sensitive to data corruption, such as in large-scale clusters and supercomputers, where GPUs process large datasets with long application run times.

ECC technology detects and corrects single-bit soft errors before they affect the system. In comparison, GDDR5 does not provide internal ECC protection of the contents of memory and is limited to error detection of the GDDR5 bus only: Errors in the memory controller or the DRAM itself are not detected.

GK110 Kepler GPUs offered ECC protection for GDDR5 by allocating some of the available memory for explicit ECC storage. 6.25% of the overall GDDR5 is reserved for ECC bits. In the case of a 12 GB Tesla K40 (for example), 750 MB of its total memory is reserved for ECC operation, resulting in 11.25 GB (out of 12 GB) of available memory with ECC turned on for Tesla K40. Also, accessing ECC bits causes a small decrease in memory bandwidth compared to the non-ECC case. Since HBM2 supports ECC natively, Tesla P100 does not suffer from the capacity overhead, and ECC can be active at all times without a bandwidth penalty. Like the GK110 GPU, the GP100 GPU’s register files, shared memories, L1 cache, L2 cache, and the Tesla P100 accelerator’s HBM2 DRAM are protected by a Single‐Error Correct Double‐Error Detect (SECDED) ECC code.
 
  • Like
Reactions: fuchsdh
No it is not. That is why it suffers from endemic GPU meltdowns.

Every. Single. One. Will. Fail. When. Pushed. Hard. Enough.

And when those GPUs are replaced under warranty, or even out of warranty, they will fail as well.

If you have a (especially D700) cylinder Mac Pro that has not had a thermal failure, that merely indicates you haven't pushed its duty cycle sufficiently, not that you have one which will be "immune" from the problem.

I mean, I stand corrected if that's the case, but I come on MacRumors daily and hadn't really heard of that, so I didn't realise there were issues at all. Contrary to something like the 2011 MBPs which got a lot of coverage, and I was unfortunately an owner of that model.... Failed on me about a half a year before the repair program started. Anyway that's a bit irrelevant, but point is just that I can sympathise with anybody having a grudge with the MP 6.1 since overheating components is a real bummer. I genuinely didn't realise there was an issue though. In fact, I've seen stress test numbers suggesting sustained load at around 70°C which is reasonable.
[doublepost=1524193291][/doublepost]
Yes, if you have no fans (Cube 1.0). If you have a fan, natural convection becomes a minor factor.

Although natural convection may be a minor factor with a fan, why not take advantage of it anyway? I mean, even if it only helps the airflow 1%, it's a free 1% essentially. And it could give you lower idle fan speeds for when the system isn't being pushed anyways, since the natural airflow would mean more in those circumstances. What I'm saying is not that you should rely on it, but that it's a free positive to take advantage of essentially.

That is the problem. Coupled with the fact that Apple underclocked the components to fit the power envelope - rather than increasing the power envelope to match the day 1 components and perhaps future enhancements.

But if you bought it with those expectations - I mean, it delivered the performance advertised. 7TFLOPS with two D700s.

Now as per above I've been informed that apparently there were failures due to overheating, so I'll give you that, and not argue on that front any more, but if the thermal solution had held up to what was in the computer, I wouldn't have seen it as an issue. Not ideal for the Pro market, at least not as the sole offering, but a good design regardless, seeing that it looked sleek and worked well for the components it was offered with. Of course a pro system should be more versatile, but for the very specific market that just wanted what the Mac Pro already offered, again, if it hadn't had overheating problems, it was a good concept
 
That site is all about Teslas... I don't know what your point was.

Let me quote you Nvidia, in defence of my fellow poster.

For high-precision, data-sensitive applications, Quadro is the only professional graphics solution with ECC memory and fast double precision capabilities to ensure the accuracy and fidelity of your results. From medical imaging to structural analysis applications, data integrity and precision is assured, without sacrificing ...

And here's the source page.
http://www.nvidia.com/object/quadro-fermi-overview.html

Hell, I just Google ECC Quadro and that came up

Now granted; That's the old Fermi architecture they talk about, but still.
I guess, you really like the marketing talk...
from the link on Pascal, which is the first with HBM2. Read and educate yourself and check the magic with the recent Titan drivers, putting the card on par with the quadros.

“ECC Memory
Another HBM2 benefit is native support for error correcting code (ECC) funtionality...
.
.
GDDR5 does not provide internal ECC protection of the contents of memory and is limited to error detection of the GDDR5 bus only: Errors in the memory controller or the DRAM itself are not detected.

GK110 Kepler GPUs offered ECC protection for GDDR5 by allocating some of the available memory for explicit ECC storage. 6.25% of the overall GDDR5 is reserved for ECC bits. In the case of a 12 GB Tesla K40 (for example), 750 MB of its total memory is reserved for ECC operation, resulting in 11.25 GB (out of 12 GB) of available memory with ECC turned on for Tesla K40. Also, accessing ECC bits causes a small decrease in memory bandwidth compared to the non-ECC case. Since HBM2 supports ECC natively, Tesla P100 does not suffer from the capacity overhead, and ECC can be active at all times without a bandwidth penalty.
 
from the link on Pascal, which is the first with HBM2. Read and educate yourself and check the magic with the recent Titan drivers, putting the card on par with the quadros.


That's no different to what the guy you're replying to said. - But not all cards have HBM and the Quadros that use GDDR5 have ECC GDDR5
 
As a semi-pro user, I DO WANT the old cheese-grater case back. It was nearly perfect, aside from being overly heavy. It had 4 drive bays, easy to add/swap drives. The CPU tray was a nice, well engineered piece. Theoretically you could have upgraded only that for newer CPUs/RAM (but that never came about). It had 4 PCI slots, not enough for some but more than enough for most users. The 2x DVD drive was useful in its day but surely would not be needed now, so that space could be put to better use with more drives, or you move all the SSD drives there and give more space for another PCI/video card.
 
On the surface, this is a silly question. A pro user, is of course somebody who, professionally, makes money using their Mac. That's the literal definition. It's your profession. However, when you dig into it, it's an extremely good, and fascinating question.

You cannot categorise all pros in one box. You are technically a professional Mac user, if you use your Mac to order new supplies for your garage as a mechanic, but it's hardly what Apple or any of us think of, when we think of a Pro Mac user, and this use would never require the kind of hardware in a Pro-level machine.

Generally speaking when we talk of Pro users, we talk of people who require workstation hardware.
This is often the creative professionals - working with 3D modelling software, game development software, high-end animation software, Final Cut Pro (and other NLEs) etc.

It can however also be academia. In research, powerful and reliable computing power can greatly improve researchers' ability to do their jobs, and many universities even have supercomputers. The issue with supercomputers though, is that you have to queue up your jobs on them, waiting your turn, and likely only getting very limited time on it. Many researchers would love a computer like an iMac Pro, so that they wouldn't have to wait their turn on the supercomputer, but could instead start a computing job on their own computer, and get reliable data out. Maybe this would take a week whereas utilising the supercomputer it would only take 2 hours, but they might have to wait 3 months to get access to the supercomputer.

There are a myriad of other workflows too, like machine learning professionals, and hell, even my girlfriend has a workstation at work. She digitalises old archives for the government for public access, and they need a computer that can be hooked up to all their scanning stations, and handle thousands of pages hourly, cropping them based on preconfigured instructions, compressing them to internet friendly sizes and uploading them to their server. And again, it needs to be able to do this 24/7 constantly.

All in all, we have a relatively narrow-minded perspective of what a pro-user is, but what the computer companies mean, is somebody who makes money off of a workstation computer. One where 24/7 reliability is crucial. It's not even always about speed either. There are plenty of workstations slower than my 2014 iMac. But they have ECC memory and Xeon chips. Now with 16GB of RAM and 24/7 operation, you'll have a RAM error every 6 months on average. If you're thinking "I've not experienced that", well, you're probably lucky enough that it's in an unused part of the RAM, or that there was enough parity data that your OS could fix it. But it's a fact that it happens. - With ECC RAM, the error rate practically goes down to never. And that's the crucial difference. Workstation components are made to be more reliable. In many cases, faster as well, but that's because a large part of the Pro market that these companies target, is performance oriented too, like the Motion Graphics folk.



Mostly again, reliability. With a normal Core Intel chip, obviously it will be tested. But they may only test each chip for a few minutes, with a single stress test, and then ship it.

Xeon chips are way more thoroughly tested, and crucially, they are tested with a large variety of pro software, like Maya, AutoCAD and Cinema4D. These are examples of software I know Pro GPUs to be tested with by the way, I don't know Xeons are tested with these exactly.
Xeons also typically include more cores, which makes them substantially more expensive, since it's typical in chip production for parts of the chip to not be fully functional. The more cores you push on a chip, the more parts need to be functional. Now if you try and make an 18 core chip, and 4 of the cores have defects, you can just sell it as a 14 core one, but there are limits.
More important than that though is the fact that, as stated in my answer to Q1, Pro customers make money off of their computer. Thus, price isn't that big a concern. If you pay $1k extra for your PC/Mac now, if it'll save you an hour every day that you can spend on serving more clients, it'll pay for itself in the long run.
Xeon chips are also used in servers and data centers, again, an area that will generate a lot of money from the hardware over time. Hell a lot of the time the software licenses cost more than the computers do.

And that's just the CPU! All the components are like this! The ECC RAM, the power supply system, the GPU, and even the GPU drivers! FirePro and Quadro come with special Pro-level drivers that have been verified for a long suite of pro software. This verification process is expensive.

When it's stated the Mac Pro and iMac Pro isn't that expensive compared to similarly spec'd workstations it's not a lie. In fact, the Macs are quite cheap.
It's a very old example by now, but I've included a screenshot from the Anandtech review of the 2013 Mac Pro when it'd just come out, comparing it to similar offerings from Lenovo and HP. Note how cheap the Mac Pro was in comparison.



It's not really about what it looks like. Just that it can be expanded over time. You touched upon it yourself. People are still rocking the CheeseGrater, and upgrading it to be really powerful. That's what a Pro machine can do. It can take all the expansion you want to throw at it, so it can fit into a wide variety of workflows. Red Rocket X cards and GPUs, Fiber Channel cards, Xsan, capture cards and video interface cards, etc. And a power supply that can deal with anything.

It's not that the Trash Can was bad. It was brillant in fact. But it was a short term solution. If you pay that much money for a computer, it's nice to know that you don't need to replace it in five years, but rather that you can pluck in a few extra parts and keep rocking. Furthermore, some components are crucial for certain workflows that you can't expect a computer to come with standard, since it's so specialised, but the Trash Can didn't offer any way of connecting them either. Apple at the time said "Just use Thunderbolt 2!" And that would've been a great solution, but TB2 wasn't ready for it. Just look at what Apple is doing themselves. eGPUs only work on TB3 officially.
Plus it ruined the aesthetics anyhow.


Edit: Forgot to upload the screenshot I mentioned. Here it is.


Love the overall answers:

Generally speaking when we talk of Pro users, we talk of people who require workstation hardware.
This is often the creative professionals - working with 3D modelling software, game development software, high-end animation software, Final Cut Pro (and other NLEs) etc.


Workstation Hardware is a jaded concept in so many cases. In Applespeak it refers to "Xeon CPUs with ECC memory". Several of those creative professionals you listed, and I'll include photographers, graphic (independent and agency types) and industrial designers, have been primarily using the 2014-2017 iMac. I've also noticed a lot of photographers and web developers are now migrating to laptops, and I have to believe graphic designers are doing the same.

Corporate in-house departments actually require their employees to take their computers/workstations home after work. I could see the same being true of creative agencies (ad and design). With the highspeed internet, it's now easy to work from home and link to directly to the company's servers.

Professional Apple user workstations have been less reliant on Apple's definition of "workstation hardware". In fact, the whole point of the iMac Pro was inspired by these creative professionals already using the i7 iMacs as their primary workstation. The big thing Apple missed on was; those users partly chose the iMac because it was less expensive than the Mac Pro. It had a better "price-to-performance" ratio.

On the Windows side, the professional photographers, videographers and designers who use the same software for their professional work as their Apple counterparts, also opt for the 6700k/7700k and now the 8700K. There has even been a migration - me included - who vacated Apple and moved to Windows because we know Intel offers more than 4 cores in the i7 series.

With the x299 series, non-xeon CPUs directly aimed at the 'pro market', There is a LOT to option to customize your workstation for your needs. For Apple, if you want more than 4-cores, you needed to pay $4,000 for 6-cores, and $5,000 for 8-cores back in 2013. Right now, the 2013 Mac Pro is $4,000. For the tower alone.

I built my 7820x Radeon, 32GB RAM, RX 580 Capture One Pro workstation for $1,950. I chose the Meshify C case, as it has fantastic airflow. ...This is exactly the non-xeon hardware I've wanted from Apple for such a LONG LONG time.

This is where Apple is missing the boat for so many users. Intel offers CPUs to create a wide array of workstations for professional users of all types. Apple simply doesn't want to cannibalize their high priced Xeon workstations for less expensive solutions.

The machine I'd love to see from Apple, and I've mentioned this numerous times on these forums, is an entirely user-upgradable triple or quad height Mac Mini with 6-core 8700K, up to 64GB of memory, dual NVMe SSD slots and an internal 2.5" drive bay. I could deal with the GPU being proprietary in this smaller form factor, as you can add a secondary GPU via thunderbolt 3.

However, they Apple could also do it this way: 4-core 7740x, 6-core 7800x, 8-core 7820x, 10-core 7900x. They could use the Radeon RX 560 for their base GPU, and include an upgrade option to the Radeon RX 580, Vega 56 and 64 GPUs. If you wanted to get a GTX 1080ti, you can add that via thunderbolt 3.

The 7800x with the Radeon RX 580, an SSD boot drive and 16GB RAM shouldn't cost the buyers anything more than $1,800, with an upgrade path to the 7820x being another $300 an top of the $1,800 -- so looking at $2,100. Tack on another $100 for the RX 580 upgrade, and it's $2,200. Considering Apple would get their parts at wholesale, that's still a decent margin. I'd easily pay a $200-$300 premium for this over what I have no - as I prefer the Mac OS.


Case wise, even if they went larger than a quad-height Mac Mini, like this size, but obviously a different design...

https://www.newegg.com/Product/Prod..._design_meshify_c_mini-_-11-352-086-_-Product

...which is smaller than my Meshify C mid-tower - you still have plenty of room for internal expansion.

That would be entirely raging for the Apple user community - with an incredible amount of flexibility across so many general and pro uses. I'd be back with Apple in a heartbeat. Apple could still offer the Xeon and ECC workstations for those users who need that kind of hardware. But most people simply don't need it, nor want it.

I have absolutely zero hope this will ever happen at Apple - they're too proud/stingy/greedy/stoopid to do it.
 
Last edited:
  • Like
Reactions: BlueTide
Love the overall answers:

Generally speaking when we talk of Pro users, we talk of people who require workstation hardware.
This is often the creative professionals - working with 3D modelling software, game development software, high-end animation software, Final Cut Pro (and other NLEs) etc.


Workstation Hardware is a jaded concept in so many cases. In Applespeak it refers to "Xeon CPUs with ECC memory". Several of those creative professionals you listed, and I'll include photographers, graphic (independent and agency types) and industrial designers, have been primarily using the 2014-2017 iMac. I've also noticed a lot of photographers and web developers are now migrating to laptops, and I have to believe graphic designers are doing the same.

Corporate in-house departments actually require their employees to take their computers/workstations home after work. I could see the same being true of creative agencies (ad and design). With the highspeed internet, it's now easy to work from home and link to directly to the company's servers.

Professional Apple user workstations have been less reliant on Apple's definition of "workstation hardware". In fact, the whole point of the iMac Pro was inspired by these creative professionals already using the i7 iMacs as their primary workstation. The big thing Apple missed on was; those users partly chose the iMac because it was less expensive than the Mac Pro. It had a better "price-to-performance" ratio.

On the Windows side, the professional photographers, videographers and designers who use the same software for their professional work as their Apple counterparts, also opt for the 6700k/7700k and now the 8700K. There has even been a migration - me included - who vacated Apple and moved to Windows because we know Intel offers more than 4 cores in the i7 series.

With the x299 series, non-xeon CPUs directly aimed at the 'pro market', There is a LOT to option to customize your workstation for your needs. For Apple, if you want more than 4-cores, you needed to pay $4,000 for 6-cores, and $5,000 for 8-cores back in 2013. Right now, the 2013 Mac Pro is $4,000. For the tower alone.

I built my 7820x Radeon, 32GB RAM, RX 580 Capture One Pro workstation for $1,950. I chose the Meshify C case, as it has fantastic airflow. ...This is exactly the non-xeon hardware I've wanted from Apple for such a LONG LONG time.

This is where Apple is missing the boat for so many users. Intel offers CPUs to create a wide array of workstations for professional users of all types. Apple simply doesn't want to cannibalize their high priced Xeon workstations for less expensive solutions.

The machine I'd love to see from Apple, and I've mentioned this numerous times on these forums, is an entirely user-upgradable triple or quad height Mac Mini with 6-core 8700K, up to 64GB of memory, dual NVMe SSD slots and an internal 2.5" drive bay. I could deal with the GPU being proprietary in this smaller form factor, as you can add a secondary GPU via thunderbolt 3.

However, they Apple could also do it this way: 4-core 7740x, 6-core 7800x, 8-core 7820x, 10-core 7900x. They could use the Radeon RX 560 for their base GPU, and include an upgrade option to the Radeon RX 580, Vega 56 and 64 GPUs. If you wanted to get a GTX 1080ti, you can add that via thunderbolt 3.

The 7800x with the Radeon RX 580, an SSD boot drive and 16GB RAM shouldn't cost the buyers anything more than $1,800, with an upgrade path to the 7820x being another $300 an top of the $1,800 -- so looking at $2,100. Tack on another $100 for the RX 580 upgrade, and it's $2,200. Considering Apple would get their parts at wholesale, that's still a decent margin. I'd easily pay a $200-$300 premium for this over what I have no - as I prefer the Mac OS.


Case wise, even if they went larger than a quad-height Mac Mini, like this size, but obviously a different design...

https://www.newegg.com/Product/Prod..._design_meshify_c_mini-_-11-352-086-_-Product

...which is smaller than my Meshify C mid-tower - you still have plenty of room for internal expansion.

That would be entirely raging for the Apple user community - with an incredible amount of flexibility across so many general and pro uses. I'd be back with Apple in a heartbeat. Apple could still offer the Xeon and ECC workstations for those users who need that kind of hardware. But most people simply don't need it, nor want it.

I have absolutely zero hope this will ever happen at Apple - they're too proud/stingy/greedy/stoopid to do it.


I essentially agree with you. I still think there's a very, very strong case for Xeon workstations with ECC and Pro-level GPUs, but there's certainly a huge market that Apple is missing that relies on consumer/prosumer parts.
Besides, it wouldn't be unprecedented for Apple to label a device "Pro" and use consumer parts. That's what the MacBook Pro always does. But yeah, not gonna happen like you say.

I personally do video work and software development, though primarily on a non-paying hobbyist level, and I use an iMac for my productions
 
  • Like
Reactions: ETN3
I essentially agree with you. I still think there's a very, very strong case for Xeon workstations with ECC and Pro-level GPUs, but there's certainly a huge market that Apple is missing that relies on consumer/prosumer parts.
Besides, it wouldn't be unprecedented for Apple to label a device "Pro" and use consumer parts. That's what the MacBook Pro always does. But yeah, not gonna happen like you say.

I personally do video work and software development, though primarily on a non-paying hobbyist level, and I use an iMac for my productions


Oh for sure. Without question there's a market for Xeon workstations beyond just server applications. The problem is, Apple believes every working professional requires the Xeon, even though a huge segment of pro users have proven otherwise.
 
  • Like
Reactions: BlueTide
Has Intel changed their Xeon pricing? Back in the Nehalem and Westmere days, the Xeons were priced the same as their “consumer” i7 counterparts.

I’m asking because people seem to think Xeons drive up the price of the Mac Pro? If the prices are still the same, why wouldn’t you want a Xeon?
 
Oh for sure. Without question there's a market for Xeon workstations beyond just server applications. The problem is, Apple believes every working professional requires the Xeon, even though a huge segment of pro users have proven otherwise.

That's patently untrue, considering Apple sells iMacs and MacBook Pros and that's what the majority of pros (their definition in the roundtable) use. The iMac Pro and Mac Pro are the niche computers.

Apple doesn't see the point in offering a headless iMac, and likely never will. They never have and never will attempt to be all things to all people.
 
  • Like
Reactions: zephonic
Has Intel changed their Xeon pricing? Back in the Nehalem and Westmere days, the Xeons were priced the same as their “consumer” i7 counterparts.

I’m asking because people seem to think Xeons drive up the price of the Mac Pro? If the prices are still the same, why wouldn’t you want a Xeon?


There are many Xeons - but generally speaking, yeah, they're more expensive. Even if the CPU itself wasn't though, it would still drive up the price, since it needs a different chipset, which is more expensive.
 
9: Why does Apple ignore gaming on MacOS, but embrace it on iOS?

Fundamentally different markets.

There are roughly 1 billion - with a B - iOS devices in the world. Their hardware is well-known and standardized. They are essentially closer to being a game console like the Switch than a general-purpose PC, so it's much easier for developers to target, and there's a large enough market to make it profitable for them.

While Apple is shipping more Macs per year than ever before, it is still roughly under 10% of the total active personal computer marketplace. Apple's own 2016 estimates said there are roughly 100 million active Mac OS users in the world; there are 500 million Windows 10 users as of 2017, representing half of all Windows users; Windows 7 makes up about 38% of the remainder, with Windows 8 and 8.1 making up the remaining 12%.

On the Steam marketplace, Windows dominates massively, at 95% of the user share, with Mac far behind at roughly 3%.

It simply doesn't make economic sense for game developers to pursue a small niche market like MacOS, particularly with the fact that it means supporting many different versions of the OS, hardware, etc, which makes the economic decision clearer; it's too risky, especially for small developers.

Sure, it'd be nice to have that long-rumored iOS compatibility mode, but Tim Cook has repeatedly said it doesn't really make sense for them to do that.


I want a multi-purpose machine that can run MacOS and Windows 10. I want a full tower (with a separate aftermarket wide-screen monitor) that can be CTO'd with no RAM or hard drive on-board. It should have the latest USB-C/Thunderbolt 3, USB-A, WiFi, Bluetooth. I want it to be able to run common productivity tasks like Microsoft Office and Quickbooks, but also to be able to run Final Cut and/or Adobe CC for 4K video editing, graphics and DTP. I also want the option of being able to do some CAD with it. I want this to be a machine you can CTO just like an iMac Pro (Xeon W) and pay accordingly.

Here's my question: Why do you need it to run both OSes?

To sum up your needs, as I understand them:
  • All the Adobe apps are cross-platform.
  • All productivity apps are cross-platform or available in web-app / cloud versions, or have equivalents.
  • For CAD, it really depends on which app you prefer. AutoCAD has come back to Mac OS, so at least that is cross-platform.
  • Gaming tilts far more deeply towards Windows, plus the choice of affordable graphics cards + drivers.
  • Thunderbolt 3 / USB-C requires motherboard support, so it has to be built-in (newer MacBook Pros, Mac Pro) or on your choice of PC workstation motherboard (higher end ASUS workstation boards) which either have it onboard, or have the PCIe bandwidth to support Intel-certified TB3 expansion cards. Not cheap.
If you absolutely have to have an expandable tower, and you need Mac OS, your only officially Apple-supported choice right now is a Mac Pro 5,1 cheese-grater tower. Easy enough to create a Boot Camp Windows drive for dual-boot or use VMware. However, you cannot add Thunderbolt to these computers; you'll max out at USB 3 with an expansion card and Firewire 800.

So, to meet all your needs, it seems like the only direction to go in is to build a well-specced Windows PC that you can dual-boot as a Hackintosh, although it's really up to you if you want to go through the very complicated technical steps to accomplish that, and still not have a really well-supported or stable system.

Or, if you can accomplish all these tasks without Mac OS, just go for a Windows workstation!

Or... wait to see what Apple comes up with.
 
1: What is a "PRO" Mac user? What makes them different from any Mac user?

A "Pro" user (in terms of Apple products and not in general) has always been someone who works in the film/television industry and needs the fastest technology available because they're working with the latest and greatest camera equipment.

Yeah, I know people will argue that anyone who uses their Mac for any time of work is considered a "pro" user. But that's not actually why Apple created the Mac Pro. Apple literally made the Mac Pro for high-end filmmakers.

Unfortunately, Tim Cook's Apple has been changing their definition of "Pro" with crap like the iPad Pro and iMac Pro, and heck, even the new MacBook Pro, none of which actually help high-end filmmakers very much.
 
I essentially agree with you. I still think there's a very, very strong case for Xeon workstations with ECC and Pro-level GPUs, but there's certainly a huge market that Apple is missing that relies on consumer/prosumer parts.
Besides, it wouldn't be unprecedented for Apple to label a device "Pro" and use consumer parts. That's what the MacBook Pro always does. But yeah, not gonna happen like you say.

Yes, there is a lot of people wanting pro grade processors and GPU's but if you look at what happened with the GPUs there has been a shift where "pro´s" don´t buy expensive Nvidia Quadros but rather a 1080 or two instead. The performance of these cards is great and gives much more bang for bucks compared with a Quadro. (Nvidia even prohibits the use of Geforce cards in data centers for the same reason. https://wccftech.com/nvidia-geforce-eula-prohibits-datacenter-blockchain-allowed/)

And now with AMD Threadripper, you get a lot more cores and performance for your buck than a Xeon. It is about value but that doesn't mean the same as cheap. And for the Apple platform, the value lays in MacOS and the quality of the machines.
We gladly pay more for a fast performing Mac with a high quality, and it is nothing new but even for Apple, there is a limit to how much people are willing to pay "extra" rather than just buy a PC.

The thing with for example the iMac Pro is that if you compare it with a similar specced PC workstation Xeon, Vega...it´s not too badly priced. But you can get a PC that matches the same performance without Xeons at much lower costs or a higher performing computer for the same amount of money. There are a lot of professionals that don't care about if it´s a Xeon or ECC the thing that matter is that is fast as hell.

It´s hard not to forget what happened at the end of the nineties when SGI, SUN, HP.... lost the market for workstations, and the reason was that PC´s had caught up with the performance and didn't cost ten arms and twenty legs.
 
At this point, keeping my 4,1 Mac Pro running with High Sierra isn't far off from running a Hackintosh. So I'd be fine if Apple's idea of "modular" was a Mac Mini with a Xeon CPU, ECC RAM, AMD GPU, and a ton of TB3 ports. As long as I have a "blessed" path for running Win10 alongside MacOS, and upgrading the GPU (even if it means buying TB3 disk/eGPU enclosures), I'm set.

Truthfully, proper Boot Camp eGPU support would probably steer me towards just getting a macbook pro, so maybe I'm not the intended audience. I mean, I'd prefer having a socketed CPU, replaceable RAM, and a proper PCIe x16 slot, but I'm not going to hold my breath.
 
Yes, there is a lot of people wanting pro grade processors and GPU's but if you look at what happened with the GPUs there has been a shift where "pro´s" don´t buy expensive Nvidia Quadros but rather a 1080 or two instead. The performance of these cards is great and gives much more bang for bucks compared with a Quadro. (Nvidia even prohibits the use of Geforce cards in data centers for the same reason. https://wccftech.com/nvidia-geforce-eula-prohibits-datacenter-blockchain-allowed/)

I don't think that's completely true. There are software packages that lock features to "authorized" cards, like the Quadro or FirePros. Does the software really not work with a Geforce? Probably not. But there is still a pretty decent market for "certified" setups, and that usually means a pro card that is over Thunderbolt or in a card slot.

We just don't see much of that because most people here want to play games, or use something like FCPX where Apple has no interest in pairing their software with special cards.

Nvidia throws around money a lot, and I'm also not entirely unconvinced they're paying software vendors to "optimize" code for Quadros.
 
I don't think that's completely true. There are software packages that lock features to "authorized" cards, like the Quadro or FirePros. Does the software really not work with a Geforce? Probably not. But there is still a pretty decent market for "certified" setups, and that usually means a pro card that is over Thunderbolt or in a card slot.¨

I am sure there still are such apps. There were a lot more in 2000's. The trend seems to be going away from those cards, they were never a major thing on Mac anyhow, and the stuff that the Quadros offer has become more and more niche (like genlock) even in rest of the OSes. I recall times when Maya's and 3D Max's required, then benefitted from those cards. And now almost no one seems to bother.
 
I mean, I stand corrected if that's the case, but I come on MacRumors daily and hadn't really heard of that, so I didn't realise there were issues at all.
There was a defect with early D500 & D700 - Apple has a replacement program. They work fine. Just the usuals here who want to continue the narrative that the 2013 MP was a bad computer because they didn't personally care for it.

Also, on the whole ECC GPU memory deal, you're getting taken for a ride by these same folks. They get you all twisted up whether this or that GPU has ECC memory and whether it's "pro" or not, and for 97% of users, it doesn't matter... again, they just like to create any narrative where they can point to something and suggest Apple sucks in some way.
[doublepost=1524346180][/doublepost]
A "Pro" user (in terms of Apple products and not in general) has always been someone who works in the film/television industry and needs the fastest technology available because they're working with the latest and greatest camera equipment.

Yeah, I know people will argue that anyone who uses their Mac for any time of work is considered a "pro" user. But that's not actually why Apple created the Mac Pro. Apple literally made the Mac Pro for high-end filmmakers.

Unfortunately, Tim Cook's Apple has been changing their definition of "Pro" with crap like the iPad Pro and iMac Pro, and heck, even the new MacBook Pro, none of which actually help high-end filmmakers very much.
Sorry Arron, but this is just complete nonsense. Whoever fed you that was full of it.
 
  • Like
Reactions: zephonic
That's patently untrue, considering Apple sells iMacs and MacBook Pros and that's what the majority of pros (their definition in the roundtable) use. The iMac Pro and Mac Pro are the niche computers.

Apple doesn't see the point in offering a headless iMac, and likely never will. They never have and never will attempt to be all things to all people.


That's wrong, aside from the laptop. The MacBook Pro is Apple's aberration in this case in terms of product naming/marketing, being that's it's an i7 CPU vs. Xeon.

Apple didn't design the iMac, much less market the iMac for professional use. A lot of professionals simply bought the iMac for pro use because it has a beautiful form factor, and it's less expensive and performs reasonably well against it's Mac Pro counterparts. The White Intel iMac I bought back in the day for independent design work performed better in many cases than the old G5 Mac Pro I used in the studio for my 9-5 day job.

When the 2014 4-core i7 4K iMac was released, the iMac became the preferred computer for most photographers, and a load of graphic designers - especially considering the 2014 iMac outperformed for the 2013 4-core Mac Pro in most cases, and outperformed the 6 and 8-core Mac Pro in numerous photography and design tasks. The 2015 and 2017 iMac made those performance disparities even worse.

Apple finally recognized the shift to professionals preferring the iMac over the 2013 Mac Pro. That was the inspiration for the iMac Pro - they said as much, even stating (paraphrasing) "they built an iMac for those pro iMac users".

However, Apple chose to use Xeons for the iMac Pro, and created a starting price at $5,000, even though the professionals buying the iMac didn't care about xeon CPUs, much less even needed 8-cores, since they were using the 4-core i7 CPUs.

So yes. Clearly Apple believes professional users require Xeon CPUs - as the iMac Pro was inspired by and targeted for the professional iMac user community. Their actions and words simply say so. ;)

Apple at one point offered a quad-core i7 Mac Mini. It cut into the $3,000 iMac sales, so they cut the 4-core version and now only offer a duo-core Mac Mini.

Apple's actions and intentions are loud and clear.

No, Apple doesn't need to be all things to all users. But what they're doing is simply sabotaging their own their desktop computer success by skimping out and alienating their core "creative users" - many of whom are independent small business owners who have a limited budget - hence the preference for less expensive iMac vs. the expensive Xeon Mac Pros.

Coincidentally enough, another friend of mine, a LONG time creative pro Apple user, is finally inquiring about a Windows 10 workstation for his graphic design, photography and video work.
 
Last edited:
A "Pro" user (in terms of Apple products and not in general) has always been someone who works in the film/television industry and needs the fastest technology available because they're working with the latest and greatest camera equipment.

I disagree entirely with that.

The 'Mac Pro' was simply a new name given to the Apple Tower when they transitioned from using Power PC to Intel CPUs.

The Powerbook became a 'MacBook Pro' and the G5 Tower became a 'Mac Pro'.
The film and TV thing had absolutely nothing to do with it.

What made the Mac Pro suitable for professionals wasn't just it's power - it was the flexibility of a tower design.

Apple's tower was their "pro' machine - long before Apple attached the 'Pro' label to it.

If you made movies, created music, did desktop publishing or edited web content on a Mac prior to Apple introducing the Mac Pro, chances are you did it on a G5 Tower.
Prior to that it was a G4 Tower & prior to that a G3 Tower.

The original Mac Pro had more connections than any other Apple product at the time (multiple USB, Firewire & Ethernet ports), and you could add more ports like eSata when required via PCIe cards.
Same for storage, optical drives, graphics cards & RAM.

That's why it was a 'Pro' machine and prior to that the 'Pro' machine Apple made was the G5 Tower.

The classic Mac Pro was Apple's last Tower and that's why so many 'Pros' and power users yearn for it to return.

They want the latest CPUs of course, but more importantly they need the flexibility that the tower offered them.

What they call the tower is irrelevant.
Call it a Mac Pro or a G5, G4, G3 or Floppy Doppy Dippy - it matters not a jot!it was the tower design and flexibility it offered that made it attractive to the many different professionals from many different industries that used it.

It shows the Apple Marketing worked though.

When it was called a G3, G4 or G5 anyone was allowed to own an Apple Tower, but after they rebranded it a 'Mac Pro' suddenly it was only for 'professionals' only!!!

Nonsense of course, but I suppose it justified the massive price hike that soon followed and would make the 'Pro' feel more important.
 
Last edited:
  • Like
Reactions: singhs.apps
That's wrong, aside from the laptop. The MacBook Pro is Apple's aberration in this case in terms of product naming/marketing, being that's it's an i7 CPU vs. Xeon.

Apple didn't design the iMac, much less market the iMac for professional use. A lot of professionals simply bought the iMac for pro use because it has a beautiful form factor, and it's less expensive and performs reasonably well against it's Mac Pro counterparts. The White Intel iMac I bought back in the day for independent design work performed better in many cases than the old G5 Mac Pro I used in the studio for my 9-5 day job.

When the 2014 4-core i7 4K iMac was released, the iMac became the preferred computer for most photographers, and a load of graphic designers - especially considering the 2014 iMac outperformed for the 2013 4-core Mac Pro in most cases, and outperformed the 6 and 8-core Mac Pro in numerous photography and design tasks. The 2015 and 2017 iMac made those performance disparities even worse.

Apple finally recognized the shift to professionals preferring the iMac over the 2013 Mac Pro. That was the inspiration for the iMac Pro - they said as much, even stating (paraphrasing) "they built an iMac for those pro iMac users".

However, Apple chose to use Xeons for the iMac Pro, and created a starting price at $5,000, even though the professionals buying the iMac didn't care about xeon CPUs, much less even needed 8-cores, since they were using the 4-core i7 CPUs.

So yes. Clearly Apple believes professional users require Xeon CPUs - as the iMac Pro was inspired by and targeted for the professional iMac user community. Their actions and words simply say so. ;)

Apple at one point offered a quad-core i7 Mac Mini. It cut into the $3,000 iMac sales, so they cut the 4-core version and now only offer a duo-core Mac Mini.

Apple's actions and intentions are loud and clear.

No, Apple doesn't need to be all things to all users. But what they're doing is simply sabotaging their own their desktop computer success by skimping out and alienating their core "creative users" - many of whom are independent small business owners who have a limited budget - hence the preference for less expensive iMac vs. the expensive Xeon Mac Pros.

Coincidentally enough, another friend of mine, a LONG time creative pro Apple user, is finally inquiring about a Windows 10 workstation for his graphic design, photography and video work.

This doesn't track. MacBook Pros have never had Xeons, PowerBooks never had similar features. And Apple has sold far more laptops than desktops for more than a decade at this point. When the majority of your pro-targeted machines don't feature Xeons and ECC, it's clear that those features are not the pro differentiator.

People started using iMacs because iMacs became more powerful; it has very little to do with the Mac Pro. This is a general industry-wide trend; the performance delta between low-end and high-end hardware has shrunk dramatically save for specific applications. At the same time performance requirements have plateaued for many industries; I'm not going to be so much faster for web design or desktop publishing with a new machine than if I was upgrading a decade or two ago. Even in my job, motion graphics, the limitation for me isn't hardware for most projects, but Adobe's ancient creaky software.

The natural result of this, and Intel's near-monopoly in the high-end space, is that prices have steadily risen for pro machines, and the iMac Prois a reflection of that. The machine is out there to goose ASP for the Mac line by catering to the few users who would benefit from a workstation CPU and a beefier GPU; that it's not economical for many users isn't the point. Apple never said that "all pros should use the iMac Pro", because Apple has never really tried to create a sharp divide as to what "pro" hardware is besides being more expensive. It's always been a rule that when you get into professional products you're paying far more for comparatively less gain than consumer products, why you're expecting Apple to buck that trend is beyond me.

The whole Mac mini i7 thing is nonsense as well. They didn't ship a quad core i7 in 2014 because it would have required designing a different board, and they clearly weren't interested in devoting any more resources than they had to.

As for sabotaging themselves, they'd be doing better if they could keep their product matrix refreshed concurrently, but given that Mac shipments haven't slipped despite all the woe about creative pros leaving, I don't think it makes a difference.

When it was a G3, G4 or G5 anyone was allowed to own an Apple Tower, but after they rebranded it a 'Mac Pro' suddenly it was only for 'professionals' only!!!

Nonsense of course, but I suppose it justified the massive price hike that soon followed and would make the 'Pro' feel more important.

Pro Mac prices have been steadily increasing over time, and this was the case even with the PowerMac line.
 

Sure but the rate of the increase has escalated at a time when computers are becoming cheaper.

Also the prices in the link above are inaccurate.Try this one...
http://mactracker.ca
According to Mac Tracker app the prices for base units were as follows...

1999 - G3 Tower $1599
2001 - G4 Tower $1699 (Quicksilver)
2003 - G4 Tower $1699 (Mirror Door)
2005 - G5 Tower $1999 (Dual Core processor)

So that shows a $400 increase in price over 6 years.



Then the Mac Pro comes along...

2006 - Mac Pro 1.1 $2499 - A $500 increase from the off.
2008 - Mac Pro 3.1 $2799
2010 - Mac Pro 5.1 $2499
2013 - Mac Pro 6.1 $2999 (trashcan)

So from 2005 to 2006 the price of an Apple tower jumped by $500.

Then from 2006 to 2013 the price has (more modestly) risen by a further $500.

All this together means the Mac Pro has become a $1000 more than the G5 tower it replaced.

No other product in the Apple range has seen such an increase.

For example...
2006 Powerbook G4 15" $1999
2013 MacBook Pro 15" $1799

2006 iMac 17" $1299
2013 iMac 21" $1099

So both the iMac and the MacBook have become cheaper, yet in the same period the Mac Pro has increased by $1000.
 
  • Like
Reactions: ETN3
RWT0143-1-LR__87597.1520585462.jpg
 
Sure but the rate of the increase has escalated at a time when computers are becoming cheaper.

Also the prices in the link above are inaccurate.Try this one...
http://mactracker.ca
According to Mac Tracker app the prices for base units were as follows...

1999 - G3 Tower $1599
2001 - G4 Tower $1699 (Quicksilver)
2003 - G4 Tower $1699 (Mirror Door)
2005 - G5 Tower $1999 (Dual Core processor)

So that shows a $400 increase in price over 6 years.



Then the Mac Pro comes along...

2006 - Mac Pro 1.1 $2499 - A $500 increase from the off.
2008 - Mac Pro 3.1 $2799
2010 - Mac Pro 5.1 $2499
2013 - Mac Pro 6.1 $2999 (trashcan)

So from 2005 to 2006 the price of an Apple tower jumped by $500.

Then from 2006 to 2013 the price has (more modestly) risen by a further $500.

All this together means the Mac Pro has become a $1000 more than the G5 tower it replaced.

No other product in the Apple range has seen such an increase.

For example...
2006 Powerbook G4 15" $1999
2013 MacBook Pro 15" $1799

2006 iMac 17" $1299
2013 iMac 21" $1099

So both the iMac and the MacBook have become cheaper, yet in the same period the Mac Pro has increased by $1000.

I'm not sure what numbers you're looking at because the ones you quoted are in concurrence with the link. Either way, Apple's prices aren't out of line with any other OEM, and the broader point is that Apple's pro Macs were never targeted at price-sensitive consumers. Apple hasn't made a midrange box-with-slots in the Jobs or Cook eras. Those cheap G4s were still $2200+ in today's dollars (and considering they were vastly outclassed by Intel's chips by the end of that era, they were probably still overpriced.)

Even if Intel weren't actually abusing its position to make more money, the reality is that pro computers were *always* going to get more expensive in the current computing reality. Fewer people need as much comparative power as they used to, and the cost of that power is vastly more expensive (adding more and more cores and cache since pure frequency is hard to boost like it once was.)

This is starting to stray off, but the point is that Apple has not and never will subscribe to some very select definition of pro.
 
I am sure there still are such apps. There were a lot more in 2000's. The trend seems to be going away from those cards, they were never a major thing on Mac anyhow, and the stuff that the Quadros offer has become more and more niche (like genlock) even in rest of the OSes. I recall times when Maya's and 3D Max's required, then benefitted from those cards. And now almost no one seems to bother.

It's niche. But it's an important niche. Apple even acknowledges it now by adding official certified support for the FirePro series (the real one, not the fake on they put in the 2013 Mac Pro) in 10.13.4.

It would be odd if they didn't have support for real FirePros in the new Mac Pro considering for some users, those are the real top end GPUs, and Apple supports them officially for eGPU now.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.