Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
The problem is that even the non-Duo blocks the second PCIe slot. Having 1 full length slot with the MPX slot changes the dynamics a lot, considering you can’t use either of the Promise internal bays, and will be more reliant on external I/O vs the Pro and only 4 rear ports vs the iMac.

the "half width" MPX module doesn't block the slot if just want bus power. A MPX bay consists of two double wide PCI slots.

"
Each MPX bay provides:

.....

Alternatively, each MPX bay can support:

One full-length, double-wide x16 gen 3 slot and one full-length, double-wide x8 gen 3 slot (MPX bay 1)
[/quote]

https://www.apple.com/mac-pro/specs/

Each MPX bay has three sockets. Two PCI-e ones and one "MPX connector". There is a double wide gap between the PCI-e ones. The two MPX bays cover four double wide slots.

So yes the single Vega II soaks up lots of space and if coupled with a the Promise MPX storage module then have two major slot coverers. I extremely doubt though that is the configuration that Apple is thinking most will go with. Folks with big budgets will get a Duo and storage MPX and be done.


Also, the J2i does absolutely nothing to slot capacity at all. It is in a different zone of the system (CPU zone, not the slot zone). If the next gen HDD density tech trickles down to 2.5" drive then a future 'cousin' of the J2 could pack four 2.5" drives up in that zone. There is probably a future path there for folks who 'love' HDDs and don't need contemporary fast storage.

Apple didn't optimize this system at all for HDDs. The more HDDs stuffed into the internals the more "problems" will pop up.


Pushing folks to non-Apple GPUs if they need more I/O is a weird statement for Apple to make,

But they aren't pushing folks. At least those with money/budget. If Apple goes back into Rip Van Winkle slumber then perhaps that would be a "push". However, if they just do a few things there isn't much there.

One, replace the 580X card with something better in 6 months or less. It is pretty apparent that card is a bit of a fluke of timing rather than their desired objective. If Apple's entry base card was a RX 5700 based card they'd be less of a need for third party stuff. Same issue 12-14 months if iterate on the Vega II cards.

Two, expand out robust support on add in cards for SSDs. "more I/O" isn't really where HDDs fit in. Relatively it is much less I/O to keep a HDD pipe filled than a SSD one. HDDs have more capacity.


considering the effort made to keep Thunderbolt 3 relevant for display output in the Mac Pro.

DisplayPort 2.0 is built on the Thunderbolt protocol. The number of GPU cards with "Thunderbolt" is only going up in 2021 and later. The notion that Thunderbolt is going to drop of relevancy ine upcoming years is odd when it has been weaved into USB4 and DP v2. It isn't going anywhere. And Apple isn't the only one that is going to be keeping it relevant.

The third thing Apple can do is not be quite so proprietary when DP v2.0 rolls out and more folks may be interested in dual usage output on new generation cards. ( i.e., work with some folks on using MPX connector as a pseudo standard).



You cut out the important bit when you quoted: the Thunderbolt card. Gigabyte produces two (Alpine Ridge and Titan Ridge) variants that include dual DisplayPort inputs. Can’t do it without that, I agree.

Apple isn't going to spend much time putting support in for 3rd party legacy TBv3 cards. The Titan Ridge ones can be put into a "happen to work" state but Apple probably won't spend any more time on that than they do with eGPU on the MP 2013.

Rube Goldberg, external 'loop back" cables. Apple isn't pushing people to that. First, there are tons of folks sitting on older displays that don't have Thunderbolt. They will be happy as pigs in mud with a 3rd party GPU card that is supported with official drivers in the new Mac Pro. For the XDR .... if can afford it then probably can afford Apple's GPUs. The Apple constrained LG Ultrafine have one, and only one, input, but most other monitors that have TB input have more than one input. Those 3rd party monitors will work with 3rd party GPU cards.

There is only relatively a narrow corner case here where there is a mismatch. Most Mac Pro users don't have Apple "one input" monitors. That isn't going to change with the new Mac Pro.
 
As I said speculation is fine, it’s just annoying to see people complain constantly Apple don’t make what they personally want and so they just moan and slander the machine they have made, or even those who will buy them! It’s like some sort of jealous spite. People need to move on.

Seriously though, consider the fact that at least some of those people moaning are long time Mac users who are feeling like they are being priced out of solutions that were available years ago. Not even the 2013 Mac Pro started this high. The PowerMac line was always a bit pricy but still in reach for prosumer/enthusiast types, or folks like me where the heavy hardware is paid for by work, but I still want some reasonable hardware at home, but don't need a full Xeon workstation at home (but will take it if it wasn't double the price of the next tiny step down, performance wise).

I joined this forum as a refugee from another even older forum. I've been in the ecosystem for a very long time. I get why these folks need to vent. They are passionate about the platform, but shifts in the product matrix make them feel like they aren't part of Apple's market anymore. It's not easy to simply "move on", especially if like me, you not only prefer working with macOS, your career is in making software for Apple devices.

the "half width" MPX module doesn't block the slot if just want bus power. A MPX bay consists of two double wide PCI slots.

<SNIP A LOT OF STUFF>

You spent a lot of words and missed some context in the quote chain. We were talking about a hypothetical "Mini Mac Pro" that had only a single MPX bay + 1 PCIe slot populated with the 2 TB3, 2 USB-A I/O card. Just go read the chain for context, but we weren't discussing the 2019 itself.

"Half width" is not the same as "half length". You need a "half length" to expose the MPX slot that carries power/DisplayPort. Half width just means it exposes the second PCIe slot that an MPX bay has. And the debate we were getting into was over if there should be one more full-length PCIe slot so that someone could use a full-size MPX module and still add an I/O card of some kind.

Context is important, and there's no need to act like I don't know a PCIe slot from a hole in the ground. It's condescending. And I'll ignore any points you made attacking that little side-thread discussing a hypothetical machine that doesn't exist, since none of the arguments you attacked were even relevant to the 2019 Mac Pro, other than we were discussing "what if Apple removed slots and sold it cheaper".

But they aren't pushing folks. At least those with money/budget. If Apple goes back into Rip Van Winkle slumber then perhaps that would be a "push". However, if they just do a few things there isn't much there.

Again my comment was more about that hypothetical "Mini Mac Pro". My argument was that one half-slot PCIe (with the TB/USB card) and one MPX bay would create weird incentives to avoid using Apple for GPU upgrades if they also wanted I/O expansion, and that such a hypothetical machine should have one additional PCIe slot so that someone would have more flexibility.

Honestly, I think the slot layout on the 2019 is fine. I've even said in this thread (I think) that I think the 2019 is more flexible than the old cMP.

I mostly responded to this comment because you also bring up money/budget. Because that is honestly the biggest problem with the 2019 Mac Pro. A 3k price increase for the base model versus both the tcMP and cMP is more than a bit annoying.

DisplayPort 2.0 is built on the Thunderbolt protocol. The number of GPU cards with "Thunderbolt" is only going up in 2021 and later. The notion that Thunderbolt is going to drop of relevancy ine upcoming years is odd when it has been weaved into USB4 and DP v2. It isn't going anywhere. And Apple isn't the only one that is going to be keeping it relevant.

The third thing Apple can do is not be quite so proprietary when DP v2.0 rolls out and more folks may be interested in dual usage output on new generation cards. ( i.e., work with some folks on using MPX connector as a pseudo standard).

Yeah, the problem there is it's years away. Both USB4 and DP2 uptake is going to be pretty slow outside of the Apple sphere. I can only hope that Apple being quick on tech like this means that there will be something like the 5K UltraFine that supports this fairly quickly. But Apple really could benefit from not needing MST to drive 5K/6K displays at 10-bit, so I do see them picking it up fairly quickly once the chipsets exist to route DP2 to Thunderbolt/USB4 ports.

Maybe on the PC side, we can hope that gaming monitors will start adopting DP2 for >60Hz 4K displays, and pushing adoption that way. But since AMD hasn't even shipped GPUs with USB-C ports for VirtualLink yet, who knows.

But one reason I've recently written off a Hackintosh is that TB3 support on the PC side is pretty hit and miss. Mostly in pre-built laptops, and mobos have issues providing more than a single USB-C port, let alone TB3.

Apple isn't going to spend much time putting support in for 3rd party legacy TBv3 cards. The Titan Ridge ones can be put into a "happen to work" state but Apple probably won't spend any more time on that than they do with eGPU on the MP 2013.

I wasn't expecting any different. I was pointing out that one of the nice things about the Mac Pro's number of slots is that you can use the loopback method if you wanted a 3rd party GPU and just absolutely had to drive that 5K over TB3 in the near term. It wouldn't be my first choice, but it's an option, and really you just want the Mac Pro slot to provide power to the card for the DisplayPort loopback, and not need an external PCIe box to hold it.

I said nothing about how well they'd work for anything beyond display loopback. The Titan Ridge card in particular couldn't carry actual data on the ports at all anyways, the Mac Pro doesn't have the TB3 header the card needs on the motherboard.

Again, context in the quote chain is somewhat important here.
 
  • Like
Reactions: Polymorphic
The one exception to "it's hard to get TB3 support with a non-Apple video card" not really mattering is the XDR Display. Very few Mac Pro users are going to use the old Apple "one-input" displays. If you're doing anything in photography, video or graphics, you'll have a high-end display from Eizo, NEC or someone. If you're working with audio or something else where the display matters little enough that you might choose an old Thunderbolt Display you had around, there are many suppliers of 27" non-Retina displays, and a good one for non-color critical use is under $500 - hardly a dealbreaker on a computer that starts at $6000.

The XDR Display makes it more interesting - we'll have to see how it performs against other very high-end displays, but if it really is offering the performance of a $15,000-$30,000 reference monitor for $5000, it will find a market (largely in Hollywood). Some of those users may want to choose another GPU, and a compatibility issue with the only port on the display could be disappointing. Of course, that's a somewhat narrow slice of the market.

$5000 displays are relatively rare - there are very, very good displays from Eizo, NEC and competitors which are mostly in the $1000-$3000 range, although they do top out around $5000 - and there are true cinema reference displays, which are much more than $5000.

If Apple hasn't come up with something that performs like a true reference monitor, they have one of a number of entries in the top-end photo/graphics display market, which is dominated by two trusted names that have been making great displays for decades. No non-Eizo, non-NEC photo/graphics monitor sells for much over $2000 (except for one HP DreamColor Apple will be competing directly with, and a Dell that happens to have an 8K resolution).

Unless it's really reference grade, Apple's monitor is wildly overpriced - and if it is a true reference monitor, it's a steal for the few people who need one.

If Apple delivered a version of the 6K display in the $2000-$3000 range that didn't have quite the performance, but was clearly aimed at the top of the photo/graphics market rather than only at Hollywood color graders, that would make the "one input" problem all the more severe.
 
Seriously though, consider the fact that at least some of those people moaning are long time Mac users who are feeling like they are being priced out of solutions that were available years ago. Not even the 2013 Mac Pro started this high. The PowerMac line was always a bit pricy but still in reach for prosumer/enthusiast types, or folks like me where the heavy hardware is paid for by work, but I still want some reasonable hardware at home, but don't need a full Xeon workstation at home (but will take it if it wasn't double the price of the next tiny step down, performance wise).

Then those users are realising that Apple is a giant global corporation who only cares about money and profits, not what a select few users want. Their is NO loayalty with Apple I’m afraid. It’s about time some realised that, especially under Tim Cook and co.
Apple see those users as a market too small to be concerned with and use resources on.
And of course that sucks, but it is what it is. Why complain about it, no global corporation is loyal to its customers in that way.

The new machine is fantastic but aimed at a specific market, it’s also nice to see meaning put back into the word ‘pro’.
And Apple will tell you the iMac Pro is for he people you refer to, and that again comes down to cost as you say it’s cheap as it uses the same tooling as the iMac.
 
The new machine is fantastic but aimed at a specific market, it’s also nice to see meaning put back into the word ‘pro’.
And Apple will tell you the iMac Pro is for he people you refer to, and that again comes down to cost as you say it’s cheap as it uses the same tooling as the iMac.

If the iMac Pro was for me, I would own one. The only product line that Apple makes that can do 3d art is the Mac Pro. Which give me 2 options: Hackintosh or leave the Apple eco-system behind.

My next system will be an AMD based solution; my windows testbed (HP Z210) apparently makes a great hackintosh. My next phone will probably be a Samsung, and I am getting a Kindle for xmas.

I'll miss OSX, but Timmy doesn't want my money.
 
If the iMac Pro was for me, I would own one. The only product line that Apple makes that can do 3d art is the Mac Pro. Which give me 2 options: Hackintosh or leave the Apple eco-system behind.

My next system will be an AMD based solution; my windows testbed (HP Z210) apparently makes a great hackintosh. My next phone will probably be a Samsung, and I am getting a Kindle for xmas.

I'll miss OSX, but Timmy doesn't want my money.

Nope he doesn’t, but I’m pretty sure if your at the level of 3D Art where you need a Mac Pro, you should be spending thousands on equipment as surely you’ll get that money back right from clients, as you’ll either have lots or charge high prices for high end work right?
 
Last edited:
The increase in cost for a non-sealed Mac has occurred simultaneously with the iMac going from a dual-core home computer that used mobile CPUs and GPUs (even the Core 2 "Extreme" in one 2007 iMac was a 44 watt processor) to an 8 core powerhouse that uses one of the fastest desktop CPUs available (albeit with midrange GPUs). Yes, that chip won't turbo as high in an iMac as it will liquid cooled in a big tower. Yes, there are a couple of faster desktop chips around - AMD Ryzens that Apple won't use because they can't get the mobile lineup they want for their laptops except from Intel - and laptops are what drive Macs...

Given all of that, the iMac has still moved way up the power scale since 2006 when the first cheesegrater Mac Pro was introduced. Apple prefers to sell iMacs, for reasons ranging from ease of technical support to pure greed - and Apple won't give you a choice other than an iMac except for niche use cases for which an iMac simply won't work. There are some use cases like gaming (which is a deliberate decision) and applications that absolutely require NVidia graphics (for which Apple doesn't have a driver that meets their standards, and won't bother to write one) that Apple has decided not to support.

As the iMac becomes capable of more and more. Apple makes it harder and harder to avoid. When all iMacs were wimpy home computers, a computer that took on jobs the iMac just couldn't handle didn't have to be all that expensive, and it couldn't be (at the bottom end of its line), because it would leave a gap where no iMac was powerful enough, but the Mac Pro was overkill. Today, Apple figures that iMacs (including the iMac Pro) can handle most jobs, leaving only a tiny market, largely in Hollywood, that can't use an iMac - so they built a Mac Pro for a tiny market, largely in Hollywood.

This has nothing at all to do with which markets might or might not prefer an iMac versus a tower. Apple, quite frankly, doesn't care. It has only to do with which markets can use an iMac if that's what Apple offers... Realistically, pretty much every still photographer can use an iMac (and if your camera can clog up the biggest iMacs, your camera gear costs enough more than a Mac Pro that the Mac Pro isn't the big expense). I'm editing 50 megapixel images and printing 24x36" on a two year old quad-core MacBook Pro with 16 GB of RAM.

That's not ideal, and I'm looking at either an octo-core MBP with 32 (or 64 if they let me) GB when the new design ships or an iMac (maybe an iMac Pro?) with 64 GB. ~50 MP is a huge image size (Sony's 61 MP camera will have very similar requirements) and, while 100 MP cameras exist, the "cheapest" one with a couple of lenses costs as much as a nicely configured Mac Pro with an XDR display (and a high-end iMac, especially an iMac Pro, would work just fine with one - so would an octo-core MacBook Pro with 32 GB of RAM). Cell phones and other pocket cameras with strange ultra high resolution sensor designs will compress their images and employ other tricks so as not to need exotic computers to edit them.

4K video, even in "big", low-compression formats, edits just fine (and very smoothly) on an octo-core with 32 or 64 GB of RAM. Easy for an iMac Pro from its introduction - now easy on a non-pro iMac and even an MBP. Video over 4K is now possible to capture with a $3000 camera instead of a $50,000 camera - due to a couple of affordable 6K cameras released in the last few months. Even 6K should probably edit well on a powerful octo-core system, and certainly on an iMac Pro. If you're shooting 8K, you're using a RED (or something equally expensive), and the Mac Pro fits in your budget.

Apple is ignoring 3D artists, probably because of NVidia. High-end professional 3D shops can easily afford the Mac Pro if they can use AMD cards - it's cheaper than the Crays some of them use... Semi-pros and hobbyists are actively being pushed to Windows, and it may be a small enough market that Apple doesn't care (or thinks that supporting their unique needs will cost them stability elsewhere).

Apple is not ignoring gamers - they're deliberately keeping them out. They don't want to add the hooks they'd need to the OS to allow complex games to function. The routines that allow games to control Windows machines at low level are also beloved by malware authors. The same is true of gaming drivers (mostly for video cards) - they're simply not as stable as drivers written for other uses. AMD and NVidia both charge extra for workstation drivers (and slight hardware modifications) on the PC side, and Apple has written their own AMD driver - all to avoid using gaming drivers for other purposes.

Almost all standard office applications run fine on a MacBook Air, let alone a big iMac - and the exceptions (GIS, CAD, database - all at the high end) tend to be Windows-based anyway...

While scientists loved the old Mac Pro, that was a happy accident for Apple. They've never cared about the scientific market - just not enough of them to be worth their while. They'll get some purchase orders from the physics department for the new one, too - and the scientists they lose to Dell and HP won't register on their trillion dollar radar.

Computational investing has so much money floating around that they don't care what it costs. If they want Mac Pros, they buy them out of petty cash. Quantitative hedge fund investors once drilled a hole through a mountain to get a cable routing a hundred miles shorter between New York and Chicago. They cared enough about a few hundredths of a millisecond to spend probably millions of dollars on the shorter route. If a Mac Pro fits their workflow, they'll buy as many of them as they need and a bunch more for spares (and get them custom-painted to match their Cray).

Most other possible applications fall into one of these broad categories... Either an iMac works (whether you like it or not), there's enough money around that the price of the Mac Pro doesn't matter, or Apple has more or less deliberately abandoned the market to Windows. Apple won't make a machine that lets the user avoid an iMac because Apple prefers to sell iMacs.
 
Nope he doesn’t, but I’m pretty sure if your at the level where of 3D Art where you need a Mac Pro, you should be spending thousands on equipment as surely you’ll get that money back right from clients, as you’ll either have lots or charge high prices for high end work right?

You would be pretty wrong. There are no barriers to entry. The low end, hobbyist level software is fully multi-threaded (except for Daz Studio), and will use every byte of ram and every core we can throw at it. Even at the hobbyist level, some of us have render farms.:D
 
Apple is ignoring 3D artists, probably because of NVidia. .

I think this is incorrect - the Redshift devs are porting their 3D GPU renderer to Apple Metal and AMD cards, as are Otoy with their Octane GPU render engine. Mac Pro customers will even be offered an Octane license free of charge with their purchase. Apple highlights performance gains with Cinema 4D on the new Mac Pro product page, and also the rendering speed gains with Arnold renderer and Maya - ie the absolute top-end of 3D art design and rendering.

This suggests to me that Apple are not ignoring 3D artists, and by offering an alternative to Nvidia by pushing to devs to port their GPU renderers to Metal, they are creating a healthier market for us buyers.
 
  • Like
Reactions: Macintosh IIcx
Thats a rather extreme hobby! What software is that?

You have no idea :eek:. I stopped counting the costs when I realized I had dropped over $10K on digital assets ($100 - $150 a month adds up over a decade and a half).

To start - Daz Studio is Free (and does have a really nice render engine (Iray). Digital assets are pretty inexpensive, and once you hit the software limits (that is pretty easy - it is a VERY poorly written piece of software), there is Poser, or other software to move up to.

If you go over to Amazon, a LOT of self published book (and album covers)covers are made in either DS or Poser.

I use Vue (outdoor scenes), Poser Pro 11 (heart of the workflow, and the professional 3d world's dirty little secret), Zbrush (my only expensive software), Blender (for modeling), Daz Studio (for exporting DS content out of DS and into Poser), Photoshop (post-work). Lux-Render (for rendering) Acrobat (for final assembly of my graphic novels). Subsurface Painter is the next piece of software I'll be adding. Since I am moving to windows, I'll probably be adding Marvelous Designer.

Keep in mind, I am on the low end - there are a lot of hobbyist that own Cinema3D, or Maya.

A Z210 or HP8200 makes a great little render boxen - for less than the price of an RTX 2080, I can have 6 of them (4 cores, 8 threads (24 core, 48 threads) along with both of my 4,1 mac pros, controlled through Poser Pro.
 
  • Like
Reactions: apolloa
AMD EPYC 2 CPU range gonna murder these Xeons....way more flexible in the SKU range core and clock options more reasonable to suite other purposes.

EPYC 7542

32 core

2.9-3.4ghz

225 W

$3400

Can put in 2 processor boards...
 
AMD EPYC 2 CPU range gonna murder these Xeons....way more flexible in the SKU range core and clock options more reasonable to suite other purposes.

EPYC 7542

32 core

2.9-3.4ghz

225 W

$3400

Can put in 2 processor boards...

Yep. And you can do the same with the 64 core Eypc cpus.
 
I think this is incorrect - the Redshift devs are porting their 3D GPU renderer to Apple Metal and AMD cards, as are Otoy with their Octane GPU render engine. Mac Pro customers will even be offered an Octane license free of charge with their purchase. Apple highlights performance gains with Cinema 4D on the new Mac Pro product page, and also the rendering speed gains with Arnold renderer and Maya - ie the absolute top-end of 3D art design and rendering.

This suggests to me that Apple are not ignoring 3D artists, and by offering an alternative to Nvidia by pushing to devs to port their GPU renderers to Metal, they are creating a healthier market for us buyers.

Apple is investing a lot in Metal based Ray Tracing as well (which runs on AMD cards.) So far, they've completely focused that on rendering in creative apps and not in games.
 
  • Like
Reactions: Macintosh IIcx
That is what I am preparing to build with.

ASUS has released a workstation X570 board (No gaming stuff, no RGB :D), with a Ryzen 9, 128 GB ram, NVMe SSD Boot drive, and 5700 video card.
 
As I said speculation is fine, it’s just annoying to see people complain constantly Apple don’t make what they personally want and so they just moan and slander the machine they have made, or even those who will buy them! It’s like some sort of jealous spite. People need to move on.
People need to be heard. You've been around (here anyway) long enough to remember the uproar when new GPUs for 3,1 couldn't be used in 1,1 and 2,1?

It was a crap move, and Apple knew it was crap, but they tried it anyway. Sometimes the pitchforks have to come out to get what we want.
 
You have no idea :eek:. I stopped counting the costs when I realized I had dropped over $10K on digital assets ($100 - $150 a month adds up over a decade and a half).

To start - Daz Studio is Free (and does have a really nice render engine (Iray). Digital assets are pretty inexpensive, and once you hit the software limits (that is pretty easy - it is a VERY poorly written piece of software), there is Poser, or other software to move up to.

If you go over to Amazon, a LOT of self published book (and album covers)covers are made in either DS or Poser.

I use Vue (outdoor scenes), Poser Pro 11 (heart of the workflow, and the professional 3d world's dirty little secret), Zbrush (my only expensive software), Blender (for modeling), Daz Studio (for exporting DS content out of DS and into Poser), Photoshop (post-work). Lux-Render (for rendering) Acrobat (for final assembly of my graphic novels). Subsurface Painter is the next piece of software I'll be adding. Since I am moving to windows, I'll probably be adding Marvelous Designer.

Keep in mind, I am on the low end - there are a lot of hobbyist that own Cinema3D, or Maya.

A Z210 or HP8200 makes a great little render boxen - for less than the price of an RTX 2080, I can have 6 of them (4 cores, 8 threads (24 core, 48 threads) along with both of my 4,1 mac pros, controlled through Poser Pro.

I’ve played around with the demo version of Cinema 3D before, if I remember it is the cheaper animation software to go for, 3DS Max and May can cost thousands.
Never heard of Daz Studio though.
Certainly seems expensive if you putting that much a month into it, still if you keeps you busy and your mind exercises and you enjoy it, I’m all for that.
[doublepost=1566938369][/doublepost]
People need to be heard. You've been around (here anyway) long enough to remember the uproar when new GPUs for 3,1 couldn't be used in 1,1 and 2,1?

It was a crap move, and Apple knew it was crap, but they tried it anyway. Sometimes the pitchforks have to come out to get what we want.

Haha but with today’s Apple they aren’t listening... I mean the Pros were really complaining about the lack of an update and leaving the platform, so Apple responds with a machine costing double the previous model.!!
 
It's not too late for them to do the right thing, but they obviously won't if we're not vocal.

$1 says they have working prototypes of a scaled down version. :D Man, I'd love to get my hands on one of those. Now.
 
The increase in cost for a non-sealed Mac has occurred simultaneously with the iMac going from a dual-core home computer that used mobile CPUs and GPUs (even the Core 2 "Extreme" in one 2007 iMac was a 44 watt processor) to an 8 core powerhouse that uses one of the fastest desktop CPUs available (albeit with midrange GPUs). Yes, that chip won't turbo as high in an iMac as it will liquid cooled in a big tower. Yes, there are a couple of faster desktop chips around - AMD Ryzens that Apple won't use because they can't get the mobile lineup they want for their laptops except from Intel - and laptops are what drive Macs...

Given all of that, the iMac has still moved way up the power scale since 2006 when the first cheesegrater Mac Pro was introduced. Apple prefers to sell iMacs, for reasons ranging from ease of technical support to pure greed - and Apple won't give you a choice other than an iMac except for niche use cases for which an iMac simply won't work. There are some use cases like gaming (which is a deliberate decision) and applications that absolutely require NVidia graphics (for which Apple doesn't have a driver that meets their standards, and won't bother to write one) that Apple has decided not to support.

As the iMac becomes capable of more and more. Apple makes it harder and harder to avoid. When all iMacs were wimpy home computers, a computer that took on jobs the iMac just couldn't handle didn't have to be all that expensive, and it couldn't be (at the bottom end of its line), because it would leave a gap where no iMac was powerful enough, but the Mac Pro was overkill. Today, Apple figures that iMacs (including the iMac Pro) can handle most jobs, leaving only a tiny market, largely in Hollywood, that can't use an iMac - so they built a Mac Pro for a tiny market, largely in Hollywood.

This has nothing at all to do with which markets might or might not prefer an iMac versus a tower. Apple, quite frankly, doesn't care. It has only to do with which markets can use an iMac if that's what Apple offers... Realistically, pretty much every still photographer can use an iMac (and if your camera can clog up the biggest iMacs, your camera gear costs enough more than a Mac Pro that the Mac Pro isn't the big expense). I'm editing 50 megapixel images and printing 24x36" on a two year old quad-core MacBook Pro with 16 GB of RAM.

That's not ideal, and I'm looking at either an octo-core MBP with 32 (or 64 if they let me) GB when the new design ships or an iMac (maybe an iMac Pro?) with 64 GB. ~50 MP is a huge image size (Sony's 61 MP camera will have very similar requirements) and, while 100 MP cameras exist, the "cheapest" one with a couple of lenses costs as much as a nicely configured Mac Pro with an XDR display (and a high-end iMac, especially an iMac Pro, would work just fine with one - so would an octo-core MacBook Pro with 32 GB of RAM). Cell phones and other pocket cameras with strange ultra high resolution sensor designs will compress their images and employ other tricks so as not to need exotic computers to edit them.

4K video, even in "big", low-compression formats, edits just fine (and very smoothly) on an octo-core with 32 or 64 GB of RAM. Easy for an iMac Pro from its introduction - now easy on a non-pro iMac and even an MBP. Video over 4K is now possible to capture with a $3000 camera instead of a $50,000 camera - due to a couple of affordable 6K cameras released in the last few months. Even 6K should probably edit well on a powerful octo-core system, and certainly on an iMac Pro. If you're shooting 8K, you're using a RED (or something equally expensive), and the Mac Pro fits in your budget.

Apple is ignoring 3D artists, probably because of NVidia. High-end professional 3D shops can easily afford the Mac Pro if they can use AMD cards - it's cheaper than the Crays some of them use... Semi-pros and hobbyists are actively being pushed to Windows, and it may be a small enough market that Apple doesn't care (or thinks that supporting their unique needs will cost them stability elsewhere).

Apple is not ignoring gamers - they're deliberately keeping them out. They don't want to add the hooks they'd need to the OS to allow complex games to function. The routines that allow games to control Windows machines at low level are also beloved by malware authors. The same is true of gaming drivers (mostly for video cards) - they're simply not as stable as drivers written for other uses. AMD and NVidia both charge extra for workstation drivers (and slight hardware modifications) on the PC side, and Apple has written their own AMD driver - all to avoid using gaming drivers for other purposes.

Almost all standard office applications run fine on a MacBook Air, let alone a big iMac - and the exceptions (GIS, CAD, database - all at the high end) tend to be Windows-based anyway...

While scientists loved the old Mac Pro, that was a happy accident for Apple. They've never cared about the scientific market - just not enough of them to be worth their while. They'll get some purchase orders from the physics department for the new one, too - and the scientists they lose to Dell and HP won't register on their trillion dollar radar.

Computational investing has so much money floating around that they don't care what it costs. If they want Mac Pros, they buy them out of petty cash. Quantitative hedge fund investors once drilled a hole through a mountain to get a cable routing a hundred miles shorter between New York and Chicago. They cared enough about a few hundredths of a millisecond to spend probably millions of dollars on the shorter route. If a Mac Pro fits their workflow, they'll buy as many of them as they need and a bunch more for spares (and get them custom-painted to match their Cray).

Most other possible applications fall into one of these broad categories... Either an iMac works (whether you like it or not), there's enough money around that the price of the Mac Pro doesn't matter, or Apple has more or less deliberately abandoned the market to Windows. Apple won't make a machine that lets the user avoid an iMac because Apple prefers to sell iMacs.
apple has not really really made a lot about gamers other then press stuff at the time where the only real good systems for gaming where $1600+ imacs and the $2000+ mac pro with an weak video card at that price.
 
apple has not really really made a lot about gamers other then press stuff at the time where the only real good systems for gaming where $1600+ imacs and the $2000+ mac pro with an weak video card at that price.

And, sounding as I do like a broken record - a "gaming computer" with an Nvidia GPU is a virtual reality workstation. Apple's top of the range workstation, is a garbage virtual reality workstation.
 
And, sounding as I do like a broken record - a "gaming computer" with an Nvidia GPU is a virtual reality workstation. Apple's top of the range workstation, is a garbage virtual reality workstation.

I don't buy it.

There is a lot of hand wringing here that Nvidia is required for any serious work, and I just don't see it. Even if an Nvidia card is incrementally faster, that doesn't make an AMD card a garbage option.

Designing with an actual gaming card is pretty ehhhh too. Arguably the Vega 2 Duo would be a far superior option with more VRAM too. You'd need to thread rendering across the GPUs, but when you're the author of the content, that's doable.

If I had the choice between a 2080 or a Vega 2 Duo for working writing a 3D content engine, I'd take the Vega 2 Duo.

Heck, if you had the money, two Vega 2 Duos would absolutely destroy a 2080.
 
I don't buy it.

There is a lot of hand wringing here that Nvidia is required for any serious work, and I just don't see it. Even if an Nvidia card is incrementally faster, that doesn't make an AMD card a garbage option.

Designing with an actual gaming card is pretty ehhhh too. Arguably the Vega 2 Duo would be a far superior option with more VRAM too. You'd need to thread rendering across the GPUs, but when you're the author of the content, that's doable.

If I had the choice between a 2080 or a Vega 2 Duo for working writing a 3D content engine, I'd take the Vega 2 Duo.

Heck, if you had the money, two Vega 2 Duos would absolutely destroy a 2080.
Main issue with AMD card's is even running in Linux (to access SYCL/HIP) still far behind nVidia CUDA capabilities, even notwithstanding AMD huge efforts to create builds for the latest tensorflow releases they're available about 4 moth later, and only in Linux, Apple's shameful AI strategy not just kicked CUDA, also SYCL/opencl, there are cues that apple may push back and re-install both CUDA and opencl/AMD hip to comeback to the AI game
 
Main issue with AMD card's is even running in Linux (to access SYCL/HIP) still far behind nVidia CUDA capabilities, even notwithstanding AMD huge efforts to create builds for the latest tensorflow releases they're available about 4 moth later, and only in Linux, Apple's shameful AI strategy not just kicked CUDA, also SYCL/opencl, there are cues that apple may push back and re-install both CUDA and opencl/AMD hip to comeback to the AI game

Well now you’re changing the category. We were talking VR content creation. Now we’re talking about compute.

I’m not convinced Nvidia hardware is faster as much as Metal is just young. Apple is adding more and more performance enhancements to Metal every year, and I’m curious to see what comes out of their work this year.

Can’t speak to Linux drivers, but I don’t think Nvidia is winning significantly in hardware. They have an edge, but especially with stacking multiple AMD GPUs, it’s not much.
 
I don't buy it. Heck, if you had the money, two Vega 2 Duos would absolutely destroy a 2080.

Right, but Nvidia gives you 2080ti, Titan (x) etc for radically less money, and probably radically radically higher performance than Apple's Vega 2 Duos will likely cost. The undeniable problem with AMD, is they're simply not good at generating high-end realtime 3d environments. Look at 4k high framerate gaming scores, which are the best indicator of VR performance, AMD is nowhere to be seen - they've put all their eggs in 1080p / 1440p.

AMD isn't just slower on MacOS, it's slower on Windows as well. You literally cannot achieve the performance in VR with AMD, that you can get with Nvidia. Advancements in Metal will not cure hardware incapacity.
 
  • Like
Reactions: Flint Ironstag
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.