Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
This article seems to contradict about it working.

https://discussions.apple.com/thread/7364710?start=30&tstart=0

Also at the end of your own linked article, a user also has a contradictory comment.
It is difficult to find any discreet graphics cards made in the last 6 years that DO NOT support 10-bit color.

I've tested/confirmed "Deep Color" mode works on several older Macs (hence why I wrote the rough guide). Models tested had Discreet Graphics cards and ranged from early 2011 to 2015 iMacs and MBPs w/ Tbolt displays - They all worked!

The contradictory comment is by someone unwilling to try the 10-bit work-around w/ switchresx.

Try it yourself.

Make sure you download the 10-bit/8-bit gradient test file and use "Preview" to view it,
both before/after setting the color mode to "billions" in switchresx.

NOTE: I have not tested IGP Macs (gimped Macs w/ no discreet graphics). Additionaly, some older MBP models only have 8bit screens, but will display 10-bit color when connected to a Thunderbolt Display.

Basically all you have to do is:
  1. install switchresx - http://www.madrau.com/srx_download/download.html
  2. set color to "billions" in switchresx
  3. open 10bit image test in preview/photoshop cc - https://raw.githubusercontent.com/jursonovicst/gradient/master/gradient.png
  4. open 10bit video test in resolve - https://github.com/jursonovicst/gradient/raw/master/gradient.mp4

Currently 10-bit only works in a handful of apps on Mac;
most importantly it works in Resolve & Photoshop CC.

You can read more about Dr. Tamás Jursonovics, who created the above 10bit test files, on his page: http://jursonovics.com/
 
Last edited:
  • Like
Reactions: tuxon86
Well, I don't know. Is it hardware thing, software thing? Maybe now with GPU's becoming more powerful that Apple can just hack through their own drivers to support 10-bits even if there aren't any dedicated hardware.

You've already said the consumer hardware supports it when you noted that there are plenty of Mac consumer GPUs that output 10 bit hardware.

So you don't need a "workstation" card to output 10 bit color. There's still no difference in the hardware. The Mac Pro does not have some special 10 bit hardware dohicky on it's GPUs.

10 bit is part of the HDMI 1.3 spec. If you can output HDMI 1.3 off a card, you can do 10 bit color.
[doublepost=1480559788][/doublepost]
So, a game that supports HDR (in Windows mind you) is not the same as an OS supporting 10-bit color space, which MacOS does. But, only through a select number of Macs and GPU's. So...

All Macs (and iOS devices) support 10 bit output. An iPhone 7 has a 10 bit panel.

This isn't as special as you seem to think it is. They just have to be running... I think it was a certain version of El Capitan that had this flipped on?

You do need a GPU that can do 10 bit, but that's a lot of GPUs, and pretty much anything new. The actual Mac driving the GPU doesn't matter.
 
Last edited:
From "you need a professional grade gpu to"...."well Macs haven't supported it"
What was your original assertion? Something along the lines of 10bit is a pro feature...
Keep moving the posts! You'll be on target eventually.

Haha. You just wanna make the excuse that I am doing what you think Im doing to use this GIF.

But, I don't think I am moving the goal post.

This all started I think in post #2737 when I commented about the D700 card in the mac pro as being nothing more and nothing less a workstation class card... to counter-argue your comment about the D700 as being nothing more and nothing less a 2011 Tahiti card. And, then casually added a comment about workstation cards supporting 10-bit color because I have been trying to find out about this with my own RX 460 card in my Mac Pro, recently.

Which... AMD RX 460 does not support 10-bit color. But, it does support HDR. But, HDR is only through games. So, it's not the same.

There might be hackery involved to support 10-bit color from say an HD7950 in a Mac Pro. But, even the link supplied by Daniel Reed, users there are having contradictory results. And, I'm not willing to try the hack on mine because I don't have a 10-bit capable monitor.

Perhaps, 10-bit support will become standard in Macs now as we have seen with the new MBP's. But, previously, it wasn't. And previously, only workstation cards supported 10-bit color. And, yes, Pascal and Polaris gaming cards support HDR which is 10-bits. But, that is only through gaming and DirectX.

I don't think my comments are close to that GIF. So, it's kind of a waste of a GIF. Sorry!
[doublepost=1480561479][/doublepost]
You've already said the consumer hardware supports it when you noted that there are plenty of Mac consumer GPUs that output 10 bit hardware.

So you don't need a "workstation" card to output 10 bit color. There's still no difference in the hardware. The Mac Pro does not have some special 10 bit hardware dohicky on it's GPUs.

10 bit is part of the HDMI 1.3 spec. If you can output HDMI 1.3 off a card, you can do 10 bit color.
[doublepost=1480559788][/doublepost]

All Macs (and iOS devices) support 10 bit output. An iPhone 7 has a 10 bit panel.

This isn't as special as you seem to think it is. They just have to be running... I think it was a certain version of El Capitan that had this flipped on?

You do need a GPU that can do 10 bit, but that's a lot of GPUs, and pretty much anything new. The actual Mac driving the GPU doesn't matter.

I'm not saying it's "special." All I said was workstation class cards, OOB, are the only ones that support 10-bit color space.

And, TBH, I really wanna know this... how do you know there is no "special 10 bit hardware dohicky on it's GPUs?" I don't really know this, so I'm curious. But, I doubt you have the answer.

As I said before, Polaris and Pascal supports HDR which is 10-bit, but that is through DirectX and games.

iPhone 10-bit support is probably through metal.

And, are you sure all Macs support 10-bit output?
 
Last edited:
All Macs (and iOS devices) support 10 bit output. An iPhone 7 has a 10 bit panel.

This isn't as special as you seem to think it is. They just have to be running... I think it was a certain version of El Capitan that had this flipped on?

Apple turned on 10-bit in 10.11.2:
https://developer.apple.com/library..._11_2.html#//apple_ref/doc/uid/TP40016630-SW1

Apple set a software flag to keep 10-bit off w/ some macs, switchresx bypasses the Apple block.

Why did apple do this?
Money!

Want professional 10-bit for free? No way Apple says!
Apple demands you buy another "officially supported" iMac if you want 10-bit.
 
Last edited:
I've tested/confirmed "Deep Color" mode works on several older Macs (hence why I wrote the rough guide). Models tested had Discreet Graphics cards and ranged from early 2011 to 2015 iMacs and MBPs w/ Tbolt displays - They all worked!

The contradictory comment is by someone unwilling to try the 10-bit work-around w/ switchresx.

Try it yourself.

Make sure you download the 10-bit/8-bit gradient test file and use "Preview" to view it,
both before/after setting the color mode to "billions" in switchresx.

NOTE: I have not tested IGP Macs (gimped Macs w/ no discreet graphics). Additionaly, some older MBP models only have 8bit screens, but will display 10-bit color when connected to a Thunderbolt Display.

Basically all you have to do is:
  1. install switchresx - http://www.madrau.com/srx_download/download.html
  2. set color to "billions" in switchresx
  3. open 10bit image test in preview/photoshop cc - https://raw.githubusercontent.com/jursonovicst/gradient/master/gradient.png
  4. open 10bit video test in resolve - https://github.com/jursonovicst/gradient/raw/master/gradient.mp4

Currently 10-bit only works in a handful of apps on Mac;
most importantly it works in Resolve & Photoshop CC.

You can read more about Dr. Tamás Jursonovics, who created the above 10bit test files, on his page: http://jursonovics.com/

This is interesting. But, I'm gonna pass. Even if I can hack my RX 460 in my Mac Pro with SwitchRex, I don't have a 10-bit capable monitor. So, it would be, kind of, useless, for me.

But, thanks though. This could be useful for future reference!
 
This is interesting. But, I'm gonna pass. Even if I can hack my RX 460 in my Mac Pro with SwitchRex, I don't have a 10-bit capable monitor. So, it would be, kind of, useless, for me.

But, thanks though. This could be useful for future reference!

I suspect some use 10-bit panels even when they claim they are only 8-bit

Thunderbolt Displays claim to be only 8-bit, yet I have tested several and not found a single one that is not 10-bit.
 
As I said before, Polaris and Pascal supports HDR which is 10-bit, but that is through DirectX and games.

This doesn't make any sense. All Windows apps run through DirectX. DirectX is the whole system, not just games. Ever since Vista every app has been a DirectX capable app.

Because every app since Vista is built on DirectX, every app is able to render in 10 bit.

iPhone 10-bit support is probably through metal.

Now you're making less sense. Are you saying anything that supports Metal supports 10 bit output?

That's not true, regardless. OpenGL on the Mac has supported 10 bit output.

http://nativedigital.co.uk/site/2010/11/10-bit-graphics-on-apple-mac/

See. 4870 and 5870 on the Mac have supported 10 bit output through OpenGL.

You're talking yourself in circles. Why can't you just accept there isn't anything special about the Mac Pro's GPUs?
 
This doesn't make any sense. All Windows apps run through DirectX. DirectX is the whole system, not just games. Ever since Vista every app has been a DirectX capable app.

Because every app since Vista is built on DirectX, every app is able to render in 10 bit.



Now you're making less sense. Are you saying anything that supports Metal supports 10 bit output?

That's not true, regardless. OpenGL on the Mac has supported 10 bit output.

http://nativedigital.co.uk/site/2010/11/10-bit-graphics-on-apple-mac/

See. 4870 and 5870 on the Mac have supported 10 bit output through OpenGL.

You're talking yourself in circles. Why can't you just accept there isn't anything special about the Mac Pro's GPUs?

Accept that there isn't anything special about the Mac Pro's GPU's? I assume you mean the nMP D series GPU's, right? In that case, I think they're special since they're custom and they're "workstation" class GPU's.

What you really wanna ask though, I think, is admit that there isn't anything special about an AMD FirePro card and an Nvidia Quadro card, right? Because that's what you're really asking? Am I right? Comparing them to Geforce or Radeon gaming cards? Am I right?

This article here says only supported Macs does "Deep Image"

https://developer.apple.com/library...ion/Intro.html#//apple_ref/doc/uid/TP40016622

If you want me to admit to anything... I admit I'm a noob at this. Not an engineer or anything. So, I'm learning as I go. But, you seem to be accusing me of going in circles, claiming I'm moving goal posts and admitting to something special about something...

The Metal thing and the DirectX thing was a shot in the dark after quick googling. Like I said, I'm a noob.

"Are you saying anything that supports Metal supports 10 bit output?"

Sounds like a trick question. I'm a noob okay at this stuff! I admit it! I don't even have a 10-bit card and a 10-bit monitor. All my 10-bit knowledge is from ogling Quadro and Firepro and expensive NEC and Eizo product pages! Lol!

As to Windows and DirectX. Take a look at this article after like 2 seconds of googling:

http://nvidia.custhelp.com/app/answers/detail/a_id/3011/related/1/session/L2F2LzEvdGltZS8xNDgwNTcyMTEwL3NpZC9jbENRWl80bg==

"NVIDIA Geforce graphics cards have offered 10-bit per color out to a full screen Direct X surface since the Geforce 200 series GPUs. Due to the way most applications use traditional Windows API functions to create the application UI and viewport display, this method is not used for professional applications such as Adobe Premiere Pro and Adobe Photoshop. These programs use OpenGL 10-bit per color buffers which require an NVIDIA Quadro GPU with DisplayPort connector. A small number of monitors support 10-bit per color with Quadro graphics cards over DVI. For more information on NVIDIA professional line of Quadro GPUs, please visit:"

"4870 and 5870 on the Mac have supported 10 bit output through OpenGL"

Ummm... before I got the RX 460, I had an AMD HD 5770 in my Mac Pro. The HD 5770 is in the same family tree as the HD5870. And, my HD 5770 doesn't support 10-bit. So... are you sure about that, bud?

 
Last edited:
Workstation class graphics cards usually means ECC memory and firmware tweaks for better full float math. I forget where i read it, but there was a strong argument about the loss of speed for ECC did not favor well with speed loss from the rare memory error w/ non-ecc. Sure, better full float performance helps if applications use it. I've bounced between systems with quadro 6000s and gtx titans (quadro vs gaming firmware, same hardware except the memory type), and the quadros seemed a few seconds slower with everything else equal for processing 6k to HD materal, even in resolve. Not really much of a difference at all, considering quadros charge a premium, users are almost always wayy wayy better off with double or tiple titans then the cost of one Quadro. And from real world experience, most creative applications work better with CUDA then OpenCL. One gtx titan performs the same or way better then dual D700s, even with the nvidia card in an external enclosure via the pci-e 4 lane bus on TB2. I know the arm chair experts will disagree o_O
 
Last edited:
  • Like
Reactions: tuxon86
The focus for the Mac line is to drop Intel CPUs in favor of using Apple's own A chips.
Even as Intel preps to fab the A12 chips for Apple's 2018 products.
Expect the A chip switch-over to start by macOS 10.14 (if not for 10.13).

The 2017 iMac is suspected to be the last Intel based Mac.

And that will be the end of the mac as a software platform.

Software companies aren't going to jump architectures twice in less than 20 years.
 
The bandwidth required to transfer 10-bit on a sub-retina resolution display was never a limiting factor, unless you wanted to run multiple monitors. The power of the GPU had long surpassed that mark and hand't been the bottleneck for a long time. In practice, the fact that Apple hadn't bothered to properly implement 10-bit into OS X as late as El Crap was the one of the main reasons that video professionals at large had to migrate out of the Mac eco-system.

I remember a 100 dollar blackmagic PCI card was already capable of pulling 10-bit video out of FCP7 on a MacPro 2,1 more than 10 years ago, but it's limited to just the active video render but not the rest of the workspace such as canvas, not even for single static frame. It was OS X as a system, where you want also the graphical interface to support 10-bit, that was the hinderance.

To say that the nMP, even when it was new in 2013, was a workstation class computer is a really long stretch. They chose components that had some resemblance to what you can find in a typical workstation, but the overall package that they decided to put it in was the biggest factor from keeping it truly powerful. The form factor, the throttling... now after 3 years of having virtually no update, we are at a time were a Pascal / CUPA GTX 10xx card can be bought for less than $200. You can spend less than 1/3rd of the price you pay on a nMP to build a Windows PC that runs Adobe CC video accelerated apps just as well.
 
The bandwidth required to transfer 10-bit on a sub-retina resolution display was never a limiting factor, unless you wanted to run multiple monitors. The power of the GPU had long surpassed that mark and hand't been the bottleneck for a long time. In practice, the fact that Apple hadn't bothered to properly implement 10-bit into OS X as late as El Crap was the one of the main reasons that video professionals at large had to migrate out of the Mac eco-system.

I remember a 100 dollar blackmagic PCI card was already capable of pulling 10-bit video out of FCP7 on a MacPro 2,1 more than 10 years ago, but it's limited to just the active video render but not the rest of the workspace such as canvas, not even for single static frame. It was OS X as a system, where you want also the graphical interface to support 10-bit, that was the hinderance.

To say that the nMP, even when it was new in 2013, was a workstation class computer is a really long stretch. They chose components that had some resemblance to what you can find in a typical workstation, but the overall package that they decided to put it in was the biggest factor from keeping it truly powerful. The form factor, the throttling... now after 3 years of having virtually no update, we are at a time were a Pascal / CUPA GTX 10xx card can be bought for less than $200. You can spend less than 1/3rd of the price you pay on a nMP to build a Windows PC that runs Adobe CC video accelerated apps just as well.

The caveat there, though, is...

Windoze... :(

...and subscription-based model for Adobe professional apps... :( (Not good for someone with unsteady or low income such as a lot of artists)

Also, regarding 10-bit support, here is a good conversation about it:

https://www.dpreview.com/forums/thread/3538545

I know you're sad about the trashcan form-factor. But, I see it as buy and forget for people who can do so. As an artists and aspiring video editor, I find the nMP powerful enough, today. Even, if it's from 2013...

As a creator the last thing you wanna do is troubleshooting stuff... unplugging stuff... etc... with the nMP (trashcan edition)... you don't really have to do that. It's either software issue which you can't resolve. Or, a hardware issue, which you can't resolve.

This notion that say a tower Mac like the one I have (classic Mac Pro 5,1) that has PCI slots and what not is "better" because it's more upgradeable is a kind of mirage or allusion that if I can swap components out myself that it will make my life better.

It doesn't!

" The form factor, the throttling... "

Are you sure the nMP throttles? The thing is basically "designed" in a thermally efficient form factor. One big single fan in the bottom sucking cool air, cooling the core of the thing which is the heatsink. And, vertically oriented to allow hot air to escape and get expelled upwards and out of the enclosure, via gravity and that single fan.
 
Last edited:
The caveat there, though, is...

Windoze... :(

...and subscription-based model for Adobe professional apps... :( (Not good for someone with unsteady or low income such as a lot of artists)

Also, regarding 10-bit support, here is a good conversation about it:

https://www.dpreview.com/forums/thread/3538545

I know you're sad about the trashcan form-factor. But, I see it as buy and forget for people who can do so. As an artists and aspiring video editor, I find the nMP powerful enough, today. Even, if it's from 2013...

As a creator the last thing you wanna do is troubleshooting stuff... unplugging stuff... etc... with the nMP (trashcan edition)... you don't really have to do that. It's either software issue which you can't resolve. Or, a hardware issue, which you can't resolve.

This notion that say a tower Mac like the one I have (classic Mac Pro 5,1) that has PCI slots and what not is "better" because it's more upgradeable is a kind of mirage or allusion that if I can swap components out myself that it will make my life better.

It doesn't!
The blackmagic card I mentioned in my previous post was originally sold and advertised as an FCP tool, but for my use case I actually needed it for Photoshop, and only wanted it to display a single frame to correctly judge if gradient banding was occurring in my images (at the time I was dealing with many different 14-bit photos of colored sky). The mere fact that Photoshop under Windows could display it without going through an external card would have saved me a lot of troubles, had OS X behave the same for me, how ever glitchy it might have been. Also, I don't see how this was even a "Windows caveat", the fact was that Apple chose to be completely unable to do it, while MS at least tried. If anything it was Windows that pushed this forward.

I know a few video professionals whose workflow actually does involve the necessity to actually see 10-bit colors. In color grading and post, when you are screwing at the intensely dark shades, artefacts can occur if a drastic filter is used. This is especially important since it is still common to have original master footages being shot in compressed codecs, due to the astronomical file size and bandwidth associated with RAW video codecs.

The nMP for factor was a choice that Apple made, a preference of form over reliability. Even if we put aside upgradability or user-servicability out of the consideration, the single fact that it heats up easier than a traditional tower is deal breaking enough in many professional scenarios. Of course it is imaginable that some users such as you are lucky enough to fall into the target audience of such particular design choice of Apple's, but it should be equally easy to accept that for every specialized design there is its share of compromises and limits.
 
The blackmagic card I mentioned in my previous post was originally sold and advertised as an FCP tool, but for my use case I actually needed it for Photoshop, and only wanted it to display a single frame to correctly judge if gradient banding was occurring in my images (at the time I was dealing with many different 14-bit photos of colored sky). The mere fact that Photoshop under Windows could display it without going through an external card would have saved me a lot of troubles, had OS X behave the same for me, how ever glitchy it might have been. Also, I don't see how this was even a "Windows caveat", the fact was that Apple chose to be completely unable to do it, while MS at least tried. If anything it was Windows that pushed this forward.

I know a few video professionals whose workflow actually does involve the necessity to actually see 10-bit colors. In color grading and post, when you are screwing at the intensely dark shades, artefacts can occur if a drastic filter is used. This is especially important since it is still common to have original master footages being shot in compressed codecs, due to the astronomical file size and bandwidth associated with RAW video codecs.

The nMP for factor was a choice that Apple made, a preference of form over reliability. Even if we put aside upgradability or user-servicability out of the consideration, the single fact that it heats up easier than a traditional tower is deal breaking enough in many professional scenarios. Of course it is imaginable that some users such as you are lucky enough to fall into the target audience of such particular design choice of Apple's, but it should be equally easy to accept that for every specialized design there is its share of compromises and limits.

You mean this?

https://www.bhphotovideo.com/c/prod...agic_design_bintspro_4k_intensity_pro_4k.html

I don't know how the form affects the reliability of the nMP. It's a vertically oriented thermally efficient design using one big single fan to push/expel hot air out of the enclosure. The enclosure looks like it was made for natural convection. So, no heat is trapped within. Thus, prolonging lifespan and reliability. But, I don't know. I don't have experience with the nMP. But, it looks thermally right to me. Although, I am not a thermal engineer. So, don't quote me on that. Lol!
 
http://wccftech.com/amd-radeon-rx-490-gpu-listing-benchmarks/
Daily clickbait by WCCF, now featuring Apple Mac Pro
" This card is specially built for use in the next generation of Apple Mac Pro computers. The launch of this card is expected in May 2017 and the leaker mentions that the card would feature Vega architecture."

They're confused/misquoted.

If anything, the talk is about the custom Vega board option for Apple to use in the 2017 iMac (Pro, possibly in black) to be released in Q2.

The article, now edited, no longer lists the word "mac" or "apple" in it.
 
http://www.cultofmac.com/454710/sho...for-third-party-mac-pros-friday-night-fights/




mac-pro-tombstone.jpg


mac-pro-2012-vs-2013-spoof.jpg
 
Interesting commentary. I can't see the hackintosh official installer, seeing how that would open a can of worms and possibly break compatibilities with every update. If they license it out, it would probably be easier to get installed on non-authorized PC's, creating a pystar situation, and ultimately cannabinolizong Apple hardware.

The ego is costing Apple big.
 
Interesting commentary. I can't see the hackintosh official installer, seeing how that would open a can of worms and possibly break compatibilities with every update. If they license it out, it would probably be easier to get installed on non-authorized PC's, creating a pystar situation, and ultimately cannabinolizong Apple hardware.

The ego is costing Apple big.

Could Apple develop a distinct version of macOS for a Mac/PC market that would require an onboard ROM chip?
 
Apple won’t license OS X to other mfrs because OS updates with marginal incremental features that benefit mostly the Apple ecosystem are low value. How much do they charge us for an OS upgrade?
 
Apple won’t license OS X to other mfrs because OS updates with marginal incremental features that benefit mostly the Apple ecosystem are low value. How much do they charge us for an OS upgrade?
Back in the day close to $200 at least in the early OS X days, and have progressively gotten to where we are today where updates are free.

Could Apple develop a distinct version of macOS for a Mac/PC market that would require an onboard ROM chip?

That's a thought, but how hard would it be to keep it under Apple's control? That I don't know but would be worth discussion, at least in context of the article.
 
Again...

Mostly we know about 2016(hopefully) nMP 7,1 are personal assumptions and some "anonymous leaks".

Most rumors point to Introductions at WWDC with availability not earlier than Q4'16

  • Should be Based on Intel C612 Server or X99 Workstation/ProSumer Chipset.

  • GPUs x2 most leaks points to AMD GPU (Polaris and Vega) very few forum speculations still believe possible an nVidia based nMP.

  • Thunderbolt 3 upgrade it's a logical evolution, as USB-C and HDMI2.

  • CPUs Xeon E5v4 family most likely few rumors account on AMD Zen.

  • Storage, it's also assumed as logical evolution either 1x NVMe on PCIe3 or 2xNVMe on PCIe2 2.5 GBps total throughput...

  • RAM upto 256 GB possible as supported by C612 on 4 RDIMM DDR4 ECC Slot.

  • Performance as Xeon E5v4 should be utpo 630GFlops compute FP64 (CPU only E5-2699v4).

  • Performance on Upcoming GPUs speculatively should be (on each GPU) among 5.5Tflop FP32 (Polaris) to 9 TFlop FP32 (Vega) FP64 performance should be from 500GFlop to 4-5 TFlop.

The purpose of this thread
is to discuss on possible configurations also share news and discussion on the probable nMP 7,1 components.

Everybody is welcome.
[doublepost=1480660567][/doublepost]What about Intel Xeon Phi processors? For me, that can be a real one....:)
 
[doublepost=1480660567][/doublepost]What about Intel Xeon Phi processors? For me, that can be a real one....:)

There will be no new Mac Pro based on Intel designed CPUs/Processors.

Sadly, a 10-core Broadwell-E on Win10 DOUBLES the multi-core geekbench results of a maxed trashcan.

The Cook only cares about a "future" of iOS domination and short-term profits;
I suspect history will show Apple lost the major OS market share (again) by not listening to customers.

Look at how many folks, especially the younger gen, have switched to Android.
There used to be a time in San Francisco when everyone had an iPhone, now its far less then 1 in 5.

Apple used to be cool & innovative.
Tim's desperate move to get back the 20s crowd w/ focus on emojis & video memes only backfired.
I now see teens teasing other teens for even having an Apple product - that's what old people use!
Can you imagine Jobs adding emoji touchbars to Macs and built-in video memes to iOS - NEVER!

There is rarely ever a successful business with unhappy customers (Comcast as an exception).
Less then a quarter of Mac users I've asked are happy with Apple's release history the last 4 years.
Hopefully the board will realigned their priorities and fire The Cook before he burns down the kitchen.

Apple is in desperate need of leadership - not Scully 2.0

The "Mac Pro" name may be reused in the future;
but the way things are going, it'll be a more drastic change then the nMP was.
Imagine an A chip 8-core iPad base station for example. o_O
 
Last edited:
  • Like
Reactions: Aldaris
Could Apple develop a distinct version of macOS for a Mac/PC market that would require an onboard ROM chip?

I think Apple have made it abundantly clear that they no longer care about the Pro customers. It's my dream for Apple to release macOS to be able to be installed on non-Apple hardware.. if I could run it natively on my Hackintosh, without the need for modifying config files, dodgy drivers, and worrying every time you want to do an update I'd be elated.

But looking at it from a purely business perspective, the cost / time for them to work with vendors, customise / develop the OS, etc is going to net them next to no profits - when they can just keep pumping out iPhones and iWatches, and rake in the profit. I don't think they care any more about "halo" projects or keeping the old faithful customers happy. As much as it galls me to say it.
 
  • Like
Reactions: Aldaris
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.