I think that this is not entirely correct.
The 8 bit or 10 bit input/output has to do only with the gradations, how rough and visible are (8 bit) or smoother / not visible (10 bit), and here, there is a role for dithering. So how fine the steps are between each shade of color is a job for the bits.
But this has nothing to do with the ability of a calibrated high end display to show full Adobe RGB or whatever gamut it's capable of. So a display can show full Adobe RGB, but in the case of 8 bit input it will have some visible banding, or in the case of 10 bit it will be a lot smoother.
Yes, I'll just post this and end of.
The iMac 5k is the first Mac in history to support 10bit colour and El Cap enabled it for the 2013 Mac Pro. It has never existed on the Mac platform before, not even in third party drivers.
Adobe couldn't support 30bit display in the Mac version of Photoshop until just a few months ago, and even then it will only work if you have the 10bit display driver enabled.
https://www.dpreview.com/forums/thread/3538545
http://macperformanceguide.com/blog/2015/20151105_2300-OSX_ElCapitan-10bit-dualNEC.html
http://nofilmschool.com/2015/11/apple-paving-way-10-bit-color-its-latest-operating-system-update
http://petapixel.com/2015/10/30/os-x-el-capitan-quietly-unlocked-10-bit-color-in-imacs-and-mac-pros/
https://www.cinema5d.com/5k-imac-10-bit-color/
But then there's the monitor issue. Lots of sub-€500 monitors these days claim to have 10 bit support, but this is mostly marketing and misleading. They also claim to be able to display 10 billion colours 'at once', which is nonsense because:
1. There aren't even close to that many pixels on the highest red screen. Gamuts are available palettes, not how many colours can be seen on screen at one time.
2. Humans with normal vision are trichromats and can see about 10 million colours at a time, at most.
3. Most colourful digital images only have several thousand colours in them, and here it helps to have hardware and drivers truly capable of showing accurate gradation. Without driver support for wide gamut then you get pseudo-support such as dithering. Remember some years back when MBP owners won a legal case against Apple for claiming wide gamut support when their screens only supported 6 bit colour? Same thing could happen now to monitor manufacturers who are not honest in their marketing.
4. True wide gamut monitors have dedicated ASICs and LUTs. These so called wide gamut support from cheap Dell and HP don't offer these features. They aren't bad monitors, they will do a professional job for a lot of people, but they should never be compared to an Eizo or NEC.
In conclusion, you can certainly produce professional work on an 8bit monitor with 8 bit drivers. People did it for years and years. But if you want the absolute best and most accurate results, the Mac platform has always had this one downside for people who really like to go deep into the tiny details. Most Macs just don't support 10 bit output, and that includes all these guys with their Mac Pro towers and sometimes expensive graphic card upgrades. They could install Windows on the same machine and ouila 10 bit support.
Btw, would be good if someone could see if Sierra has enabled 10 bit support on more than just the iMac 5k and Mac Pro 2013.