From the way I've heard it described by engineers, the "all displays are driven by a single card" is actually a pretty fundamental part of the design of the OS X display system, and to change it would more or less require rewriting the entire display system from scratch. While it *does* actually work to varying degrees to have different displays driven by different cards, it's more by accident than intent.
Mmmmmm no. This isn't really getting at what the problem is.
All operating systems assume a display is only driven by one card. Windows, Mac, Linux, whatever. Even all the hardware assumes it. Unless you're plugging a monitor into two different cards, at the end of the day, only one card is driving it.
Crossfire works by slicing up your output, dividing it up among cards, and then sending all the slices back to the main card powering the display. So if you divide your display in two, one card will draw the top half, the other card will draw the bottom half, and then one card will take the two halves, put them together, and then send them off to the display that the card is connected to. In the past this has been done over the bridge, but now software is fast enough to do it.
There actually isn't a problem here. OS X can totally do partial rendering on one card, and then send it to another card for final display (They've demoed this at WWDC previously). The issue is that it's not automatic. A developer has to write the code by hand to do it. If Apple or AMD wanted to, nothing is stopping them right now from writing Crossfire support. It's just that no one cares.
They also may already have it working for automatic graphics switching. It's likely that when your dGPU turns on in your Macbook Pro, it's just forwarding the rendered 3D output from your discrete GPU to your integrated card.
The other thing to keep in mind is that Crossfire is complicated. Done the wrong way, trying to send resources back and forth between cards can lead to a performance decline. That's why Crossfire or SLI have profiles. DirectX 12 requires that developers be willing to get their hands a little dirty. Nowhere exists a multiple GPU implementation that is totally automatic and just works with everything.
So can classic MacOS, but not OS X, it seems.
I don't think this is true either. Classic Mac OS barely supported OpenGL, much less multiple cards.
The underlying display system under OS X, while having other issues, doesn't prevent this from going on. Nothing is wrong with OS X here, and it's certainly not a result of some sort of UNIX heritage outdatedness.