I'd be happy if they'd just play nice and deliver drivers personally-this 8K part I'm rooting for in the professional community is above my pay-grade, really-LOL. Photography is my primary creative application, but Excel gets its fair share of workouts. Most people don't realize how data intensive trend analysis can get when you are taking about 20-200K data points and working across multiple page-sheets. I've just always focused on hardware-since software always comes second and is "easier" to modify; not by me-but other more talented people. There is nothing more beautiful than a driver update notification ping when limping along with buggy drivers.
I was under the impression 10-bit has been working in Windows for a while.
Now I don't have a 10-bit monitor or handle the media so I've paid no heed-and I can adjust it, but it wouldn't be visible/noticeable to me if I did. Are you saying this function doesn't work on your Titan XP? I'll take that with a grain of salt-as well as your claim you own any stock in a company that you'd bash on nearly as hard as you do. Just an observation, I don't really care what your portfolio looks like-but why would anyone still own stock that's performed as NVidia has in Q4 2018 unless there was turnaround expected? I prefer win/win scenario's over win/lose ones though-it tends to be more productive.
I understand that's a resolution output setting-but one would conclude that would enable the instruction set throughout the GPU, and if it's capable of outputting it-its capable of processing it to one degree or another.
"High Dynamic Range, or HDR, isn't a new concept in photography. It isn't even new to PC gaming, as some of the oldest games with HDR (using simple bloom effects) date back to the Valve Source engine (early 2000s). Those apps, however, used the limited 24-bit (8-bit per color, 16.7 million colors in all) color palette to emulate HDR. Modern bandwidth-rich GPUs such as the GTX 1080 have native support for large color palettes, such as 10-bit (1.07 billion colors) and 12-bit (68.7 billion colors), to accelerate HDR content without software emulation. This includes support for 10-bit and 12-bit HVEC video decoding at resolutions of up to 4K @ 60 Hz, or video encoding at 10-bit for the same resolution."-https://
www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_1080/3.html
For me-Turing's and Apple's speaking natively at a hardware and software level is an ideal solution to our many of our problems, but I have read great things leaking on Navi. We'll see what happens in Vegas next month and through March (End Q1 2019). If they want to open the door to both-so much the better. The grass is always greener on the other side and opening up GPU interchangeability eliminates the fear of missing out BS that drives 90% of disputes between PC/Mac/Hackintosh's.