10-bit support works for full screen DirectX applications only.But at the very least there seems to be some sort of 10-bit support.
10-bit support works for full screen DirectX applications only.But at the very least there seems to be some sort of 10-bit support.
10-bit support works for full screen DirectX applications only.
Don’t blame Apple for those Open CL bugs in the web drivers. We documented on this forum and Tony86 exactly when they started happening. Radeons don’t have the issues.
Soy, I do believe you, but I am also confused.
I have a 10-bit monitor (actually its an 8-bit panel with dithering, but it announces itself as a 10-bit monitor) and in the Windows Geforce drivers for 1080ti for the parameter Output Color Depth I can select 10-bit color (options are 8 bpc or 10 bpc). Furthermore, it also seems to support 10-bit color for HDR when connected to my TV.
I am certain you know your stuff. Certainly far more than I do, because you work with it and I'm just an enthusiast. But at the very least there seems to be some sort of 10-bit support. Is the Geforce software doing something like my Dell, where it is simply faking 10-bit color by using an 8-bit palette with dithering? Or is the 10-bit support only for something like HEVC video playback? I'm not sure how else I can rectify what I'm seeing with what you are saying, but perhaps you can elaborate.
Personally I leave mine on 8 bpc because I don't do any Pro work and don't need 10-bit color for anything. My understanding is that if you select 10-bit color but use applications that don't support it (which is what I do), the output will actually be less color accurate. But I'm not sure that's the right thing to do either, and I haven't investigated.
Interesting how Nvidia shows an Apple Cinema Display and macOS running on the new RTX card subsite -
https://www.nvidia.com/en-us/design-visualization/creators/
There's no way this is merely a 'marketing blip'
On this sub forum the last 4 years I have constantly reported benchmarks, settings, features, configs of Nvidia cards from 6 series to 10 series on Windows and macOS. I have gone into it extremely throughly. I’m sorry that you missed it, but I also see lots of cognitive dissonance and people selling cards who keep burying what I inform.
I have 10 bit monitors for over a decade. GeForce cards only have a 8 bit output or with dithering. This is known very well in the graphic community. Nvidia always wanted users to step up to Quadro for TRUE 10 bit support.
The current state of GeForce drivers for Windows is a mixed bag. Some full screen games force 10 bit (8 bit with dithering really) but creative or workstation applications run in 8 bit only. That’s crap if you want to edit HDR videos and photos.
Here’s a recent discussion
https://www.reddit.com/r/nvidia/comments/8qpr0t/10bit_billions_of_colors_on_geforce_gpus_like/
That’s for Windows. On Mac it is 8 bit only.
AMD has no issue with 10 bit output even on lowly Polaris 450 mobile GPU.
[doublepost=1540062409][/doublepost]
Or old publicity shot running Quadro or they just think Windows looks ugly for marketing material.
Or, you know, I’m right and you’re wrong.
Like I said in my post, I highly doubt this is a hiccup. There’s numerous ways to cover up Windows’ ugliness without blatantly using the Apple logo and OS on your own bleeping website.
It's a possibility, if a new Pro machine is coming out and they got inside knowledge of it. But otherwise these type of promotional graphics with mock up OS screen have been around since 20+ years.
so just came across this
https://egpu.io/forums/builds/2010-...16gbps-tb3-mantiz-venus-win10-1803-theitsage/
looks like the RTXs may for real, actually provide boot screens its interesting to note from the PCI device listing, that indeed theres no mac specific information
its like as if you booted OS X on a GOP x86 PC Hackintosh with no video drivers or graphics injectors
so nvidia is just providing the bare minimum to give boot screens on the MP51, and is letting the drivers handle the card as if its an "un-flashed" card
What's the current list of known NVIDIA OpenCL bugs? The last time I really saw this discussed in detail was several years ago, and at the time, basically all the bugs had been fixed in the latest web drivers from NVIDIA. I believe there were even a few cases of the bugs being in the application, and the app had been updated to resolve the issue.
This is naturally ignoring the fact that OpenCL is effectively dead, as Apple has announced it is deprecated on their platforms.
I have that guy on ignore because three years ago for many months I listed all the OpenCL bugs, Metal bugs in Premiere, iBooks bugs, uploaded images, uploaded videos, wrote exact instructions how to replicate the bugs and which version drivers introduced the bugs.
He kept changing subject, being extremely rude, trying to deflect and bury my evidence. Again trying the same psychological tactics today after three years.
Few members on this board were giving very bad advice to buyers (Using creative apps) that the flashed GeForce cards were the best choice for them. Whenever I spoke about the lack of 10 bit support and the OpenCL bugs these card shills attacked me and tried to divert the subject.
Some of those buyers took the bad advice and bought the GeForce cards and were very angry with the results (we see their posts everywhere). It was dreadful to see this happen. A real black stain on the community.
https://devtalk.nvidia.com/default/topic/1042279/?comment=5286813
Nvidia dev makes clear that they haven’t started on Mojave drivers yet because they are waiting for a green light.
Kind of BS. Apple doesn't control NVIDIA drivers. Only the drivers that are included in MacOS. That's why it has been up to NVIDIA to bring us Web Drivers to manually install, the CUDA problem I can see, but drivers for the actual GPU no way.
Per the discussion in the FAQ thread, my take on this is that Apple hasn't given NVIDIA the information (i.e. private frameworks/headers) to actually build the driver for Mojave. GPU drivers rely on non-public interfaces to work, and if Apple is withholding that information from NVIDIA, then they can't produce a driver for Mojave.
I confirm I have a RTX 2080 not the Ti and it has the EFI boot when you press alt you can now see the drive.
I’m sorry that you missed it
Could you please provide us witha full system profiler dump of it and also a drivers -b output from an EFI Shell as requested above? thanks
Install rEFIt or rEFInd onto a usb-key, boot from it, select EFI Shell, do a drivers -b and please take a photo.Extracting the EFI from the chip model IS25WP080
Where the Efi shell at or how to? I missed it? extracting the Efi from the chip though..
Install rEFIt or rEFInd onto a usb-key, boot from it, select EFI Shell, do a drivers -b and please take a photo.
Install rEFIt or rEFInd onto a usb-key, boot from it, select EFI Shell, do a drivers -b and please take a photo.