Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
GTX 1080 is working great on my machine! Except for this: In Google Chrome, YouTube videos look really washed out. It's really bizarre. Anyone else experiencing this? Anyone know of a fix?

Don't use Google Chrome:D

Use to be a way of controlling the gamma for video in the nVidia Control Panel, oh, that's right Mac users never got that, even tho Linux and Windows have had it for 20 years or so.

In reality, nVidia will keep offering bata drivers for PC cards, so their board parters have a bigger market, that they, nor nVidia has to support, in any way$$$$$$$$$$$$$$$$$$, until OS X users stop buying their products.
 
Don't use Google Chrome:D

Use to be a way of controlling the gamma for video in the nVidia Control Panel, oh, that's right Mac users never got that, even tho Linux and Windows have had it for 20 years or so.

In reality, nVidia will keep offering bata drivers for PC cards, so their board parters have a bigger market, that they, nor nVidia has to support, in any way$$$$$$$$$$$$$$$$$$, until OS X users stop buying their products.

Here's another dumb thing: If I turn off Chrome's hardware acceleration, the gamma goes back to normal on YouTube. But the problem is that I need hardware acceleration....
 
Here's another dumb thing: If I turn off Chrome's hardware acceleration, the gamma goes back to normal on YouTube. But the problem is that I need hardware acceleration....
Does it happen with HWA in Safari too?

Should tell us if it's a driver bug, or a Chrome bug, tho I think Apple still controls the HWA for video stack.
 
Does it happen with HWA in Safari too?

Should tell us if it's a driver bug, or a Chrome bug, tho I think Apple still controls the HWA for video stack.

I'm not sure how to enable/disable hardware acceleration in Safari. But the videos look great in Safari. Not washed out at all. The videos freeze in Safari though, if I change the quality setting. It's annoying; I have to refresh the page just to watch a video in Safari.
 
I'm not sure how to enable/disable hardware acceleration in Safari. But the videos look great in Safari. Not washed out at all. The videos freeze in Safari though, if I change the quality setting. It's annoying; I have to refresh the page just to watch a video in Safari.
Could you provide a link to what you are watching?

Is this video freeze in Safari something that predates your video card upgrade, or is it also a new bug?
 
GTX 1080 is working great on my machine except for this: In Google Chrome, YouTube videos look really washed out. It's really bizarre. Anyone else experiencing this? Anyone know of a fix?

EDIT: Here's a screenshot. Left is Chrome, right is Safari. It's subtle, but it's still really annoying. Chrome looks more milky.
osSZhcL.jpg

Can it be due to Google using their own Blink rendering engine?
 
The more that I look at it, the more I realize that it's not just YouTube. It seems to be the whole monitor. (And for some reason, Safari is just a little more contrasty than Chrome). It seems to be that macOS defaults to limited range RGB (even though I'm using DisplayPort, not HDMI).

I also have no way of knowing whether my GPU is outputting 8-bit or 10-bit signals to my monitor (yes, I have a 10-bit monitor). In Windows, there is the NVIDIA Control Panel, and I get these really helpful options:

0OEBgef.png


It would be really nice to have these options in macOS too. Full range color is extremely important to my workflow, and if the NVIDIA settings can't even show me what's being outputted to my monitor, how am I supposed to know what's being outputted to my monitor?? There seems to be zero controls for this type of thing in macOS, so it appears I'm just stuck...

EDIT: Hmm... Now I'm doubting my second theory again... It's just hard to know what the problem is without having a tool like NVIDIA Control Panel... Weird. :(
 
Last edited:
if the NVIDIA settings can't even show me what's being outputted to my monitor, how am I supposed to know what's being outputted to my monitor??

You shouldn't trust what the Nvidia settings say in Windows anyway, because the only way to be sure that the end result is 10-bit, is to test the entire path (OS, GPU, drivers, application, application settings, media file type, physical display, display settings, display cable type). Any single entity in the path not supporting 10-bit will "break the chain" and the final output won't be 10 bit.

I suggest you open Photoshop or whatever you're using, create a gradient from 100% black to 100% white across the width of the screen. If the transition is smooth, you are 10-bit. If you see banding/posterization/dithering, you are not.

If it is a video application you are using, download one of the 10-bit test pattern videos.
 
  • Like
Reactions: aaronhead14
hi all,
i just bought my gtx1080 but i dont get PCI 2 speeds with OpenCL OceanWave benchmark on ( EVGA GeForce GTX 1080 SC GAMING, 08G-P4-6183-KR, 8GB GDDR5X, ACX 3.0 & LED) Can be a bug with OpenCL OceanWave benchmark?

Yes. I ran the OceanWave and get the same BS results with my Gigabyte GTX 1080. It's also reporting the card speed wrong.

OceanWave 10.12.4 GTX 780.jpg OceanWave GTX 1080 copy.jpg

Lou
 
  • Like
Reactions: angelsevov
Does that mean that the card cannot run at full pcie2 speed with the macOS Pascal drivers and EFI Mod?

Lou is one of the first ones to have the flashing done, and maybe the MVC code is not 100% ready. Or OceanWave needs an API update to test accurately.
 
Does that mean that the card cannot run at full pcie2 speed with the macOS Pascal drivers and EFI Mod?

No. It means that something is wrong with the OceanWave test. Note that angelsevov got similar results with a non flashed card. Seems there's a problem with OceanWave and Pascal, not with MVC's EFI. Cuda-Z is reading correctly:

TinyGrab Screen Shot 4-29-17, 9.28.50 AM.png

And by the time the machine is booted, the EFI is out of the picture (no pun intended), it is used only during the boot process for boot screen.

Lou
 
The more that I look at it, the more I realize that it's not just YouTube. It seems to be the whole monitor. (And for some reason, Safari is just a little more contrasty than Chrome). It seems to be that macOS defaults to limited range RGB (even though I'm using DisplayPort, not HDMI).

I also have no way of knowing whether my GPU is outputting 8-bit or 10-bit signals to my monitor (yes, I have a 10-bit monitor). In Windows, there is the NVIDIA Control Panel, and I get these really helpful options:

0OEBgef.png


It would be really nice to have these options in macOS too. Full range color is extremely important to my workflow, and if the NVIDIA settings can't even show me what's being outputted to my monitor, how am I supposed to know what's being outputted to my monitor?? There seems to be zero controls for this type of thing in macOS, so it appears I'm just stuck...

EDIT: Hmm... Now I'm doubting my second theory again... It's just hard to know what the problem is without having a tool like NVIDIA Control Panel... Weird. :(

My 4K monitor supports 10 Bit (8 Bit + frc). In OS X I can activate the 10 Bit modus with SwitchRex X (Billion of colors).
04267cadff7a3b3412fe13f1c859e4fe76a49efe.png
 
For instance, I'm a film editor, Avid qualified Media Composer... In my line of work, most studios are still running on El Capitan - ok some are even still on 10.8.5, Well I like to have choices. And that might involve sticking to a GTX 980Ti.
Peace.

Now that Script Sync & Phrase Find are back on board, places can finally update from MC 7.0.4 OS 10.8.5 if they want to! Did you see where Marianna said GPU rendering acceleration is in development? If it's come in MC 8.8? or 8.9 your 980Ti would become a bigger bang for the buck! :D
 
The more that I look at it, the more I realize that it's not just YouTube. It seems to be the whole monitor. (And for some reason, Safari is just a little more contrasty than Chrome). It seems to be that macOS defaults to limited range RGB (even though I'm using DisplayPort, not HDMI).

I also have no way of knowing whether my GPU is outputting 8-bit or 10-bit signals to my monitor (yes, I have a 10-bit monitor). In Windows, there is the NVIDIA Control Panel, and I get these really helpful options:

0OEBgef.png


It would be really nice to have these options in macOS too. Full range color is extremely important to my workflow, and if the NVIDIA settings can't even show me what's being outputted to my monitor, how am I supposed to know what's being outputted to my monitor?? There seems to be zero controls for this type of thing in macOS, so it appears I'm just stuck...

EDIT: Hmm... Now I'm doubting my second theory again... It's just hard to know what the problem is without having a tool like NVIDIA Control Panel... Weird. :(


Try this: http://www.mathewinkson.com/2013/03...ix-the-picture-quality-of-an-external-monitor

I had a Dell displayport monitor that Mac os saw as a "TV" in settings, and gave me washed out colors too, this patch fixed everything for me.
 
Oh that's awesome! I'll have the try SwitchRes X!

I've used SwitchRes X for a while and its great. If your monitor supports 10 bit, it will auto detect it and allow for the billions of colors options. (I use this with my Titan X and LG 31MU97 DCI 4K 10 bit monitor).

You can double check if 10-bit output has been activated under system report in the display section, under pixel depth.
 
  • Like
Reactions: aaronhead14
[doublepost=1493584714][/doublepost]

Try Iina player and you will see immediately the difference with the right footage: https://lhc70000.github.io/iina/

It is a pity that I can't activate the 10 Bit modus in Windows. The Nvidia drivers are locked in 8 Bit with my GTX card... Only Quadro cards are supported: http://nvidia.custhelp.com/app/answ...-bit-per-color-support-on-nvidia-geforce-gpus

I thought that, too!

But, apparently, Nvidia Driver on windows will have an option for 10-bit, if, your monitor supports it. Even, if you're using a Geforce Pascal card, such as on mine, which is running a GTX 1070 on Windows 10:

10-bit%20nvidia_zps7d4vntgh.png


Of course, this is useless, for me because I don't have a 10-bit workflow. Lol!

PS--"10 bpc" under "Output Color Depth" is what we're looking at. If, your monitor is 8-bit, it won't have a 10-bit option. But, if it is 10-bit, it can toggle between 8-bit or 10-bit. But, as I said, I leave mine to 8-bit since I don't have a 10-bit workflow, anyway.
 
  • Like
Reactions: Synchro3
OceanWave seems to be working correctly (sometimes) in 10.12.5. I tried it this morning after updating and got the following:

OceanWave 10.12.5.png

I then loaded it again, and got the screwed number, reloaded it, and loaded correctly. However, still reporting card speed wrong.

Lou
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.