Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Ruahrc

macrumors 65816
Original poster
Jun 9, 2009
1,345
0
I have 2 macs, a laptop with SL on it and a (newer) mac mini with Lion.

Before I bought my Mac mini, I used to connect my laptop to my HDTV using the miniDP>DVI adapter, which then had a DVI>HDMI cable that went into the TV. As I own a colorimeter, I calibrated my TV display so the output matched my laptop screen. Everything looked/worked great.

When I got my new Mac Mini, I set it up such that I could connect it to my TV using the HDMI output on the mini. Since my colorimeter did not have lion-compatible profiling software, I simply took the .icc profile created on my laptop and put it on the Lion computer. Since the display is the same, and it's still a digital connection, I figured using the old .icc profile in the mean time would produce an identical result. This method worked with my dual monitors connected via TB>DVI and HDMI>DVI.

Well, it doesn't. The picture on the TV is more washed out compared to the output from the laptop.

I think it has something to do with output levels, and OSX treating the TV as a TV and not a monitor, since it is now hooked up via direct HDMI instead of DVI. The reason I say this is because there are 2 standards for video output, monitor levels (0-255 levels) and TV/video levels (16-235). I think that the new setup with the mac mini uses the 16-235 output levels instead of the monitor based 0-255, hence causing the washed out appearance despite using the same .icc profile and same TV settings.

The other reason I think this is because when I connect my TV to the laptop, it shows up as a monitor, and I can select "1920x1080" resolution. There is no overscan slider, etc. When I connect my TV to the mac mini, however, the resolution options are given in "TV" terms, i.e. 1080p. There is also a slider for overscan and also the refresh rates are listed in "TV" terms as well (NTSC, PAL, etc).

Is there any way to force OSX to treat the HDMI output as a monitor, and not a TV? That way (hopefully) I would get the full 0-255 level output, and have a better picture. I am thinking to replace the HDMI cable I currently use with another HDMI>DVI cable, and then use the HDMI>DVI converter that came withy my mac mini. Hoepfully, by doing this, it will fool the mac into thinking the display hooked up to it is a monitor and not a TV. Even though this kind of doesn't make sense since the signal protocol for HDMI and DVI are the same.

However, I get no similar effect when I connect a standard monitor to the HDMI port using a HDMI>DVI cable. The resolution and refresh rate options are given in "monitor" terms like 1280x720 and not 720p, etc.

Anyone else run into this issue?

Ruahrc
 
I have a 2009 Mac Mini and when I installed Lion I had the same issue. Right now I'm running 720p, which was not something I could select before. The colors are all off otherwise it seems. I'm connected through a MDP to HDMI adapter that I got from mono price. Prior to Lion everything was fine, I choose the same resolution as you did on my 47" LCD TV. Now I seem to have issues with color and such.

I don't have a solution but I have the same problem even on a different Mac Mini.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.