This has got me perplexed. I just went to a couple of websites that can test the gamma value of a monitor. You do this by looking at a test image. The gamma value is where you can't distinguish between the patterns when looking at them from a distance. For example, you can check your monitor at this website.
http://www.normankoren.com/makingfineprints1A.html
Scroll down more than half way to where it says "Gamma and Black Level Chart." The test image is on the right and looks like this:
An example of a gamma of 2.0 looks like this:
What's weird is that when I test my iMac using the image while it's embedded in the web page, the gamma shows a value of about 1.3, which is way off. But when I save the test image to my desktop and use that image for the test, the gamma has the correct value of 2.2. I've tried two different browsers with the same result.
When I tried the same test with a Dell monitor, the gamma was the same value of 2.2 for both images. What's causing this?
http://www.normankoren.com/makingfineprints1A.html
Scroll down more than half way to where it says "Gamma and Black Level Chart." The test image is on the right and looks like this:
An example of a gamma of 2.0 looks like this:
What's weird is that when I test my iMac using the image while it's embedded in the web page, the gamma shows a value of about 1.3, which is way off. But when I save the test image to my desktop and use that image for the test, the gamma has the correct value of 2.2. I've tried two different browsers with the same result.
When I tried the same test with a Dell monitor, the gamma was the same value of 2.2 for both images. What's causing this?