I have two Mid 2010 Mac minis in my possession. One is the Server model, one is the normal non-server one with the optical drive. It is my understanding that on these Mac minis as well as the 2011 and 2012 models (both server and non-server) that the maximum resolution output on the HDMI port is 1920x1200 and the maximum resolution of 2560x1600 via the miniDisplayPort or Thunderbolt port (on the 2011 and 2012 models).
Now, I have both one of the included HDMI to DVI adapters as well as a miniDisplayPort to DVI adapter. I currently have the 2010 Server model hooked up to an older Viewsonic monitor carrying a native resolution of 1680x1050, and when I use either adapter the Mac mini Server outputs the full 1680x1050; this makes sense as this is less than the HDMI's maximum.
However, the non-server Mac mini is hooked up to a newer Samsung monitor with a native resolution of 2048x1152. With the HDMI to DVI adapter, and via the HDMI port, I'm getting the full 2048x1152 resolution. With the miniDisplayPort to DVI adapter, I'm also getting the full 2048x1152 resolution. The latter result makes sense to me as it is within the maximum output resolution of the port. But the result when using the HDMI port is unexpected as 2048x1152 is greater than 1920x1200.
My question is this: is this normal? If I decide to use the HDMI port and adapter for this monitor, will I cause strain to the GPU assuming it doesn't wake up and realize that it should be outputting less resolution? For the time being, I'm just going to use the miniDP port and call it a day, but still this is curious behavior, no?
Now, I have both one of the included HDMI to DVI adapters as well as a miniDisplayPort to DVI adapter. I currently have the 2010 Server model hooked up to an older Viewsonic monitor carrying a native resolution of 1680x1050, and when I use either adapter the Mac mini Server outputs the full 1680x1050; this makes sense as this is less than the HDMI's maximum.
However, the non-server Mac mini is hooked up to a newer Samsung monitor with a native resolution of 2048x1152. With the HDMI to DVI adapter, and via the HDMI port, I'm getting the full 2048x1152 resolution. With the miniDisplayPort to DVI adapter, I'm also getting the full 2048x1152 resolution. The latter result makes sense to me as it is within the maximum output resolution of the port. But the result when using the HDMI port is unexpected as 2048x1152 is greater than 1920x1200.
My question is this: is this normal? If I decide to use the HDMI port and adapter for this monitor, will I cause strain to the GPU assuming it doesn't wake up and realize that it should be outputting less resolution? For the time being, I'm just going to use the miniDP port and call it a day, but still this is curious behavior, no?