Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

TomKing

macrumors newbie
Original poster
Apr 17, 2011
18
0
Hey Guys,

So I've just got my new mac pro in the post, ive got to say it is lightning quick, even before ive plugged the new SSD in! Cant wait for that to arrive tomoz!

But I've run in to an issue, I will be using my LG LCD 1080p screen for the main screen but the mac will only let me set the res as 1600x900.... Problem. I cant see why.

Its the base model quad mac pro, connected via a dvi to HDMI cable, now when i was using this very same cable to connect to the macbook with the display port to DVI adaptor it was happy putting 1080i through.

So can any body please help! I'm pretty desperate.

Thanks in advance.
 
Use DVI. The problem is the HDMI link. Your Mac can output at least 2560x1600. Are you running a TV or a display? 1080i is not what you want. You want 1920x1080, which is correct 1080p. I've had all kinds of issues with unavailable resolutions through HDMI links depending on display types and support.
 
Computers and home theater are both dear to my heart and I hang out in forums for both. Without question HDMI is both great and a complete nightmare. When it works, it is fantastic. When it doesn't, it is a headache and a half.

I have seen many, many problems with HDMI connectivity/compatibility issues. It gets worse when you add HDCP into the picture.

I agree with poster above, use DVI or Display Port if you can.
 
You want 1920x1080, which is correct 1080p

I thought that 1080i, and 1080p were both 1920x1080, just one was an interlaced signal (so only half the lines are shown at any one time) and the other progressive (all the lines all of the time)


But i didn't make myself clear, the DVI-HDMI cable that originally connected the mac book (with the aid of a display port to DVI adaptor) to the HDMI input of the tele (capable of 1080p), was capable of transmitting 1080i, which was the limit set by the GPU of the macbook.

However when I use the same cable with out the adaptor, connecting directly from the DVI port on the mac pro GPU to the same HDMI input on the LG tele.

Surely that can't be the cables fault, as its already shown its capable of the resolution 1080i, yet when connected to a far superior machine, with out the aid of an adaptor its only capable of 1600x900.

I'm aware that this both 1600x900 and 1920x1080 are both in the aspect ratio of 16:9 so surely this has to be software of driver based?

Also a big problem is that the only HD input i have on the tele is HDMI, no DVI there I'm afraid.
 
I thought that 1080i, and 1080p were both 1920x1080, just one was an interlaced signal (so only half the lines are shown at any one time) and the other progressive (all the lines all of the time)

1080i is technically 1920x540, with the TV trying to do magic to guess what the missing lines should be. Physically you will see 1920x1080, but it's going to be much lower quality as it's hard for the computer to fill in the blanks.

But i didn't make myself clear, the DVI-HDMI cable that originally connected the mac book (with the aid of a display port to DVI adaptor) to the HDMI input of the tele (capable of 1080p), was capable of transmitting 1080i, which was the limit set by the GPU of the macbook.

Macbook should be able to do 1080p.

A lot of this has to do with the TV. Some TVs are just plain bad at working with the proper resolutions over DVI.

Surely that can't be the cables fault, as its already shown its capable of the resolution 1080i, yet when connected to a far superior machine, with out the aid of an adaptor its only capable of 1600x900.

It's likely a combination of the GPU and the TV. Your TV is supposed to tell the computer what resolutions it can do. When this doesn't happen, things start to go to hell.

I'm aware that this both 1600x900 and 1920x1080 are both in the aspect ratio of 16:9 so surely this has to be software of driver based?

Kind of. You can usually override the resolution using something like SwitchResX, but if your TV isn't reporting what resolutions it can do, your GPU just has to start guessing. The two different GPUs or drivers are guessing two different things.

Also a big problem is that the only HD input i have on the tele is HDMI, no DVI there I'm afraid.

And this sort of thing is why I gave up using a TV as a display and bought a monitor. :)
 
Haha, twin 24" are in the horizon, but managed to pop an audio monitor the other day, so the tele is having to do the job. while I save for new speakers. :eek: SwitchResX actually did the trick.

I hate to pay for such a simple problem, but well, its cheaper than a monitor right now.

thank you for the advice guys :)

Still surprised this is even an issue. haha.
 
1080i is technically 1920x540, with the TV trying to do magic to guess what the missing lines should be.

This is incorrect it's two fields of 1920x540, some of the early tv couldn't decode the 2nd field properly, and you ended up with the tv showing the same field twice. Some tv stations broad cast improperly flagged content, which also didn't help.
 
Haha, twin 24" are in the horizon, but managed to pop an audio monitor the other day, so the tele is having to do the job. while I save for new speakers. :eek: SwitchResX actually did the trick.

I hate to pay for such a simple problem, but well, its cheaper than a monitor right now.

thank you for the advice guys :)

Still surprised this is even an issue. haha.

Be glad you don't have a Sony TV. Even worse. Found out the hard way with my father in law. He kept on about "Your supposed to be the expert":mad:
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.