Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

whenpaulsparks

macrumors regular
Original poster
Jun 5, 2004
210
1
Tallahassee, FL
Since i had a hard time finding this information on this forum when trying to decide whether or not to buy the mini-DVI to DVI adapter, i went ahead and bought it, and hooked it up over DVI to my RCA 27" tube 1080i HDTV... and it works!

I'm using a black macbook. using the adapter, just plug it in, and go into your displays preferences. you may need to hit detect displays, but after that, the displays window on your HDTV will say "HDTV" and it will give you the option of various resolutions, including 1920 x 1080 interlaced, or 1080i. granted, i wasn't completely impressed with the quality of running a secondary display (not mirrored) in 1080i. so i did mirroring with 1280x800 (standard macbook res), and it outputs a 1080i signal with that resolution. hard to explain, i know. but it's NOT 1280x800 that gets sent to the HDTV in this mode, its 1080 interlaced!!! it just up-converts the 1280x800 to 1920x1080 much like an upconverting DVD player would.

also, if you want to use front row, you'll benefit from doing mirroring. i also got the mini-optical to optical adapter and hooked it up to my 5.1 unit and just left my macbook near my TV. it's nice being able to control that across the room :)

but those of you with 720p HDTVs or standard def TVs will never understand why us 1080i people get so excited about this! 720p just works, but it takes a certain graphics card to output an interlaced signal, since computers usually use progressive scanning.
 
whenpaulsparks said:
Since i had a hard time finding this information on this forum when trying to decide whether or not to buy the mini-DVI to DVI adapter, i went ahead and bought it, and hooked it up over DVI to my RCA 27" tube 1080i HDTV... and it works!

I'm using a black macbook. using the adapter, just plug it in, and go into your displays preferences. you may need to hit detect displays, but after that, the displays window on your HDTV will say "HDTV" and it will give you the option of various resolutions, including 1920 x 1080 interlaced, or 1080i. granted, i wasn't completely impressed with the quality of running a secondary display (not mirrored) in 1080i. so i did mirroring with 1280x800 (standard macbook res), and it outputs a 1080i signal with that resolution. hard to explain, i know. but it's NOT 1280x800 that gets sent to the HDTV in this mode, its 1080 interlaced!!! it just up-converts the 1280x800 to 1920x1080 much like an upconverting DVD player would.

also, if you want to use front row, you'll benefit from doing mirroring. i also got the mini-optical to optical adapter and hooked it up to my 5.1 unit and just left my macbook near my TV. it's nice being able to control that across the room :)

but those of you with 720p HDTVs or standard def TVs will never understand why us 1080i people get so excited about this! 720p just works, but it takes a certain graphics card to output an interlaced signal, since computers usually use progressive scanning.

What's it like in terms of usability? Does it lag? Do all the OSX graphics work smoothly?
 
whenpaulsparks said:
...but those of you with 720p HDTVs or standard def TVs will never understand why us 1080i people get so excited about this! 720p just works, but it takes a certain graphics card to output an interlaced signal, since computers usually use progressive scanning.
Actually the 720p may be equal or even better than 1080i. It's 1080p that is the "big cheese". ;)
I'm not certain about this, but after reading a lot of this thread, and reading around about HD in general, I have come to this conclusion (may be wrong?). The basic reason is that the "i" behind 1080 stands for interlaced. This kind of signal is the same "kind" as we have had all along (much better, but same kind). The interlaced signal combines two signals to "create" the picture. The "p" stands for progressive. The progressive signal is "one" signal, and this (supposedly) gives a better and more stable picture.

And a question. What do you mean with?:
...a certain kind of graphics card
You do realize that the MacBook doesn't exactly have a graphics card to shout about?

Anyway, it's cool to know it works. :)
 
IEatApples said:
Actually the 720p may be equal or even better than 1080i. It's 1080p that is the "big cheese". ;)
I'm not certain about this, but after reading a lot of this thread, and reading around about HD in general, I have come to this conclusion (may be wrong?). The basic reason is that the "i" behind 1080 stands for interlaced. This kind of signal is the same "kind" as we have had all along (much better, but same kind). The interlaced signal combines two signals to "create" the picture. The "p" stands for progressive. The progressive signal is "one" signal, and this (supposedly) gives a better and more stable picture./QUOTE]


Not to say you're wrong but what makes you think 720p is better than 1080i (other than it being progressive scan).


/Not trying to argue, honest question
 
what i meant by 720p users won't know why we get so excited about this is that until recently, graphics cards couldn't output a 1080i signal, but they could 720p just fine, so most people with 720p HDTVs have no problems configuring it, but its a PAIN to get a 1080i HDTV working.

as for the graphics card, i mean a certain type of graphics card that supports 1080i.

as for the usage of it, in full 1920x1080 it is a bit jumpy. i haven't tried using the DVI as the main and only display yet, that would probably help performance. but if you do 1280x800 over 1080i, it feels a lot more responsive.

but 1080i does not equal 540p, in much the same way that 480i does not equal 240p. sure, 540 lines are shown at once, but the full 1080 lines are shown every 30th of a second, which is faster than the eye can view video. therefore you do see 1080 lines (assuming your eye can make out the horizontal resolution), so it is higher *quality* than 720p. also realize that there are horizontal pixels too... 1080i has 1920 horizontal pixels, 720p only has 1280. as for motion, there are interlacing artifacts, so 720p is generally preferred for sports and film. but for normal TV broadcasts, for example, 1080i is preferred, because you get a much sharper picture. only ESPN and ABC in my area broadcast 720p... everything else (TNT, NBC, CBS, INHD, INHD2, DHD, etc.) is in 1080i. granted, 1080p is much nicer, but it is not a broadcast-supported format. as for HDTVs and blu-ray/hd-dvd, i'd much rather have 1080p than anything else when i can afford it, but for now, i have a $294 wal-mart RCA HDTV that does 1080i only :)
 
don't get too excited folks... 1080i resolution is 1366x768, so if your TV will only display 1080i, you're only going to get 1366x768. If you TV will display 720p then you're going to get 1280x720. The only way to get 1920x1080 is to have a 1080p TV, because the resolution of 1920x1080 is 1080p. Check the thread linked above started by Shard if you have any other questions.
 
Hate to tell you, but 1080i is still 1920x1080, it simply scans half the image to the screen per pass, rather than the entire thing. The visual resolution is identical, the quality difference would only be noticable in fast-moving graphics/video sequences.
 
whenpaulsparks said:
what i meant by 720p users won't know why we get so excited about this is that until recently, graphics cards couldn't output a 1080i signal, but they could 720p just fine, so most people with 720p HDTVs have no problems configuring it, but its a PAIN to get a 1080i HDTV working.

as for the graphics card, i mean a certain type of graphics card that supports 1080i.

as for the usage of it, in full 1920x1080 it is a bit jumpy. i haven't tried using the DVI as the main and only display yet, that would probably help performance. but if you do 1280x800 over 1080i, it feels a lot more responsive.

but 1080i does not equal 540p, in much the same way that 480i does not equal 240p. sure, 540 lines are shown at once, but the full 1080 lines are shown every 30th of a second, which is faster than the eye can view video. therefore you do see 1080 lines (assuming your eye can make out the horizontal resolution), so it is higher *quality* than 720p. also realize that there are horizontal pixels too... 1080i has 1920 horizontal pixels, 720p only has 1280. as for motion, there are interlacing artifacts, so 720p is generally preferred for sports and film. but for normal TV broadcasts, for example, 1080i is preferred, because you get a much sharper picture. only ESPN and ABC in my area broadcast 720p... everything else (TNT, NBC, CBS, INHD, INHD2, DHD, etc.) is in 1080i. granted, 1080p is much nicer, but it is not a broadcast-supported format. as for HDTVs and blu-ray/hd-dvd, i'd much rather have 1080p than anything else when i can afford it, but for now, i have a $294 wal-mart RCA HDTV that does 1080i only :)
Good stuff! :)
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.