It's converted to progressive scan for display, but it still bears all the disadvantages of the interlaced source material, you cannot ever restore that. You can blow up a 320x240 GIF image to 1920x1080 too, but that doesn't make it a high-definition image because you're still stuck with poor source material. 1080i is still inferior to 720p in some respects. OK, so I just hate interlaced video, can you tell?![]()
Sorry, I really do understand the confusion and the baggage inherent to the term "interlaced" but this is from the 480i days when the interlaced frames where received at 24fps with each frame being only half of the image. This resulted in combing and other artifacts that are nearly impossible to remove.
1080i video is displayed as full 1920x1080 frames at 30fps from a 1080p display. (true 1080p can be at 60fps, but it isn't due to most sources being recorded at 24 or 30fps)
Here's how it works...
A single 1080i frame consists of 2 frames of 1920x540. Both of these interlaced frames are created from a single progressive frames odd vs even lines.
Both frames are received and reconstituted into a single 1920x1080 progressive frame. This is a lossless process that is part of the 1080i spec. Since the 2 1080i frames are from the same temporal frame, there is none combing or other artifacts typically inherent in 480i. The resulting frame is pixel for pixel identical to the original 1080p frame*.
Since the 1080i spec runs at 60(half)fps, it can display up to 30 full progressive frames per second.
Keep in mind that this is only true for 1080p displays. A 1080i display will show only half of the frame at time every 1/60 of a second, leading to a shaky image that I find much worse then 720p
Another interesting point is that while the temporal pixel density for 720p and 1080i are nearly the same, this assumes 720p at 60fps. Since 720p content is typically at <30fps, where 1080i is at <60 (each single frame is sent in 2 halves) the typical temporal pixel density for 1080i is twice that of 720p.
Did you know that sony's first 1080p displays only have 1080i inputs? It's because for movies, it doesn't matter.
I hope this was clear, because i know it's not obvious.
* I'm not 100% sure, but 1080i may be limited to a chroma space of 4x2x2 where as a 1080p signal can be 4x4x4 (but usually not)
** for gaming I would use 720p over 1080i, since frame rates are likely over 30fps