Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Actually a frame in 1080i has only half the pixels of a frame in 1080p. On the other hand 1080i in comparison has double the frame rate. The problem here is interlacing (that is what the "i" stands for) it was develop to gain better video quality on CRT displays. But non-CRT displays can not properly display interlaced video, instead they use progeressive scan to display video. So techincally every HDTV can only display 1080p content and to display 1080i it has to do something like upscaling.

Maybe you want to read one or all of these Wikipedia articles, they're quite interessting.

http://en.wikipedia.org/wiki/1080i
http://en.wikipedia.org/wiki/Interlacing
http://en.wikipedia.org/wiki/Deinterlacing

I have read them. They endorse what I said: There is no difference beteen 1080 i and 1080 p since the frame buffer is filled and blasted as one, no matter how it was filled.

-- Mikie
 
It sounds to me like a lot of the misinformation is due to confusing 1080i content with 1080i displays. A 1080p display is superior to a 1080i display. True interlaced displays only apply to CRTs and some rear projection screens. Interlaced displays display only 1/2 the image at a time, alternating quickly between odd and even lines. This often makes the image appear unstable.
All flat panel and true 1080p displays are progressive. A flat panel that is capable of 1080i is merely able to interpret the interlaced signal in order to convert it to the screens native resolution. Not all TV are equal at this process. In fact some low cost or early screens will simply throw out the odd frames resulting in a 540p image.

1080i content 'can' look the same as 1080p but only when viewed on a 1080p display that does proper processing to reassemble the frames. Did you know that the first 1080p displays from sony can only handle 1080i input? This is because the sony engineers knew this.

The trick is that the 1080i spec is 60 frames per second where all of the even lines are in one frame and the odd in the next. for 24 frame per second content (like movies) both the odd and even frames are combined before being displayed on the progressive screen at full resolution. The catch is that the 1080i input tops out at 30 progressive frames per second. (this explanation is simplified and the 3:2 pull down is described more accurately earlier in the tread.

True 1080p is also up to 60 frames per second, but each frame is a full frame. however since the content is only 24 for movies, half of the maximum temporal bandwidth potential is not used.

long story short. Due to current content limitations the 1080p spec only is able to use half of it's potential, allowing 1080i (when being pushed to it extreme) to equal the current quality.

You might be asking why your bluray disk looks so much better then your 1080i cable content. This is just compression from your provider. the bluray disk is recorded at a MUCH higher bitrate then your TV provider is able to transmit. Your TV provider does this in order to squeeze more channels into the same pipe.

As a fun experiment (depending on your setup) try changing your bluray player output to 1080i and see if you can tell the difference. If you have all quality components they will be identical.

Once content begins to exceed 24 frames per second 1080p content will become more more important.
 
I agree 100%

seeing the difference between 1080i and 1080p is like hearing the difference between a 320kbps mp3 and a 400kbps mp3. Noone can tell the difference. 1080P is a marketing ploy and people are buying into it. While technically it makes sense, it doesnt matter because the difference is unable to be see by the naked eye.

I'm so sick of people getting bogged down in this sort of technical cr@p. For anyone who's whinging about the lack of 1080 content. Go down to your local TV store and compare the two formats. You simply cannot tell the difference.

To make matters worse, all Plasma & LCD screens are simply not sharp or accurate enough to illustrate the additional detail on the screen. I recently went to my local HiFi store to see how the latest Plasma & LCD screens compare to my Sony Vega CRT screen. They don't. I find it amazing that so many people are sacrificing such jumpy, blurred images for the sake of larger screens. I've had so many visitors come to our house and stand amazed at the quality of our screen after spending over $5000 on a new 1080i LCD or Plasma screen.

As Nut says, you are getting sucked in by the manufactures of this equipment.
 
To make matters worse, all Plasma & LCD screens are simply not sharp or accurate enough to illustrate the additional detail on the screen.

Odds are that your computer display is LCD... Are you saying that your CRT could display the text you're reading right now more sharply or accurately than LCD? Your 1080i Wega CRT is roughly the same pixel resoultion as most modern PC displays. I think it's clear that LCD is the PC display technology of choice, especially when "sharp or accurate" is the objective.

Wouldn't you agree? I'm sorry, you lost me when you said LCD isn't sharp or accurate enough to display more detail than CRT. That isn't true in a PC environment, so how can it be true in your living room?
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.