Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
1080 is literally 2.25times the resolution of 720P

Not to be picky, but your math is way off... 1080/720 = 1.5. Not to mention that 720p actually contains more vertical detail than a 1080i picture. 1080p is a different story, of course.

Also in my case, I have a native 720p set, so nothing's getting upscaled to 1080. I obviously can't comment on the quality differences between upscaled 720 and native 1080, but one would assume 1080 looks better than upscaled 720 on a native 1080 set. Although, I have to consider the compression artifacts I see on my satellite provider's picture quality (DISH HD). Why would I want to see those in 1080 anyway? I'm thinking those would look even worse if I had a 1080 set. Until the providers up their standards, I'm perfectly content with 720p.
 
I see that there are a number of folks like me with over 200 movies on their ATV.

So therefore there must be quite a lot of you also feeling the pain by Apple not supporting playlists of the movies on ATV. It takes ages to scroll through the entire movie list - and then you always overshoot. You can forget browsing as it takes too long.

(Note that I stream everything, and use MacTheRipper to rip and HandBrake to encode my DVDs)

Cheers, Ed.
 
Yeah eddyg, it seems Apple was thinking we'd sync our movies, rather than stream them. But on my 802.11n network, none of my DVD rips (1500kbps) ever hiccup while streaming. And most of my 720p movies (acquired via "alternate" means :D) can stream without hesitation, as well.
 
How large is the projected screen you are playing this on? Projector hardware? I can't agree with you, as I have and HD-DVD player and have compared the same films in 720P as well. But to each his own. 1080 is literally 2.25times the resolution of 720P, there is a significant difference on large screens. Especially since everything must be upscaled from 720 to 1080 ANYHOW. And the end result will depend heavily on how good the internal scaler is. How much are you paying for a movie of half the resolution on AppleTV without any physical medium?

For what it's worth, the lossless audio codecs were a SIGNIFICANT factor in my move to HD-DVD.

I am playing these on a 42" screen... I am by no means saying that 720P looks better than 1080p but just that the average person isn't going to notice a difference. The 720p movies off of AppleTV look better than those off an HD Channel and just about as good as a HD-DVD or Bluray... For the best experience 1080p is the way to go, but until this format war is decided about for just the average consumer a AppleTV with 720p HD movies could be just as good... This is a good article on when 1080p matter http://www.carltonbale.com/2006/11/1080p-does-matter/
 
Not to be picky, but your math is way off... 1080/720 = 1.5. Not to mention that 720p actually contains more vertical detail than a 1080i picture. 1080p is a different story, of course.
I hope you're joking. For starters, any current display outside of a CRT is progressive by design. A 1080i signal has a frequency of 50/60hz, meaning when it's deinterlaced - which it will be it's a 1080P signal. 1080i 60hz is EXACTLY the same as 1080P 30hz. THERE IS NO LOSS OF DATA OR RESOLUTION. A 720 signal DOES NOT contain more vertical detail than a 1080 signal. How can you even say that, 1080 > 720. The ONLY advantage that a 720 signal can have, is the ability to be transmitted at 60hz, which it definitely is not with movies/film.

What the hell kind of math are you doing. You forgot about the entire horizontal resolution. You know, that number before the vertical one. There is a reason both are there. To give you the effective area of pixel resolution.

1920x1080 = 2073600 pixels
1280x720 = 921600

2073600 / 921600 = 2.250

By your logic I could have a 1x1000000000000000000 resolution display vs a 1920x1080. My god, look! The first one has 925925925925925.92 times the resolution!

Read up on the basics before trying to correct me.
 
I am playing these on a 42" screen... I am by no means saying that 720P looks better than 1080p but just that the average person isn't going to notice a difference. The 720p movies off of AppleTV look better than those off an HD Channel and just about as good as a HD-DVD or Bluray... For the best experience 1080p is the way to go, but until this format war is decided about for just the average consumer a AppleTV with 720p HD movies could be just as good... This is a good article on when 1080p matter http://www.carltonbale.com/2006/11/1080p-does-matter/
You can't really compare what you see off of TV. While the resolutions may be the same, the compressions are totally different. Yielding garbage looking artificats.
 
All bought from the UK iTunes Store:

  • 3 Pixar short movies.
  • 19 Music Videos
  • 14 TV episodes (Grey's Anatomy, Series 1)


And no AppleTV or iPod with video playback.

:)
 
just wondering how many movies other appleTV users have in their iTunes collection. i have 128. This is ofcourse with the help of 'handbrake'......

2.

I can't see using Hard Drive Space for Content I might view Once a Year or so. How often do you watch the same movie? On the Other Hand I use it for TV Shows, which I will watch more often.
 
I hope you're joking. For starters, any current display outside of a CRT is progressive by design. A 1080i signal has a frequency of 50/60hz, meaning when it's deinterlaced - which it will be it's a 1080P signal. 1080i 60hz is EXACTLY the same as 1080P 30hz. THERE IS NO LOSS OF DATA OR RESOLUTION. A 720 signal DOES NOT contain more vertical detail than a 1080 signal. How can you even say that, 1080 > 720. The ONLY advantage that a 720 signal can have, is the ability to be transmitted at 60hz, which it definitely is not with movies/film.

What the hell kind of math are you doing. You forgot about the entire horizontal resolution. You know, that number before the vertical one. There is a reason both are there. To give you the effective area of pixel resolution.

1920x1080 = 2073600 pixels
1280x720 = 921600

2073600 / 921600 = 2.250

By your logic I could have a 1x1000000000000000000 resolution display vs a 1920x1080. My god, look! The first one has 925925925925925.92 times the resolution!

Read up on the basics before trying to correct me.

No reason to have an aneurysm, dude. Take a pill or something. I was merely referring to the fact that a single field of 1080i only has 540 lines on the screen at any given time, while a 720p picture always displays 720. But if it makes you feel superior to write a dissertation on why you're more knowledgable than I am on the subject, be my guest.
 
No reason to have an aneurysm, dude. Take a pill or something. I was merely referring to the fact that a single field of 1080i only has 540 lines on the screen at any given time, while a 720p picture always displays 720. But if it makes you feel superior to write a dissertation on why you're more knowledgable than I am on the subject, be my guest.
No a 1080i signal has 1080 lines of information on the screen on ANYTHING but a CRT. So unless you are using an old CRT then you're wrong. ALL modern TV's deinterlace an interlaced signal. You NEVER see an interlaced signal, EVER.

End of story.
 
No a 1080i signal has 1080 lines of information on the screen on ANYTHING but a CRT. So unless you are using an old CRT then you're wrong. ALL modern TV's deinterlace an interlaced signal. You NEVER see an interlaced signal, EVER.

End of story.

You just couldn't resist, could you? We both knew you had to have the last word, didn't we? Be sure to post one more time, even though there's nothing more to discuss, mmkay? I DARE you to read this, and not reply. We both know you can't do it.
 
No a 1080i signal has 1080 lines of information on the screen on ANYTHING but a CRT. So unless you are using an old CRT then you're wrong. ALL modern TV's deinterlace an interlaced signal. You NEVER see an interlaced signal, EVER.

End of story.

Not quite the end of story, the 1080i picture has 540 fields per frame, which means that if your screen is deinterlacing, which it does, then it has to combine two frames to get the 1080 lines. Which means that your effective frame rate is halved for the same detail.

So for static to slow moving shots 1080i will be the same as 1080p. However with fast changing material you will notice the frame rate difference between 1080i and 1080p. Depending on the deinterlacer you could even experience combing artefacts.

Therefore for action films and computer games you would be better off with 720p vs 1080i.

Cheers, Ed.
 
I hope you're joking. For starters, any current display outside of a CRT is progressive by design. A 1080i signal has a frequency of 50/60hz, meaning when it's deinterlaced - which it will be it's a 1080P signal. 1080i 60hz is EXACTLY the same as 1080P 30hz. THERE IS NO LOSS OF DATA OR RESOLUTION. A 720 signal DOES NOT contain more vertical detail than a 1080 signal. How can you even say that, 1080 > 720. The ONLY advantage that a 720 signal can have, is the ability to be transmitted at 60hz, which it definitely is not with movies/film.

Don't confuse the frequency of the update of the screen vs the frame rate of the material.

The source material for PAL is 25fps and NTSC is 23.976 or 29.97.

It all comes down to the source material really. If the source is at 60hz and is then interlaced, then you are correct you are only dropping down to 30hz, which is fine. However if you are playing interlaced source material at 22.976 then you are dropping down to 23.976/2.

Of course there are wrinkles in all of this, depending on who is doing the deinterlacing and where.

Cheers, Ed.
 
165 Movies (54 of them are 720P HD!)
400 Tv episodes..... I love my appletv, and wish they would start selling HD movies, cause they look amazing on this thing!

I have not tried converting 720p Material to AppleTV format. I would imagine it must look good, but one thing escapes me. No matter how good the Picture is, until AppleTV can output in 5.1, then we are stuck with stereo (or emulated 5.1 for some). And sound does play a signicant role when watching a movie.

Also I found some 720p material on the internet, and I just wanted to play with it. I believe it was coded in x264 format. Does anyone know if appletv/quicktime will play this, or does it have to converted using VisualHub/Handbrak.

thanks for info, and Apple gets its act together, makes a deal with Dolby, gives us 5.1. I bought this unit on the day it was released, I love it but it has a lot of work ahead of it.
 
Also I found some 720p material on the internet, and I just wanted to play with it. I believe it was coded in x264 format. Does anyone know if appletv/quicktime will play this, or does it have to converted using VisualHub/Handbrak.

As log as it's H.264 (mp4 or m4v) and iTunes can deal with it, :apple:TV should play it.
 
As log as it's H.264 (mp4 or m4v) and iTunes can deal with it, :apple:TV should play it.

The files I have are .mkv, Matroska codec. VLC will play them as well as Perian, but Quicktime alone will not play it, therefore AppleTV will not play them. Therefore I am having to convert the files using VisualHub using AppleTV setting at insane, 2 pass. The conversion is taking some time per movie/file but they look absolutely AMAZING. For those that have never seen these hi-def (720p) movies, they are incredible on your appletv, albeit the sound which does not retain it 5.1 setting when converting to apple friendly format. My Audio receiver does simulate and output in Dolby Prologic II or DTS NEO, but its not the same as 5.1. But the picture is amazing. Naturally not as good as my blu-ray player, but not that far behind either. Once apple fixes/agrees to put in Dobly Digital, this will truly replace my DVD player (not blu-ray). So for now, i am converting all the hi-def movies to m4v format.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.