Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
If that manual that someone linked to earlier is for your TV, the audio should be working for all the HDMI inputs. Since it isn't working for any of them, I would suggest making sure you've set up the audio for your HDMI inputs correctly (I think I saw that it can be set to Auto, Digital or Analog).

Also, if that manual is correct, this TV is (as already mentioned) 720p native, not 1080i.

Yes I tried all options Auto, Digital, and Analog, and only Analog works for me and that is frustrating to me. I know that you have to match the units that are connected to the HD TV. Another words your DVR has to be set to HDMI and HDMI Audio as well , as well as the TV set set to HDMI and HDMI audio which I tried auto, digital and analog. So maybe now you can see my frustration.

Any suggestions that are welcomed by all means.
 
It would be helpful it the TV model was listed. You also say TV receiver, but the lingo is that means two different devices, a TV and A/V receiver.

Anyway, it sounds like your setup may be too dated for the Apple TV limited connectivity. You were obviously on the bleeding edge back in the day, but as a result the I/O isn't very typical today. It could be possible your equipment is malfunctioning or set up incorrectly.

This is a possibility, and I plan to call Pioneer to see if they can help with anything but it could be like you say , just too old of a TV.
(even though it is only 5 years old)
 
Your TV is getting a bit long in the tooth and technology is beginning to pass it up. It's not a reason to be disappointed in the Apple TV, but maybe a reason to be disappointed in the aging equipment.

It's bound to happen sometime and probably with more and more devices.

It may sting to say this, but the 50" plasma you can buy for $800 today will probably have better picture quality than the TV you originally bought for $10,000. It will certainly have better connectivity!

That said, playback with the old Apple TV was never 1080p/i, it had a "1080i" setting, but that wasn't for video playback.

Also, you can consider buying an A/V receiver with analog outputs and surround sound speakers if you don't want to replace your TV.

Yes it makes me sick how the prices came down from the MSRP of $9000 but again I only paid $5000 for it still too much, but you may be right about the older equipment just can't keep up with the technology.
 
Ok, I checked on the page you said, and you are right the pixels are 1280 x 768. However my model is the PDP-505PU but is the same. So I stand corrected on this. But why does my TV display show 1080i on the display? When I connect some games it shows 720 and boy what a difference in quality.

I wrote the post you quoted at the same time you wrote about reading your manual so I was adding to the conversation and not correcting you. However, thanks for being understanding that I am just trying to help you.

SHORT ANSWER

All it means is that your set is decoding a 1080i signal.

LONG ANSWER

Understanding one's digital television has always been confusing. Manufacturers have taken advantage of the newness of their products.

Most people are confused by two concepts. Manufactures have made it appear that they are saying the same thing.

1. Monitor can play 1080p, 1080i, 720p, 480p, and 480i.

2. Monitor has 1080p resolution.

Most people think that the first statement means the second statement is true.

Reality is that the first statement ONLY says that the monitor can decode those signals. It doesn't say what resolution it plays those signals at.

My bedroom set is like your plasma. It can decode 1080i but it plays it in 720p resolution. Just like your plasma, my set says it has a 1080i being decoded.

My main set is a 1080p that plays in 1080p. I waited almost two years to buy it. When 1080p first hit the market, few sets marked 1080p had 1080p resolution and those sets were out of my price range. I bought when the prices came down.

Many people who bought those early sets believe that they have a real 1080p set when all they have is a set that can decode 1080p.

BTW, my first HD set was a high quality native 1080i rear projector. Your 720p plasma has a far superior picture than that set. So while resolution is important, it isn't everything.

PS
I sold my first HD set to my brother-in-law cheap. His kids still use it in their basement and it still has a good picture. There are ways to recycle an out of date functioning set.
 
When I connect some games it shows 720 and boy what a difference in quality.

It took me a while to figure what is happening.

Your monitor does a much better job down converting a 1080i signal to 720p than the game console. Thus you get a better picture with the 1080i signal.
 
Didn't even bother reading past the title, nor did I read other replies. But Apple can't please everybody and I'm sure they don't care too much that they can't because they know this. You've heard it a hundred times and you'll hear it a hundred more: If you don't like it, don't buy it.

I for one, can't wait to receive and use my Apple TV. I'm stoked.


Dale
 
I mock you all with my 54" projection 480i SDTV hooked up to my PS3 for video streaming.
 
I just did a Google search, you set has 1280 × 768 pixels which means that it is 720p native.

http://www.pioneerelectronics.com/p...1/189562688PDP5045HDOperatingInstructions.pdf

page 71

A 1080i would have a 1,920 x 1,080 resolution.

There must be some other issue as a native 720p signal to a native 720p should give the best picture.

With the original ATV set to 1080i the follow happens with your plasma set.

ATV up converts 720p to 1080i.

The Pioneer down converts 1080i to 720p and plays the video.

With the new ATV the following happens.

ATV sends a 720p signal to the Pioneer that plays it without any conversion.

All TV's back then advertised themselves at 1080i, incorrectly.

The meanings of the terms have changed.

Today 720p and 1080p are referred to as capable resolutions.

Back when it was referred to as being capable of displaying, ie 1080i converted to 720p, taking 1920x540 pixels and cramming them into whatever your sets res was.

Still things have come a long way and wondering why a 6-7 year old set isn't completely compatible with a new device doesn't make sense. Your an early adopter who got burned because the HDMI spec wasn't completely integrated into the set.

Your the minority, a comparable VT set right now would run you around $700 on sale.
 
All TV's back then advertised themselves at 1080i, incorrectly.

The meanings of the terms have changed.

Today 720p and 1080p are referred to as capable resolutions.

Back when it was referred to as being capable of displaying, ie 1080i converted to 720p, taking 1920x540 pixels and cramming them into whatever your sets res was.

Still things have come a long way and wondering why a 6-7 year old set isn't completely compatible with a new device doesn't make sense. Your an early adopter who got burned because the HDMI spec wasn't completely integrated into the set.

Your the minority, a comparable VT set right now would run you around $700 on sale.

TV's are getting SO cheap. Best Buy has a 55 inch LED TV on sale for $1299 from the norm of $1499. It's their house brand insignia, but they are trying to turn insignia into a premium brand apparently.

Source: http://www.bestbuy.com/site/Insigni...35&skuId=9896008&st=55 insginia led&cp=1&lp=1

Dale
 
Ugh. I will attempt to give some home theater advice here, but I have a feeling a lot is lost in translation somehow.

Misconception: 1080i is better than 720p

1080i is not better than 720p, having interlaced lines is never a good thing unless you have a good deinterlacer, which modern 1080p capable TVs do. If your TV is not capable of 1080p, then 720p is actually the best option. It is most likely that the TV actually runs at 720p natively, and when you use 1080i there is some rather trivial scaling involved. This is because 1080i is really only 540 lines of video, but interlaced so that every other line is skipped. When not using a deinterlacer, you are much better off using 720p because you actually get a full 720 lines of video. Granted they both have different horizontal resolutions (1280x720 vs 1920x540) but you do not see a difference in that horizontal resolution because the TV probably does not run at 1080i as its native resolution. So ultimately it is a complete waste, and you get a much better picture out of 720p.

Misconception: The new Apple TV requires HDMI audio.

There is also an optical out port. If you do not have an optical capable receiver that is your own fault. If you are out of spots to plug it in, you need to either buy a new receiver or consider buying an optical audio switcher. Apple is providing the minimum requirements for digital audio. It is your own fault if you don't meet the minimum requirements, and you should have known this before you buy the product.

Short version: Your criticisms have absolutely nothing to do with the product, but rather your lack of meeting the correct hardware requirements.
 
My final thoughts on this thread... by OP

It took me a while to figure what is happening.

Your monitor does a much better job down converting a 1080i signal to 720p than the game console. Thus you get a better picture with the 1080i signal.

Thanks philipk as your comments have been very helpful along with some others on this thread. However others commented that have no business commenting. Anyway, This thread has opened my eyes and I will be more cautious when I buy my next Flat Screen that is for sure. I don't really blame Apple, if I came across that way I am sorry, as I LOVE Apple Products and own a ton of them. I also love this forum as it usually gives me good feedback that I can use, while at times some idiots reply that don't care to answer your posted question and make some stupid remark, those can be overlooked, and I can benefit from the good comments.

I thank all who have replied that were relevant to the subject in matter.
To those who comment just to say nothing or try to say something without actually saying anything real, well they need to get a life I guess.

Thanks again for your input.
 
Ugh. I will attempt to give some home theater advice here, but I have a feeling a lot is lost in translation somehow.



Misconception: The new Apple TV requires HDMI audio.

There is also an optical out port. If you do not have an optical capable receiver that is your own fault. If you are out of spots to plug it in, you need to either buy a new receiver or consider buying an optical audio switcher. Apple is providing the minimum requirements for digital audio. It is your own fault if you don't meet the minimum requirements, and you should have known this before you buy the product.

Short version: Your criticisms have absolutely nothing to do with the product, but rather your lack of meeting the correct hardware requirements.

That is interesting, I did not know that they made a optical audio switcher. I knew about the optical option but mine is taken with my HD TV Receiver to my speaker amp. So I may check into a optical switch box. Thanks for that infor. Regarding the criticisms on Apple, I take them back, as it is nothing against Apple. I now know it is my equipment though I thought it was not that old, apparently it is way out of date already. I just bought way too early I guess. Thanks for your comments.
 
Yes I am aware of the optical option but I only have one connection for my optical and that is from my TV Receiver's optical out to my 5.1 speakers. So I can not hook up just the ATV just to the optical or I will not hear sound from my blu-Ray player, dishnetwok and DVD player.

Use a splitter.
 
Summary

Great thread guys!!

I am in the UK and along with everyone else in my country I am being held hostage by a second rate courier. AppleTV will only arrive here on Monday. :(

Searching the forums to see how my "American Cousins" are getting on I got to this great thread.

WizardHunt - you are a gent! Very calm under the pressure of a bit of a slagging from some quarters!

Great and interesting info from many others!

Trip1ex - you are an ejit!
 
I just got my AppleTV to replace a first gen one (first gen going to the basement) and my initial thoughts are this is a major step back.

The major issue I have is they have made all my content, which a lot of came from iTunes, second class. I now have to go into Computer, select a library, wait for it to load, select photos, wait for it to load, select and album, wait for it to load, then I can view a photo. Same procedure for my music, movies and podcasts. The old way of syncing to a main library and then having secondary shared libraries was much better.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.