Hmmm, I'll check this out. Inputs/Accepts/Decodes/whatever, all equate to the same meaning, simply semantics. I have a first gen Apple TV running 10.4.9 with CHD and yes it's good quality but doesn't compare to my Sony Blu-Ray.
When I stated inputs I mean accepts/decodes a 1080P signal then 720P (which I do not understand as the box itself is sufficient enough to output 1080P aside from the RAM issue). My Pioneer system handles a lot of the processing as all my components run through it via HDMI, passing video through one HDMI cable to my 50" Pioneer Elite. I noticed a significant difference in playback with my B&W 7.1 system and picture quality, better than CHD.
My statement regarding "not true" 1080P was regarding interlaced versus non-interlaced (slightly similar to the old DVD systems Samsung produced that would "upscale" SD DVD to 720P by doubling pixels, not true 1080P quality).
Now of course 54 Mbps is extremely high bandwidth for most systems at the moment, and mkv files will be rather large. However attached storage via ethernet makes a HUGE difference in quality versus streaming via WiFi. Semantics, pure and simple. As for XBMC, there are a lot of developers out there producing various codecs that the engineers are not aware of and are available through various Cydia repo's, and depending on your setup it's all a matter of factors and variables. The fact is, the aTV 2 can process 1080P, fact.
And of course you're not argumentative, I appreciate your info, and I'm still learning a lot so thanks