I have to admit I'm rather disapointed. I love the Apple TV, I have had Gen.2, Gen.3, Gen.4 and now Gen.5 (4K). It's not like the new one is any worse compared to the previous one but I was really stoked for having 4K and HDR and especially Dolby Vision support.
But the new Apple TV 4K gets entirely crippled by the fact that Apple for some inexplicable reason wants to enforce whatever setting you set as your output to be applied to all kinds of sources. Why is there no automatic switching based on the content being played?
First of, I can't utilise Dolby Vision as that would force me to choose 4K Dolby Vision 30Hz or 24Hz due to my LG OLED not supporting 4K Dolby Vision at 60Hz. As there is no automatic switching selecting either 30Hz or 24Hz forces everything, including the UI to run at lower frame rats which looks horrible compared to running it at 50Hz or 60Hz. It also enforces HDR / Dolby Vision "upscaling/rendering" on the UI which looks horrible as everything becomes eye piercingly bright. The Apple TV UI itself works out okay, but the UI within apps gets way too bright, its hurting my eyes and causing strain on my brain giving me headaches and migraines if I look at it for too long.... Like come on?
Opting for 4K HDR 60Hz works out much better, but a lot of content is just not looking all that great with enforced HDR upscaling/rendering so things end up looking worse when compared to watching the same content on my Apple TV 4.Gen.. And locking output to Dolby Vision 30Hz or 24Hz also makes it so we wont get 60Hz on YouTube (most videos are 60 on YouTube these days) or Twitch and playing games when locked to 30 or 24Hz is not all that great.
So we have ended up with just setting it to 4K SDR 60Hz and manually switching when we are going to watch some actually HDR or Dolby Vision content. It just feels backwards and stupid. How is it even possible that Apple have tested with Apple TV 4K themselves and concluded that enforcing HDR to every source was a good idea?