Yes, something like that. You will observe some videos will not work if they doesn't have a "static" object in some frames at all. Need to investigate more this, regarding Video Exporter "Does not successfully set the metadata required to make a Live Photo supported as a wallpaper." It sets meta but I think is about highlight map made by Apple around object found. Video will not work if: 1. objects contours are fogs, ripple, not clear 2. no observable static object at all {and few more factors but those are main which makes a video not usable as lw}. For example a movie with fishes moving all around will not work, same video with a static starfish in most of frames {and I mean frame 0 at all , that's important} will work. So any static object identifiable as object {moon, star, planet, sun, cloud, tiger etc etc} will make video more "accepted" for moving one
Tested 1200 videos, automatically processed, only half or less passed this test. My first concern was about colors and objects having same average color, but is not the case, even a video made from nuances of blue, gray etc, will pass the test if it has an identified {static} object. As a future exercise will try custom movies made with even single static identifiable object and other moving elements {but not so soon, I am in other projects} and use the meta track from that. And most important, when you have time Chris, try to get a segmented track of sample interval from captured video, write it over n duration, and this will make you 100% original lw {as I said have no time, but is sure possible as all working videos from that app are using same data, and different sample time, sample example}. In that moment you can figure out array of formats that work at all and can use {I suppose any video format which match captured one} any video size respecting original recorded one lw. Until we don't have any details of custom generating from Apple, try this: make a video with timecode as frame in each frame {so it show 0, 1, 2... 59 on each frame of video}. Apply as LW. You will see it takes frame 0, then average 17 ->30 frames to use and show on lock screen. Those being said, happy coding, you are on the right path. And Apple maybe will explain more in future. Because the timestamps of samples in a recorded lw with camera have different time than working lw, that will make them yet unusable.