Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Yes, but make sure it is the same image {or frame} you add at still image time. Anyway work on tracks, because I think you are not adding the metatracks {from a working live photo video} with 1" lenght to your 1" video at all. I said previous you can use any live photo metatracks recorded with camera, but there is supplementary work to do {like saving patterns for any kind of formats, not worth the work since is private API in the end}. Use an existing working one. You need traks.metadata)[0], .metadata)[1] from a working lw. Having those in mind, you will be soon there:





let thrirdCompositionTrack = mixComposition.addMutableTrack(withMediaType: .metadata, preferredTrackID: kCMPersistentTrackID_Invalid)
do {
try thrirdCompositionTrack?.insertTimeRange(CMTimeRange(start: kCMTimeZero, end:videoAssets1.tracks(withMediaType: .video)[0].asset!.duration), of: videoAssets2.tracks(withMediaType: .metadata)[0], at: kCMTimeZero)
} catch {
print("Error = \(error.localizedDescription)")
} // videoAssets1 is your personal video , videoAssets2 is a working one

Hope it helps, in the end you will succeed. Ignore my bad coding there, I have real important projects to do, not even checked thrird or declaration of objects names as "English" at all :) I work for 6 months now in AI sphere :)
Сould you please provide the complete code? Maybe some class that would allow you to save live photos.
 
My code allows you to save Live Photos. It works if you export a supported live wallpaper and reimport it with the app.
Yes, something like that. You will observe some videos will not work if they doesn't have a "static" object in some frames at all. Need to investigate more this, regarding Video Exporter "Does not successfully set the metadata required to make a Live Photo supported as a wallpaper." It sets meta but I think is about highlight map made by Apple around object found. Video will not work if: 1. objects contours are fogs, ripple, not clear 2. no observable static object at all {and few more factors but those are main which makes a video not usable as lw}. For example a movie with fishes moving all around will not work, same video with a static starfish in most of frames {and I mean frame 0 at all , that's important} will work. So any static object identifiable as object {moon, star, planet, sun, cloud, tiger etc etc} will make video more "accepted" for moving one :) Tested 1200 videos, automatically processed, only half or less passed this test. My first concern was about colors and objects having same average color, but is not the case, even a video made from nuances of blue, gray etc, will pass the test if it has an identified {static} object. As a future exercise will try custom movies made with even single static identifiable object and other moving elements {but not so soon, I am in other projects} and use the meta track from that. And most important, when you have time Chris, try to get a segmented track of sample interval from captured video, write it over n duration, and this will make you 100% original lw {as I said have no time, but is sure possible as all working videos from that app are using same data, and different sample time, sample example}. In that moment you can figure out array of formats that work at all and can use {I suppose any video format which match captured one} any video size respecting original recorded one lw. Until we don't have any details of custom generating from Apple, try this: make a video with timecode as frame in each frame {so it show 0, 1, 2... 59 on each frame of video}. Apply as LW. You will see it takes frame 0, then average 17 ->30 frames to use and show on lock screen. Those being said, happy coding, you are on the right path. And Apple maybe will explain more in future. Because the timestamps of samples in a recorded lw with camera have different time than working lw, that will make them yet unusable.
 
Last edited:
Yes, something like that. You will observe some videos will not work if they doesn't have a "static" object in some frames at all. Need to investigate more this, regarding Video Exporter "Does not successfully set the metadata required to make a Live Photo supported as a wallpaper." It sets meta but I think is about highlight map made by Apple around object found. Video will not work if: 1. objects contours are fogs, ripple, not clear 2. no observable static object at all {and few more factors but those are main which makes a video not usable as lw}. For example a movie with fishes moving all around will not work, same video with a static starfish in most of frames {and I mean frame 0 at all , that's important} will work. So any static object identifiable as object {moon, star, planet, sun, cloud, tiger etc etc} will make video more "accepted" for moving one :) Tested 1200 videos, automatically processed, only half or less passed this test. My first concern was about colors and objects having same average color, but is not the case, even a video made from nuances of blue, gray etc, will pass the test if it has an identified {static} object. As a future exercise will try custom movies made with even single static identifiable object and other moving elements {but not so soon, I am in other projects} and use the meta track from that. And most important, when you have time Chris, try to get a segmented track of sample interval from captured video, write it over n duration, and this will make you 100% original lw {as I said have no time, but is sure possible as all working videos from that app are using same data, and different sample time, sample example}. In that moment you can figure out array of formats that work at all and can use {I suppose any video format which match captured one} any video size respecting original recorded one lw. Until we don't have any details of custom generating from Apple, try this: make a video with timecode as frame in each frame {so it show 0, 1, 2... 59 on each frame of video}. Apply as LW. You will see it takes frame 0, then average 17 ->30 frames to use and show on lock screen. Those being said, happy coding, you are on the right path. And Apple maybe will explain more in future. Because the timestamps of samples in a recorded lw with camera have different time than working lw, that will make them yet unusable.
Regarding your point 2, I find that this is not the case. You can replicate for yourself taking a picture of complete black or of a static wall. It works as a live wallpaper and no object is detected.

Please post the code you used to format the videos. I'm out of options here and I didn't understand what you meant above.
 
Regarding your point 2, I find that this is not the case. You can replicate for yourself taking a picture of complete black or of a static wall. It works as a live wallpaper and no object is detected.

Please post the code you used to format the videos. I'm out of options here and I didn't understand what you meant above.
Straight from camera it works, yes, but did you try saving the video and photo, re-encoding it and then running it through this code? Did that work too?
 
Regarding your point 2, I find that this is not the case. You can replicate for yourself taking a picture of complete black or of a static wall. It works as a live wallpaper and no object is detected.

Please post the code you used to format the videos. I'm out of options here and I didn't understand what you meant above.
I think you miss one step. When you export HEVC from Premiere, the file format will be mp4. So before AVCapture session format the movie in app AVAssetWriterInput(mediaType: .video, outputSettings: [AVVideoCodecKey : AVVideoCodecType.hevc, AVVideoWidthKey :1080, AVVideoHeightKey : 1920]) , export as mov. Do not use Quicktime for that, do it in app. From this one extract frame(s) and then proceed to AVMutableComposition(). As I said, not all already made videos will work with frame 0. Some of them need searching for a valid frame to function as moving movie :)
 
This older app that successfully converted video to live photos has been updated for iOS17
It's not perfect, but it does work.


Video clip to live wallpaper
(*The erratic movement shown below was me trying to shrink the image, move it)

 
  • Like
Reactions: Bostonbanana
This App has just been updated. I tried it and the dynamic wallpaper I made can also be used on iOS17. You can take a look. Study how it is done.
1702094445118.png
 
This App has just been updated. I tried it and the dynamic wallpaper I made can also be used on iOS17. You can take a look. Study how it is done.
View attachment 2322705
Cool, thank you. Yeah it works on my end.
These updated apps, iOS17 really changed the animation how the old videos to live are handled. Testing it on a old iOS Live animation. it converted it...barely see any movement. Maybe speeding up the original video, I'll get a more better looking live wallapaper. Not sure

Original

Converted to Live
 
This older app that successfully converted video to live photos has been updated for iOS17
It's not perfect, but it does work.


Video clip to live wallpaper
(*The erratic movement shown below was me trying to shrink the image, move it)

I’d say this is the best one out there. I do end up speeding up the videos most of the time so I can see more motion. This is the only app I’ve come across that allows speeding up and adding clips and stuff.

Just saw this post https://www.instagram.com/reel/C22kWb5Ots4/?igsh=MWl5NnN0Ymhod2N5cg==
 
  • Like
Reactions: PhillyGuy72
I think you miss one step. When you export HEVC from Premiere, the file format will be mp4. So before AVCapture session format the movie in app AVAssetWriterInput(mediaType: .video, outputSettings: [AVVideoCodecKey : AVVideoCodecType.hevc, AVVideoWidthKey :1080, AVVideoHeightKey : 1920]) , export as mov. Do not use Quicktime for that, do it in app. From this one extract frame(s) and then proceed to AVMutableComposition(). As I said, not all already made videos will work with frame 0. Some of them need searching for a valid frame to function as moving movie :)
Do you have complete code? ,I'm a little confused about what you mean 。Thanks!
 
Guys, I finally made it! Thanks to @area7 's guideline. Here are some additions:
1. make sure heic picture has content
2. stillImage is not important, you can ignore that
3. resize your video into the same size of the workable video
3. accelerate your video until it's duration is the same as the workable video
4. copy EVERY tracks of metadata in workable video into new one

Good luck
 
  • Like
Reactions: FieryDragon
Guys, I finally made it! Thanks to @area7 's guideline. Here are some additions:
1. make sure heic picture has content
2. stillImage is not important, you can ignore that
3. resize your video into the same size of the workable video
3. accelerate your video until it's duration is the same as the workable video
4. copy EVERY tracks of metadata in workable video into new one

Good luck
Do you have demo code I can look at? Thank you very much
 
Guys, I finally made it! Thanks to @area7 's guideline. Here are some additions:
1. make sure heic picture has content
2. stillImage is not important, you can ignore that
3. resize your video into the same size of the workable video
3. accelerate your video until it's duration is the same as the workable video
4. copy EVERY tracks of metadata in workable video into new one

Good luck
hi~ would you explain more about `resize`, do you mean change video width and height to (1080x1920)? Is it possible to share parts of your code.
 
Last edited:
I shot a video myself,then use your demo code to convert it and still can't able.
Don you know what went wrong? Thanks
You can post your video in the issue and lets check out.
May because the sample time for generating the cover image is improperly (as they discussed before), try change it to any other time in duration.
 
You can post your video in the issue and lets check out.
May because the sample time for generating the cover image is improperly (as they discussed before), try change it to any other time in duration.
Yeah, you can user this video to check out , Thanks

 
Guys, I finally made it! Thanks to @area7 's guideline. Here are some additions:
1. make sure heic picture has content
2. stillImage is not important, you can ignore that
3. resize your video into the same size of the workable video
3. accelerate your video until it's duration is the same as the workable video
4. copy EVERY tracks of metadata in workable video into new one

Good luck
I would like to know what each metadata track in step 4 is used for and how to configure it. Can we add code to the video we want to edit instead of copying a mov file track to the file we want to edit? I feel more flexible this way. thank you.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.