Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

PhillyGuy72

macrumors 68040
Original poster
Sep 13, 2014
3,074
4,653
Philadelphia, PA USA
I know many didn't like Live Photos. But I wanted to try to bring a few back to my lock screen in iOS 17. I simply get this "Motion Not Available - Motion from this Live Photo is not supported as a wallpaper." on these pics.

Either some old live photos I've had in the past or a few I just took from my camera won't work. 🤷🏻‍♂️

IMG_5902.jpg
IMG_5901.jpeg
 
Thanks for posting this!

I’m confused why Apple implemented it this way with the slow motion Lock Screen for Live Photos.
It looks good in the keynote for the splashing wave photo, but the slow motion gives a creepy effect to a lot of regular photos of people.

But to your point, it’s very frustrating that you can’t just pick any Live Photo.
It seems more complicated than it used to be where you could choose any Live Photo and it just played at normal speed. No advanced bionic processing needed to interpolate frames.
 
Thanks for posting this!

I’m confused why Apple implemented it this way with the slow motion Lock Screen for Live Photos.
It looks good in the keynote for the splashing wave photo, but the slow motion gives a creepy effect to a lot of regular photos of people.

But to your point, it’s very frustrating that you can’t just pick any Live Photo.
It seems more complicated than it used to be where you could choose any Live Photo and it just played at normal speed. No advanced bionic processing needed to interpolate frames.
Interesting, you mentioned "slow motion". So...I tried a Live Photo on my camera (a horrible photo, LOL) ...but moved the phone very slowly to the left, as a test. And that slow movement seemed to be "acceptable" when it came to creating this as a Live Wallpaper? Odd!

Maybe a slight workaround...MAYBE? - Is to take an old Live Photo, export that to MP4...slow down the animation in Adobe Premiere, export it to 3-4 second mp4...then convert it back as a Live Photo, This could work?? Sounds like a huge process and yes... it is just for this. I am curious if this works though. Just seems silly after a year off from Live Wallpaper, this all changed?
 
  • Like
Reactions: fatTribble
Interesting, you mentioned "slow motion". So...I tried a Live Photo on my camera (a horrible photo, LOL) ...but moved the phone very slowly to the left, as a test. And that slow movement seemed to be "acceptable" when it came to creating this as a Live Wallpaper? Odd!

Maybe a slight workaround...MAYBE? - Is to take an old Live Photo, export that to MP4...slow down the animation in Adobe Premiere, export it to 3-4 second mp4...then convert it back as a Live Photo, This could work?? Sounds like a huge process and yes... it is just for this. I am curious if this works though. Just seems silly after a year off from Live Wallpaper, this all changed?
It’s a great proposal but I don’t have anything that important for my background that would justify the effort. I tried some Live Photos with minimal motion and it denied those. I submitted feedback to Apple to allow us to disable the slow motion. I probably should have just asked to bring back Live Wallpaper. It may sound strange but this was one of the features I was most looking forward to in iOS 17. I assumed you could also do a photo shuffle of Live Photos and have them all animate. It just seemed so obvious at least to me. Why the effort was put into adding frames is a mystery to me.
 
  • Like
Reactions: PhillyGuy72
Going to try to finish figuring out how to export from Ae/Pr later tonight. But incase this helps anyone a few things I’ve found testing Live Photos w/ Motion Wallpaper.

1. Using flash will disable motion.

2. Using 1:1 or non-standard portrait aspect ratio or dimensions will disable motion.

3. The few sites and apps that I’ve tried converting gif/video -> Live Photo. When you import it into photos app then save a copy as video, the frame rate is double what Live Photo from camera would be. I’m assuming this also is disabling the motion.
 
  • Like
Reactions: PhillyGuy72
I converted a GIF animation to live photo years ago. it worked when Apple's new iOS allows live photo photos as live wallpaper. but you had to touch the screen to see the effect. now with iOS 17, it runs automatically. However, the same live photo doesn't work anymore. it can be used as a still image showing just the first frame and that's it. I will have to investigate more. Keep discussing. We will find solutions for it.
 
Going to try to finish figuring out how to export from Ae/Pr later tonight. But incase this helps anyone a few things I’ve found testing Live Photos w/ Motion Wallpaper.

1. Using flash will disable motion.

2. Using 1:1 or non-standard portrait aspect ratio or dimensions will disable motion.

3. The few sites and apps that I’ve tried converting gif/video -> Live Photo. When you import it into photos app then save a copy as video, the frame rate is double what Live Photo from camera would be. I’m assuming this also is disabling the motion.
I have also tried using third party apps to convert mp4s to Live Photos with no success. The videos I tried had very little motion and had no occlusion of the main subject. I tried 1) creating Live Wallpapers with a 15fps jpg attached Live Photo created with LoveLiver and 2) a 30fps HEIC attached Live Photo created with IntoLive. Both photos were cropped to be the exact dimensions of my phone screen.

I talked with Apple Support about this and they are escalating my ticket about finding what qualifies as a compatible Live Photo to check with their engineers. I'll update in this thread once I hear back but I expect they don't release any technical details in true Apple fashion.
 
I have also tried using third party apps to convert mp4s to Live Photos with no success. The videos I tried had very little motion and had no occlusion of the main subject. I tried 1) creating Live Wallpapers with a 15fps jpg attached Live Photo created with LoveLiver and 2) a 30fps HEIC attached Live Photo created with IntoLive. Both photos were cropped to be the exact dimensions of my phone screen.

I talked with Apple Support about this and they are escalating my ticket about finding what qualifies as a compatible Live Photo to check with their engineers. I'll update in this thread once I hear back but I expect they don't release any technical details in true Apple fashion.
Following. I also tried with VideoToLive and didn’t take.
 
  • Like
Reactions: PhillyGuy72
Additionally, adding a Live Photo to a shared album makes the shared photo unsupported as a wallpaper only for certain photos. Adding the Live Photo to a regular album does not have this effect. Airdropping a Live Photo from one person to another keeps it compatible as a live wallpaper.

It seems this is not something that Apple intended. Legitimately made Live Photos should be able to able to be shared on Apple's own platform and still be able to be used as a live wallpaper.
 
Last edited:
  • Like
Reactions: nordique
Summary if you’re just googling looking for solution: Live Photos are just 2 files being displayed as one, so there probably will not be solution (that’s not extremely complicated) until they update iOS past 17.0.1.



Long Explanation of Troubleshooting Process:

I captured a new Live Photo and set it as wallpaper to verify functionality. I tried various export methods in Photos and Shortcuts, saving it from my Mac’s Photos. The file format depends on your phone’s camera settings, either JPEG or HEIC. Regardless of the format, importing it into Photos doesn’t retain the Live Photo functionality, even when imported alongside the accompanying video file. I observed that when Live Photos are AirDropped or sent via messages, they may appear as .PVT files, an uncommon format that I couldn’t find information on. Even if you manage to re-import them, as Live Wallpapers they remain non-functional.

However, if I sent a live photo to somebody in iMessage, and saved it (after deleting the original from Photos), it would save correctly, and the motion would work. So, since my Mac is synced to my messages, I checked In "Library -> Messages -> Attachments," where you can find where your Mac is storing your message attachments.

So, even in the most basic format, it's saving a live photo as two separate files and then displaying them as one. I duplicated the files and experimented with some basic modifications like overlaying text and flipping the video (in Preview) without changing anything else and replaced the originals. Nothing changed in my Messages app on Mac, but when I forwarded the messages to myself again, the ones that came through on my iPhone were the edited versions, and they displayed as a live photo with all the correct metadata. But still they didn’t work for wallpaper (also tried renaming to original file name to trick photos into assuming it a duplicate but no luck).

So, I'm going off the fact that often when live photos are shared, they end up being shared as .PVT, which is some weird format, and otherwise, it's just two files being made to appear as one. There must be some kind of tag that only allows those two files to be combined and used as a wallpaper if they are directly taken from the Camera app on your phone and have not been shared. Possible candidates for this flag are "PFVideoComplementMetadataVersionKey" and "Scene Type: A directly photographed image." Based on that they only seem visible when viewing data on Mac; didn't see them on PC.

From my many years being an Apple Fanboy and a few years developing apps only using xCode & Swift. I've found that most of the time, if there's something super peculiar going on with Apple stuff, the reason is either related to accessibility (flash might disable motion because of epilepsy concerns) or proprietary "Apple magic" stuff (if it’s only meant to work within Apple made apps). I have a feeling that IF there is a solution, it's buried somewhere in archived developer documentation around the first release of Live Photos.
 
Summary if you’re just googling looking for solution: Live Photos are just 2 files being displayed as one, so there probably will not be solution (that’s not extremely complicated) until they update iOS past 17.0.1.



Long Explanation of Troubleshooting Process:

I captured a new Live Photo and set it as wallpaper to verify functionality. I tried various export methods in Photos and Shortcuts, saving it from my Mac’s Photos. The file format depends on your phone’s camera settings, either JPEG or HEIC. Regardless of the format, importing it into Photos doesn’t retain the Live Photo functionality, even when imported alongside the accompanying video file. I observed that when Live Photos are AirDropped or sent via messages, they may appear as .PVT files, an uncommon format that I couldn’t find information on. Even if you manage to re-import them, as Live Wallpapers they remain non-functional.

However, if I sent a live photo to somebody in iMessage, and saved it (after deleting the original from Photos), it would save correctly, and the motion would work. So, since my Mac is synced to my messages, I checked In "Library -> Messages -> Attachments," where you can find where your Mac is storing your message attachments.

So, even in the most basic format, it's saving a live photo as two separate files and then displaying them as one. I duplicated the files and experimented with some basic modifications like overlaying text and flipping the video (in Preview) without changing anything else and replaced the originals. Nothing changed in my Messages app on Mac, but when I forwarded the messages to myself again, the ones that came through on my iPhone were the edited versions, and they displayed as a live photo with all the correct metadata. But still they didn’t work for wallpaper (also tried renaming to original file name to trick photos into assuming it a duplicate but no luck).

So, I'm going off the fact that often when live photos are shared, they end up being shared as .PVT, which is some weird format, and otherwise, it's just two files being made to appear as one. There must be some kind of tag that only allows those two files to be combined and used as a wallpaper if they are directly taken from the Camera app on your phone and have not been shared. Possible candidates for this flag are "PFVideoComplementMetadataVersionKey" and "Scene Type: A directly photographed image." Based on that they only seem visible when viewing data on Mac; didn't see them on PC.

From my many years being an Apple Fanboy and a few years developing apps only using xCode & Swift. I've found that most of the time, if there's something super peculiar going on with Apple stuff, the reason is either related to accessibility (flash might disable motion because of epilepsy concerns) or proprietary "Apple magic" stuff (if it’s only meant to work within Apple made apps). I have a feeling that IF there is a solution, it's buried somewhere in archived developer documentation around the first release of Live Photos.
Great analysis. I don't think that the solution is in the Apple Developer documentation though. There seems to be three metadata properties for media that relate to Live Photos. Those are quickTimeMetadataAutoLivePhoto, quickTimeMetadataLivePhotoVitalityScore, and quickTimeMetadataLivePhotoVitalityScoringVersion.

The ExifTool documentation shows these metadata tags for the .mov part and the .jpg part. This documentation was updated in 2022 or so. While it could be out of date, I don't think that it is because Live Photos from previous iOS versions are supported.

I took two photos with the camera app: one of a still wall, and one of a still wall but me shaking my hand in front of it. The still wall was supported as a live wallpaper while the one of me shaking my hand was not. I exported the jpg/mov combinations from the Photos App on Mac. One would expect that the one of me shaking my hand would have a higher vitality score but after inspecting the metadata with ExifTool, both videos had scores around 0.939 but the one of me shaking my hand had a score 0.0002 higher. Considering that the documentation says that media with low vitality has a score below 0.5 and the Live Photo of the wall was extremely still and the hand one wasn't, I think that this score is messed up and doesn't have anything to do with determining of the photo is supported as a live wallpaper.
 
I tried this wallpaper app I grabbed for free from the app store. (Called Wallpaper Now) This app had a few free "new iOS17 live wallpapers" on the app. I downloaded them, these totally worked.

The Live Photo moves much slower when you create the lock screen out of this, when viewing this in your Photos album, it moves much faster. Not sure what any of this means, EXIF data is very limited on my end, only showing file size of HEIF or JPG 1.2MB - Mp4 size of 3.3MB

 
  • Like
Reactions: chris01b
I tried this wallpaper app I grabbed for free from the app store. (Called Wallpaper Now) This app had a few free "new iOS17 live wallpapers" on the app. I downloaded them, these totally worked.

The Live Photo moves much slower when you create the lock screen out of this, when viewing this in your Photos album, it moves much faster. Not sure what any of this means, EXIF data is very limited on my end, only showing file size of HEIF or JPG 1.2MB - Mp4 size of 3.3MB

Well, the good news is that someone has figured out how to create and import custom live wallpapers so eventually this knowledge will trickle down to us. Until we learn how to export a Live Photo as its two parts and then import it as a valid live wallpaper, I'll be of no help.
 
Well, the good news is that someone has figured out how to create and import custom live wallpapers so eventually this knowledge will trickle down to us. Until we learn how to export a Live Photo as its two parts and then import it as a valid live wallpaper, I'll be of no help.
So I think I figured out how the apps are doing this.

Source: https://developer.apple.com/documen...live_photos/capturing_and_saving_live_photos/


func saveLivePhotoToPhotosLibrary(stillImageData: Data, livePhotoMovieURL: URL) { PHPhotoLibrary.requestAuthorization { status in guard status == .authorized else { return }

Honestly skimmed the page for right now, but just based on this bit alone I’m thinking the guard status might be why shared photos aren’t working, and the MovieURL might be why it’s so hard to import.

At some point some I’m gonna sit down and try to work out a playgrounds project that’s simple & can handle the saving part if nothing else. That way at least anyone with an iPad / Mac can copy the text from mine and run their own custom app for it in a few minutes.

Side Note: I don’t have anywhere else to mention this, but I think it’s genuinely cool that you can Develop, Maintain, and Publish an full App to the App Store solely from an iPad.
 
So I think I figured out how the apps are doing this.

Source: https://developer.apple.com/documen...live_photos/capturing_and_saving_live_photos/


func saveLivePhotoToPhotosLibrary(stillImageData: Data, livePhotoMovieURL: URL) { PHPhotoLibrary.requestAuthorization { status in guard status == .authorized else { return }

Honestly skimmed the page for right now, but just based on this bit alone I’m thinking the guard status might be why shared photos aren’t working, and the MovieURL might be why it’s so hard to import.

At some point some I’m gonna sit down and try to work out a playgrounds project that’s simple & can handle the saving part if nothing else. That way at least anyone with an iPad / Mac can copy the text from mine and run their own custom app for it in a few minutes.

Side Note: I don’t have anywhere else to mention this, but I think it’s genuinely cool that you can Develop, Maintain, and Publish an full App to the App Store solely from an iPad.
Thanks for sharing that developer documentation link. About the guard status, I'm leaning towards it not being the root cause of our issue here. That piece of code is more about basic app permissions and making sure things run smoothly in that regard.

I'm with you on the use of Apple's methods potentially being a key factor here, but it is also worth noting that apps like intoLive do not create supported Live Photo wallpapers. If we can at least get to a point where we can handle the saving part correctly, it might bring us a step closer to understanding this puzzle.

I see that there are some Live Photo settings that you can configure. In the livePhotoMovieMetadata AVMetadataItem array, there are some AVMetadataIdentifiers of interest: quickTimeMetadataAutoLivePhoto, quickTimeMetadataLivePhotoVitalityScore, and quickTimeMetadataLivePhotoVitalityScoringVersion I found earlier. From my looking at exif data, I can't seem to tell if these affect whether a Live Photo is supported. What I can tell, is Live Photo Info in all supported .mov parts. I can't find any more documentation online about it, but maybe this is created when you use addResource on a pairedVideo PHAssetResourceType. Towards the contrary, however, reimporting a live photo with this metadata via the Photos App on Mac still doesn't make it work.
 
  • Like
Reactions: Bostonbanana
Thanks for sharing that developer documentation link. About the guard status, I'm leaning towards it not being the root cause of our issue here. That piece of code is more about basic app permissions and making sure things run smoothly in that regard.

I'm with you on the use of Apple's methods potentially being a key factor here, but it is also worth noting that apps like intoLive do not create supported Live Photo wallpapers. If we can at least get to a point where we can handle the saving part correctly, it might bring us a step closer to understanding this puzzle.

I see that there are some Live Photo settings that you can configure. In the livePhotoMovieMetadata AVMetadataItem array, there are some AVMetadataIdentifiers of interest: quickTimeMetadataAutoLivePhoto, quickTimeMetadataLivePhotoVitalityScore, and quickTimeMetadataLivePhotoVitalityScoringVersion I found earlier. From my looking at exif data, I can't seem to tell if these affect whether a Live Photo is supported. What I can tell, is Live Photo Info in all supported .mov parts. I can't find any more documentation online about it, but maybe this is created when you use addResource on a pairedVideo PHAssetResourceType. Towards the contrary, however, reimporting a live photo with this metadata via the Photos App on Mac still doesn't make it work.
Have you looked at the Live Photo from the wallpaper now app with exiftool to see if it shows anything new? I would look at myself but I am new to all this and can’t even seem to find how to correctly export it to Mac…
 
I think I figured out how to export to my Mac and use the ExifTool. When I looked the metadata of the movie files from the Live Photo that work (ones I took and ones I downloaded from wallpaper now app), I did noticed that it had extra tags that were absent from the ones that don't work. Two tags that jumped out at me are Live Photo Info and Live Photo Still Image Transform.

Sample Time : 0 s
Sample Duration : 0.02 s
Live Photo Info : 3 0.0145119996741414 1844164067 128 86.7509384155273 14.2610721588135 0.396545708179474 -0.0815095826983452 1.92900002002716 4 4 0 -1 0 0 0 0 0 0 0 0 0 9.80908925027372e-45 0.264391481876373 3209850598 10779 50282
...
(These 3 tags repeated until the end time. Sample Time and Duration changed but Live Photo Info value remained same . I looked at another file from wallpaper now and the Live Photo Info values were identical but the value changed for each instance when I look at the files that I took. I wonder if wallpaper now just hardcoded it.)

Live Photo Still Image Transform: ??????
(value is unreadable and I don't think this tag is listed here.)

Can anyone make sense of these? I am hoping this is just a bug and apple makes it backwards compatible...
 
  • Like
Reactions: chris01b
I think I figured out how to export to my Mac and use the ExifTool. When I looked the metadata of the movie files from the Live Photo that work (ones I took and ones I downloaded from wallpaper now app), I did noticed that it had extra tags that were absent from the ones that don't work. Two tags that jumped out at me are Live Photo Info and Live Photo Still Image Transform.

Sample Time : 0 s
Sample Duration : 0.02 s
Live Photo Info : 3 0.0145119996741414 1844164067 128 86.7509384155273 14.2610721588135 0.396545708179474 -0.0815095826983452 1.92900002002716 4 4 0 -1 0 0 0 0 0 0 0 0 0 9.80908925027372e-45 0.264391481876373 3209850598 10779 50282
...
(These 3 tags repeated until the end time. Sample Time and Duration changed but Live Photo Info value remained same . I looked at another file from wallpaper now and the Live Photo Info values were identical but the value changed for each instance when I look at the files that I took. I wonder if wallpaper now just hardcoded it.)

Live Photo Still Image Transform: ??????
(value is unreadable and I don't think this tag is listed here.)

Can anyone make sense of these? I am hoping this is just a bug and apple makes it backwards compatible...
I can corroborate that my supported Live Photos all have these tags but not all Live Photos that have these tags are supported. In a Live Photo I tested where motion is not supported, the foreground object's motion occludes the background and it still has those tags.

The custom Live Photos from the Wallpaper Now app also have these tags but their Live Photo Info tags are simply duplicated instead of being unique like a photo taken with the Camera App! This means that the developer probably manually added those tags, because creating a Live Photo from a video usually doesn't include any of this.

I can't do any more testing until I get a demo app on my phone to create live photos using this metadata through PhotoKit. Hopefully PhotoKit is the key to getting the supported Live Photos to stay supported after reimporting them into the Photos Library.

I had a follow up call with my Apple Support case and I showed them the Wallpapers Now's custom supported Live Photos. Hopefully the engineering team comments on that when they get back to me.
 
I think I figured out how to export to my Mac and use the ExifTool. When I looked the metadata of the movie files from the Live Photo that work (ones I took and ones I downloaded from wallpaper now app), I did noticed that it had extra tags that were absent from the ones that don't work. Two tags that jumped out at me are Live Photo Info and Live Photo Still Image Transform.

Sample Time : 0 s
Sample Duration : 0.02 s
Live Photo Info : 3 0.0145119996741414 1844164067 128 86.7509384155273 14.2610721588135 0.396545708179474 -0.0815095826983452 1.92900002002716 4 4 0 -1 0 0 0 0 0 0 0 0 0 9.80908925027372e-45 0.264391481876373 3209850598 10779 50282
...
(These 3 tags repeated until the end time. Sample Time and Duration changed but Live Photo Info value remained same . I looked at another file from wallpaper now and the Live Photo Info values were identical but the value changed for each instance when I look at the files that I took. I wonder if wallpaper now just hardcoded it.)

Live Photo Still Image Transform: ??????
(value is unreadable and I don't think this tag is listed here.)

Can anyone make sense of these? I am hoping this is just a bug and apple makes it backwards compatible...
The values of this time in Metadata are present in each live photo taken with camera, e.g.:
Sample Time : 2.88 s
Sample Duration : 0.03 s
Live Photo Info : 3 0.00823200028389692 2960405551 105 -32.6897506713867 -277.827728271484 -0.792463481426239 -2.89210414886475 1.92900002002716 3.41015625 2 4 -1 0 0 0 0 0 0 0 0 0 9.80908925027372e-45 -0.149214282631874 3212876178 22057 49828

You can hardcode them/add in metadata for your video but will not make that movie being accepted as valid motion.
The other key contains undocumented metadata format , e.g.:
Live Photo Still Image Transform: ???.?q}.??.????֯??﫶@@KS@??.y????????6?@
Live Photo Still Image Transform Reference Dimensions: 1920 1440
and I think this is some which we need to wait for Apple docs for now.
 
I can corroborate that my supported Live Photos all have these tags but not all Live Photos that have these tags are supported. In a Live Photo I tested where motion is not supported, the foreground object's motion occludes the background and it still has those tags.

The custom Live Photos from the Wallpaper Now app also have these tags but their Live Photo Info tags are simply duplicated instead of being unique like a photo taken with the Camera App! This means that the developer probably manually added those tags, because creating a Live Photo from a video usually doesn't include any of this.

I can't do any more testing until I get a demo app on my phone to create live photos using this metadata through PhotoKit. Hopefully PhotoKit is the key to getting the supported Live Photos to stay supported after reimporting them into the Photos Library.

I had a follow up call with my Apple Support case and I showed them the Wallpapers Now's custom supported Live Photos. Hopefully the engineering team comments on that when they get back to me.
The values of this time in Metadata are present in each live photo taken with camera, e.g.:
Sample Time : 2.88 s
Sample Duration : 0.03 s
Live Photo Info : 3 0.00823200028389692 2960405551 105 -32.6897506713867 -277.827728271484 -0.792463481426239 -2.89210414886475 1.92900002002716 3.41015625 2 4 -1 0 0 0 0 0 0 0 0 0 9.80908925027372e-45 -0.149214282631874 3212876178 22057 49828

You can hardcode them/add in metadata for your video but will not make that movie being accepted as valid motion.
The other key contains undocumented metadata format , e.g.:
Live Photo Still Image Transform: ???.?q}.??.????֯??﫶@@KS@??.y????????6?@
Live Photo Still Image Transform Reference Dimensions: 1920 1440
and I think this is some which we need to wait for Apple docs for now.
Have you tried hardcoding them? Do you mind sharing your code?
 
Have you tried hardcoding them? Do you mind sharing your code?
My point was that you maybe you have to hardcode them and then use PhotoKit to get them into your photo library. I say this because I have been unsuccessful at exporting the two parts of a supported Live Photo via the Photo App on Mac and then reimporting them by dragging them back and then syncing, shared albums, or Airdrop.
 
My point was that you maybe you have to hardcode them and then use PhotoKit to get them into your photo library. I say this because I have been unsuccessful at exporting the two parts of a supported Live Photo via the Photo App on Mac and then reimporting them by dragging them back and then syncing, shared albums, or Airdrop.
@chris01b hey I was actually replying to area7's quote.

"You can hardcode them/add in metadata for your video but will not make that movie being accepted as valid motion."

He sounded like he already tried so I wanted to see how he was adding the metadata. As for testing with the PhotoKit. Take a look at this file if you haven't already, this project is using PHAssetCreationRequest kit to save to the album as Live Photo. It still doesn't work.
 
@chris01b hey I was actually replying to area7's quote.

"You can hardcode them/add in metadata for your video but will not make that movie being accepted as valid motion."

He sounded like he already tried so I wanted to see how he was adding the metadata. As for testing with the PhotoKit. Take a look at this file if you haven't already, this project is using PHAssetCreationRequest kit to save to the album as Live Photo. It still doesn't work.
That project you linked will not work in iOS17 set as live wallpaper since it includes old pairing of jpg and mov via only identifier, which is not the case now {because other keys added for that motion thing}

Regarding adding hard coded metadata you can do it simple e.g. adding metadata in assetWriter?.metadata =
[addone, addtwo, addthree ... 4 x .. n ]. you just iterate though an array of values [0, 0.02, 0.03 etc] and add whatever values in order to fill all video total duration.

let addone = addSampleTime()
lett addtwo = addSampleTime2()
let addthree = addLiveTime()
let Allmediatimed = "3 0.00823200028389692 16939217 155 5.94521753062658e-15 -6.44374300689783e-15 0.27380958199501 0.575768828392029 1.92900002002716 3.30649042129517 4 0 -1 0 0 0 0 0 0 0 0 0 9.80908925027372e-45 -0.17872442305088 3212927435 33811 49646"


private func addSampleTime()->AVMetadataItem { // here you can add ofc. input as float 0.0 .. 0.99 and format it as String float + " s"
let item = AVMutableMetadataItem()
let keyContentIdentifier = "Sample Time" // metadatakey quickTimeMetadataKeyContentIdentifier
let keySpaceQuickTimeMetadata = "mdta"
item.key = keyContentIdentifier as (NSCopying & NSObjectProtocol)?
item.keySpace = AVMetadataKeySpace(rawValue: keySpaceQuickTimeMetadata)
item.value = "0 s" as (NSCopying & NSObjectProtocol)?
item.dataType = "com.apple.metadata.datatype.UTF-8"
return item
}

private func addSampleTime2()->AVMetadataItem { // here you can add ofc. input as float 0.0 .. 0.99 and format it as String float + " s"
let item = AVMutableMetadataItem()
let keyContentIdentifier = "Sample Duration" // metadatakey quickTimeMetadataKeyContentIdentifier
let keySpaceQuickTimeMetadata = "mdta"
item.key = keyContentIdentifier as (NSCopying & NSObjectProtocol)?
item.keySpace = AVMetadataKeySpace(rawValue: keySpaceQuickTimeMetadata)
item.value = "0.03 s" as (NSCopying & NSObjectProtocol)?
item.dataType = "com.apple.metadata.datatype.UTF-8"
return item
}

private func addLiveTime()->AVMetadataItem { //this is repetitive add it whenever you add the 2 keys above
let item = AVMutableMetadataItem()
let keyContentIdentifier = "Live Photo Info" // metadatakey quickTimeMetadataKeyContentIdentifier
let keySpaceQuickTimeMetadata = "mdta"
item.key = keyContentIdentifier as (NSCopying & NSObjectProtocol)?
item.keySpace = AVMetadataKeySpace(rawValue: keySpaceQuickTimeMetadata)
item.value = self.Allmediatimed as (NSCopying & NSObjectProtocol)?
item.dataType = "com.apple.metadata.datatype.UTF-8"
return item
}
Code can be simplyfied with a single function and enum as input type, have limited time to work on this .
The exiftool will report them exactly how you hard coded so no problem to add into your asset metadata.
 
Last edited:
  • Like
Reactions: Bostonbanana
My point was that you maybe you have to hardcode them and then use PhotoKit to get them into your photo library. I say this because I have been unsuccessful at exporting the two parts of a supported Live Photo via the Photo App on Mac and then reimporting them by dragging them back and then syncing, shared albums, or Airdrop.
This had nothing to do with AVCam project again. We need to have docs for AVAssetWriter (for example) with the accepted format, codec, kCVPixelBufferPixelFormatTypeKey etc of video movie and needed metadata keys which identifies it as a valid "movement". Hope the engineering team will respond you since if that app use private API and is published, they may remove it ...
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.