Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
That project you linked will not work in iOS17 set as live wallpaper since it includes old pairing of jpg and mov via only identifier, which is not the case now {because other keys added for that motion thing}

Regarding adding hard coded metadata you can do it simple e.g. adding metadata in assetWriter?.metadata =
[addone, addtwo, addthree ... 4 x .. n ]. you just iterate though an array of values [0, 0.02, 0.03 etc] and add whatever values in order to fill all video total duration.

let addone = addSampleTime()
lett addtwo = addSampleTime2()
let addthree = addLiveTime()
let Allmediatimed = "3 0.00823200028389692 16939217 155 5.94521753062658e-15 -6.44374300689783e-15 0.27380958199501 0.575768828392029 1.92900002002716 3.30649042129517 4 0 -1 0 0 0 0 0 0 0 0 0 9.80908925027372e-45 -0.17872442305088 3212927435 33811 49646"


private func addSampleTime()->AVMetadataItem { // here you can add ofc. input as float 0.0 .. 0.99 and format it as String float + " s"
let item = AVMutableMetadataItem()
let keyContentIdentifier = "Sample Time" // metadatakey quickTimeMetadataKeyContentIdentifier
let keySpaceQuickTimeMetadata = "mdta"
item.key = keyContentIdentifier as (NSCopying & NSObjectProtocol)?
item.keySpace = AVMetadataKeySpace(rawValue: keySpaceQuickTimeMetadata)
item.value = "0 s" as (NSCopying & NSObjectProtocol)?
item.dataType = "com.apple.metadata.datatype.UTF-8"
return item
}

private func addSampleTime2()->AVMetadataItem { // here you can add ofc. input as float 0.0 .. 0.99 and format it as String float + " s"
let item = AVMutableMetadataItem()
let keyContentIdentifier = "Sample Duration" // metadatakey quickTimeMetadataKeyContentIdentifier
let keySpaceQuickTimeMetadata = "mdta"
item.key = keyContentIdentifier as (NSCopying & NSObjectProtocol)?
item.keySpace = AVMetadataKeySpace(rawValue: keySpaceQuickTimeMetadata)
item.value = "0.03 s" as (NSCopying & NSObjectProtocol)?
item.dataType = "com.apple.metadata.datatype.UTF-8"
return item
}

private func addLiveTime()->AVMetadataItem { //this is repetitive add it whenever you add the 2 keys above
let item = AVMutableMetadataItem()
let keyContentIdentifier = "Live Photo Info" // metadatakey quickTimeMetadataKeyContentIdentifier
let keySpaceQuickTimeMetadata = "mdta"
item.key = keyContentIdentifier as (NSCopying & NSObjectProtocol)?
item.keySpace = AVMetadataKeySpace(rawValue: keySpaceQuickTimeMetadata)
item.value = self.Allmediatimed as (NSCopying & NSObjectProtocol)?
item.dataType = "com.apple.metadata.datatype.UTF-8"
return item
}
Code can be simplyfied with a single function and enum as input type, have limited time to work on this .
The exiftool will report them exactly how you hard coded so no problem to add into your asset metadata.
It doesn't look like Live Photo Info value gets added properly. I don't think the datatype is utf-8?

I used:
let Allmediatimed = "3 0.0145119996741414 1844164067 128 86.7509384155273 14.2610721588135 0.396545708179474 -0.0815095826983452 1.92900002002716 4 4 0 -1 0 0 0 0 0 0 0 0 0 9.80908925027372e-45 0.264391481876373 3209850598 10779 50282"

exiftool reports:
774905907 6.7126802605344e-07 960049457 876033593 1.6503396693679e-07 1.71542069438146e-07 1.6967970850601e-07 1.55186320745308e-19 1.56020131727458e-19 1.03838319773786e-05 53 48 57 51 892417080 53 50 55 51 775172384 6.44692410567416e-10 6.60107104977214e-07 4.12621545820002e-08 4.00463724681277e-11 892746035 13620 12343
 
It doesn't look like Live Photo Info value gets added properly. I don't think the datatype is utf-8?

I used:
let Allmediatimed = "3 0.0145119996741414 1844164067 128 86.7509384155273 14.2610721588135 0.396545708179474 -0.0815095826983452 1.92900002002716 4 4 0 -1 0 0 0 0 0 0 0 0 0 9.80908925027372e-45 0.264391481876373 3209850598 10779 50282"

exiftool reports:
774905907 6.7126802605344e-07 960049457 876033593 1.6503396693679e-07 1.71542069438146e-07 1.6967970850601e-07 1.55186320745308e-19 1.56020131727458e-19 1.03838319773786e-05 53 48 57 51 892417080 53 50 55 51 775172384 6.44692410567416e-10 6.60107104977214e-07 4.12621545820002e-08 4.00463724681277e-11 892746035 13620 12343
No, because is a string, and for me is displayed exactly how I wrote it . Just log metadata for movie asset after writting e.g.

<AVMetadataItem: 0x28311d090, identifier=mdta/Live%20Photo%20Info, keySpace=mdta, key class = __NSCFString, key=Live Photo Info, commonKey=(null), extendedLanguageTag=(null), dataType=com.apple.metadata.datatype.UTF-8, time={INVALID}, duration={INVALID}, startDate=(null), extras={
dataType = 1;
dataTypeNamespace = "com.apple.quicktime.mdta";
}, value class=__NSCFString, value=3 0.00823200028389692 16939217 155 5.94521753062658e-15 -6.44374300689783e-15 0.27380958199501 0.575768828392029 1.92900002002716 3.30649042129517 4 0 -1 0 0 0 0 0 0 0 0 0 9.80908925027372e-45 -0.17872442305088 3212927435 33811 49646>]
 
Last edited:
No, because is a string, and for me is displayed exactly how I wrote it . Just log metadata for movie asset after writting e.g.

<AVMetadataItem: 0x28311d090, identifier=mdta/Live%20Photo%20Info, keySpace=mdta, key class = __NSCFString, key=Live Photo Info, commonKey=(null), extendedLanguageTag=(null), dataType=com.apple.metadata.datatype.UTF-8, time={INVALID}, duration={INVALID}, startDate=(null), extras={
dataType = 1;
dataTypeNamespace = "com.apple.quicktime.mdta";
}, value class=__NSCFString, value=3 0.00823200028389692 16939217 155 5.94521753062658e-15 -6.44374300689783e-15 0.27380958199501 0.575768828392029 1.92900002002716 3.30649042129517 4 0 -1 0 0 0 0 0 0 0 0 0 9.80908925027372e-45 -0.17872442305088 3212927435 33811 49646>]
Hmmm...it's a strange issue. exiftool still reported Live Photo Info metadata when I used com.apple.quicktime.live-photo-info but with different value.

Thanks. It was my bad. I was using com.apple.quicktime.live-photo-info instead of Live Photo Info. Looks like we need to figure out what Live Photo Still Image Transform metadata is next...
 
Last edited:
Thanks. It was my bad. I was using com.apple.quicktime.live-photo-info instead of Live Photo Info. Looks like we need to figure out what Live Photo Still Image Transform metadata is next...
Tested with those values, compose a full array of
[AVMetadataItem]


let Allmediatimed = "3 0.00823200028389692 16939217 155 5.94521753062658e-15 -6.44374300689783e-15 0.27380958199501 0.575768828392029 1.92900002002716 3.30649042129517 4 0 -1 0 0 0 0 0 0 0 0 0 9.80908925027372e-45 -0.17872442305088 3212927435 33811 49646"
[/I][/I]

var sstime = [" 0 s", " 0.03 s", " 0.07 s", " 0.10 s", " 0.13 s", " 0.17 s", " 0.20 s", " 0.23 s", " 0.27 s", " 0.30 s", " 0.33 s", " 0.37 s", " 0.40 s", " 0.43 s", " 0.47 s", " 0.50 s", " 0.53 s", " 0.57 s", " 0.60 s", " 0.63 s", " 0.67 s", " 0.70 s", " 0.73 s", " 0.77 s", " 0.80 s", " 0.83 s", " 0.87 s", " 0.90 s", " 0.93 s", " 0.97 s", " 1.00 s", " 1.03 s", " 1.07 s", " 1.10 s", " 1.13 s", " 1.17 s", " 1.20 s", " 1.23 s", " 1.27 s", " 1.30 s", " 1.33 s", " 1.40 s", " 1.47 s", " 1.53 s", " 1.60 s", " 1.67 s", " 1.73 s", " 1.80 s", " 1.87 s", " 1.93 s", " 2.00 s", " 2.07 s", " 2.13 s", " 2.20 s", " 2.27 s", " 2.30 s", " 2.33 s", " 2.37 s", " 2.40 s", " 2.43 s", " 2.47 s", " 2.50 s", " 2.53 s", " 2.57 s", " 2.60 s", " 2.63 s", " 2.67 s", " 2.70 s", " 2.73 s"]

var ssduration = [" 0 s", " 0.03 s", " 0.03 s", " 0.03 s", " 0.03 s", " 0.03 s", " 0.03 s", " 0.03 s", " 0.03 s", " 0.03 s", " 0.03 s", " 0.03 s", " 0.03 s", " 0.03 s", " 0.03 s", " 0.03 s", " 0.03 s", " 0.03 s", " 0.03 s", " 0.03 s", " 0.03 s", " 0.03 s", " 0.03 s", " 0.03 s", " 0.03 s", " 0.03 s", " 0.03 s", " 0.03 s", " 0.03 s", " 0.03 s", " 0.03 s", " 0.03 s", " 0.03 s", " 0.03 s", " 0.03 s", " 0.03 s", " 0.03 s", " 0.03 s", " 0.03 s", " 0.03 s", " 0.03 s", " 0.07 s", " 0.07 s", " 0.07 s", " 0.07 s", " 0.07 s", " 0.07 s", " 0.07 s", " 0.07 s", " 0.07 s", " 0.07 s", " 0.07 s", " 0.07 s", " 0.07 s", " 0.07 s", " 0.03 s", " 0.03 s", " 0.03 s", " 0.03 s", " 0.03 s", " 0.03 s", " 0.03 s", " 0.03 s", " 0.03 s", " 0.03 s", " 0.03 s", " 0.03 s", " 0.03 s", " 0.03 s", " 0.03 s"]
 
Last edited:
It doesn't look like Live Photo Info value gets added properly. I don't think the datatype is utf-8?

I used:
let Allmediatimed = "3 0.0145119996741414 1844164067 128 86.7509384155273 14.2610721588135 0.396545708179474 -0.0815095826983452 1.92900002002716 4 4 0 -1 0 0 0 0 0 0 0 0 0 9.80908925027372e-45 0.264391481876373 3209850598 10779 50282"

exiftool reports:
774905907 6.7126802605344e-07 960049457 876033593 1.6503396693679e-07 1.71542069438146e-07 1.6967970850601e-07 1.55186320745308e-19 1.56020131727458e-19 1.03838319773786e-05 53 48 57 51 892417080 53 50 55 51 775172384 6.44692410567416e-10 6.60107104977214e-07 4.12621545820002e-08 4.00463724681277e-11 892746035 13620 12343
That value is not a string it's [key:value] pairs. Furthermore it's timed metadata not regular metadata. Just pitching in to be useful.
 
That value is not a string it's [key:value] pairs. Furthermore it's timed metadata not regular metadata. Just pitching in to be useful.
Hmm. How can you tell it is supposed to be key:value pair? Looking at the metadata types, only two datatypes take in pair of numbers:

@const kCMMetadataBaseDataType_PolygonF32


Three or more pairs of 32-bit floating point numbers (x and y values) that define the verticies of a polygon.


@const kCMMetadataBaseDataType_PolylineF32


Two or more pairs of 32-bit floating point numbers (x and y values) that define a multi-segmented line.

As for the timed metadata, would you add it using an AVAssetWriterInputMetadataAdaptor? Appreciate the help!
 
  • Like
Reactions: matadors
Figured out that movie attached with a live photo has just 4 metatags at the time of composing in camera. So the message Motion Not Available is related to heic container. I mean main image which contains preview, static image and frames from video. As I tested when you record with camera there are multiple frames contained in heic image. You can use after a capture PHLivePhotoEditingContext(livePhotoEditingInput: , then context.frameProcessor = { [self] frame, _ in and see there are more images there with different type , photo and video. The problem is if you alter images there , let's say you add some filter in each frame, when you save output it will say again motion not supported. This bug exists also in Photos app, if you change a key image of live photo and save edit, then you cannot apply it as live wallpaper. Now must figure out what exactly edit broke when saving after editing heic [here it loose original metadata ] and of course how to programmatically compose main heic file. For multiple images in container CGImageDestinationCreateWithURL, then CGImageDestinationAddImage work to generate heic file. But the relation with video content seems hard to find well documented.
 
Last edited:
Hmm. How can you tell it is supposed to be key:value pair? Looking at the metadata types, only two datatypes take in pair of numbers:

@const kCMMetadataBaseDataType_PolygonF32


Three or more pairs of 32-bit floating point numbers (x and y values) that define the verticies of a polygon.


@const kCMMetadataBaseDataType_PolylineF32


Two or more pairs of 32-bit floating point numbers (x and y values) that define a multi-segmented line.

As for the timed metadata, would you add it using an AVAssetWriterInputMetadataAdaptor? Appreciate the help!
Not sure what the datatype is but it's lead by plist when broken down, not sure if it's a traditional plist or just refers to a list. Since we have Sample Time : 0.08s, Sample Duration: 0.02 s and Live Photo Info : 3 0.01451.... that makes sense.

Yes using AVAssetWriterInputMetadataAdopter and the time range.

There are also custom types that can be created, though they still have to conform to existing datatypes. The RawDataType would allow something custom. Not sure what Apple is using since there isn't much documentation.
 
  • Like
Reactions: Bostonbanana
Figured out that movie attached with a live photo has just 4 metatags at the time of composing in camera. So the message Motion Not Available is related to heic container. I mean main image which contains preview, static image and frames from video. As I tested when you record with camera there are multiple frames contained in heic image. You can use after a capture PHLivePhotoEditingContext(livePhotoEditingInput: , then context.frameProcessor = { [self] frame, _ in and see there are more images there with different type , photo and video. The problem is if you alter images there , let's say you add some filter in each frame, when you save output it will say again motion not supported. This bug exists also in Photos app, if you change a key image of live photo and save edit, then you cannot apply it as live wallpaper. Now must figure out what exactly edit broke when saving after editing heic [here it loose original metadata ] and of course how to programmatically compose main heic file. For multiple images in container CGImageDestinationCreateWithURL, then CGImageDestinationAddImage work to generate heic file. But the relation with video content seems hard to find well documented.
How are you extracting/seeing these? Also JPG's seem to work as well which they have much less flexibility.
 
Figured out that movie attached with a live photo has just 4 metatags at the time of composing in camera. So the message Motion Not Available is related to heic container. I mean main image which contains preview, static image and frames from video. As I tested when you record with camera there are multiple frames contained in heic image. You can use after a capture PHLivePhotoEditingContext(livePhotoEditingInput: , then context.frameProcessor = { [self] frame, _ in and see there are more images there with different type , photo and video. The problem is if you alter images there , let's say you add some filter in each frame, when you save output it will say again motion not supported. This bug exists also in Photos app, if you change a key image of live photo and save edit, then you cannot apply it as live wallpaper. Now must figure out what exactly edit broke when saving after editing heic [here it loose original metadata ] and of course how to programmatically compose main heic file. For multiple images in container CGImageDestinationCreateWithURL, then CGImageDestinationAddImage work to generate heic file. But the relation with video content seems hard to find well documented.
If I convert the HEIC to jpeg, it still works fine. I tried using a working video file to copy its metadata from another similar video file (same length, frame rate, dimensions etc.) but it doesn't work. According to Exiftool the metadata is the same except for timescale, bitrate and track volume. It was a direct metadata copy from a working video to a newly created video.

A few weeks ago I tried to add the keys one by one but it didn't work. I managed to add com.apple.quicktime.live-photo-info key like below, it showed as expected with Exiftool.
let tag4 = metadataItemGeneric(key: "com.apple.quicktime.live-photo-info", dataType: "com.apple.metadata.datatype.utf8" , value: "3 0.0200059991329908 483877380 66 9.36382810618483e-15 -3.63274575622841e-14 1.42329120635986 -0.880580961704254 1.97099995613098 3.14510226249695 2 4 -1 0 0 0 0 0 0 0 0 0 9.80908925027372e-45 0.714458882808685 3212747775 58550 49602")

private func metadataItemGeneric(key: String, dataType: String, value: Any) -> AVMetadataItem? {
guard let value = value as? (NSCopying & NSObjectProtocol) else {
print("Value doesn't conform to (NSCopying & NSObjectProtocol)")
return nil
}
let item = AVMutableMetadataItem()
item.key = key as (NSCopying & NSObjectProtocol)?
item.keySpace = .quickTimeMetadata
item.value = value
item.dataType = dataType
return item.copy() as? AVMetadataItem
}
 
Ok, I reencoded the original video file with Premier Pro and then added back the metadata and it worked fine as live wallpaper. In other words, the LivePhotoInfo tag has the information that makes it or breaks it. It is an undocumented tag so I have no idea what it describes. Any ideas?

Edit:
The only thing I found on the internet is in ExifTool github:

# (mdta)com.apple.quicktime.live-photo-info (dtyp=com.apple.quicktime.com.apple.quicktime.live-photo-info)
'live-photo-info' => {
Name => 'LivePhotoInfo',
Writable => 0,
# not sure what these values mean, but unpack them anyway - PH
# (ignore the fact that the "f" and "l" unpacks won't work on a big-endian machine)
ValueConv => 'join " ",unpack "VfVVf6c4lCCcclf4Vvv", $val',
},
 
Ok, I reencoded the original video file with Premier Pro and then added back the metadata and it worked fine as live wallpaper. In other words, the LivePhotoInfo tag has the information that makes it or breaks it. It is an undocumented tag so I have no idea what it describes. Any ideas?
So the different encoding settings posed no issues. Did you try creating a new video with similar settings and duration then adding the metadata from the original video? Also are you just using exiftool to transfer all the metadata?

I also tried moving the metadata however the livephotoinfo values didn't get added for each section for whatever reason. Sample Time and Sample Duration yes. I will poke around one more time when I get the chance and check if there may be a difference in the Quicktime boxes that hold the LivePhotoInfo tag.
 
Last edited:
Сan someone share a gist with the swift class for saving new lives? It just doesn't work out for me. Although the metadata is the same.
 
If I convert the HEIC to jpeg, it still works fine. I tried using a working video file to copy its metadata from another similar video file (same length, frame rate, dimensions etc.) but it doesn't work. According to Exiftool the metadata is the same except for timescale, bitrate and track volume. It was a direct metadata copy from a working video to a newly created video.

A few weeks ago I tried to add the keys one by one but it didn't work. I managed to add com.apple.quicktime.live-photo-info key like below, it showed as expected with Exiftool.
let tag4 = metadataItemGeneric(key: "com.apple.quicktime.live-photo-info", dataType: "com.apple.metadata.datatype.utf8" , value: "3 0.0200059991329908 483877380 66 9.36382810618483e-15 -3.63274575622841e-14 1.42329120635986 -0.880580961704254 1.97099995613098 3.14510226249695 2 4 -1 0 0 0 0 0 0 0 0 0 9.80908925027372e-45 0.714458882808685 3212747775 58550 49602")

private func metadataItemGeneric(key: String, dataType: String, value: Any) -> AVMetadataItem? {
guard let value = value as? (NSCopying & NSObjectProtocol) else {
print("Value doesn't conform to (NSCopying & NSObjectProtocol)")
return nil
}
let item = AVMutableMetadataItem()
item.key = key as (NSCopying & NSObjectProtocol)?
item.keySpace = .quickTimeMetadata
item.value = value
item.dataType = dataType
return item.copy() as? AVMetadataItem
}
@matadors mentioned that LivePhotoInfo is a timed metadata and it needs to be added using AVAssetWriterInputMetadataAdopter. I am having trouble creating a timed metadata for LivePhotoInfo tag.
 
Apple support just got back to me and said that the engineering team is aware that some Live Photos are not supported as wallpapers and they are trying to fix the issue.

They gave absolutely no more info than that so we don't know if their goal is to get all wallpapers to work like iOS 15 (including manually-created ones) or to just support more vigorous motion created in the Camera App.
 
So the different encoding settings posed no issues. Did you try creating a new video with similar settings and duration then adding the metadata from the original video? Also are you just using exiftool to transfer all the metadata?

I also tried moving the metadata however the livephotoinfo values didn't get added for each section for whatever reason. Sample Time and Sample Duration yes. I will poke around one more time when I get the chance and check if there may be a difference in the Quicktime boxes that hold the LivePhotoInfo tag.
Yes, same video reencoded with Premier Pro with similar settings (HEVC, Main 10, 5.1, 40Mbs, 1s, 60fps, no audio) . I used AVMutableComposition to add video track metadata and asset metadata back and AVAssetExportSession(asset: composition, presetName: AVAssetExportPresetPassthrough) to create a 3rd video file that has the Premier Pro encoding but the original metadata. It works fine. If I use a different video file same length, settings etc it doesn't work. The fact that there is some processing going on when you press the live photo button means iOS analyzing the live photo. I will try some other videos, maybe it needs more motion or something.
 
Yes, same video reencoded with Premier Pro with similar settings (HEVC, Main 10, 5.1, 40Mbs, 1s, 60fps, no audio) . I used AVMutableComposition to add video track metadata and asset metadata back and AVAssetExportSession(asset: composition, presetName: AVAssetExportPresetPassthrough) to create a 3rd video file that has the Premier Pro encoding but the original metadata. It works fine. If I use a different video file same length, settings etc it doesn't work. The fact that there is some processing going on when you press the live photo button means iOS analyzing the live photo. I will try some other videos, maybe it needs more motion or something.
Hello. Could you show how and what metadata you add to the video to make the live work.
 
Ok, I reencoded the original video file with Premier Pro and then added back the metadata and it worked fine as live wallpaper. In other words, the LivePhotoInfo tag has the information that makes it or breaks it. It is an undocumented tag so I have no idea what it describes. Any ideas?

Edit:
The only thing I found on the internet is in ExifTool github:

# (mdta)com.apple.quicktime.live-photo-info (dtyp=com.apple.quicktime.com.apple.quicktime.live-photo-info)
'live-photo-info' => {
Name => 'LivePhotoInfo',
Writable => 0,
# not sure what these values mean, but unpack them anyway - PH
# (ignore the fact that the "f" and "l" unpacks won't work on a big-endian machine)
ValueConv => 'join " ",unpack "VfVVf6c4lCCcclf4Vvv", $val',
},
Could you give a procedure for us to try and reproduce?
 
So I think I figured out how the apps are doing this.

Source: https://developer.apple.com/documen...live_photos/capturing_and_saving_live_photos/


func saveLivePhotoToPhotosLibrary(stillImageData: Data, livePhotoMovieURL: URL) { PHPhotoLibrary.requestAuthorization { status in guard status == .authorized else { return }

Honestly skimmed the page for right now, but just based on this bit alone I’m thinking the guard status might be why shared photos aren’t working, and the MovieURL might be why it’s so hard to import.

At some point some I’m gonna sit down and try to work out a playgrounds project that’s simple & can handle the saving part if nothing else. That way at least anyone with an iPad / Mac can copy the text from mine and run their own custom app for it in a few minutes.

Side Note: I don’t have anywhere else to mention this, but I think it’s genuinely cool that you can Develop, Maintain, and Publish an full App to the App Store solely from an iPad.
I agree with your side note, I love the iPad, and it’s become my primary computer. It does a good job of balancing simplicity with functionality, something that macOS doesn’t pull off quite as well. I like macOS too, but it is more complex, and often the same functionality is a little less intuitive and more complicated. I think next year will be a really big year for the iPad. 👍🏻
 
Read this entire thread. does anyone know if there was software that worked on ios 15 for making live wallpapers on your pc or mac? i am wondering if those softwares would just update to support ios 17 apps. there are plenty of app makers making money on there wallpapers as we speak and they have it all figured out. sadly there is nothing on how to's yet.
 
I've also been trying to figure out how to run live wallpapers on iOS 17, and recently came across this repository with a working example of wallpaper generation for iOS 17. I haven't gone into the implementation details yet, just checked out the example

 
I've also been trying to figure out how to run live wallpapers on iOS 17, and recently came across this repository with a working example of wallpaper generation for iOS 17. I haven't gone into the implementation details yet, just checked out the example

it works on iOS 17 because the sample heic and mov already have the necessary tags and the example is just combining them into Live Photo object. Looks like those files are downloaded from wallpaper now app. QuickTimeHelper.swift will not produce a Live Photo that works on iOS 17 wall paper.
 
Interestingly enough everything seems buggy, many live photos taken with iPhone don't work, or really changes the motion. Wondering what would be limitations of the ones that are custom?
 
I submited also a bug report with this content : "Using Photos app in i0S 17, record a live photo. Try to apply as wallpaper and it works. Now edit same live photo and change to a frame and press Make Key Photo, then save [tap Done]. Try yo apply this edited live photo as wallpaper and you get the annoying bug message: "Motion Not Available". This bug can also be reproduced using Swift language: Photo framework PHLivePhotoEditingContext(livePhotoEditingInput: , apply a filter in Context.frameProcessor = and save with Context.saveLivePhoto(to: output) . The same message Motion Not Available will appear. I think this is a major bug." Because the culprit seems to be inside Photo framework at all. On my iPad no recorded live photo is able to be applied as wallpaper for example.
 
  • Like
Reactions: Bostonbanana
Interestingly enough everything seems buggy, many live photos taken with iPhone don't work, or really changes the motion. Wondering what would be limitations of the ones that are custom?
If you analyze a recorded live photo you see it has 2 frames of image type and more of video [I counted 60 average]. So the old format will not work now in any kind you will try [jpg + mov]. Even if new files have jpg extension they are generated from heic and contain relations to video segments. We need to find a way to do 2 writers: one for heic img container and one for mov with some metadata. Being undocumented, now we have low chances to find a way of composing programmatically those files. I suppose those who succeed done writing atom part of a recorded live photo {the mbex atom which contains timed metadata} but I have no idea how this was done. I wrote the part of heic container for multiple frames in same file, need more documentation after this at all. Some of the new movies which we see in Wallpapers now were compressed using MainConcept video editor, as exiftool reports, but I have no interest in buying it, just saying. So in my opinion first we need to decode right relation of images and frames in the heic of live photo recorded with camera, then make a writter with those rules. Because all "secret" stays in heic file and relation one to one to video frames.
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.