The recent iOS 16 Beta 2 release indicates Apple have been testing a 3:2 aspect ratio stills mode for the Camera app. This setting is not exposed to the user and hasn't been listed as an official feature of iOS 16 (as far as I'm aware) so either Apple will quietly introduce this in a later update or they're working on something for the iPhone 14. Note that the current default aspect ratio of iPhone shots is 4:3.
I've long been requesting that the next generation of Photographic Styles move beyond simple contrast and saturation/warmth tweaking into the realm of film simulation profiles, much like digital Fujifilm cameras do today. Film simulation is a lot more advanced (and opinionated) than what Apple currently offers because film simulation also plays with artificial grain, fundamentally changing color tones rather than simple saturation (Lightroom Color Calibration for example), halation, etc. Artificial grain, for example, isn't as simple as slapping a grain filter onto an image like many editing apps do today because real film grain changes in appearance depending on how much light the film is receiving in a given portion of the frame.
One of the commonly agreed upon rumored features regarding the 14 Pro series is the move to a 48MP sensor after 7 years of 12MP sensors since the 6S in 2015, so there's no doubt the camera will be (as usual) a massive focus of this year's Pro model. Beyond camera hardware upgrades, Apple like to introduce new camera software features to accompany the hardware improvements...
So what does the 3:2 aspect ratio have to do with all this? Well that's the aspect ratio of 35mm still film, famously popularized by the Leica camera (and we all know how much Steve Jobs and Apple love Leica).
I believe this year's Pro series software feature will be the introduction of "AI powered" film simulation profiles built right into the camera app which are baked in at capture time, much like photographic styles are today.
It will be "AI powered" because they'll say the neural engine is figuring out how to perfectly replicate film signatures (such as the accurate application of grain I mentioned) across the image using "raw camera data" as opposed to 'dumb' flat filters that look janky. This will be their excuse for why older iPhones like the 13 Pro can't have it, like what they said about photographic styles.
I think Apple will continue to build customizable unique at-capture-time photographic styles like this because not only do they render better than post-capture filters in 3rd party apps but also because it shifts the yearly camera comparison of "Here's what iPhone 14 vs. Pixel 7 photos look like" into something less binary. You're no longer comparing one company's single algorithm to another, instead the customer is creating their own camera by choosing how they want everything to look. Photographic styles already does this but film simulation takes it to another level.
Thoughts? Of course there's no guarantees they're going to do this and film simulation has long been on my list of "things I want from Apple that likely won't happen" but seeing the 3:2 aspect ratio leak instantly made me think of film cameras.
I've long been requesting that the next generation of Photographic Styles move beyond simple contrast and saturation/warmth tweaking into the realm of film simulation profiles, much like digital Fujifilm cameras do today. Film simulation is a lot more advanced (and opinionated) than what Apple currently offers because film simulation also plays with artificial grain, fundamentally changing color tones rather than simple saturation (Lightroom Color Calibration for example), halation, etc. Artificial grain, for example, isn't as simple as slapping a grain filter onto an image like many editing apps do today because real film grain changes in appearance depending on how much light the film is receiving in a given portion of the frame.
One of the commonly agreed upon rumored features regarding the 14 Pro series is the move to a 48MP sensor after 7 years of 12MP sensors since the 6S in 2015, so there's no doubt the camera will be (as usual) a massive focus of this year's Pro model. Beyond camera hardware upgrades, Apple like to introduce new camera software features to accompany the hardware improvements...
So what does the 3:2 aspect ratio have to do with all this? Well that's the aspect ratio of 35mm still film, famously popularized by the Leica camera (and we all know how much Steve Jobs and Apple love Leica).
I believe this year's Pro series software feature will be the introduction of "AI powered" film simulation profiles built right into the camera app which are baked in at capture time, much like photographic styles are today.
It will be "AI powered" because they'll say the neural engine is figuring out how to perfectly replicate film signatures (such as the accurate application of grain I mentioned) across the image using "raw camera data" as opposed to 'dumb' flat filters that look janky. This will be their excuse for why older iPhones like the 13 Pro can't have it, like what they said about photographic styles.
I think Apple will continue to build customizable unique at-capture-time photographic styles like this because not only do they render better than post-capture filters in 3rd party apps but also because it shifts the yearly camera comparison of "Here's what iPhone 14 vs. Pixel 7 photos look like" into something less binary. You're no longer comparing one company's single algorithm to another, instead the customer is creating their own camera by choosing how they want everything to look. Photographic styles already does this but film simulation takes it to another level.
Thoughts? Of course there's no guarantees they're going to do this and film simulation has long been on my list of "things I want from Apple that likely won't happen" but seeing the 3:2 aspect ratio leak instantly made me think of film cameras.
Last edited: