So, what should Apple do now? They’ve provided Photographic Styles, yet many still shoot with the default settings and complain.
The photographic styles don’t override the processing. It’s like applying a filter to a photo; I don’t see much difference between that and a filter.
I chose my old iPhone 7 for my European trip over the iPhone 15 for the photos. The difference is so evident, especially in color science. The iPhone 7 captures moments accurately, with realistic colors that would look very muted to an iPhone 15 user, and proper skin tones. Overcast skies are let to be overcast — it doesn’t do the fake brightening of faces and objects and provides more natural contrast and depth when it’s cloudy. The 15 does it regardless of any photographic style applied. Even if you decrease the saturation, turn off night mode etc. it still produces messy images and trashes the lighting.
The 7 is just much more unbiased and transparent, willing to let the scene set the tone instead of wrestling with it. The 15 over-lights everything, ruining the vibe, the faces, and the nostalgia when looking back at the photos. And for me, the reason I took a photo in the first place is because I want it to bring me back somewhere. Especially when viewing photos on a large display, the iPhone 15’s scenes appear surreal, like a realistic watercolor painting or an AI-generated image. I just don’t connect with the photos it takes as well. And the 15 is a very notable improvement in what I hate about it over the 13 which was an absolute mess of a camera in exactly those aspects (worse than 12 imo).
I suppose you trade the grain and noise of earlier iPhones for this, but I’m not sure that’s a good trade. I prefer the grain to the what seems like AI-generated crisp “fill” that replaces the missing details where noise used to be with random artifacts. From what I'm seeing, the 16 is a step backwards from the 15, looking more AI-generated, aggressive and flat.