For example, looking up the hill from Lawrence Hall of Science. Zoom in on the leaves above the fountain, the top left building’s window frames and the grass/foliage below it. You can actually make out an acceptable amount of detail. The colors are a little deeper too, less washed out than what I’ve come to expect from iPhone. At first glance there's slightly less yellow/grey dominance than what I'm used to on iPhone but I need to take a lot more shots to figure this out.
There’s a confidence in the way this image renders like the sensor is struggling less than the 13 Pro's did. It's not mind blowing but enough for me to consider a slight improvement. The biggest downside? I shouldn't have to tell you: those gross artifacts/flares from the lights (visible just above the buildings). Still an unresolved issue from 12 Pro and 13 Pro.
1x Wide camera, stock Camera app, Night mode at 9 second exposure, no edits.
Crop of the above for reference.
A closeup of the fountain. Everything looks mostly the same as 13 Pro Max to my eye except for detail. Yes I think there's a microscopic improvement in color depth but take a look at the texture of the stone and water droplets, they are a little more defined than what I would otherwise get from 13 Pro -- importantly, there's less perceivable oil painting effect.
1x Wide camera, stock Camera app, Night mode at 3 second exposure, no edits.
Crop of the above for reference.
Here's a telephoto shot of the same scene. The telephoto hardware didn't get upgraded this year as far as I know so this is a real test of Photonic Engine. To my eye the water droplet detail has improved significantly with a lot less oil painting effect than before. You can make out more individual droplets than what 13 Pro would capture and the result is a more satisfying fountain texture. There's a little more texture in the stone at the bottom of the frame too. This must be a result of using the raw uncompressed camera data to kick off the computational photography process because this kind of micro detail/contrast in the droplets is what I'm more used to seeing in Halide RAW photos (not entirely but you get the point).
Reminder that this is NOT a ProRAW image.
3x Telephoto camera, stock Camera app, Night mode at 3 second exposure, no edits.
Crop of the above for reference.
Here's an image taken in ProRAW. In my settings I have 48MP mode enabled but the final image said 12MP so Apple are clearly overriding your 48MP preference in low light scenarios. That's fine by me given the better detail I'm getting in this shot anyway vs. what the 13 Pro would capture. Look at the wall surrounding the glowing letters. Again, roughly speaking better detail from Photonic Engine + pixel binning = better image texture = more satisfying photo that looks less "smartphony."
1x Wide camera, stock Camera app, ProRAW, "Magic Wand" one button edit in Photos app (it primarily increased contrast).
Crop of the above for reference.
Looking down at the lab. An overall cleaner image than what my 13 PM would give me. There are no miracles being performed in terms of bringing up the pitch black areas of the image (something Pixel would try to do) but the areas with light are a lot more detailed, as is the sky.
1x Wide camera, stock Camera app, Night mode at 6 second exposure, no edits.