Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Another Issue surfaces with the iPhone 13 Cameras...

update: I took a picture of a small insect and DELIBERATELY used the 77mm (3x) to see what I would get. The results were smeary and horrid. The resulting image looked like a poor digital zoom. To my surprise the EXIF data showed that I had shot this with the 1x lens. The aperture used is what gave it away. I assumed that perhaps the camera had switched lenses but that was just a guess...

On a photographic forum I follow, a fellow iPhone 13 shooter has just mentioned to me the following: " don't really appreciate it, but have found that occasionally in really low light if I choose the 77mm camera especially at close distances, the camera will switch to the 26mm lens with the 1.5 aperture and digitally zoom to 77mm."

View attachment 1902579

This reinforces my own experience. Although I was shooting in good lighting outdoors on a sunny day when I shot that bug. What the heck is going on with these iPhone 13 camera modules? It would seem that the iPhone will switch between the 3x and 1x lens when pushed... and then uses a digital zoom to make up the 77mm equivalent. I have another example of this when I photographed closeup of a dried out dead lizard on the footpath with the 77mm lens but it switched on me again and used the 1x lens with an applied digital zoom. (SEE attached image/s) The resulting image that was supposed to be from the 3x (77mm) lens was smeared and vevoid of detail. Just like the bug. It was annoying enough to try and get a Macro with the lenses constantly switching back and forth, and fortunately Apple sorted that out with a new setting that was added with the update.

Regards,

Marco
Yes, in the middle of the thread, other posters noticed this same behavior and posted photos with metadata. I experienced this myself. I took a photo using the 3x [telephoto] lens on the stock Camera app, and when I checked the metadata, the phone switched to the 1x [wide] lens, overriding my selection.
 
The beauty of the iPhone and their cameras is that there is not supposed to be too much thinking involved unless you want too. If I have to start "thinking" about pointing the camera here, or adjusting something else there or anything in between, I'll have a good mind to chuck this thing thru Tim Cooks office window :mad: That's not the experience most are wanting I'd wager...
 
The beauty of the iPhone and their cameras is that there is not supposed to be too much thinking involved unless you want too. If I have to start "thinking" about pointing the camera here, or adjusting something else there or anything in between, I'll have a good mind to chuck this thing thru Tim Cooks office window :mad: That's not the experience most are wanting I'd wager...
Precisely this.
 
Lens flare?
Definitely a lens flare arc. Apple appear to have optically coated the lenses on the iPhone 13 Pro and these coatings may be on the internal elements as well to reduce ghosting. However, the rims of the outer lens elements appear to create arc shaped internal reflections that usually show up when the sun it near the edge of the frame. This is why some filter companies today have black-coated-rims to reduce the chances of internal flares. The flares on the iPhone 13 Pro models (and presumably the 13 regular models) are usually subtle if the sun isn't too much into frame. They tend to have a distinct arc shape with a wide curve when generated by the iPhone 13.
 
  • Like
Reactions: dk001
If I remember correctly, lens flare is a result of the outer sapphire glass.

If we wanted anti-flare coatings on it, then that solves one problem, but then these coatings would easily get scartched/damaged whjich would screw up the image quality of every and all photos that we took.

So its a trade off. Uncoated sapphire glass will flare, but not scratch.
 
Yes. Hugely disappointed by the camera in my 13 pro max. I get that strange thing you describe of seeing the photo for a fraction without charges and then it sharpens and so much it becomes almost unusable. I have tried to salvage some in photoshop but most photos can’t be used.
And the wide lens results are also really bad. They look ok on the phone and I was really excited to use it but when I zoom in on the photos they have no detail at all. Very blocky. Again photos you can’t use anywhere.
I am really shocked that the photos I get on my previous phone which was X Max plus we’re so much better. Confused about the reviews I see. Perhaps the phone is faulty?
Oddly the video so far seems really great!
Here one photo taken on 1x lens. See the excessive sharpness. Second photo shows the opposite which is lack any detail in wide shots..
ADA35062-D308-48D7-AFA6-C0D186408176.jpeg
 

Attachments

  • E617611C-C495-454A-B6DE-E00EE0D3A1F2.jpeg
    E617611C-C495-454A-B6DE-E00EE0D3A1F2.jpeg
    825.3 KB · Views: 166
  • Like
Reactions: pdxa4
If I remember correctly, lens flare is a result of the outer sapphire glass.

If we wanted anti-flare coatings on it, then that solves one problem, but then these coatings would easily get scartched/damaged whjich would screw up the image quality of every and all photos that we took.

So its a trade off. Uncoated sapphire glass will flare, but not scratch.
Lens flare can occur at any time the the sun hits the sensor at the right angle. Removing the sapphire glass will not completely remove lens flare - it's just physics and has been around forever It occurs with expensive DSLR lenses as well. That's why you get a lens hood when you purchase an lens it to reduce the effect if desired. Sometimes lens flare can actually add to character of an image.

Each lens will act uniquely to flare - even lenses of the same model because of imperfections and small differences in the structure of every lens.

You can eliminate or reduce flare by shading the moving the camera slightly or by shading with a hand. Apple did not create the problem and won't be able to eliminate it - at least not completely. Maybe someone will create a lens hood for the iPhone. :cool:
 
Here's a shot I took with my Pixel 6 Pro and the other from my 13 Pro, which do you guys prefer?

I can tell you guys right now the Pixel 6 Pro nailed the color temperature here, it was a warm colored afternoon as the sun was going down. I do prefer the way my boy Gotti looks, it emphasizes the black fur better but the overall photo didn't represent how the scene looked color wise.

If only we could get Pixel's processing/camera on am iPhone.... One can dream right? Lol
 

Attachments

  • PXL_20211104_215411662.jpg
    PXL_20211104_215411662.jpg
    517 KB · Views: 196
  • 3F2AA236-F239-4988-9179-B78527D10E77.jpeg
    3F2AA236-F239-4988-9179-B78527D10E77.jpeg
    568.4 KB · Views: 212
Last edited:
Here's a shot I took with my Pixel 6 Pro and the other from my 13 Pro, which do you guys prefer?

I can tell you guys right now the Pixel 6 Pro nailed the color temperature here, it was a warm colored afternoon as the sun was going down. I do prefer the way my boy Gotti looks, it emphasizes the black fur better but the overall photo didn't represent how the scene looked color wise.

If only we could get Pixel's processing/camera on am iPhone.... One can dream right? Lol

For shots of your pets pixel is the best
 
Here's a shot I took with my Pixel 6 Pro and the other from my 13 Pro, which do you guys prefer?

I can tell you guys right now the Pixel 6 Pro nailed the color temperature here, it was a warm colored afternoon as the sun was going down. I do prefer the way my boy Gotti looks, it emphasizes the black fur better but the overall photo didn't represent how the scene looked color wise.

If only we could get Pixel's processing/camera on am iPhone.... One can dream right? Lol

A lot of calculations need to be made when converting the signal that hits the sensor into a representation of a scene (or... a photo as we know it). White balance and exposure are really tricky when you think about it, especially since not only is a camera trying to actually figure out what is out there but also do it in a way that tries to replicate what we see by our own eyes + make it 'asthetically pleasing'.

For the last bit, the way our eyes see light, focus and give us an image in our brains has as much to do with physics as it does with neuroscience. We can 'see' a huge amount of F-stops when it comes to dynamic range, but also what is happening is that our brain is filling in the gaps and making huge assumptions (kind of like a natural signal processing system). Our fovea blind spot is the most obvious example of this.

As how this relates to your photos.

Firstly, yes the Pixel image is much warmer and aesthetically pleasing to me. How much it truly represented the real world conditions is something only you would know (and the mind does like to play tricks). You mentioned that Gotti's fur is black, and better represented in the iPhone photo, yet the rest of the scenes colours didn't match? Feels slightly contradictory. That being said, iPhone have always traditionally shot a bit 'cool' in their colour tone.

I don't like how the front of the nose/mouth is out of focus on the Pixel. Seems to detract from the subject. If we look at the tiny hairs on the upper back of your dog against the background, on the Pixel they seem soft and out of focus, while on the iPhone there is more clear definition. However, the iPhone has done a less good job keeping the hairs on the ears in focus. You have cropped in a little tighter on the iphone photo through (I'm not sure on the focal length/aperature size of the Pixel) so comparisons are tricky.

With regards to exposure, clearly the Pixel is trying to keep the shadows of the fur visible. While the iPhone is choosing to crush those blacks a lot more. In real life, it would feel like maybe the fur is as visible as the Pixel, but that's probably because our eyes are very quickly adjusting 'exposure' as we dart our eyes over a scene. It would be interesting to see what is recoverable from the orignal file or if you shot in ProRes, but I sometimes quite like the darker shadows for portrait type photos.

All in all, both cameras have decided to take different decisions in producing a final photo. As ever, the big question is to what the end goals really in an age where the majority of people expect a photo to look a certain way and not necessary like real life.

Btw, have you looked into the Photo Styles that Apple introduced with the 13? Maybe you can set things to be warmer by default and more to your liking. Remember, this bakes in the temperature adjustment quite early on in the image capture pipeline, and I 'think' its not something you can retrospectively adjust, so be warned.

I find all of this stuff super interesting, and always try to learn more about how photography works on the iPhone. As others have said, 'point and shoot' via the Camera app has its pro's and con's. You are handing yourself over to Apple and their processsing algorithms which are designed to be the best average solution for everyone. But for specific shooting conditions and if you know what you are doing plus really care about the quality of the shot, always switch to a dedicated camera app like Halide.
 
  • Like
Reactions: TL24
I am having a similar issue with my 13PM, but my problem is that certain ProRAW photos seem to self-destruct as soon as I attempt to edit or export them. The issue is particularly bad with Night Mode images, but I have noticed it is also affecting high light shots as well.

I wrote about the issue in full here… https://discussions.apple.com/thread/253309940

I am a pixel peeper, but as a rule I tend to focus on the appearance of an image as a whole, and recently I have been trying to focus more on the artistic merits of a shot and allow myself to look past certain slight technical issues. The one thing I would like is consistency. In my opinion, nothing is more annoying than accepting a shot, only to have it appear completely differently when I come to edit. I agree that the 13 is over sharpening to quite a degree, which is also a gripe of mine, but this does not offend me nearly as much as my photos having split personalities!!!

Examples to follow…
 
  • Like
Reactions: dk001 and TL24
Screenshot of unedited preview - full frame:
309F2C6D-A19A-4082-8BB6-EA8E6F3D913E.jpeg

Screenshot of UNedited image, full frame, just after pressing “Edit” (using iOS Photos app). Notice there is now less noise and the image has a softer and more natural appearance:
13BBA812-6DF4-4541-8136-6DA2F1E9ACAC.jpeg
Preview - 100% crop
A47A632B-C9A9-4A13-BA53-713A027DE93C.png
Edit window - 100% crop:
FAB4CA7D-828C-42C6-AC2D-9ADB3F9770B0.png

220873A9-4691-4619-8972-08AB3380D078.png
 
  • Like
Reactions: dk001 and TL24
IMG_0241.jpeg

1636127722911.png



IMG_0242.jpeg

1636127702583.png



IMG_0243.jpeg

1636127685813.png




My comments: Horribly lit restaurant and I could have screwed up on the meat pic but for some reason it looks horrible and it could have been me zooming in but I didn't think I did! (note the MM difference). I am not a pro but just posting cuz I was asked to in the past and figured I'd contribute.

I thought the Salad and Noodles pic very nice in quality but the meat pic horrible.
 
What I don't understand: If the problem, as it seems to be, is software related, if I use another cam app like ProCamera would I solve the problem?
 
  • Like
Reactions: mainemini and dk001
I've commented on another thread about this issue with my iPhone 13PM and sad to say it's still going on and happening to others. I know there's being overly critical and pixel peeping, but I never had the feeling of uncertainty with my 11PM or my 12PM.

It seems as if the iPhone is making incorrect computational choices. In quite a few scenarios, I have to take multiple shots to get the result I want. It's a bit uncomfortable to deal with and I guess the hassle is there but I can take a few more pictures if I care that much. The only issue would be if it were a photo I only have moments to capture and it being less than ideal.

I also do have an issue with the over processed sharpness and the HDR being way too aggressive. Ruins a few shots where highlights could have been dialed down. I know shooting with ProRAW can help alleviate this. But do I want to have to go back and edit all my shots? Not really.

This isn't to say that I haven't taken amazing photos with my 13PM, it has happened. I've included two of my favorites that really capture the true to life atmosphere exactly as the atmosphere was in the moment. I just hope this is all sorted out soon.

I've already submitted feedback to Apple via this form on their website:


I suggest anyone else with the issue or feedback on their photos/camera, to do the same. It will give Apple more information as they sort through the feedback and maybe an upcoming software update could fix this up.
 

Attachments

  • IMG_2697.jpg
    IMG_2697.jpg
    475.6 KB · Views: 137
  • IMG_2412.jpg
    IMG_2412.jpg
    229.3 KB · Views: 126
Other posters commented that turning off “View Full HDR” in the settings of the Photos app helps. I’ve tried it out, and it does! Photos look more natural [particularly with highlights and shadows], but they’re still on the sharp side.
 
I've commented on another thread about this issue with my iPhone 13PM and sad to say it's still going on and happening to others. I know there's being overly critical and pixel peeping, but I never had the feeling of uncertainty with my 11PM or my 12PM.

It seems as if the iPhone is making incorrect computational choices. In quite a few scenarios, I have to take multiple shots to get the result I want. It's a bit uncomfortable to deal with and I guess the hassle is there but I can take a few more pictures if I care that much. The only issue would be if it were a photo I only have moments to capture and it being less than ideal.

I also do have an issue with the over processed sharpness and the HDR being way too aggressive. Ruins a few shots where highlights could have been dialed down. I know shooting with ProRAW can help alleviate this. But do I want to have to go back and edit all my shots? Not really.

This isn't to say that I haven't taken amazing photos with my 13PM, it has happened. I've included two of my favorites that really capture the true to life atmosphere exactly as the atmosphere was in the moment. I just hope this is all sorted out soon.

I've already submitted feedback to Apple via this form on their website:


I suggest anyone else with the issue or feedback on their photos/camera, to do the same. It will give Apple more information as they sort through the feedback and maybe an upcoming software update could fix this up.
Really good, informative post. Thanks.
To my eyes, those shots look really good, especially the 1st one. Tack sharp but not overdone. The seconds seems to be a bit underexposed in the darker areas but that's a tough lighting situation. Still a really good shot. Would you mind sharing the EXIF to these images?
 
My comments: Horribly lit restaurant and I could have screwed up on the meat pic but for some reason it looks horrible and it could have been me zooming in but I didn't think I did! (note the MM difference). I am not a pro but just posting cuz I was asked to in the past and figured I'd contribute.

On the meat pic, the native camera app decided to switch settings considerably. You are shooting at 1/30 (ie more prone to motion blur) and it looks like you have somehow switched lenses too (notice how below the info doesn't say 26mm like the others). A few different reasons for this, one of which could be the mainly dark background of the meat pic meant that your phone thought it was an even lower light situation and hence switched things up. The result is that you have the photo you have.

Yes. It looks bad and in an ideal world it shouldn't be happening.

But its probably the case where if Apple have found that if they don't switch the cameras, it's an unusable image - maybe the phone detected you were shaking as well?

The solution is to switch to Halide or another app and play around with the settings a bit more manually.
 
Other posters commented that turning off “View Full HDR” in the settings of the Photos app helps. I’ve tried it out, and it does! Photos look more natural [particularly with highlights and shadows], but they’re still on the sharp side.

But in that case, that only changes how photos are viewed on that device, right? Not how it looks sharing the photos, or viewing it on other devices.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.