Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I suspect that you are correct. The problem seems to come back to this again and again. By forcing HDR on some shots, the camera module and processor inside the iPhone is layering images that capture various values withing the scene, sometimes resulting in poor calculations when combining media with detail with shots taken for illumination and lighting. There DOES seem to be an additional issue of the camera AF refusing to work properly...

Something else I noticed with using the ProCam App is that I could feel the OIS mechanism vibrating inside the camera when I shifted it with my hand. It's not something I could feel when using the native Camera App. I'm curious to figure out why that is happening as well.

View attachment 1902227

Last night I conducted a comparison from my desk. It wasn't particularly scientific but I think it applied well to the real world. Every single picture I took was from a seated position with both hands holding the phone very carefully. The AF reticule was allowed to rest and lock exposure and focus before I took each picture. The target zone was the face of this wristwatch. I was surprised to see that every single picture from the Camera App using the 3x Lens was smeared. Some were worse (far worse) than others. The same applied when JPEG was selected instead of Apple's preferred HEIC files. Here's an example of one of the cropped JPEG images...

View attachment 1902241

Switching to the ProCam App instead, I took three pictures using the TIFF (TIF) format with the SMRT feature turned off and on. All of these shots were somewhat ideal. Certainly what I would expect from the iPhone 13's cameras. Especially the 3x (77mm) f/2.8 camera.

As far as I am concerned, the problem lies with Apple's algorithms and computational software. If I can get virtually perfect shots with the ProCam App, or at least shots that match the best images from the Apple Camera App, then I'm happy to keep using this smartphone... but Apple needs to SORT THIS OUT A.S.A.P. You can see the full scene below - but the method used to try to nail the AF on each shot was perhaps even more stable and careful that the methods used by the average shooter. I'm both disappointed that I can't trust the Apple software/app and happy that I can at least be assured of reasonably good results from the ProCam app that I bought years ago. But as for *Apple, I find this completely unacceptable. * (Apple is welcome to reach out to me on this.)

View attachment 1902236

Regards,

Marco
I appreciate the work you put into this!

While I am disappointed in Apple that they shipped such great hardware with poorly tuned software, I am also relieved to know that this issue can be resolved with a software update.

So the next step is getting Apple’s attention. Because this will all be for naught if Apple doesn’t act on it.
 
  • Like
Reactions: Guacamole and dk001
Just to reiterate, everyone I hope is fully aware that the iPhone native camera app does not always stick to the lens you think it does. Shooting tele can utilise the cropped wide camera, Apple just decided in some circumstances its better to do it this way.
 
Because of some trouble with my Android phone (S10+), I was considering a 13 Pro. Unfortunately, it seems like the photography situation is the same. To get anywhere near decent photo quality, avoid the original camera app like the plague and move to a third party app. Google may be the exception, but after they killed the Nexus line just when I bought into it, I will never buy from Google anymore. They have the tendency to kill anything off at random, just when you're invested in it.
 
Who is going to return the 13p/pm? The clock is ticking for me... Verizon says 30 day returns and I am not looking forward to leaving the Appleverse in terms of phone at least. This is frustrating. I could revert back to my XR and hope the 14 is better or maybe get a 12.
Any chance Apple can fix this with an update?

The issue is software, not hardware.
I’m keeping my 13PM - it is a work phone so not really an option.

I have several work-arounds I am trying however I really hope Apple fixes this.
For many photos I am using the Halide app.
 
Just to reiterate, everyone I hope is fully aware that the iPhone native camera app does not always stick to the lens you think it does. Shooting tele can utilise the cropped wide camera, Apple just decided in some circumstances its better to do it this way.

They have been doing this that I know of since the 11 ProMax.
PITA!
 
Because of some trouble with my Android phone (S10+), I was considering a 13 Pro. Unfortunately, it seems like the photography situation is the same. To get anywhere near decent photo quality, avoid the original camera app like the plague and move to a third party app. Google may be the exception, but after they killed the Nexus line just when I bought into it, I will never buy from Google anymore. They have the tendency to kill anything off at random, just when you're invested in it.

IMO that was a good thing though.
Nexus was a Google / partner collaboration while Pixel is all Google.
 
In good light conditions telephoto make very good results !
F987EEA0-4E2D-484D-8C0A-DB29EF9BCBFA.jpeg
 
IMO that was a good thing though.
Nexus was a Google / partner collaboration while Pixel is all Google.
Officially, yes. Behind the scenes, of course, companies like LG or HTC still produce them. But there's more Google control over what will end up in the phone, indeed. Still, no confidence in Google's determination to continue projects for the long term: https://killedbygoogle.com/
 
  • Like
Reactions: dk001
Just to reiterate, everyone I hope is fully aware that the iPhone native camera app does not always stick to the lens you think it does. Shooting tele can utilise the cropped wide camera, Apple just decided in some circumstances its better to do it this way.
This. I wish people would post the photo metadata along with their photos so that we can see exactly what is going on. It tells us what lens was used and if it was zoomed or not.

Just as an example, a person early on in this thread thought a bad picture was taken with the telephoto lens (3x), when it was actually taken with the ultra wide lens (0.5x) and digitally zoomed in 6x. We would not have known that without the metadata info.
 
If enough of us send feedback, or if someone high-profile [a news site or app developer] makes a report, they will.

Honestly, based on the anecdotal reports on how Apple responds to feedback, I think the latter option is what really gets Apple’s attention.
Where is @gordonkelly from Forbes when you need him. He bashes Apple for every little trivial issue for clickbait. Here is an alarming issue and he is quiet as a church mouse.
 
  • Haha
Reactions: orbital~debris
I suspect that you are correct. The problem seems to come back to this again and again. By forcing HDR on some shots, the camera module and processor inside the iPhone is layering images that capture various values withing the scene, sometimes resulting in poor calculations when combining media with detail with shots taken for illumination and lighting. There DOES seem to be an additional issue of the camera AF refusing to work properly...

Something else I noticed with using the ProCam App is that I could feel the OIS mechanism vibrating inside the camera when I shifted it with my hand. It's not something I could feel when using the native Camera App. I'm curious to figure out why that is happening as well.

View attachment 1902227

Last night I conducted a comparison from my desk. It wasn't particularly scientific but I think it applied well to the real world. Every single picture I took was from a seated position with both hands holding the phone very carefully. The AF reticule was allowed to rest and lock exposure and focus before I took each picture. The target zone was the face of this wristwatch. I was surprised to see that every single picture from the Camera App using the 3x Lens was smeared. Some were worse (far worse) than others. The same applied when JPEG was selected instead of Apple's preferred HEIC files. Here's an example of one of the cropped JPEG images...

View attachment 1902241

Switching to the ProCam App instead, I took three pictures using the TIFF (TIF) format with the SMRT feature turned off and on. All of these shots were somewhat ideal. Certainly what I would expect from the iPhone 13's cameras. Especially the 3x (77mm) f/2.8 camera.

As far as I am concerned, the problem lies with Apple's algorithms and computational software. If I can get virtually perfect shots with the ProCam App, or at least shots that match the best images from the Apple Camera App, then I'm happy to keep using this smartphone... but Apple needs to SORT THIS OUT A.S.A.P. You can see the full scene below - but the method used to try to nail the AF on each shot was perhaps even more stable and careful that the methods used by the average shooter. I'm both disappointed that I can't trust the Apple software/app and happy that I can at least be assured of reasonably good results from the ProCam app that I bought years ago. But as for *Apple, I find this completely unacceptable. * (Apple is welcome to reach out to me on this.)

View attachment 1902236

Regards,

Marco
Nice Tissot!
 

Attachments

  • IMG_3689.JPG
    IMG_3689.JPG
    312.4 KB · Views: 104
Nice Tissot!
Hdr sucks and not needed. I have got good pics from my 13pm but it does do some odd things when “stacking” the file during processing. Looks blurry and then fine but unnecessary

I do have to say for my use so far it has not been terrible. Ultra and wide are fine for me and I dont go above 3x.
 
Last edited:
This. I wish people would post the photo metadata along with their photos so that we can see exactly what is going on. It tells us what lens was used and if it was zoomed or not.

Just as an example, a person early on in this thread thought a bad picture was taken with the telephoto lens (3x), when it was actually taken with the ultra wide lens (0.5x) and digitally zoomed in 6x. We would not have known that without the metadata info.

It's exahusting to see people go into such 'technical depth' over and over about the issue here but not understand the fundametal ways in which Apple's image capture pipeline works.
 
This. I wish people would post the photo metadata along with their photos so that we can see exactly what is going on. It tells us what lens was used and if it was zoomed or not.

I posted example of the issue in another thread here where i zoomed in fully and still the main lens was used.

More examples below of the issue, with metadata. These are screenshots from the gallery.
To get the correct lens in use, I had to press a bright area.
Last example is weird, because the distance to subject is same as 77mm but it says 26mm, and ultra wide is producing a smeared effect even in raw.

IMG_4063.JPEG

IMG_4064.JPEG

IMG_4065.JPG

IMG_4066.JPEG

IMG_4085.JPG
 
  • Like
Reactions: dk001 and BigBlur
My biggest issue with the 13 Pro lineup is how it just over sharpens everything and how dynamic range is severely lacking.

In tough lighting conditions the highlights are just totally blown out and makes a shot useless. I highly doubt Apple will fix any of these issues as they'll just save the advancements for the 14 Pro instead.

I take a lot of pictures, not so much videos so I'm definitely thinking of making my Pixel 6 Pro as my daily driver now. So disappointed in Apple, they have a lot of catching up to do when it comes to computational photography.
 
  • Like
Reactions: dk001
My biggest issue with the 13 Pro lineup is how it just over sharpens everything and how dynamic range is severely lacking.

In tough lighting conditions the highlights are just totally blown out and makes a shot useless. I highly doubt Apple will fix any of these issues as they'll just save the advancements for the 14 Pro instead.

I take a lot of pictures, not so much videos so I'm definitely thinking of making my Pixel 6 Pro as my daily driver now. So disappointed in Apple, they have a lot of catching up to do when it comes to computational photography.
If it's a software issue, I think Apple will fix it. They won't want the iPhone 13 Pro to have a bad reputation. If it's hardware, then yes, they'll probably save it for the 14 Pro.

Does anyone remember 'Beauty-gate' from the iPhone XS? They fixed that nearly a couple months later with iOS 12.1.
 
If it's a software issue, I think Apple will fix it. They won't want the iPhone 13 Pro to have a bad reputation. If it's hardware, then yes, they'll probably save it for the 14 Pro.

Does anyone remember 'Beauty-gate' from the iPhone XS? They fixed that nearly a couple months later with iOS 12.1.

Truly hope so because the photos I’ve taken with my 13 Pro in harsh conditions do not look good at all. Highlights are blown out, shadows are totally crushed and everything looks like someone took sharpness and maxed it out.

I can see how some people may find a super punchy photo as appealing but you’ve got to have balance. The Pixel 6 Pro at times brings up the shadow way too much but you can actually see what you’re shooting, that’s the difference. Also, the Pixel 6 Pro isn’t overdoing the sharpness, yes there’s sharpness added and at times it is too much too but not like the 13 Pro does it where it makes everything look artificial.
 
Does anyone want to share any methods they use (or settings) to ensure the best results from their iPhone 13 Pro cameras? In ideal lighting, the 77mm (3x) lens can still produce decent results but I'll avoid it now unless I really have a need for it.

Are you all mostly shooting with RAW and is this for image integrity? I can tell the self-defined "Pros" on YouTube were happy with all their RAW shots ...but using the Portrait Mode (*which uses computational manipulation and depth mapping from the LIDAR to produce superb looking 'fake bokeh') sure comes in handy from time to time... and I'm a photographer who usually shoots with f/1.2 lenses. It's pretty darned decent and you can imagine folks going nuts with this feature just because of the effect it produces.

The standard Wide Angle lens is likewise just fine. I like all the shots from that lens. The Ultra Wide lens is a bit of a novelty though again, it's routinely useful and practical. The Macro capability (which is a feature of the Ultra Wide Lens) is limited because it's an Ultra Wide lens but it's fun to have on hand. I was just able to make out the serial number on a diamond with it... and those things are laser etched so small that the characters are smaller than the diameter of a human hair. My Canon EF 100mmL Macro lens can't quite nail it because it's so small, so that was a surprise.

I find that the RAW images (via the ProRAW) setting are still quite removed from those RAW images from a dedicated camera. I've even had an odd shading effect in the sky with a ProRAW image that I can't quite explain (see image below). And the HEIC images are still not close enough to JPEGs if the light is low. But overall, the iPhone 13 is capable of so much more than the older models were prone to producing.

I'm wondering if it's worth creating a YouTube video to get more traction on the subject of the shortcomings this thread has addressed? It would have helped if the reviewers fawning over this new smartphone had noticed these issues because it would seem to be a universal problem... and Apple might have taken early steps to resolve it.

IMG_6477X.JPG

(IMAGE) iPhone 13 Pro Max - image shot with ProRAW - odd shading artifact on the right which didn't show up on the HEIC images shot alongside it. I might add that it was visible in the unedited RAW image and is not the result of pushing the Dynamic Range during editing and conversion.
 
Last edited:
  • Like
Reactions: dk001
If it's a software issue, I think Apple will fix it. They won't want the iPhone 13 Pro to have a bad reputation. If it's hardware, then yes, they'll probably save it for the 14 Pro.

Does anyone remember 'Beauty-gate' from the iPhone XS? They fixed that nearly a couple months later with iOS 12.1.
I remember beauty-gate, and it’s happening on my 13 Pro. I did numerous comparison shots of my ugly mug using the 11 and 13P. You can see the pores on my forehead with the 11. With the 13P it’s like a filter has been applied. Less detail and almost a ‘smooth or smear’ effect. Disappointing.
 
  • Like
Reactions: cabragg and dk001
Does anyone want to share any methods they use (or settings) to ensure the best results from their iPhone 13 Pro cameras? In ideal lighting, the 77mm (3x) lens can still produce decent results but I'll avoid it now unless I really have a need for it.

Are you all mostly shooting with RAW and is this for image integrity? I can tell the self-defined "Pros" on YouTube were happy with all their RAW shots ...but using the Portrait Mode (*which uses computational manipulation and depth mapping from the LIDAR to produce superb looking 'fake bokeh') sure comes in handy from time to time... and I'm a photographer who usually shoots with f/1.2 lenses. It's pretty darned decent and you can imagine folks going nuts with this feature just because of the effect it produces.

The standard Wide Angle lens is likewise just fine. I like all the shots from that lens. The Ultra Wide lens is a bit of a novelty though again, it's routinely useful and practical. The Macro capability (which is a feature of the Ultra Wide Lens) is limited because it's an Ultra Wide lens but it's fun to have on hand. I was just able to make out the serial number on a diamond with it... and those things are laser etched so small that the characters are smaller than the diameter of a human hair. My Canon EF 100mmL Macro lens can't quite nail it because it's so small, so that was a surprise.

I find that the RAW images (via the ProRAW) setting are still quite removed from those RAW images from a dedicated camera. I've even had an odd shading effect in the sky with a ProRAW image that I can't quite explain (see image below). And the HEIC images are still not close enough to JPEGs if the light is low. But overall, the iPhone 13 is capable of so much more than the older models were prone to producing.

I'm wondering if it's worth creating a YouTube video to get more traction on the subject of the shortcomings this thread has addressed? It would have helped if the reviewers fawning over this new smartphone had noticed these issues because it would seem to be a universal problem... and Apple might have taken early steps to resolve it.

View attachment 1902532
(IMAGE) iPhone 13 Pro Max - image shot with ProRAW - odd shading artifact on the right which didn't show up on the HEIC images shot alongside it. I might add that it was visible in the unedited RAW image and is not the result of pushing the Dynamic Range during editing and conversion.
I noticed the same thing in my first sets of photos. I got a weird shading effect when I took a shot with the ultra wide camera.

In the attachment, you’ll see the weird shading thing around the subject’s hand. [I cropped the photo.]
 

Attachments

  • 7C8E06E4-29B8-4AF7-9899-530B3321C76E.jpeg
    7C8E06E4-29B8-4AF7-9899-530B3321C76E.jpeg
    74 KB · Views: 114
  • Like
Reactions: dk001
Another Issue surfaces with the iPhone 13 Cameras...

update: I took a picture of a small insect and DELIBERATELY used the 77mm (3x) to see what I would get. The results were smeary and horrid. The resulting image looked like a poor digital zoom. To my surprise the EXIF data showed that I had shot this with the 1x lens. The aperture used is what gave it away. I assumed that perhaps the camera had switched lenses but that was just a guess...

On a photographic forum I follow, a fellow iPhone 13 shooter has just mentioned to me the following: " don't really appreciate it, but have found that occasionally in really low light if I choose the 77mm camera especially at close distances, the camera will switch to the 26mm lens with the 1.5 aperture and digitally zoom to 77mm."

IMG_6194X.JPG


This reinforces my own experience. Although I was shooting in good lighting outdoors on a sunny day when I shot that bug. What the heck is going on with these iPhone 13 camera modules? It would seem that the iPhone will switch between the 3x and 1x lens when pushed... and then uses a digital zoom to make up the 77mm equivalent. I have another example of this when I photographed closeup of a dried out dead lizard on the footpath with the 77mm lens but it switched on me again and used the 1x lens with an applied digital zoom. (SEE attached image/s) The resulting image that was supposed to be from the 3x (77mm) lens was smeared and vevoid of detail. Just like the bug. It was annoying enough to try and get a Macro with the lenses constantly switching back and forth, and fortunately Apple sorted that out with a new setting that was added with the update.

Regards,

Marco
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.