Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Attachments

  • 73F539C8-197E-4E90-9339-9BD586C20AD6.jpeg
    73F539C8-197E-4E90-9339-9BD586C20AD6.jpeg
    1.1 MB · Views: 337
Your example clearly shows one of the issues with the main lens. There is clear “ghosting” in P key and the bracket key.
The ultra wide is sharper.
Do you still have the 12 to compare same setup?
Yes. Same distance - only mini led lamp light in evening.
13 pro (left) vs 12 (right).
Little more ghosting on arrow and square signs for 13 pro - maybe less sharpness radius because of ƒ/1.5 ?

13pro-vs12.jpg
 
Yes. Same distance - only mini led lamp light in evening.
13 pro (left) vs 12 (right).
Little more ghosting on arrow and square signs for 13 pro - maybe less sharpness radius because of ƒ/1.5 ?

View attachment 1871521
That's almost certainly because of ƒ/1.5 as it has less depth of field. The rest looks... almost identical?
 
Do any of you check out the shots on the iPhoneograpgy subreddit? How do they compare to yours?

 
I might have found a workaround for this blurry/oversharpening issue.
E131051C-DF67-462D-AC2E-67D7E197E2C5.jpeg

I took one picture using the iPhone camera app with the wide lens, every settings set as standard, except raw enabled.

I look at the photo in iOS standard app, awful results, look a bad digitally zoomed picture taken from a 2010 smartphone…

Then I opened the exact same raw picture in Lightroom mobile app, the result is much more appealing, I mean there is quality in the picture, this is noticeable. No filter or setting modifications applied.


Pic in attachment is IOS photo app, pic above is Lightroom.



The issue is then more on the software side rather than hardware. Nonetheless, this cannot be ignored or denied! Please Apple engineers, take my humble investigation as hint of what’s going wrong with this post processing …

Your are welcome ;)
 

Attachments

  • BF9E87E3-DFA8-4F79-B2EE-38D03D45089C.jpeg
    BF9E87E3-DFA8-4F79-B2EE-38D03D45089C.jpeg
    696.4 KB · Views: 456
Last edited:
I might have found a workaround for this blurry/oversharpening issue.
View attachment 1872703
I took one picture using the iPhone camera app with the wide lens, every settings set as standard, except raw enabled.

I look at the photo in iOS standard app, awful results, look a bad digitally zoomed picture taken from a 2010 smartphone…

Then I opened the exact same raw picture in Lightroom mobile app, the result is much more appealing, I mean there is quality in the picture, this is noticeable. No filter or setting modifications applied.



have a look below. First is IOS photo app, second is Lightroom.



The issue is then more on the software side rather than hardware. Nonetheless, this cannot be ignored or denied! Please Apple engineers, take my humble investigation as hint of what’s going wrong with this post processing …

Your are welcome ;)
That cushion on the right of your lovely dog looks terrible with the sharpening effect. It looks like spaghetti!
 
I've just cancelled my order for an iPhone 13 Pro (Graphite/128GB) as I'm not comfortable with paying £1,000 for a phone when so many people are raising these issues. If it is a software issue and is fixed in a future update I'll probably still get one, but right now, no thanks ?
 
I'm fairly sure it's a software thing because I can use Halide to force the use of the telephoto lens and it looks fine.

I just took these two photos out of my office window. No modifications beyond the Photos app's JPEG conversion. As you can see, brilliant sunlight so it's not like the smaller aperture of the zoom lens should be an issue.

Native camera app, 3x zoom

IMG_0281.jpeg


Halide, 3x zoom

IMG_0282.jpeg


Look at the details in the trees and on the roof. It's night and day.

And, sure enough, looking at the metadata in the Photos app, the native app took it with the wide lens and digitally cropped it.

IMG_0283.pngIMG_0284.png

So I think it's 100% a software issue, with the native camera app being far too aggressive in choosing the wide lens and digitally cropping. Personally I don't think it should be doing that at all, but if it's going to do that it should only be in really adverse lighting conditions, when the zoom physically can't pull in enough light.

As a final experiment, I used the native app, 3x zoom, and put my finger over the wide lens so that it couldn't use it. The photo came out fine:

IMG_0285.jpeg


Edit: And here's the metadata for the final photo:

IMG_0286.png
 
Last edited:
I'm fairly sure it's a software thing because I can use Halide to force the use of the telephoto lens and it looks fine.

I just took these two photos out of my office window. No modifications beyond the Photos app's JPEG conversion. As you can see, brilliant sunlight so it's not like the smaller aperture of the zoom lens should be an issue.

Native camera app, 3x zoom

View attachment 1873264

Halide, 3x zoom

View attachment 1873266

Look at the details in the trees and on the roof. It's night and day.

And, sure enough, looking at the metadata in the Photos app, the native app took it with the wide lens and digitally cropped it.

View attachment 1873269View attachment 1873270

So I think it's 100% a software issue, with the native camera app being far too aggressive in choosing the wide lens and digitally cropping. Personally I don't think it should be doing that at all, but if it's going to do that it should only be in really adverse lighting conditions, when the zoom physically can't pull in enough light.

As a final experiment, I used the native app, 3x zoom, and put my finger over the wide lens so that it couldn't use it. The photo came out fine:

View attachment 1873275

Edit: And here's the metadata for the final photo:

View attachment 1873297
So, if one uses this app ‘Halide’, it should solve the issue? Any downsides to using third-party apps?
 
I'm fairly sure it's a software thing because I can use Halide to force the use of the telephoto lens and it looks fine.

I just took these two photos out of my office window. No modifications beyond the Photos app's JPEG conversion. As you can see, brilliant sunlight so it's not like the smaller aperture of the zoom lens should be an issue.

Native camera app, 3x zoom

View attachment 1873264

Halide, 3x zoom

View attachment 1873266

Look at the details in the trees and on the roof. It's night and day.

And, sure enough, looking at the metadata in the Photos app, the native app took it with the wide lens and digitally cropped it.

View attachment 1873269View attachment 1873270

So I think it's 100% a software issue, with the native camera app being far too aggressive in choosing the wide lens and digitally cropping. Personally I don't think it should be doing that at all, but if it's going to do that it should only be in really adverse lighting conditions, when the zoom physically can't pull in enough light.

As a final experiment, I used the native app, 3x zoom, and put my finger over the wide lens so that it couldn't use it. The photo came out fine:

View attachment 1873275

Edit: And here's the metadata for the final photo:

View attachment 1873297
Great investigative work!
Dumb question, but does Halide still use Apple's computational photography (smartHDR) and all of that on the back end- so it's really just another tool letting you control the camera, with no difference otherwise in image quality?

Sounds like a fantastic work-around option until Apple can fix this issue.
 
I’ve noticed over the last few weeks the over-processed images my 13 pro is producing, especially when it comes to skin tones.

Tested halide and the pictures are toned down and look more like the actual shot. This tells me it’s a software issue that will be fixed in time.

For me halide isn’t the solution right now either. No night mode and no video means you’ve got two possibly three camera apps if you want to add long exposure shots
 
Maybe use a real professional camera instead of buying into the iPhone camera crap?
Some of us who are really into photography do have DSLR cameras, and are not giving those up. With the touted advancements on the iPhone cameras, though, we expect them to work as advertised- and hope to sometimes put down the DSLR and use the phone cameras in certain situations and still get good shots.
 
Some of us who are really into photography do have DSLR cameras, and are not giving those up. With the touted advancements on the iPhone cameras, though, we expect them to work as advertised- and hope to sometimes put down the DSLR and use the phone cameras in certain situations and still get good shots.
I get that, but any smartphone (no matter how fancy those lenses are) picture is always going to be a hit or miss.
 
I get that, but any smartphone (no matter how fancy those lenses are) picture is always going to be a hit or miss.

Well, no. Most high end phones do become better with every iteration.
I’m certainly not expecting my iPhone photos to compete with my mirrorless camera, but at least getting as good or better images than iPhone 11.

Most issues seems to be software related, and hopefully Apple can give the option to reduce or turn off some of the processing of the images.
 
So, if one uses this app ‘Halide’, it should solve the issue? Any downsides to using third-party apps?

Halide doesn’t support video and some other photo features like night mode. It does support portrait mode, though. So you’ll need to keep using the default app for some things.

Great investigative work!
Dumb question, but does Halide still use Apple's computational photography (smartHDR) and all of that on the back end- so it's really just another tool letting you control the camera, with no difference otherwise in image quality?

Sounds like a fantastic work-around option until Apple can fix this issue.

It supports a lot of the computational stuff including Deep Fusion, but I know it’s not 100% identical to whatever the built-in app does due to API limitations. I believe the way it handles RAW files differs quite a lot too. If you check their blog they do pretty in-depth breakdowns of new features.
 
  • Like
Reactions: raf_am and jm31828
This thread makes me a little nervous. I’ll be going on vacation in less than a couple months and was thinking of getting the iPhone 13 Pro, since it should have the better camera. Some of the pics here are just awful with the watercolor effect though. If this is a software issue, I’m not sure I can count on Apple having it fixed before I leave. Would I be better off getting the regular iPhone 13? Most of my vacation photos will probably be landscape/scenery with some portrait/people pics.
 
having this same problem. I am a glass artist and use my iphone to take pictures of my work. I have been having to use my old 11 pro for pics because the results from this iphone 13 pro are complete crap. Over sharpening, white balance off, blacks faded. Is is very strange..as im taking the photos, they look great in the preview, then the final saved photo looks like complete crap. This needs to be fixed...
 
80F9F1C3-F74A-47B0-B0AF-4EF9649FF1A2.jpeg
Hi there, while I agree that smart phones are not a perfect replacement for dedicated cameras they are the cameras we always have with us. And they are not exactly cheap. Given the marketing (best camera in an iPhone ever …) I think we can expect that the photos are not worse than those of older iPhones. But harsh contrasts are the situations where the 13 pro struggles compared to its predecessors. I get a lot of double contours around edges between dark and light areas. Here is a collage of a scene I shot with my old 11 Pro Max and the new 13 Pro to show this issue. It’s not a single incident but an example of a general flaw of this camera.
 
View attachment 1874704Hi there, while I agree that smart phones are not a perfect replacement for dedicated cameras they are the cameras we always have with us. And they are not exactly cheap. Given the marketing (best camera in an iPhone ever …) I think we can expect that the photos are not worse than those of older iPhones. But harsh contrasts are the situations where the 13 pro struggles compared to its predecessors. I get a lot of double contours around edges between dark and light areas. Here is a collage of a scene I shot with my old 11 Pro Max and the new 13 Pro to show this issue. It’s not a single incident but an example of a general flaw of this camera.
Given what someone else found that the image when taken in Halide looks MUCH better, it seems positive that this isn't a hardware issue, it is something Apple can patch with a software update in the near future.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.