Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Bobout

macrumors 65816
Apr 6, 2017
1,415
2,481
Apple Valley
Wow, I really like the detailing that Ferris wheel. That’s cool.

I’ve been a photographer for a long time and I know a lot about sensors and megapixels etc. I think some of the issues that y’all are talking about could be contributed to the fact that the new larger sensor in the iPhone 13 Pro Max has 1.9 µm diameter pixels. With pixels that large and spread out over a 12 mega pixel sensor, they pick up a lot of light but are sometimes too large to really bring in a lot of resolution and detail. If this were a 24 megapixel sensor with 1.7 µm diameter pixels, things would look a lot sharper and cleaner. On the professional level, I had a Nikon D3 which was a 12 megapixel camera and the pixels were pretty large for a full frame sensor, nearly 8 µm in diameter which is pretty big. Great for low light but not so good for distant detail and high resolution. Using a 50 mm lens I remember taking a photo of a tree and the leaves looked blotchy instead of like leaves just because it just didn’t have the resolution in detail and I think that’s kind of what we’re seeing with the iPhone wide camera. That’s the largest iPhone sensor ever and big pixels.
Sounds like you know what you are talking about .. yes I like the Ferris wheel pic also .. I’m lucky to be close to Disney and Knotts so most of my pics are of that and family. Don’t have beautiful trees that change those amazing colors ?
 

spksys

macrumors member
Jan 12, 2021
33
18
Wow, I really like the detailing that Ferris wheel. That’s cool.

I’ve been a photographer for a long time and I know a lot about sensors and megapixels etc. I think some of the issues that y’all are talking about could be contributed to the fact that the new larger sensor in the iPhone 13 Pro Max has 1.9 µm diameter pixels. With pixels that large and spread out over a 12 mega pixel sensor, they pick up a lot of light but are sometimes too large to really bring in a lot of resolution and detail. If this were a 24 megapixel sensor with 1.7 µm diameter pixels, things would look a lot sharper and cleaner. On the professional level, I had a Nikon D3 which was a 12 megapixel camera and the pixels were pretty large for a full frame sensor, nearly 8 µm in diameter which is pretty big. Great for low light but not so good for distant detail and high resolution. Using a 50 mm lens I remember taking a photo of a tree and the leaves looked blotchy instead of like leaves just because it just didn’t have the resolution in detail and I think that’s kind of what we’re seeing with the iPhone wide camera. That’s the largest iPhone sensor ever and big pixels.
Thanks a lot for that explanation, that makes sense for someone like me that is not camera savvy. I usually go by just how the photo looks compared to what my eyes see. I just received the 13 Pro Max about 2 weeks ago after a long waiting time, coming from the 11 Pro Max which I still have and use it to compare. I noticed that compared to my 11 PM the 13 PM struggles with pictures that contain text and I put that on the aggressive de-noising and over sharpening that it has, and of course loss of distant detail which seemed odd to me until now. It does choose to use the wide camera instead of telephoto more often than the 11 PM and both look like paintings using digital zoom. Night mode is better on the 13 PM, miles sharper, but I am trying to get used to it being more real rather than bright as on 11 PM. Highlights are also blown often on the 13 PM and I saw that this is because of it's display as transferring the photo onto my 11 PM looks fine.
 

Fred Zed

macrumors 603
Aug 15, 2019
5,886
6,561
Upstate NY . Was FL.

Attachments

  • 8EAA4BB5-7856-42B2-AE74-BAE63EE34FF4.jpeg
    8EAA4BB5-7856-42B2-AE74-BAE63EE34FF4.jpeg
    189.2 KB · Views: 162

Pandyone

macrumors regular
Sep 30, 2021
248
330
Wow, I really like the detailing that Ferris wheel. That’s cool.

I’ve been a photographer for a long time and I know a lot about sensors and megapixels etc. I think some of the issues that y’all are talking about could be contributed to the fact that the new larger sensor in the iPhone 13 Pro Max has 1.9 µm diameter pixels. With pixels that large and spread out over a 12 mega pixel sensor, they pick up a lot of light but are sometimes too large to really bring in a lot of resolution and detail. If this were a 24 megapixel sensor with 1.7 µm diameter pixels, things would look a lot sharper and cleaner. On the professional level, I had a Nikon D3 which was a 12 megapixel camera and the pixels were pretty large for a full frame sensor, nearly 8 µm in diameter which is pretty big. Great for low light but not so good for distant detail and high resolution. Using a 50 mm lens I remember taking a photo of a tree and the leaves looked blotchy instead of like leaves just because it just didn’t have the resolution in detail and I think that’s kind of what we’re seeing with the iPhone wide camera. That’s the largest iPhone sensor ever and big pixels.

I understand that you know a lot about photography, and on a hobbyist level so do I, and it’s not only the hardware that is at fault.
This thread seems to branch out to other discussions, but the issues are still there with over-processed photos, photos getting cropped on main lens instead of using telephoto and sometimes blurry images.

Obviously not all issues with the cameras can be fixed with software updates, like lens flares. But options to for example turn of Smart HDR could probably improve the experience for a lot of users.
 

ToddH

macrumors 68030
Jul 5, 2010
2,902
5,895
Central Tx
I understand that you know a lot about photography, and on a hobbyist level so do I, and it’s not only the hardware that is at fault.
This thread seems to branch out to other discussions, but the issues are still there with over-processed photos, photos getting cropped on main lens instead of using telephoto and sometimes blurry images.

Obviously not all issues with the cameras can be fixed with software updates, like lens flares. But options to for example turn of Smart HDR could probably improve the experience for a lot of users.
Yeah I can understand that. I do use an app called Moment and it doesn’t use smart HDR but will use ProRAW or JPEG. It seems to produce very likable images. Halide is another good app to use that doesn’t really incorporate much smart HDR either or none at all. And I’m sure I can provide you with some samples.
 

ght56

macrumors 6502a
Aug 31, 2020
839
815
Yeah I can understand that. I do use an app called Moment and it doesn’t use smart HDR but will use ProRAW or JPEG. It seems to produce very likable images. Halide is another good app to use that doesn’t really incorporate much smart HDR either or none at all. And I’m sure I can provide you with some samples.
Does that get rid of or at least reduce the blurriness? I like this phone but the camera kind of sucks.
 

ToddH

macrumors 68030
Jul 5, 2010
2,902
5,895
Central Tx
Glad I found this. Still trying to figure out why my 11 Pro takes better photos than my 13 Pro.
I don’t see how the 11 pro takes better photos than the 13 pro, I gues a jpeg to jpeg comparison tells you. I think the 11 pro has 1.4 µm pixels and a much smaller sensor. The photo here shows the 12 pro max sensor compared to the 11 pro max sensor.. the 11 pro uses a smaller sensor than the 11 pro max. Anyway, I am sure Apple will fix the issue you guys are talking about with a software update. They will fix it and probably not tell us, the photos will just suddenly get better.. that’s my guess.

13527B69-4634-4304-B184-953643F9A496.png
 

ght56

macrumors 6502a
Aug 31, 2020
839
815
According to our own Ansel Adams, that post in this thread, is because:
“Most iPhone owners here aren’t trained photographers which results in badly executed shots. I’m beginning to think you guys are purposely looking for issues now”
Well, I am definitely not a subject matter expert in photography.

I don’t see how the 11 pro takes better photos than the 13 pro, I gues a jpeg to jpeg comparison tells you. I think the 11 pro has 1.4 µm pixels and a much smaller sensor. The photo here shows the 12 pro max sensor compared to the 11 pro max sensor.. the 11 pro uses a smaller sensor than the 11 pro max. Anyway, I am sure Apple will fix the issue you guys are talking about with a software update. They will fix it and probably not tell us, the photos will just suddenly get better.. that’s my guess.

View attachment 1922757

I am not doubting your technical expertise, but the photos with my 13 Pro, especially up close, kind of suck. They look like they are straight from 2010, whereas they look great on my 11 Pro. At 2x Zoom and especially at closer distances, the 11 Pro is sharp whereas the 13 Pro is blurry. Text also looks much better with the 11 Pro in a picture.
 
  • Like
Reactions: Trader05

ToddH

macrumors 68030
Jul 5, 2010
2,902
5,895
Central Tx
Well, I am definitely not a subject matter expert in photography.



I am not doubting your technical expertise, but the photos with my 13 Pro, especially up close, kind of suck. They look like they are straight from 2010, whereas they look great on my 11 Pro. At 2x Zoom and especially at closer distances, the 11 Pro is sharp whereas the 13 Pro is blurry. Text also looks much better with the 11 Pro in a picture.
Your closeup shots kind of suck? I hope not. Tell me, are you all looking at your photos beyond 100% on screen? Like 300% or even 400%? If so, you will see all kinds of flaws in the photo. Someone should post photos to a website in full resolution for the rest of us to look at. Is it a deep fusion issue? Apple says that deep fusion takes four photos before pressing the shutter, then another five to blend with the previous four… 9 shots all together. Maybe they aren’t lining up properly all the time… something to think on…
 

Fred Zed

macrumors 603
Aug 15, 2019
5,886
6,561
Upstate NY . Was FL.
I don’t see how the 11 pro takes better photos than the 13 pro, I gues a jpeg to jpeg comparison tells you. I think the 11 pro has 1.4 µm pixels and a much smaller sensor. The photo here shows the 12 pro max sensor compared to the 11 pro max sensor.. the 11 pro uses a smaller sensor than the 11 pro max. Anyway, I am sure Apple will fix the issue you guys are talking about with a software update. They will fix it and probably not tell us, the photos will just suddenly get better.. that’s my guess.

View attachment 1922757
Is this your dog ?

2CBA67ED-3B59-44B5-A7F7-CB76B449E2E8.jpeg
 

MacLappy

macrumors 6502a
Jul 28, 2011
530
394
Singapore
What the hell bug is that? Yikes!


I know right! Scary looking bugger, like some kind of Kaiju in a monster flick. Could be an Assassin Bug or a Sweet Potato Bug or something else all together. Lol. Really no idea, what it is.


Yeah I understand it not everybody has an eye for it that’s OK. I think that’s an assassin bug. I think it’s really cool having a macro mode on the iPhone like that to be able to take such cool shots up close. I bet you can go into the default editing app and raise the shadows up a little bit and see more of that bug‘s belly. If you’re interested in learning a little bit more about iPhone photography I think there’s some free stuff on the Internet like with iPhonephotographyschool.com and some other spots that would give you some tips and I’d be willing to help you myself if interested.


Thank you for your suggestion!

I took the picture without turning on the lights in my room, was on a phone call and basically not thinking.

It was about 1 in the afternoon on a bright sunny day, so bad angle and backlit. Kind of expected that the subject will look really dark.

Never knew that there was so much information retained that can be bought out by playing with the post editing options.

Now I can see how it’s legs are connected to the body, the full under carriage of the creature and even the full length of that mouth like part. Still scary but I think unique enough to share on the “Marco” thread.

Thanks again ToddH.
 
Last edited:
  • Like
Reactions: ToddH

ian6969

macrumors member
Dec 31, 2019
78
56
I’ve just been doing some back-to-back shots to test different settings. Turning off Lens Correction (Settings > Camera) definitely results in sharper shots on my 13 Pro. This setting only applies to the ultra wide and front facing lenses apparently. The feature is supposed to ”make images look more natural” according to Apple, but I couldn’t see any obvious difference apart from everything being slightly blurred!

I also found that the ”Warm” photo style seems to be more accurate in many of my photos, with the standard setting looking slightly too green.
 
  • Like
Reactions: Bobout

babyexercise

macrumors 65816
Oct 1, 2021
1,247
684
13 camera AI still has serious bugs when it deals with multiple light sources. My iPhone 11 phone is much better than my family member 13 pro for this kind of photos. These kind of 13 pro photos look non sense sometimes.
 

PeterJP

macrumors 65816
Feb 2, 2012
1,136
896
Leuven, Belgium
30” Apple ProRAW & Lightroom mobile

Lots of light pollution though. ISO 640

A photo of the stars is a good test for optical quality.
Nice photo!

A photo of stars is a good test in some ways. There are Leica lenses that are brilliant for people photography but that produce horrible coma in astrophotography.
 

H3LL5P4WN

macrumors 68040
Jun 19, 2010
3,459
4,020
Pittsburgh PA
Wow, I really like the detailing that Ferris wheel. That’s cool.

I’ve been a photographer for a long time and I know a lot about sensors and megapixels etc. I think some of the issues that y’all are talking about could be contributed to the fact that the new larger sensor in the iPhone 13 Pro Max has 1.9 µm diameter pixels. With pixels that large and spread out over a 12 mega pixel sensor, they pick up a lot of light but are sometimes too large to really bring in a lot of resolution and detail. If this were a 24 megapixel sensor with 1.7 µm diameter pixels, things would look a lot sharper and cleaner. On the professional level, I had a Nikon D3 which was a 12 megapixel camera and the pixels were pretty large for a full frame sensor, nearly 8 µm in diameter which is pretty big. Great for low light but not so good for distant detail and high resolution. Using a 50 mm lens I remember taking a photo of a tree and the leaves looked blotchy instead of like leaves just because it just didn’t have the resolution in detail and I think that’s kind of what we’re seeing with the iPhone wide camera. That’s the largest iPhone sensor ever and big pixels.
I was completely unaware that there'd ever be a downside to larger pixels. I remember that the HTC One (M7) was praised for it's 2u pixel size and optical image stabilization (but was only 4mp).
 

babyexercise

macrumors 65816
Oct 1, 2021
1,247
684
13 pro overexpose a lot whenever there is strong sunshine or strong light in dark. I always take photos with 11 too and almost never overexpose like 13 pro.
 
  • Like
Reactions: cabragg

Kutfe

macrumors newbie
Oct 17, 2017
18
7
13 pro overexpose a lot whenever there is strong sunshine or strong light in dark. I always take photos with 11 too and almost never overexpose like 13 pro.
I usually set exposure correction -0.3..-0.7 iPhone 13PM pulls information better from shadows than from lights.
15.1.1
 
  • Like
Reactions: that be me
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.