Seems photos are the same dark but no more over saturation. iOS 15.3. Waiting for 15.4 beta to be available on my iphone
Well take in mind that the sensors are completely different between the two iPhones. The 12 Pro has a sensor that’s 47% smaller than the 12 pro max and 13 Pro Max so that’s probably why you’re getting a little bit better details in your photos because the pixels are smaller and more compressed / grouped closer together. Larger pixels are great for low light but not the best for fine details especially with subjects far away like leaves on trees and branches in the trees etc. but big pixels are awesome for portraits and anything fairly close.Great news. Can someone confirm this?
I still think the biggest issue is this horrible photo sharpening / hdr, but over saturation was a case too.
You can easly see all the postprocesses after taking a photo in the instagram app. It’s clearly visible.
Btw, I made couple of simillar photos using 13 pro and 12 pro (both on 15.1.1) - 13 had got much better white balance and photos were more hmmm polished? but 12 pro captured a lot more detailes, photos were more natural and overall was sharper in good way - 13 pro tried to do it using postprocesses and this was a disaster.
And from what I could see photos from 13 pro were somehow out of focus and had got some kind of ghosting (?)?
They were clearly worse than 12 pro in terms of details/focus - and I’m talking here about main sensor 26mm.
Rest of them are fine. I think f1.5 instead of 1.6 is a case.
I've had the iPhone 13 Pro Max for 2 weeks, as a secondary phone while the Galaxy S21 Ultra is my main phone.
I have been completely surprised by how bad the post processing is on the pictures it takes. I'm not comparing this to the Galaxy pictures, I am comparing it to the pictures I took when I had the iPhone 11. It looks so bad in comparison to what I was used to from the iPhone 11.
In that phone, the picture had a natural look to them and there was nothing wrong with it. I know there is a great picture in the iPhone 13 pictures, it's just hidden under all of the contrast, noise reduction, sharpening and overall more saturated colors.
I was surprised to see this on the iPhone. Thankfully I wanted to use the iPhone more so for the video recording and with the Dolby Vision HDR, I don't have that kind of problem when I play the videos on the TV. Comparing the pics on the 13 to the 11, it's baffling to see how bad the iPhone 13 PM pics look.
I thought the HDR might be causing this, but that option to disable it is gone. I hope they bring the option to turn off the HDR in pictures if this is were this mess comes from.
Well I couldn't get passed the way the iPhone 13 Pro Max picture look and my main focus was on the iPhone 11.How about comparing to your s21 Ultra?
Please tell me you are joking.Well take in mind that the sensors are completely different between the two iPhones. The 12 Pro has a sensor that’s 47% smaller than the 12 pro max and 13 Pro Max so that’s probably why you’re getting a little bit better details in your photos because the pixels are smaller and more compressed / grouped closer together. Larger pixels are great for low light but not the best for fine details especially with subjects far away like leaves on trees and branches in the trees etc. but big pixels are awesome for portraits and anything fairly close.
I didn’t say anything about the Sony. Not sure where you got that from.Please tell me you are joking.
So you are saying that the 13 pro max has better details then a sony a7siii?
Cmon. Tell me more ??
Yes… open the app, select the .5x camera & press the AF Button on the left and you will see a flower icon. Touch that to begin.Does anyone else use Halide? I can't figure out how to take macro shots with it. In the viewfinder, I can see the subject close up like I do when using the stock camera app but when I take the photo with Halide, it defaults back to the ultrawide view as if I hadn't used their macro setting at all. I bought Halide because it allows for more manual settings but my issue with it is kind of a dealbreaker.
HiHello again, one more thing:
How do you guys deal with flares in photos at night? They are not only green/white dots from sources of light but even whole mirrored shapes.
Oh my…
Flares and ghosting/mirroring make night shots almost unusable.
I had iPhone pre-faceid era and night photos had no this kind of thing. I checked and iP12 pro has the same problem… I am almost 100% sure it is a hardware case and I know a lot of smartphones have simillar issues but I’ve seen photos from samsung galaxy and it wasn’t that big problem there.
I’ve been a photographer for a long time and I’ve shot mostly Nikon and now Sony with various lenses that have very good optical coatings on each element inside that lens to get rid of ghosting and flaring when photographs are taken with the sun in the shot or other bright lights. I think the iPhone optical elements are multi-coated to some extent, But nothing like the coatings that you would get on the pro lenses from Nikon or canon etc.Hello again, one more thing:
How do you guys deal with flares in photos at night? They are not only green/white dots from sources of light but even whole mirrored shapes.
Oh my…
Flares and ghosting/mirroring make night shots almost unusable.
I had iPhone pre-faceid era and night photos had no this kind of thing. I checked and iP12 pro has the same problem… I am almost 100% sure it is a hardware case and I know a lot of smartphones have simillar issues but I’ve seen photos from samsung galaxy and it wasn’t that big problem there.
If you have not done so yet, report your experience to Apple Feedback. The more people that do that the better chance Apple will correct the situation.