Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
hey if any of you guys have a older iPhone that you can compare against the orange 17 pro/max, please try to recreate this macro test in a dark room with flash:
I'm very interested to see other results. something is up with this camera.
This shot doesn’t pass the sniff test. Why is the shadow off to the right on the 14? And centered on the 17. The placement of the flashes on both phones are exactly the opposite of what would occur during a photo with the flash at close range.

Edit: I suppose if the phones were held in landscape the shadows could appear like that, so I stand corrected. Even so, I just did a quick shot and my 17 pro looks like the “14Pro” in the example.
 
Last edited:
Wow, so I am not imagining that!

I’ve actually seen many examples of 14 Pro performance too and could only say “it is a good camera” back then. But now was really underwhelmed by 17 Pro, which is by definition should cater to “Pros”. And the noise reduction trend started to take height already in 16 Pro, but at least 16 Pro images look sharp and “punchy”, while 17 Pro had changed processing to more dull and images are somewhat “blurry”.

As for your example: YES, it is what I see! 14 Pro is razor sharp. And I don’t understand why bloggers keeping silent about that.

I don’t think it is some sort of lens defect, worst case - it is lens design and similar across all models. If that is in the processing then Apple must know about that, maybe it is certainly just a bug or smth
I think what everyone is noticing is that Apple might be toning down their over sharpening. Traditional cameras have a very slight softness to them. Since the beginning Apple has applied higher amounts of sharpening to account for the noise reduction. Now that the lenses are getting better they can tone down those algos some.
 
I’m a bit confused by this thread. I do prefer “softer” photos produced by 17 Pro’s camera rather than over-sharpened pics from previous models (especially 13 and 14 series which were the worst when it comes to post-processing, in my opinion).
 
  • Like
Reactions: Macintosh IIcx
I switched to 24MP HEIC and it for sure got a lot better. Shouldn’t have to do that though IMO. Hopefully update addresses the issue.
Interesting. I wonder if that’s a sign there is something wrong in the pro raw pipeline rather than a hardware defect. I generally shoot in HEIC 24 unless there is a situation that specifically calls for raw like Milky Way shots. That’s probably why I wasn’t seeing issues in the 17 Pro shots I posted on the last page. They were all shot 24 MP.

If it is indeed a processing issue for raw shots, then it should be fixable in a software update down the line.
 
A little bit of softness is okay.

Destroying all the fine detail/information when zooming to maximum is not okay and, to me, is not justified.

In my case, it's not a question of a lack of sharpness; it's a question of not applying a sufficient amount of clarity and definition.

The pictures are now unusable. Also, I shoot in RAW on both my 16PM and 17PM—an uncompressed format with the same hardware. The results should be identical, with minor variations, not this abysmal difference.
 
A little bit of softness is okay.

Destroying all the fine detail/information when zooming to maximum is not okay and, to me, is not justified.

In my case, it's not a question of a lack of sharpness; it's a question of not applying a sufficient amount of clarity and definition.

The pictures are now unusable. Also, I shoot in RAW on both my 16PM and 17PM—an uncompressed format with the same hardware. The results should be identical, with minor variations, not this abysmal difference.
Can you re sharpen the image in lightroom? I'm not sure if we are to judge image quality by looking at an image at 600~700% zoomed in instead of 100%. But I suppose at that level, individual pixels should be visible instead of object blur. I think the digital image starts to break down past 100% bit still show some details at 300%. However if your raw image is still blurry at 100% & can't be re-sharpened in lightroom, then I agree there is an issue. Especially if the camera can't be manually focused to precision.

It doesn't look like this person is having ProRaw issues, we should invite him to the conversation

https://www.instagram.com/n.muif?igsh=b21xNGVtbGVzZDVp

First entry, shot on iPhone 17 pro max ProRaw
 
Last edited:
The slight softness I think people are describing here is a decrease in the intense sharpening apple was applying as noted above. The images feel more film like to me, like my Fuji X100 does. Less crunchy; may look worse if you’re pixel peeping, but the overall image quality is quite pleasant.
 
I've just upgraded from the base 11 to the 17.

Some of my photos were blurred. I went through my settings and unticked the "AF/AE Lock" option when pressing the camera control button. (I'm still getting used to this button - it seems like a bad implementation but maybe I'll get used to it.)

After stopping my camera from accidentally locking the focus on the wrong thing, I appear to be getting in-focus pictures every time. However, this is based on limited testing.
 
  • Like
Reactions: uacd and MrAperture
Im in contact with an senior assistant, and my situation escalated already to the engineers.
I will be contacted again this week.
These are good news. I did some more tests during the weekend with my 17P and I can confirm what you guys have been showing here. For the first time in years I had to trash some photos I took due to the lack of details in those 48MP pictures; they were almost unusable even with heavy editing in Lightroom. It happened only in some situations though, some other shots are detailed enough for me (even if softer than my 14P). The good news is that it seems this can be fixed via SW updates. Finger crossed.
 
The main camera on the 14 Pro was the first with the quad Bayer sensor. But the processing wasn’t there yet to produce 24 MP images (that came with the 15 Pro the following year). So the 14 Pro always produced 12 MP HEIFs unless you shot Raw.
 
I’m thinking that after all the AIR might have the lightest and best camera of the whole 17 series

What are you even talking about, Air sensor is just same as 17 Base. Its lens is even less wide than the 17 Base because of the thickness limitation.
 
The main camera on the 14 Pro was the first with the quad Bayer sensor. But the processing wasn’t there yet to produce 24 MP images (that came with the 15 Pro the following year). So the 14 Pro always produced 12 MP HEIFs unless you shot Raw.
Good information but not totally accurate. The 14 Pro is able to shoot and process 48MP photos with the main sensor with Raw and HIEC. It didn’t ship with that feature but it was added a year later with the next version of iOS.
 
  • Like
Reactions: MacDevil7334
The 13 pro max was ok, the 14 pro max was better, I always bought the gold. The shinny gold sides were beautiful to me... anyway, as for astrophotography, the 13 pro series was terrible, the 14 pro was much better & reaching 12,500 ISO. a bit noisy but a nice improvement. Then the 15 pro was better still. I'm guessing the 17 pro is going to look exactly the same as the 16, & 15 pro. The 30 second night mode is somewhat useless especially when the Google pixel and my Galaxy S25 Ultra can do so much better. I realize im in the minority when it comes to astronomy with the iPhone, but Apple is so far behind regarding total manual control of the cameras. I wish apple would put more "Pro" into the iPhone especially with the cameras. For example, this image of the milky way was taken using my Samsung S25 Ultra in the expert raw app for a 12 minute exposure at ISO 400. The 12 minuteexposureconsistsof 30.5, 23 1/2" exposures combined.

The iPhone can only provide three 10 second exposures to finalize a 30" shot by alignment of the three images. Again, they are behind. I still have my 16 pro max, probably won't upgrade to the 17 because nothing significant has changed. Same wide & ultra wide cameras, and a decent 4x camera. I'd be buying the same cameras in a different body. I've had the iPhone since 2009, I hope it's gets better next year. Probably the same specs with different colors with a slightly faster chipset.

20250821_223306.jpg
 
The 13 pro max was ok, the 14 pro max was better, I always bought the gold. The shinny gold sides were beautiful to me... anyway, as for astrophotography, the 13 pro series was terrible, the 14 pro was much better & reaching 12,500 ISO. a bit noisy but a nice improvement. Then the 15 pro was better still. I'm guessing the 17 pro is going to look exactly the same as the 16, & 15 pro. The 30 second night mode is somewhat useless especially when the Google pixel and my Galaxy S25 Ultra can do so much better. I realize im in the minority when it comes to astronomy with the iPhone, but Apple is so far behind regarding total manual control of the cameras. I wish apple would put more "Pro" into the iPhone especially with the cameras. For example, this image of the milky way was taken using my Samsung S25 Ultra in the expert raw app for a 12 minute exposure at ISO 400. The 12 minuteexposureconsistsof 30.5, 23 1/2" exposures combined.

The iPhone can only provide three 10 second exposures to finalize a 30" shot by alignment of the three images. Again, they are behind. I still have my 16 pro max, probably won't upgrade to the 17 because nothing significant has changed. Same wide & ultra wide cameras, and a decent 4x camera. I'd be buying the same cameras in a different body. I've had the iPhone since 2009, I hope it's gets better next year. Probably the same specs with different colors with a slightly faster chipset.

View attachment 2560743
Somebody showed on YouTube that you need to mount the iPhone on a tripod to trigger the 30 seconds exposure, if handheld you do only get 10 seconds. Not sure of the details beyond that.
 
  • Like
Reactions: adrianlondon
Seems like a fair amount of user error in this video. First thing he did was try to use the telephoto to take a picture of a close up object. It's well known that when you try to take a picture of an object inside the minimum focal distance of your chosen lens, the iPhone will drop back to the next widest lens and then crop and upscale the image to match the equivalent field of view. So, when he took a picture of his shoes (which was definitely too close for the 4x to focus on), the phone dropped back to the 1x main camera and gave him an upscaled image. That image was 12 MP by necessity and of course it didn't look great. BTW, had he tried to do the same thing with the 5x telephoto on the 15 Pro Max, he would have had the exact same thing happen.

My experience with the telephoto on the 17 Pro is that is a step up from the 5x on the 16 Pro, but not a dramatic step up. The improvements are much more subtle than what you would expect from Apple's marketing (and I suspect the marketing is driving some of the disappointment we are seeing). My observations (in no particular order):
  • The 4x lens on the 17 Pro zoomed in to 5x captures more detail than the optical 5x lens on the 16 Pro. I attribute this to the higher res sensor on the 17 capturing more information. The 17 Pro also still produces a 24 MP image at 5x, which makes me think it's doing the same processing trickery that allows the 1x lens to still get 24 MP images at 28mm and 35mm equivalents. That's a nice bonus since the 16 Pro still captures a 12 MP image at 5x. Here's a comparison shot of the 17 Pro 4x zoomed to 120mm equivalent (the same field of view on the 16 Pro 5x). At first glance they look similar. But then as you zoom in, you see the 17 Pro was able to pick up things like the texture of the paper on the Da Vinci biography. It was able to resolve that the word Atomic is made of small dots in the next book to the right. And smaller letters and fine details are just crisper throughout the image. Much of that is probably due to the extra resolution of the 17 Pro sensor. There is also just less noise in the 17 Pro image. I couldn't post the full 17 Pro image here because it's 24 MP but here's the comparison of a sampled area of the image.
View attachment 2558884 View attachment 2558886
  • If you zoom both phones in to 8x in camera, the 17 Pro really pulls away. Both phones produce a 12 MP image at this point. But the 17 Pro again has more detail and less noise, likely due to the extra resolution and the physically larger sensor. The 16 Pro has to crop the smaller 12 MP sensor and then upscale and you can really see how much it has to rely on over-sharpening at this point. The 17 Pro goes a little too far in noise reduction, leading to a slightly soft image. But I still greatly prefer it to the 16 Pro image. The 17 Pro is simply working with more information. Here's a comparison of the 8x shot from both cameras focusing on the same set of books.
View attachment 2558885 View attachment 2558887
  • The 4x can give some decent optical bokeh if you focus on a subject right near the minimum focus distance of the lens. It's not DSLR quality by any means. But you can see some benefit from the larger sensor.
View attachment 2558892
  • I am finding the 4x on the 17 Pro to be soft some of the time. But it's not consistently so. So, I am not sure if it is a case of the lens not having enough resolving power for the higher res sensor or if it is an issue with the sensor stabilization or the image processing pipeline that could be corrected in software later on.
  • In general, I'm finding the telephoto on the 17 Pro to be an upgrade. It captures more detail and the 4x/8x option is more versatile than the 5x on the 16 Pro. I think people heard 48 MP in Apple's marketing and thought they were getting a DSLR. It's still a phone camera. But it can produce very nice images if you work with it.
I don't have a 14 Pro to compare the 1x cameras. In general, I have found my 1x shots to be about the same as the 16 Pro, which is to say very good. The 1x camera continues to be noticeably superior to the other cameras, though the gap has shrunk a little this year.

Thank you for providing clarity on all this possible misinformation. You're the man 👍
 
  • Like
Reactions: MacDevil7334
The 13 pro max was ok, the 14 pro max was better, I always bought the gold. The shinny gold sides were beautiful to me... anyway, as for astrophotography, the 13 pro series was terrible, the 14 pro was much better & reaching 12,500 ISO. a bit noisy but a nice improvement. Then the 15 pro was better still. I'm guessing the 17 pro is going to look exactly the same as the 16, & 15 pro. The 30 second night mode is somewhat useless especially when the Google pixel and my Galaxy S25 Ultra can do so much better. I realize im in the minority when it comes to astronomy with the iPhone, but Apple is so far behind regarding total manual control of the cameras. I wish apple would put more "Pro" into the iPhone especially with the cameras. For example, this image of the milky way was taken using my Samsung S25 Ultra in the expert raw app for a 12 minute exposure at ISO 400. The 12 minuteexposureconsistsof 30.5, 23 1/2" exposures combined.

The iPhone can only provide three 10 second exposures to finalize a 30" shot by alignment of the three images. Again, they are behind. I still have my 16 pro max, probably won't upgrade to the 17 because nothing significant has changed. Same wide & ultra wide cameras, and a decent 4x camera. I'd be buying the same cameras in a different body. I've had the iPhone since 2009, I hope it's gets better next year. Probably the same specs with different colors with a slightly faster chipset.

View attachment 2560743

Sounds like what you do is neat but it always makes me wonder why people like you complain about iPhone when you seem to have perfectly competent devises at your disposal that can fulfill all your on-the-go camera needs. IE Pixel and whatever Samsung. They're good, no argument here. Stick with them and don't keep waiting for iPhone to do something it may never do for you is my advise. I mean is Android really THAT bad that you care?
 
Somebody showed on YouTube that you need to mount the iPhone on a tripod to trigger the 30 seconds exposure, if handheld you do only get 10 seconds. Not sure of the details beyond that.
True.. you'll get 10 seconds maximum handheld, mount your iphone on a tripod and 30" shows because the accelerometer in the phone senses complete stillness.
 
  • Like
Reactions: Macintosh IIcx
Sounds like what you do is neat but it always makes me wonder why people like you complain about iPhone when you seem to have perfectly competent devises at your disposal that can fulfill all your on-the-go camera needs. IE Pixel and whatever Samsung. They're good, no argument here. Stick with them and don't keep waiting for iPhone to do something it may never do for you is my advise. I mean is Android really THAT bad that you care?
Im complaining because I've always had an iphone and I'm deeply committed to the apple ecosystem, (iPad pro, mackbook pro, apple home kit etc,) and using an android phone, I can't take advantage of airdrop or anything apple related. Photography is my passion & because apple won't change the way the camera can be used professionally and giving me & other photographers more control, I chose the Samsung phone only for the camera system that better suits my needs, and honestly the android is a nice change. So.... until apple can update the camera software for us photographers and app developers, I'll be using my Samsung, which by the way I do love using. I'll switch back to the iphone when they upgrade the camera software. Sure I can carry my Sony A7RV and multiple lenses around with me, but I don't want to becauseof the bulk & possible theft. Does this help you understand my situation?
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.