Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I am kinda sad to say but, I think my “old” 12PM had better camera performance, I wonder if they ll ever fix these issues.
All photos seem watercolored and they lack destul in difficult scenarios.
In order to see this watercolored pixelation that you guys are talking about you have to zoom in way past 100% to where the image starts to break down. There can’t be this many 13 Pro Max iPhones with bad image quality. I just purchased the new alpine green 13 Pro Max and it has the best camera I have seen by far. When I get a brand new iPhone I do a optics test by photographing the night sky with stars to see how pinpoint they look and how well the camera focuses on stars. Edge to edge clarity is what I have therefore I am highly impressed. So I really don’t understand what you guys are saying. I’ve been a photographer for many years and I pixel people like most of you do. So far I haven’t been displeased at all with this iPhone. Posting samples on this forum don’t really do justice because the image quality and size are much smaller. If photos can be emailed or sent to us individuals via our mailbox on the forum, post an example and let me see.

If I shoot a JPEG photo with the 1x or 3x of some trees in the distance and zoom in, I don’t see a water color effect at all. Instead I see down to the square pixel. This is why I’m wondering if you guys are resuming in way past 100% looking at the image. Take in find that you’re dealing with a 12 megapixel camera with larger pixels than normal which are great for lowlight and high ISO and not so great for high resolution of small objects and the distance. I can definitely see a small difference when using the default camera versus a third-party app like the Moment app or Halide, or any other. So if you guys want to put some samples in a dropbox account and send everybody a link to look at it so we can observe, then do so.
 
  • Love
Reactions: George Dawes
In order to see this watercolored pixelation that you guys are talking about you have to zoom in way past 100% to where the image starts to break down. There can’t be this many 13 Pro Max iPhones with bad image quality. I just purchased the new alpine green 13 Pro Max and it has the best camera I have seen by far. When I get a brand new iPhone I do a optics test by photographing the night sky with stars to see how pinpoint they look and how well the camera focuses on stars. Edge to edge clarity is what I have therefore I am highly impressed. So I really don’t understand what you guys are saying. I’ve been a photographer for many years and I pixel people like most of you do. So far I haven’t been displeased at all with this iPhone. Posting samples on this forum don’t really do justice because the image quality and size are much smaller. If photos can be emailed or sent to us individuals via our mailbox on the forum, post an example and let me see.

If I shoot a JPEG photo with the 1x or 3x of some trees in the distance and zoom in, I don’t see a water color effect at all. Instead I see down to the square pixel. This is why I’m wondering if you guys are resuming in way past 100% looking at the image. Take in find that you’re dealing with a 12 megapixel camera with larger pixels than normal which are great for lowlight and high ISO and not so great for high resolution of small objects and the distance. I can definitely see a small difference when using the default camera versus a third-party app like the Moment app or Halide, or any other. So if you guys want to put some samples in a dropbox account and send everybody a link to look at it so we can observe, then do so.
I think the majority of users bringing up the over-sharpening watercolour effect come from the default app‘s aggressive use of the 1x camera + digital zoom vs the native 3x camera when shooting at 3x. Whatever algorithm Apple has developed for the ISP, it heavily favours the former for most random snapshots (which i bet is most iP13P shooters). I notice it more when you don’t have control over the lighting And dont have a moment to compose a shot properly. Most of my “Snapchat worthy” pics (random snaps) using what I expect to be the 3x lens almost always end up using the 1x lens + digital crop (when I view the metadata). In the default app, the phone never indicates which lens it is using (between 1x and 3x) at any given moment, so what I, the user perceive to be the selected lens isn’t always the one the phone chooses to eventually use. That confusion and lack of transparency from Apple is throwing everyone into a debate. To suggest that users simply purchase another app is side stepping the issue and is counter-intuitive to the discussion figuring out why the algo is so aggressive.

It’s quite clear from the sample photos you’ve posted that you‘re one to control your variables and bring the best out of your camera equipment. So your experience with the 13P is going to be much different from the typical owner, hence why theres such a disparity between your experience and everyone else.

I personally really enjoy using this camera for shots I’m thoughtful about. However, the algo isn’t consistent enough to be all that reliable at anything 3x. And at 12MP, anything in between 1 and 3x and greater than 3x is a non-starter. Hope that adds some clarity.
 
Cameras are great don’t get me wrong, but comming from the 12PM all the photos, except normal conditions, seem overly processed. I just made the update to 15.5, I’ll do some tests today.
 
  • Like
Reactions: TMax
While some of the complaints are certainly from the camera switching to digital zoom instead of using the 3x lens, I’ve been really careful to ensure I’m shooting with the 77mm equivalent lens. The problem I have with the image processing on the 13pro is that Apple has it set up to show minimal noise in the photos. Meaning it applies noise reduction to get rid of most of the noise which causes the watercolor painting effect. Personally, I would prefer to have the option to turn this off when shooting ProRaw as the NR cannot be undone in post with ProRaw unlike sharpening. I’d rather have some noise than watercolor effect.
 
  • Like
Reactions: TMax
In order to see this watercolored pixelation that you guys are talking about you have to zoom in way past 100% to where the image starts to break down. There can’t be this many 13 Pro Max iPhones with bad image quality. I just purchased the new alpine green 13 Pro Max and it has the best camera I have seen by far. When I get a brand new iPhone I do a optics test by photographing the night sky with stars to see how pinpoint they look and how well the camera focuses on stars. Edge to edge clarity is what I have therefore I am highly impressed. So I really don’t understand what you guys are saying. I’ve been a photographer for many years and I pixel people like most of you do. So far I haven’t been displeased at all with this iPhone. Posting samples on this forum don’t really do justice because the image quality and size are much smaller. If photos can be emailed or sent to us individuals via our mailbox on the forum, post an example and let me see.

If I shoot a JPEG photo with the 1x or 3x of some trees in the distance and zoom in, I don’t see a water color effect at all. Instead I see down to the square pixel. This is why I’m wondering if you guys are resuming in way past 100% looking at the image. Take in find that you’re dealing with a 12 megapixel camera with larger pixels than normal which are great for lowlight and high ISO and not so great for high resolution of small objects and the distance. I can definitely see a small difference when using the default camera versus a third-party app like the Moment app or Halide, or any other. So if you guys want to put some samples in a dropbox account and send everybody a link to look at it so we can observe, then do so.
We're talking about the oil painting AI editing that takes place after... Often very visible whether you zoom into the pic or not. Whether you post it in the original size/quality on a drive or right here, it will still have that painted look. You can ignore that people are showing you the examples but it is happening, not some conspiracy... I take a photo; I see a flash of it normally for a second, before it is painted over. There's ridiculously no option to turn this auto editing off even though it distorts people's faces. As the above poster said I would much rather have a grainy photo (that I can fix myself in PS or similar, if I even care) over an AI edit that can't edit faces properly. I've been to multiple concerts w/ this phone and have entirely stopped taking photos because there's just no point when they're automatically ruined, and halide can't zoom enough. I'm not expecting $5000 camera type photos, just one that aren't destroyed by software after I take them
 
I have seen some improvements on the iPhone 13 Pro Max the with the picture after a lot of updates, but they are still minor. When I pick up an iPhone, I,m choosing the iPhone 11 as the processing is much more natural and the images look great in comparison to the 13 Pro Max.

For pictures I mainly use the GS21 Ultra and the iPhone 11. For videos, hands down I choose the 13 Pro Max, especially with the Dolby Vision HDR recording.
 
  • Like
Reactions: TMax
We're talking about the oil painting AI editing that takes place after... Often very visible whether you zoom into the pic or not. Whether you post it in the original size/quality on a drive or right here, it will still have that painted look. You can ignore that people are showing you the examples but it is happening, not some conspiracy... I take a photo; I see a flash of it normally for a second, before it is painted over. There's ridiculously no option to turn this auto editing off even though it distorts people's faces. As the above poster said I would much rather have a grainy photo (that I can fix myself in PS or similar, if I even care) over an AI edit that can't edit faces properly. I've been to multiple concerts w/ this phone and have entirely stopped taking photos because there's just no point when they're automatically ruined, and halide can't zoom enough. I'm not expecting $5000 camera type photos, just one that aren't destroyed by software after I take them
What you are seeing after the photo is taken, that Quik flash or shake of the photo is the deep fusion taking place. Before you take a photo or before you press the shutter the iPhone automatically takes four shots and then completes the other five whenever you press the shutter and stacks all of those images together into one file and that’s why you see it shake like you do. Whenever I shoot ProRAW, I see it as well but it takes longer because it’s a bigger file. I’ve mentioned this time and time again, the only way you can possibly get rid of that is to use a third-party app like Moment or Halide. Doesn’t take any longer to open that app then it does the regular Camera app.
 
What you are seeing after the photo is taken, that Quik flash or shake of the photo is the deep fusion taking place. Before you take a photo or before you press the shutter the iPhone automatically takes four shots and then completes the other five whenever you press the shutter and stacks all of those images together into one file and that’s why you see it shake like you do. Whenever I shoot ProRAW, I see it as well but it takes longer because it’s a bigger file. I’ve mentioned this time and time again, the only way you can possibly get rid of that is to use a third-party app like Moment or Halide. Doesn’t take any longer to open that app then it does the regular Camera app.
You're telling me something I already said. You were trying to say us noticing this horrible effect is because we're zooming, that there can't possibly be iphone 13s with bad photo quality... When it's present regardless. Shouldn't have to buy and open a third party app to not take ****** AI edited photos on my $1000 phone
 
You're telling me something I already said. You were trying to say us noticing this horrible effect is because we're zooming, that there can't possibly be iphone 13s with bad photo quality... When it's present regardless. Shouldn't have to buy and open a third party app to not take ****** AI edited photos on my $1000 phone
Well go take some photography classes and improve your skills instead of relying on the camera itself to give you a magic photo. I find it amusing that people think an expensive camera is going to give them fabulous photos. Photography is an art that you have to learn, sure there are some happy accidents, but they are few when you don’t understand how to use a camera.

Opening a third party camera app typically doesn’t take any longer to open and use compared to the default camera app. You still have to compose your shot, adjust the exposure to compensate for the highlights or shadows, avoid distracting elements and take the shot. It all comes down to what type of photographer you want to be. One who takes his time to come away with a winning shot, or one who settles with an average shot because no effort was applied prior to taking the photo.
 
Last edited:
Well go take some photography classes and improve your skills instead of relying on the camera itself to give you a magic photo. I find it amusing that people think an expensive camera is going to give them fabulous photos. Photography is an art that you have to learn, sure there are some happy accidents, but they are few when you don’t understand how to use a camera.

Opening a third party camera app typically doesn’t take any longer to open and use compared to the default camera app. You still have to compose your shot, adjust the exposure to compensate for the highlights or shadows, avoid distracting elements and take the shot. It all comes down to what type of photographer you want to be. One who takes his time to come away with a winning shot, or one who settles with an average shot because no effort was applied prior to taking the photo.
Facts, you can't simply point a camera and expect excellent results. I mean, I understand that that's why most of the folks here use their iPhones as it typically just takes great pictures without you having to do much.

However, you still need to understand that it's the photographer not the device that takes great photos. Since I posted on here long ago I've had nothing but excellent results with my 13 Pro. Again, I'm not a pro but I know how to manually adjust my camera settings and frame a shot.
 
Facts, you can't simply point a camera and expect excellent results. I mean, I understand that that's why most of the folks here use their iPhones as it typically just takes great pictures without you having to do much.

However, you still need to understand that it's the photographer not the device that takes great photos. Since I posted on here long ago I've had nothing but excellent results with my 13 Pro. Again, I'm not a pro but I know how to manually adjust my camera settings and frame a shot.

I would also add to that a phone is a lot harder to hold steady than a DSLR camera. Add to that the image stabilization on cameras (sensor is shifted) is not as good as SLR/lens image stabilization. That can only go so far when using 3x (77mm) or more on a camera especially in lower light.
 
Upgraded to 13 Pro recently from a 12P, thinking it would be the smart move before doing some traveling, and boy was I wrong. Incredibly disappointing and aggressive HDR effect has ruined a bunch of my photos. The Live version of them shows the difference. Really hope Apple fixes this, this is unacceptable to not have the option to turn it off. Never had these issues prior to 13. It creates a literal outline around the legs, like it's a cel shaded cartoon. Awful.
 

Attachments

  • EA26323E-C011-40F0-AA46-46CED9FF1611.jpeg
    EA26323E-C011-40F0-AA46-46CED9FF1611.jpeg
    314.2 KB · Views: 234
  • 59988ED2-3B77-49CB-978A-9EE3CE2F1706.jpeg
    59988ED2-3B77-49CB-978A-9EE3CE2F1706.jpeg
    264.6 KB · Views: 240
  • Like
Reactions: TMax and catean
Upgraded to 13 Pro recently from a 12P, thinking it would be the smart move before doing some traveling, and boy was I wrong. Incredibly disappointing and aggressive HDR effect has ruined a bunch of my photos. The Live version of them shows the difference. Really hope Apple fixes this, this is unacceptable to not have the option to turn it off. Never had these issues prior to 13. It creates a literal outline around the legs, like it's a cel shaded cartoon. Awful.
Yes that is bad! There is no way to spin that any other way. As for watercolor/painting effect for the face, etc., there may be an argument that many which includes that vast amount of influencers may actually want or prefer those kinds of photos. They may not come here and say so but their preference says so in their instagram photos for example.


It would be cool to get input from some members here who are not male and over 60 like I am to give us the other side of the argument. I’ll admit the phone that gets rid of “chicken neck” was amazing to see (a Lew Later episode) and any pro photographers would be doing a disservice to their clients if they can’t smooth out wrinkles and shrink “chicken neck”. However I would agree fully that for a pro phone, Apple should allow you to turn everything off. This must be AI at work as it only gets worse each year, but mostly people photos and not landscapes, a tree has no vanity…
 
Upgraded to 13 Pro recently from a 12P, thinking it would be the smart move before doing some traveling, and boy was I wrong. Incredibly disappointing and aggressive HDR effect has ruined a bunch of my photos. The Live version of them shows the difference. Really hope Apple fixes this, this is unacceptable to not have the option to turn it off. Never had these issues prior to 13. It creates a literal outline around the legs, like it's a cel shaded cartoon. Awful.
I see… the other issue I see is that you didn’t adjust the exposure correctly. This is why the legs and shoes are blown out and the HDR software is trying to compensate for those over exposed areas. Once a jpeg file is overexposed, there is no fixing it, the highlights can not be recovered. Touch to focus next time and drag the sunburst down to lower the exposure. Y’all are expecting the iPhone camera to do “everything“ for you. That’s not how the camera works. Most people that I talk to about the iPhone camera have no clue about any of the features or how the camera works, how to adjust the exposure, they just take the shot and settle with what they get. how is you photographic styles set up? That could be the issue, over saturation. You have to control the camera so it gives you the results you want ”before” you take the shot. What you see on the screen before the photo is exactly what you get after you take the photo. I have excellent results from my iPhone 13 pro max every time because I know how to control the camera And I have learned how the sensor “sees” a scene and i compensate for that…
 
  • Like
Reactions: geneo
I see… the other issue I see is that you didn’t adjust the exposure correctly. This is why the legs and shoes are blown out and the HDR software is trying to compensate for those over exposed areas. Once a jpeg file is overexposed, there is no fixing it, the highlights can not be recovered. Touch to focus next time and drag the sunburst down to lower the exposure. Y’all are expecting the iPhone camera to do “everything“ for you. That’s not how the camera works. Most people that I talk to about the iPhone camera have no clue about any of the features or how the camera works, how to adjust the exposure, they just take the shot and settle with what they get. how is you photographic styles set up? That could be the issue, over saturation. You have to control the camera so it gives you the results you want ”before” you take the shot. What you see on the screen before the photo is exactly what you get after you take the photo. I have excellent results from my iPhone 13 pro max every time because I know how to control the camera And I have learned how the sensor “sees” a scene and i compensate for that…
I'm very aware of how to adjust the exposure on the iphone, and on my real camera. That's not the issue. I've taken some great photos with the 13. The problem is that the apple AI is aggressive and unpredictable and when it doesn't "work" the results are very disastrous. If it's unpredictable to this point, then it should be optional to turn on or off.
 
I see… the other issue I see is that you didn’t adjust the exposure correctly. This is why the legs and shoes are blown out and the HDR software is trying to compensate for those over exposed areas. Once a jpeg file is overexposed, there is no fixing it, the highlights can not be recovered. Touch to focus next time and drag the sunburst down to lower the exposure. Y’all are expecting the iPhone camera to do “everything“ for you. That’s not how the camera works. Most people that I talk to about the iPhone camera have no clue about any of the features or how the camera works, how to adjust the exposure, they just take the shot and settle with what they get. how is you photographic styles set up? That could be the issue, over saturation. You have to control the camera so it gives you the results you want ”before” you take the shot. What you see on the screen before the photo is exactly what you get after you take the photo. I have excellent results from my iPhone 13 pro max every time because I know how to control the camera And I have learned how the sensor “sees” a scene and i compensate for that…
Btw, I included a shot of "what I saw in camera" and thats what I expected it to look like in the end. Instead it "processed" it and decided that the skin tone needed to be unnaturally enhanced
 
I'm very aware of how to adjust the exposure on the iphone, and on my real camera. That's not the issue. I've taken some great photos with the 13. The problem is that the apple AI is aggressive and unpredictable and when it doesn't "work" the results are very disastrous. If it's unpredictable to this point, then it should be optional to turn on or off.
I understand that you are very aware how to adjust the exposure, you just didn’t do it on this example. Sometimes you need to under expose your shots and then lift shadows when editing.. Apple can’t be blamed for stuff like this. It’s not their problem. Only the handful of iPhone users in this thread are the ones having the issue, prob because they may not know how to use the camera. Instagram #shotoniphone13promax is loaded with 45k photos or more, none of those folks are complaining… just saying.

The Halide app, Moment app, Beastcam app, Reeflex app…all of these do not use the smart HDR that the default camera uses but they do use deep fusion and the colors and details are more natural looking when these are used. They don’t take any longer to open and use compared to the default camera. I suggest trying them or pick one to use. I’ve posted so many shots showing what the iPhone 13 Pro camera is capable of, I’ve offered my help, so I guess I’ll not offer anymore. Apparently you all are too set in your ways or just not willing to learn new apps or photo tricks. you have to be really interested in Photography if you want better results. Taking snapshots and hoping for a masterpiece isn’t going to happen.
 
Last edited:
  • Like
Reactions: geneo
I understand that you are very aware how to adjust the exposure, you just didn’t do it on this example. Sometimes you need to under expose your shots and then lift shadows when editing.. Apple can’t be blamed for stuff like this. It’s not their problem. Only the handful of iPhone users in this thread are the ones having the issue, prob because they may not know how to use the camera. Instagram #shotoniphone13promax is loaded with 45k photos or more, none of those folks are complaining… just saying.

The Halide app, Moment app, Beastcam app, Reeflex app…all of these do not use the smart HDR that the default camera uses but they do use deep fusion and the colors and details are more natural looking when these are used. They don’t take any longer to open and use compared to the default camera. I suggest trying them or pick one to use. I’ve posted so many shots showing what the iPhone 13 Pro camera is capable of, I’ve offered my help, so I guess I’ll not offer anymore. Apparently you all are too set in your ways or just not willing to learn new apps or photo tricks. you have to be really interested in Photography if you want better results. Taking snapshots and hoping for a masterpiece isn’t going to happen.
Yeah yeah I know all the work arounds, the other apps, how to expose, yada yada. I own pro camera by moment and ive tried Halide. I've taken some nice photos I like with the phone, too. And then sometimes Apple's AI makes a bizarre decision about what I'm shooting and it's irreversible and looks nothing like the viewfinder. This has never been an issue for me with any other iPhone. You can lecture me all day about how great you think it is or what you think I should be doing differently, but it doesn't change the fact that it's unpredictable and should be toggleable.
 
  • Like
Reactions: TMax and RobertoDLV
I understand that you are very aware how to adjust the exposure, you just didn’t do it on this example. Sometimes you need to under expose your shots and then lift shadows when editing.. Apple can’t be blamed for stuff like this. It’s not their problem. Only the handful of iPhone users in this thread are the ones having the issue, prob because they may not know how to use the camera. Instagram #shotoniphone13promax is loaded with 45k photos or more, none of those folks are complaining… just saying.

The Halide app, Moment app, Beastcam app, Reeflex app…all of these do not use the smart HDR that the default camera uses but they do use deep fusion and the colors and details are more natural looking when these are used. They don’t take any longer to open and use compared to the default camera. I suggest trying them or pick one to use. I’ve posted so many shots showing what the iPhone 13 Pro camera is capable of, I’ve offered my help, so I guess I’ll not offer anymore. Apparently you all are too set in your ways or just not willing to learn new apps or photo tricks. you have to be really interested in Photography if you want better results. Taking snapshots and hoping for a masterpiece isn’t going to happen.
Btw the photo on the left ( the completely overexposed legs) is courtesy of apple
I see… the other issue I see is that you didn’t adjust the exposure correctly. This is why the legs and shoes are blown out and the HDR software is trying to compensate for those over exposed areas. Once a jpeg file is overexposed, there is no fixing it, the highlights can not be recovered. Touch to focus next time and drag the sunburst down to lower the exposure. Y’all are expecting the iPhone camera to do “everything“ for you. That’s not how the camera works. Most people that I talk to about the iPhone camera have no clue about any of the features or how the camera works, how to adjust the exposure, they just take the shot and settle with what they get. how is you photographic styles set up? That could be the issue, over saturation. You have to control the camera so it gives you the results you want ”before” you take the shot. What you see on the screen before the photo is exactly what you get after you take the photo. I have excellent results from my iPhone 13 pro max every time because I know how to control the camera And I have learned how the sensor “sees” a scene and i compensate for that…
i think you're misunderstanding the photos as well. The overexposed one is from Apples "smart hdr" and the one thats slightly underexposed is the actual photo I tried to take. I know not to blow out highlights
 
Yeah yeah I know all the work arounds, the other apps, how to expose, yada yada. I own pro camera by moment and ive tried Halide. I've taken some nice photos I like with the phone, too. And then sometimes Apple's AI makes a bizarre decision about what I'm shooting and it's irreversible and looks nothing like the viewfinder. This has never been an issue for me with any other iPhone. You can lecture me all day about how great you think it is or what you think I should be doing differently, but it doesn't change the fact that it's unpredictable and should be toggleable.
Well the default camera app on Apple‘s iPhone is designed to help people that have no idea how to take a photo and it tries to produce the best results possible to make that person happy. That’s why other apps like Halide and Moment etc. are to be used by other professionals or photographers who want more out of their photographs and want to create and have more control over the camera. That’s what those apps are designed for. Even those of you (myself included) that own a professional DSLR like Nikon or Sony or Canon don’t use the program mode on the camera and let it do everything for you. They frequently use aperture priority or manual and shoot in raw so I don’t see why applying those principles on the iPhone would be any different. If you want great photos from the iPhone, you have to use a third-party app and shoot in ProRAW or highest quality jpeg. Only Use the default camera for a quick shot when you don’t have time to take a great photo. Those of you that do any professional work or want the best from your DSLR should already know to apply those same techniques to the iPhone for the best results…. Once that gets accomplished, there won’t be any more complaining.
 
Well the default camera app on Apple‘s iPhone is designed to help people that have no idea how to take a photo and it tries to produce the best results possible to make that person happy. That’s why other apps like Halide and Moment etc. are to be used by other professionals or photographers who want more out of their photographs and want to create and have more control over the camera. That’s what those apps are designed for. Even those of you (myself included) that own a professional DSLR like Nikon or Sony or Canon don’t use the program mode on the camera and let it do everything for you. They frequently use aperture priority or manual and shoot in raw so I don’t see why applying those principles on the iPhone would be any different. If you want great photos from the iPhone, you have to use a third-party app and shoot in ProRAW or highest quality jpeg. Only Use the default camera for a quick shot when you don’t have time to take a great photo. Those of you that do any professional work or want the best from your DSLR should already know to apply those same techniques to the iPhone for the best results…. Once that gets accomplished, there won’t be any more complaining.
I think it’s pretty well established that the issue is indeed with the default camera app, it’s an issue which didn’t exist in past iPhones, and “just pay $$ for another camera app” isn’t the guidance that folks are looking for.
 
Well the default camera app on Apple‘s iPhone is designed to help people that have no idea how to take a photo and it tries to produce the best results possible to make that person happy. That’s why other apps like Halide and Moment etc. are to be used by other professionals or photographers who want more out of their photographs and want to create and have more control over the camera. That’s what those apps are designed for. Even those of you (myself included) that own a professional DSLR like Nikon or Sony or Canon don’t use the program mode on the camera and let it do everything for you. They frequently use aperture priority or manual and shoot in raw so I don’t see why applying those principles on the iPhone would be any different. If you want great photos from the iPhone, you have to use a third-party app and shoot in ProRAW or highest quality jpeg. Only Use the default camera for a quick shot when you don’t have time to take a great photo. Those of you that do any professional work or want the best from your DSLR should already know to apply those same techniques to the iPhone for the best results…. Once that gets accomplished, there won’t be any more complaining.

Lol, or they just buy an iphone 11pm, or wait for a better AI on iphone 14... That is the point of that topic, dont continue with the same excuses for apple. Have a nice day
 
  • Like
Reactions: Guacamole
It feels like this thread often descends often into photographic technique critique. It's not about that to me. Having a phone with a camera means I can pull it out of my pocket, quickly consider the shot, then take it. That's what I've done with every camera phone I've had. I could do that on my iPhone X and I was very happy with the results. However, within minutes of using the 13 I could tell something looked very different. This is why I returned it.

Having now upgraded to a refurb 11pro, I can see there's a difference from the X in terms of post processing, but in most cases it looks better and it suits my requirements. I love the things they brought like night mode and the ultra wide lens. The 11pro over the X feels like a huge jump for me.
 
Last edited:
Lol, or they just buy an iphone 11pm, or wait for a better AI on iphone 14... That is the point of that topic, dont continue with the same excuses for apple. Have a nice day
One well known fashion photographer (covers for Vogue, etc.) continues to shoot her social media content on her old 11PM and does not want to upgrade. The problems with the newer phones are very real.
 
  • Like
Reactions: TMax and atari4
I think it’s pretty well established that the issue is indeed with the default camera app, it’s an issue which didn’t exist in past iPhones, and “just pay $$ for another camera app” isn’t the guidance that folks are looking for.
Well that just depends on what type of photographer you want to be and how you want the photos to look. I use the default camera all the time, however I use ProRAW exclusively so I don’t have any issues with my photos or the camera. To get great shots, you have to go beyond the jpeg! Apparently you folks are going to continue to complain and blame Apple for everything regarding the camera instead of applying yourselves to do better at photography. This thread is a big waste of my time and probably others as well… Take your mediocrity and run with it lol…
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.