Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

kirk.vino

macrumors 6502a
Oct 27, 2017
667
1,013
That’s a good point. Though the larger aperture will always allow a faster shutter speed to avoid blur in moving subjects.
Shutter speed has nothing to do with the size/speed of the aperture. These are separate settings. Moreover, a slightly slower aperture is actually better for having sharper objects. The more you open the aperture, the more light you let in, which leads to some blurriness sometimes as well. That might be great for portraits, as there is more separation and bokeh, but for moving objects that might result in more blurriness. With a professional camera, you can control all these settings manually. With the iPhone you have no control over the shutter speed, that’s something that the algorithm determines itself for you. There might be some 3rd party apps that can let you control that manually though.
 
  • Like
Reactions: nemofish

ToddH

macrumors 68030
Jul 5, 2010
2,889
5,843
Central Tx
I did at first, but I regretted it, several times. Once the often-atrocious AI makes the photo, there’s no coming back. LivePhoto literally made the photos look better than the AI does. 13 Pro makes some great videos, especially in FilmicPro, but I can’t depend on it as a camera. iPhone 11 and 8 (I can go further back) made more consistent photos.
Interesting. I haven’t had that problem so far. maybe because I’ve been a photographer for so long I just know how to control the camera and know how to adjust it the way I want it ahead of time before the shot. I’d love to get with individuals like yourself and teach a iPhone photography class. That would be very fun and enlightening for me. Live Photos just don’t do that well for me. I’ve seen the loss of quality when using it. I shoot ProRAW nearly 100% of the time anyway, even more now with this amazing new 48mp sensor… wow!
 

ToddH

macrumors 68030
Jul 5, 2010
2,889
5,843
Central Tx
I just noticed this that the ProCam (ProCam 8) app will now allow you to shoot 48 megapixels in JPEG instead of 12 and it looks pretty impressive so far


48mp jpeg and huge crop from ProCam 8. I believe Moment, Halide, and others will follow suit as well soon.

C0E84D7D-06D2-4B37-9B72-3E66E2E8566D.jpeg


D674A1E5-1259-41FC-A958-09A0C45D3F34.png
 
Last edited:
  • Like
Reactions: I7guy

_gst_

macrumors newbie
Sep 16, 2022
4
7
if you’re shooting indoors and you get a shutter speed of 1/15 of a second, that’s not gonna cut it.

That's exactly my problem with the direction in which the iPhone Pros are moving. A lot of the new features seem to be optimized for studio-like photos (3x zoom, etc.) and given that there are tradeoffs this means that other features more useful for everyday use are lacking.

An example between the iPhone Pro 12 and 13 (not sure about 14) is that the 3x tele lens on the 13 seems a lot more dependent on good lighting. Without enough light that sensor falls back to the main lens and uses digital zoom, which results in a terrible image quality. This is not an issue on the iPhone 12 Pro: 2x tele photos pretty much always use the tele lens and never fall back to the main lens.

I understand what Apple is trying to achieve (reduce the gap between the iPhone and a DSLR), but if I would be in a studio with good lighting conditions I could as well use a DSLR instead of the iPhone. Pretty much 99% of my time I'm not in a studio though and would benefit a lot more from a camera that performs well in everyday conditions. Unfortunately for my personal usage the old 12 Pro seems to be a much better choice than the 13 Pro and the 14 Pro.
 
  • Like
Reactions: Saturn1217

ToddH

macrumors 68030
Jul 5, 2010
2,889
5,843
Central Tx
That's exactly my problem with the direction in which the iPhone Pros are moving. A lot of the new features seem to be optimized for studio-like photos (3x zoom, etc.) and given that there are tradeoffs this means that other features more useful for everyday use are lacking.

An example between the iPhone Pro 12 and 13 (not sure about 14) is that the 3x tele lens on the 13 seems a lot more dependent on good lighting. Without enough light that sensor falls back to the main lens and uses digital zoom, which results in a terrible image quality. This is not an issue on the iPhone 12 Pro: 2x tele photos pretty much always use the tele lens and never fall back to the main lens.

I understand what Apple is trying to achieve (reduce the gap between the iPhone and a DSLR), but if I would be in a studio with good lighting conditions I could as well use a DSLR instead of the iPhone. Pretty much 99% of my time I'm not in a studio though and would benefit a lot more from a camera that performs well in everyday conditions. Unfortunately for my personal usage the old 12 Pro seems to be a much better choice than the 13 Pro and the 14 Pro.
Well, take in mind whenever the first 36 megapixel or 50 mega pixel canon camera came out, They warned photographers about the larger sensor, and that motion blur will be more evident because of the higher resolution. The bigger sensor shows more flaws and it’s no different on the iPhone versus a DSLR. this was back in the day whenever image stabilization probably hasn’t been invented yet or some of the lenses had it. The iPhone will cut down on that with the sensor-shift image stabilization and pretty much get rid of a lot of that motion blur. The problem I see is the shutter button being struck by peoples finger too hard when taking pictures with their iPhone, they act like it’s a physical button that needs to be pressed down and it causes a lot of camera shake. in lowlight, taking it slow and holding your phone with a really steady grip as what makes all of the difference when you have a large sensor like this.
 

OriginalAppleGuy

Suspended
Sep 25, 2016
968
1,137
Virginia
It’s crazy, but 14 Pro photos look to be even more oversharpened than 13 Pro ones (and these were crazy oversharpened, the AI sometimes separating people and objects from background, making them look like a cardboard cutouts).
You get more detail this way with 14 Pro, yes, but it’s not pretty to look at.

Found some (not very positive) video about it

Watched the video and just not impressed with that guy. When you get a new phone and restore from your last, it can take a while for everything to eventually make it. The important stuff is copied over but apps can take time. Could also be file optimization and other housekeeping items. And - because it's new, you tend to use it more while you check things out. Those things impact battery life. The always on display isn't it. And no, the 13 does not have the same display. The 14's display is capable of 1 Hz refresh rate which allows the screen to reduce the battery drain. Also - the processor is an upgrade - not sure the 13 could handle the computations needed to achieve the software Action Mode capability in a workable manner.

I came from the 12 that's just about 2 years old. After a few days, I'll be able to tell how the 14 compares with battery life. As for the cameras, can't say how they compare to my 12 - not really a fair comparison. I do know the lenses are larger than the 13. People, including this guy, have shown the video is a nice improvement. If there are some software tweaks to be had, I'm sure Apple will execute them.
 

ToddH

macrumors 68030
Jul 5, 2010
2,889
5,843
Central Tx
I absolutely hate the pics my 14 pro is taking. 13 pro's come out better every single time. Not really sure what is going on with the new 48mp sensor but it's terrible.
Samples? I need to see samples to see what’s going on in order to help you. My photos don’t come out horrible at all and they are much improved over the 13 pro. So why is it that I get fantastic photos and everybody else doesn’t? It’s either a setting or a technique issue.
 
  • Like
Reactions: I7guy

zakarhino

Contributor
Sep 13, 2014
2,607
6,958
I have no idea what you’re talking about and why you think this way. I don’t know why you think they’re over sharpened compared to the 13 pro. The new larger sensor is fantastic and the optics are a lot sharper. Depending on your lighting conditions, you really have to pay attention to your shutter Speed. The new sensor is much larger now with a lot more pixel so any slow moving subject is gonna blur. The new sensor is probably going to require higher shutter speeds so that your objects and people will look sharp and clean. if you’re shooting indoors and you get a shutter speed of 1/15 of a second, that’s not gonna cut it. I can’t believe people are bashing the image quality of the new iPhone 4 pro, that’s just crazy. I’m sure a lot of this possibly comes down to user error.

Software oversharpening is user error?
Incorrect white balance is user error?
Horrific video lens flares are user error?
Poor skin tones are user error?
Washed out colors are user error?
Overexposure and poor dynamic range are user errors?
Poor portrait mode hair detection is user error?

Did you even watch the guy's video? What he's demonstrating is so plainly obvious, especially when he zooms in on his daughter's face. 13 Pro = less sharp but balanced. 14 Pro = oversharpened so much around the eyes she kinda looks like an alien. oversharpened hair. overexposed.

Guy in the video needs to take some photography lessons. All he is doing is pointing the camera and tripping the shutter to get a shot. There’s no control over the highlights or anything on his part. He’s just complaining because he doesn’t know enough about photography.

Mate you're saying the guy is taking photos wrong because he's pointing the camera and pressing the shutter button -- that's literally the freaking point of a smartphone camera smh. Real "you're holding it wrong" energy. The point of iPhone is that you DON'T need to know about photography: ISO, shutter speed, aperture, white balance, etc. are all things you're supposed to not think about which is why Apple absolutely do not expose them to you in the Camera app viewfinder.

If your response to abysmal results with the 14 Pro is that people need to download third party apps, take RAW photos, and edit everything every time they want a usable photo then you're entirely lost.

"user error"
"doesn't know about photography"
"technique issue"

all because he dared to use the Stock camera app and take a normal photo just like Apple advertise and optimize for, the same as every other manufacturer. bloody hell sometimes I can't believe this forum 😂😂😂
 

zakarhino

Contributor
Sep 13, 2014
2,607
6,958
I did at first, but I regretted it, several times. Once the often-atrocious AI makes the photo, there’s no coming back. LivePhoto literally made the photos look better than the AI does. 13 Pro makes some great videos, especially in FilmicPro, but I can’t depend on it as a camera. iPhone 11 and 8 (I can go further back) made more consistent photos.

11 Pro Max was one of the best iPhone camera systems ever made because it was the most refined algo before oil painting effect came around. I remember watching comparisons between 12 Pro and 11 Pro and couldn't believe my eyes at how much oversharpening and smoothing Apple were doing with their phones with the new 12 Pro smart HDR stuff. 12 Pro was also the first iPhone that removed the smart HDR toggle unlike the 11 Pro

If the Pixel 7 Pro performs another hat trick and improves significantly vs. the 6 Pro I will have to start pulling a Marques: carrying both an iPhone and Pixel, the iPhone for video, browsing, media, emails, etc., and the Pixel for photos.
 

ToddH

macrumors 68030
Jul 5, 2010
2,889
5,843
Central Tx
Software oversharpening is user error?
Incorrect white balance is user error?
Horrific video lens flares are user error?
Poor skin tones are user error?
Washed out colors are user error?
Overexposure and poor dynamic range are user errors?
Poor portrait mode hair detection is user error?

Did you even watch the guy's video? What he's demonstrating is so plainly obvious, especially when he zooms in on his daughter's face. 13 Pro = less sharp but balanced. 14 Pro = oversharpened so much around the eyes she kinda looks like an alien. oversharpened hair. overexposed.



Mate you're saying the guy is taking photos wrong because he's pointing the camera and pressing the shutter button -- that's literally the freaking point of a smartphone camera smh. Real "you're holding it wrong" energy. The point of iPhone is that you DON'T need to know about photography: ISO, shutter speed, aperture, white balance, etc. are all things you're supposed to not think about which is why Apple absolutely do not expose them to you in the Camera app viewfinder.

If your response to abysmal results with the 14 Pro is that people need to download third party apps, take RAW photos, and edit everything every time they want a usable photo then you're entirely lost.

"user error"
"doesn't know about photography"
"technique issue"

all because he dared to use the Stock camera app and take a normal photo just like Apple advertise and optimize for, the same as every other manufacturer. bloody hell sometimes I can't believe this forum 😂😂😂
What? you can’t believe this forum? The iPhone default camera is a point and shoot camera! You can’t rely on it to give you proper skin tones and white balance and everything else you mentioned. So yes… if you don’t know how to adjust the camera for those issues you mentioned, it’s user error. Did you watch the Apple event for the iPhone 14 pro? Remember how great the photos looked as they presented them? The images they showed from the iPhone 14 Pro were taken by seasoned professionals as they were pushing the camera to its limits. They weren’t taken by random people, or in point and shoot jpeg mode. They didn’t rely on the iPhone to give them a great shot, they had to create, adjust, and make the camera do what they wanted. You have to achieve and adjust these things for yourself. That’s what photography is all about, creating an image. Apple’s default camera is designed for the people who know “nothing” about photography… it’s designed to give the average person a good photo without making any adjustments and the majority are happy with that. The camera is just a tool, it will do basic photography in the hands of those who don’t know how to control it. You have to learn how to “see” like the camera does. Example: point the camera at a black cloth and what does it do? It renders that cloth overexposed because the exposure meter in the camera is calibrated to 18% gray because it is a reflective meter instead of an incident meter. Meaning the reflective meter measures the light bouncing off an object where an incident meter measures light falling on an object. You’ll have to adjust the exposure compensation to fix that. Everyone too heavily relies on the camera to do everything for them instead of creating a better photo. You have to work at it, it isn’t going to just magically give you a fabulous photo. And don’t say that it should because you paid $1200 or more on the iPhone. A $5000 Nikon will do the same, it will not produce a winning photo because you expect it to. Like I said, the camera is a tool.

If I were to buy a full mechanic tool set to work on a car and I had just enough mechanic knowledge to get by & I took that tool set and turned some nuts and bolts and added this piece and tightened that piece but the vehicle didn’t get fixed. Would I say, Man those tools suck! They can’t even fix this car and I paid a lot of money for them. The same goes for photography. An expensive camera won’t put your images in National Geographic. I know that was an extreme example, but you get my point.

Yes third party apps give you way more control over the camera. Question is, are you willing to put forth the effort to get a better images instead of settling for an image produced automatically? Yes sometimes happy accidents happen. An award winning photo was captured with no effort. Photography requires skill and patience. Most people don’t make the effort. Sorry but hose are just the facts.
 
Last edited:

Sweatypalms557

Suspended
Sep 18, 2022
71
135
I was sure they will fix the Camera app AI often making cardboard cutouts out of people, but they didn’t. View attachment 2071379
I mean you shoot into the sun and expect the image to come out magically perfect? If I shot the same image on my Sony A7 full frame I wouldn’t get anything that nice.

Please practice photography and stop blaming equipment.
 

slplss

macrumors 6502a
Nov 2, 2011
946
1,010
EU
The images they showed from the iPhone 14 Pro were taken by seasoned professionals as they were pushing the camera to its limits. They weren’t taken by random people, or in point and shoot jpeg mode. They didn’t rely on the iPhone to give them a great shot, they had to create, adjust, and make the camera do what they wanted. You have to achieve and adjust these things for yourself. That’s what photography is all about, creating an image. Apple’s default camera is designed for the people who know “nothing” about photography… it’s designed to give the average person a good photo without making any adjustments and the majority are happy with that.
Several of the key photos from my road trip are ruined forever by the atrocious AI. My mistake that after I climbed the mountain in time before sunrise I didn’t think about enabling LivePhoto, wasn’t professional photographer, had space for ProRaw and patience for learning photography or even time to fiddle with setting the photo up if I did.
Or… just had taken these photos with iPhone 11 and earlier or some android that doesn’t do this.
You have to understand, majority of people don’t use ProRaw or Halide. Nobody in my group ever heard of Filmic. When we are comparing cameras, iPhone with iPhone or Android phone, we’re comparing the photos from default app and settings most people will use. So, iPhone 12 and above can occasionally take some **** photos, and that’s a fact.
 
  • Like
Reactions: jamesrick80

slplss

macrumors 6502a
Nov 2, 2011
946
1,010
EU
I mean you shoot into the sun and expect the image to come out magically perfect? If I shot the same image on my Sony A7 full frame I wouldn’t get anything that nice.

Please practice photography and stop blaming equipment.
I have plenty of other pictures when I am not facing the sun and iPhone produces the same cardboard cutout effect, don’t want to share group photos though.
 
Last edited:

Sweatypalms557

Suspended
Sep 18, 2022
71
135
I could show you plenty of other pictures when I am not facing the sun and iPhone produces the same cardboard cutout effect, but thank you for your patronising.
Then show them. All we see is a a shot into the sun and expectations of a clean image.
 
  • Like
Reactions: JM

slplss

macrumors 6502a
Nov 2, 2011
946
1,010
EU
Then show them. All we see is a a shot into the sun and expectations of a clean image.
Edited, sorry, it’s mostly photos of other people and I respect their privacy. You just have to believe my disappointment with iPhone 13 Pro photography is real. Really it is the first time I am truly disappointed with my camera. When I was in a group of people, I asked other people to make group photos, because I knew I couldn’t rely on my phone.
 
  • Like
Reactions: lkalliance

Sweatypalms557

Suspended
Sep 18, 2022
71
135
Edited, sorry, it’s mostly photos of other people and I respect their privacy. You just have to believe my disappointment with iPhone 13 Pro photography is real. Really it is the first time I am truly disappointed with my camera. When I was in a group of people, I asked other people to make group photos, because I knew I couldn’t rely on my phone.
Ok. Unfortunate you took all of your photos into the sun. It is ok if you don’t want to show the simple mistakes you made. Try to be more aware of where the sun is and have better expectations.
 

slplss

macrumors 6502a
Nov 2, 2011
946
1,010
EU
Ok. Unfortunate you took all of your photos into the sun. It is ok if you don’t want to show the simple mistakes you made. Try to be more aware of where the sun is and have better expectations.
You can't be serious. Re read. I have made plenty of photos when I was not facing the sun and iPhone makes OUTLINES of people, aka making them look like cardboard cutouts. iPhone 8 was not making these photos, iPhone 11 wasn't making these photos. This comment section is ridiculous.
 
Last edited:

Jackbequickly

macrumors 68040
Aug 6, 2022
3,159
3,258
You can't be serious. Re read. I have made plenty of photos when I was not facing the sun and iPhone makes OUTLINES of people, aka making them look like cardboard cutouts. iPhone 8 was not making these photos, iPhone 11 wasn't making these photos. This comment section is ridiculous.

If you could, take some additional, none private, images today and post them. I have used my 14 PM camera a good bit and am extremely happy with the results, especially in lower light situations. Thanks
 
  • Like
Reactions: JM

slplss

macrumors 6502a
Nov 2, 2011
946
1,010
EU
If you could, take some additional, none private, images today and post them. I have used my 14 PM camera a good bit and am extremely happy with the results, especially in lower light situations. Thanks
Glad you're happy with your low light. This is my low light, noisy, oversharpened, overexposed. Simply unacceptable and unusable as a photo, facing the sunset or not. No other phone or camera would make it look like this.
IMG_9197.jpeg
 

ToddH

macrumors 68030
Jul 5, 2010
2,889
5,843
Central Tx
Several of the key photos from my road trip are ruined forever by the atrocious AI. My mistake that after I climbed the mountain in time before sunrise I didn’t think about enabling LivePhoto, wasn’t professional photographer, had space for ProRaw and patience for learning photography or even time to fiddle with setting the photo up if I did.
Or… just had taken these photos with iPhone 11 and earlier or some android that doesn’t do this.
You have to understand, majority of people don’t use ProRaw or Halide. Nobody in my group ever heard of Filmic. When we are comparing cameras, iPhone with iPhone or Android phone, we’re comparing the photos from default app and settings most people will use. So, iPhone 12 and above can occasionally take some **** photos, and that’s a fact.
I can understand that.. but I also think and feel that most people know what they are doing photography wise when buying a “Pro” model iPhone. There is much to learn when using professional equipment or equipment designed for the pros. I’m not saying you shouldn’t buy a 14 Pro, go for it. Just think about it this way, whenever one does learn a little bit more about photography that pro model iPhone is going to be more valuable to you then then when it was when you first purchased it. There is a lot of self gratification whenever you realize that you captured an image that you created yourself instead of relying on the iPhone to do it for you. Of course, when the iPhone does create its own remarkable image without any input from the person taking the photo, there’s some gratification and excitement there as well.
 

Dented

macrumors 65816
Oct 16, 2009
1,126
909
Shutter speed has nothing to do with the size/speed of the aperture. These are separate settings. Moreover, a slightly slower aperture is actually better for having sharper objects. The more you open the aperture, the more light you let in, which leads to some blurriness sometimes as well. That might be great for portraits, as there is more separation and bokeh, but for moving objects that might result in more blurriness. With a professional camera, you can control all these settings manually. With the iPhone you have no control over the shutter speed, that’s something that the algorithm determines itself for you. There might be some 3rd party apps that can let you control that manually though.
This is all pretty garbled. There’s no such thing as a “slower aperture”, it’s simply a hole that gets bigger or smaller and lets more or less light in, and letting more light in does not “lead to blurriness” for moving objects, although it will affect bokeh and make sharp focus more critical.

Exposure is controlled by aperture (the amount of light coming through the lens), shutter speed (the amount of time there is for that light to hit the sensor) and ISO (the sensitivity of the sensor - more sensitivity means more noise).

To get a sharp image of a moving object, you need to reduce the risk of motion blur by using a faster shutter speed. That reduces the amount of time there is for light to hit the sensor, so to get a good exposure that must be compensated for either by widening the aperture to let more light in, or ramping up the ISO. An automated camera like the iPhone will do either or both of those things depending on its algorithm and the amount it needs to compensate for that faster shutter vs the available light. Similarly, reducing the aperture may require a slower shutter speed, and consequently more motion blur (it’s important btw to understand the distinction between motion blur and lack of focus - the camera can be perfectly focussed but still have a blurred subject if it, or the camera, is moving too quickly in relation to the shutter speed).

So in short, there is a relationship between aperture and shutter speed, and increasing or decreasing one will affect the other.
 
  • Like
Reactions: PlayUltimate

kirk.vino

macrumors 6502a
Oct 27, 2017
667
1,013
This is all pretty garbled. There’s no such thing as a “slower aperture”, it’s simply a hole that gets bigger or smaller and lets more or less light in, and letting more light in does not “lead to blurriness” for moving objects, although it will affect bokeh and make sharp focus more critical.

Exposure is controlled by aperture (the amount of light coming through the lens), shutter speed (the amount of time there is for that light to hit the sensor) and ISO (the sensitivity of the sensor - more sensitivity means more noise).

To get a sharp image of a moving object, you need to reduce the risk of motion blur by using a faster shutter speed. That reduces the amount of time there is for light to hit the sensor, so to get a good exposure that must be compensated for either by widening the aperture to let more light in, or ramping up the ISO. An automated camera like the iPhone will do either or both of those things depending on its algorithm and the amount it needs to compensate for that faster shutter vs the available light. Similarly, reducing the aperture may require a slower shutter speed, and consequently more motion blur (it’s important btw to understand the distinction between motion blur and lack of focus - the camera can be perfectly focussed but still have a blurred subject if it, or the camera, is moving too quickly in relation to the shutter speed).

So in short, there is a relationship between aperture and shutter speed, and increasing or decreasing one will affect the other.
Slower or faster aperture is another way of saying smaller or bigger aperture in the photography world. I’ve been shooting on my Sony A7III for many years now, so I when I say that letting more light in and opening the aperture more creates some blurriness as well, I know what I’m talking about. When the aperture is open all the way, only your main object is in focus, the rest is all blurred out and even your main object may appear softer with less detail. The same way closing the aperture creates an overall sharper image with most of the frame in focus and the main object showing more detail.
Not sure why you even brought up ISO here, no one was talking about that in the first place, speaking of “garbled” LOL
 
Last edited:
  • Like
Reactions: Snot Rox

MrAperture

macrumors 6502a
Sep 16, 2017
738
922
SF, CA
Glad you're happy with your low light. This is my low light, noisy, oversharpened, overexposed. Simply unacceptable and unusable as a photo, facing the sunset or not. No other phone or camera would make it look like this.
View attachment 2071888
You just showed 2 pictures with sunrise or sunset in background. With Smart HDR there’s a tradeoff: it wants to expose every detail in the shadows. Your human subject included.
 
Last edited:
  • Like
Reactions: JM
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.