Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Hunter5117

macrumors 6502a
Mar 17, 2010
569
401
I’m not an expert on this but I believe the larger the sensor, the more it compensates for a slower aperture. That applies to low light as well as bokeh.
Not necessarily. A larger sensor with the same number of pixels will be more sensitive, it depends on the size of the pixels and how many photons they can intercept in the same time frame aka same shutter speed, assuming the iso remains the same. The iPhone 14PM has a larger sensor (9.8x7.3mm) vs (7.5x5.7mm) in the 13PM. However, it also has 48 megapixels vs 12mp in the 13PM. so the pixels are considerably smaller. However, in "normal" shooting when staying at 12mp, the 14PM will bin together 4 of the smaller pixels to make a single larger pixel which is then roughly 1.25 times larger than the 13PM. So, those binned "quad" pixels should have greater light sensivity. If you shoot in the pro raw mode with 48mp then you will have much less light sensitivity per pixel and poorer performance in low light conditions, again at the same iso.

Regarding slower aperture, that is a function of the larger sensor. As sensors get larger, it takes a longer focal length lens to give a "normal" appearance to the photo, ie a 50mm lens is "normal" on a full frame 35mm equivalent sensor, but it is a 2x telephoto on a micro 4/3 sensor. Longer focal length lenses are more difficult to make in large aka fast apertures hence the slightly lower max aperture on the 14PM. If you stay in 12mp mode then this should not be a problem since the larger quad pixels will more than compensate.

Moreover, a slightly slower aperture is actually better for having sharper objects. The more you open the aperture, the more light you let in, which leads to some blurriness sometimes as well.
I’ve been shooting on my Sony A7III for many years now, so I when I say that letting more light in and opening the aperture more creates some blurriness as well, I know what I’m talking about.
Letting in more light does not make things less sharp, that is a function of the quality of the lens. Yes, smaller apertures will give sharper over-all focus due to depth of field. However, that does not have anything to do with the sharpness at the point of focus. Larger apertures have a shallower depth of field, making accurate focus more important and leading to the out of focus highlights often described as "bokeh".
 
  • Like
Reactions: IT Provisor

kirk.vino

macrumors 6502a
Oct 27, 2017
667
1,013
Not necessarily. A larger sensor with the same number of pixels will be more sensitive, it depends on the size of the pixels and how many photons they can intercept in the same time frame aka same shutter speed, assuming the iso remains the same. The iPhone 14PM has a larger sensor (9.8x7.3mm) vs (7.5x5.7mm) in the 13PM. However, it also has 48 megapixels vs 12mp in the 13PM. so the pixels are considerably smaller. However, in "normal" shooting when staying at 12mp, the 14PM will bin together 4 of the smaller pixels to make a single larger pixel which is then roughly 1.25 times larger than the 13PM. So, those binned "quad" pixels should have greater light sensivity. If you shoot in the pro raw mode with 48mp then you will have much less light sensitivity per pixel and poorer performance in low light conditions, again at the same iso.

Regarding slower aperture, that is a function of the larger sensor. As sensors get larger, it takes a longer focal length lens to give a "normal" appearance to the photo, ie a 50mm lens is "normal" on a full frame 35mm equivalent sensor, but it is a 2x telephoto on a micro 4/3 sensor. Longer focal length lenses are more difficult to make in large aka fast apertures hence the slightly lower max aperture on the 14PM. If you stay in 12mp mode then this should not be a problem since the larger quad pixels will more than compensate.



Letting in more light does not make things less sharp, that is a function of the quality of the lens. Yes, smaller apertures will give sharper over-all focus due to depth of field. However, that does not have anything to do with the sharpness at the point of focus. Larger apertures have a shallower depth of field, making accurate focus more important and leading to the out of focus highlights often described as "bokeh".
Note how I used the word “sometimes” in my post without going into much detail on different types of lenses to avoid getting too meandering in my response.
Yes, the quality of the lens would, of course, affect that as well. However, working with different lenses, I can tell that most of them behave in a similar manner pertaining to that.
 

sylox

macrumors newbie
Nov 13, 2020
4
4
I have to disagree. Could you provide a sample with exif data? You probably have night mode turned off. If you don’t then we need a sample. There is a possibility that a setting is disabled. How low of lighting conditions are you talking about?
The video posted by @slplss is exactly what I meant in low light condition. The video at ~3:56 where the video creator shows his children's playhouse shows the noise. In the video he talks about AI oversharpening and noise and maybe I think it is mostly the AI.

When comparing the test images of the iPhone 13 Pro and iPhone 14 Pro the image size is always ~50% smaller.
Phone 13 Pro (12MP, 26mm f1.5 ISO 1000) vs. iPhone 14 Pro (12MP 24mm f1.78 ISO 1000) with an increase in the
brightness of the light source in order to match the same ISO values. I can't explain where and why ~50% data is lost.

@ToddH sorry, I can't/don't want to upload the test pictures of my messy 🙈 room. Rather, I wanted to find out if others also recognized the noise in the pictures and it seems so.

Can someone confirm the smaller image size in low light?
 

The Game 161

macrumors Nehalem
Dec 15, 2010
30,966
20,163
UK
The main lens of my iPhone 14 Pro Max takes horrible 👿 low light pictures compared to my previous iPhone 13 Pro Max. This is because the f1.78 lens takes pictures at a higher ISO compared to the f1.5 lens. Noise and washed-out colors are the result.
that shouldn't be happening at all. low light is where the main improvements are and are quite obvious based on my testing. day light shots you won't find much difference.
 

The Game 161

macrumors Nehalem
Dec 15, 2010
30,966
20,163
UK
this brand new sensor im sure will get some more optimisation as updates go on but the real difference will be if you shoot in pro raw. that will give you the results you really can be proud of.
 
  • Like
Reactions: I7guy

gtg465x

macrumors 6502a
Sep 12, 2016
754
883
11 Pro Max was one of the best iPhone camera systems ever made because it was the most refined algo before oil painting effect came around. I remember watching comparisons between 12 Pro and 11 Pro and couldn't believe my eyes at how much oversharpening and smoothing Apple were doing with their phones with the new 12 Pro smart HDR stuff. 12 Pro was also the first iPhone that removed the smart HDR toggle unlike the 11 Pro

If the Pixel 7 Pro performs another hat trick and improves significantly vs. the 6 Pro I will have to start pulling a Marques: carrying both an iPhone and Pixel, the iPhone for video, browsing, media, emails, etc., and the Pixel for photos.

I have the Smart HDR toggle on my 12 Pro Max and can turn it off.
 

Alfieg

macrumors regular
Feb 5, 2011
213
266
My 14 Pro Max has been taking horrible photos in 1x and with moderate light (compared to my 13 Pro).

After some testing it seems to be that it is sluggish to switch in and out of macro, even with significant distance from the subject, it either doesn’t switch or takes bloomin’ ages.

I think this is causing the issue for me.

Hope this can be fixed in software.
 

Alfieg

macrumors regular
Feb 5, 2011
213
266
Ah, hang on, it’s not switching in and out of macro as I turned macro off.

It’s switching between something else, and the difference in noise is huge.

Didn’t have this with the 13 Pro
 

enthawizeguy

macrumors 6502
Jun 10, 2007
494
54
North Hollywood, CA
While Apple has touted the better iPhone 14 pro cameras, I noticed that the maximum aperture for the main camera has actually been downgraded compared to the 13 pro. It has gone from a max aperture of f1.5 to f1.78. Even the 12 and 13 non-pros have a larger max aperture at f1.6. I appreciate that there are probably physical limitations to a larger aperture given there is now a larger 48 megapixel sensor (and the lenses are already rather… substantial…)

This could have an impact on a) low light shots and b) isolating the foreground when going for the background blur look. Its a similar story with the ultra-wide lens, though not with the telephoto. Probably not an issue for most users and low impact in day-to-day use - but maybe something for the ‘pro’ users to be aware of.
large sensor compensates, also the higher up the aperture used to take the photo the sharper your photo will be , if its wide open you will get better bokeh , but that doesnt make a good photo . I shoot most photos between f6 and f8 on a normal camera
 

JM

macrumors 601
Nov 23, 2014
4,086
6,381
Glad you're happy with your low light. This is my low light, noisy, oversharpened, overexposed. Simply unacceptable and unusable as a photo, facing the sunset or not. No other phone or camera would make it look like this.
View attachment 2071888
That’s a really good software compensation for a backlit subject.

I find the worst cardboard cutouts when the person is backlit, as Apple assumes that you want to see the person you’re taking a photo of… so it kind of HAS to make a bad looking photoshop job cardboard cutout.
 
  • Like
Reactions: lkalliance

OriginalAppleGuy

Suspended
Sep 25, 2016
968
1,137
Virginia
Glad you're happy with your low light. This is my low light, noisy, oversharpened, overexposed. Simply unacceptable and unusable as a photo, facing the sunset or not. No other phone or camera would make it look like this.
View attachment 2071888

Good lord man, you need to get that phone replaced. I agree - look at what it did to that person's face and the other one you posted. It's like your sensor has missing pixel collectors or something.
 

nemofish

macrumors regular
Mar 11, 2019
142
129
I have no idea what you’re talking about and why you think this way. I don’t know why you think they’re over sharpened compared to the 13 pro. The new larger sensor is fantastic and the optics are a lot sharper. Depending on your lighting conditions, you really have to pay attention to your shutter Speed. The new sensor is much larger now with a lot more pixel so any slow moving subject is gonna blur. The new sensor is probably going to require higher shutter speeds so that your objects and people will look sharp and clean. if you’re shooting indoors and you get a shutter speed of 1/15 of a second, that’s not gonna cut it. I can’t believe people are bashing the image quality of the new iPhone 4 pro, that’s just crazy. I’m sure a lot of this possibly comes down to user error.
The sensor is good, that’s shown in the proraw samples, but regular 12mp JPEGs are almost indistinguishable from 13p and even 11p. In some cases it’s even worse as Apples Jpeg engine is terrible for over sharpening and heavy noise reduction.
 

Dented

macrumors 65816
Oct 16, 2009
1,126
909
Slower or faster aperture is another way of saying smaller or bigger aperture in the photography world. I’ve been shooting on my Sony A7III for many years now, so I when I say that letting more light in and opening the aperture more creates some blurriness as well, I know what I’m talking about.
No, you don’t. Well done for knowing the name of a big boy camera, but you’re talking here about the optical attributes of certain lenses you’ve used (probably the kit lens by the sound of things), not anything to do with letting in more light per se.

Yes a wider aperture will narrow the depth of field - the amount of the image front to back in focus - but it doesn’t introduce or “create” bluriness of the subject itself, and isn’t inherently bad for moving subjects as you originally said - quite the opposite. Often to ensure a moving subject is sharp and not subject to motion blur, you (or the camera’s automatic exposure system) must set a wider aperture in order to allow for a faster shutter speed.

Not sure why you even brought up ISO here, no one was talking about that in the first place, speaking of “garbled” LOL
Because it’s another element of exposure. You originally said that shutter speed has nothing to do with aperture, when in reality shutter speed, aperture and ISO are all inexorably linked, and you can’t change one without the others being affected (assuming you want to expose correctly). Your ”slower” aperture may lead to both a slower shutter speed and a higher ISO, meaning more noise in the image.
 
  • Like
Reactions: scrtagntman

nemofish

macrumors regular
Mar 11, 2019
142
129
What? you can’t believe this forum? The iPhone default camera is a point and shoot camera! You can’t rely on it to give you proper skin tones and white balance and everything else you mentioned. So yes… if you don’t know how to adjust the camera for those issues you mentioned, it’s user error. Did you watch the Apple event for the iPhone 14 pro? Remember how great the photos looked as they presented them? The images they showed from the iPhone 14 Pro were taken by seasoned professionals as they were pushing the camera to its limits. They weren’t taken by random people, or in point and shoot jpeg mode. They didn’t rely on the iPhone to give them a great shot, they had to create, adjust, and make the camera do what they wanted. You have to achieve and adjust these things for yourself. That’s what photography is all about, creating an image. Apple’s default camera is designed for the people who know “nothing” about photography… it’s designed to give the average person a good photo without making any adjustments and the majority are happy with that. The camera is just a tool, it will do basic photography in the hands of those who don’t know how to control it. You have to learn how to “see” like the camera does. Example: point the camera at a black cloth and what does it do? It renders that cloth overexposed because the exposure meter in the camera is calibrated to 18% gray because it is a reflective meter instead of an incident meter. Meaning the reflective meter measures the light bouncing off an object where an incident meter measures light falling on an object. You’ll have to adjust the exposure compensation to fix that. Everyone too heavily relies on the camera to do everything for them instead of creating a better photo. You have to work at it, it isn’t going to just magically give you a fabulous photo. And don’t say that it should because you paid $1200 or more on the iPhone. A $5000 Nikon will do the same, it will not produce a winning photo because you expect it to. Like I said, the camera is a tool.

If I were to buy a full mechanic tool set to work on a car and I had just enough mechanic knowledge to get by & I took that tool set and turned some nuts and bolts and added this piece and tightened that piece but the vehicle didn’t get fixed. Would I say, Man those tools suck! They can’t even fix this car and I paid a lot of money for them. The same goes for photography. An expensive camera won’t put your images in National Geographic. I know that was an extreme example, but you get my point.

Yes third party apps give you way more control over the camera. Question is, are you willing to put forth the effort to get a better images instead of settling for an image produced automatically? Yes sometimes happy accidents happen. An award winning photo was captured with no effort. Photography requires skill and patience. Most people don’t make the effort. Sorry but hose are just the facts.
Whilst all this is true, it’s missing the point which is that Apple’s Jpeg engine and their photo AI is atrocious.

Most people just want to open their camera app and press the shutter button to take a photo; and the phone is designed for that shooting style; hence this should be reviewed and compared to other phones such as Google's Pixels which are light years ahead of iPhones in that regard as they process photos much better and retain more details.

Apple has shown with prorate that they have the sensor to keep up with Google, but for some reason they can't work out how to capture and retain some of that in standard shooting.
 
Last edited:
  • Like
Reactions: Wizec

kirk.vino

macrumors 6502a
Oct 27, 2017
667
1,013
No, you don’t. Well done for knowing the name of a big boy camera, but you’re talking here about the optical attributes of certain lenses you’ve used (probably the kit lens by the sound of things), not anything to do with letting in more light per se.

Yes a wider aperture will narrow the depth of field - the amount of the image front to back in focus - but it doesn’t introduce or “create” bluriness of the subject itself, and isn’t inherently bad for moving subjects as you originally said - quite the opposite. Often to ensure a moving subject is sharp and not subject to motion blur, you (or the camera’s automatic exposure system) must set a wider aperture in order to allow for a faster shutter speed.


Because it’s another element of exposure. You originally said that shutter speed has nothing to do with aperture, when in reality shutter speed, aperture and ISO are all inexorably linked, and you can’t change one without the others being affected (assuming you want to expose correctly). Your ”slower” aperture may lead to both a slower shutter speed and a higher ISO, meaning more noise in the image.
These are all separate elements that are not related to each other and yet they control the picture settings/quality. The initial conversation was about the aperture settings. You’re the one who decided to throw ISO in the mix here to make it even more meandering. And, no, I’m not using a kit lens. I work with different types. Many lenses tend to act in a similar way: softer image at their widest aperture. Especially, if it’s not a prime one, aka zoom lens.
Also, while we’re on the subject of ISO: the “noise” you’re talking about is mostly a thing of the past, as full-frame cameras have such high-quality sensors now that they don’t introduce any noise unless it’s at some crazy high levels. The aforementioned Sony a7iii doesn’t exhibit any up to ISO 3200.
Also, your very last sentence is super confusing: why would you want a slower shutter speed and a higher ISO at the same time? The slower the shutter speed, the brighter the image is. You would want to lower your ISO unless you want to completely overexpose your image.
 
Last edited:

winterny

macrumors 6502
Jul 5, 2010
433
239
Aperture is based on the ratio of the size of the rear element of the lens to the size of the sensor... Aperture is not a fixed variable that can be used to compare all lenses to eachother ignoring all other specs.

A f/1.8 lens on a smartphone will not be "faster" than a f/2.8 lens on a full frame 35mm DSLR.

When you combine the much larger sensor of the 14 Pro vs the 13 Pro with a lens that overall lets more light in (even with a higher f number), and a quad-bayer sensor that allows you to bin 4 pixels together into one larger pixel, the overall effect is about 2x the light per pixel when in 12MP mode for the same shutter speed.

The Ultrawide and Main camera packages are both clearly better for low light than what existed in the 13 Pro. The "3x" camera, unfortunately, is the same as the 13 Pro with apparently improvements to the signal processing which may or may not make much of a difference.

I do a lot of low-light video, and would expect that most likely the 0.5x and 1x and 2x modes will all be better than the equivalent on the 13 Pro, but I probably still would not use 3X mode on the 14 pro in low light situations.
 

spacemonkee

macrumors newbie
Sep 19, 2022
1
2
The video posted by @slplss is exactly what I meant in low light condition. The video at ~3:56 where the video creator shows his children's playhouse shows the noise. In the video he talks about AI oversharpening and noise and maybe I think it is mostly the AI.

When comparing the test images of the iPhone 13 Pro and iPhone 14 Pro the image size is always ~50% smaller.
Phone 13 Pro (12MP, 26mm f1.5 ISO 1000) vs. iPhone 14 Pro (12MP 24mm f1.78 ISO 1000) with an increase in the
brightness of the light source in order to match the same ISO values. I can't explain where and why ~50% data is lost.

@ToddH sorry, I can't/don't want to upload the test pictures of my messy 🙈 room. Rather, I wanted to find out if others also recognized the noise in the pictures and it seems so.

Can someone confirm the smaller image size in low light?
I can confirm the smaller image size in low light. I also noticed that ISO tends to be higher (often twice as high) compared to my iPhone 13 Pro Max for exactly the same shots.

So the iPhone 13 Pro Max would take the shot with ISO 500 while the iPhone 14 Pro uses ISO 1000. As expected the resulting image is worse.
 
  • Like
Reactions: Wizec and sylox

Dented

macrumors 65816
Oct 16, 2009
1,126
909
These are all separate elements that are not related to each other and yet they control the picture settings/quality. The initial conversation was about the aperture settings. You’re the one who decided to throw ISO in the mix here to make it even more meandering.
The three are interrelated and it helps to understand each of them when talking about exposure. Changing one without understanding the effect that will have on another is likely to result in disappointment. In your original post you basically suggested using a slower (higher) aperture to ensure less “bluriness” of a moving subject, when the combined effect of that aperture on both shutter speed and ISO is likely to guarantee the opposite.
And, no, I’m not using a kit lens. I work with different types. Many lenses tend to act in a similar way: softer image at their widest aperture. Especially, if it’s not a prime one, aka zoom lens.
There’s no zoom lens in an iPhone though, just three primes in effect.
Also, while we’re on the subject of ISO: the “noise” you’re talking about is mostly a thing of the past, as full-frame cameras have such high-quality sensors now that they don’t introduce any noise unless it’s at some crazy high levels. The aforementioned Sony a7iii doesn’t exhibit any up to ISO 3200.
Nobody but you is talking about an A7iii here (not that 3200 exactly counts as a crazy high iso these days - clearly noise still is a thing even on your Sony). On a much smaller iPhone sensor noise is apparent at much lower levels and causes the phone to apply aggressive noise reduction and lose a lot of detail, so situations which ramp up the ISO (like blindly narrowing the aperture) are absolutely to be avoided.
Also, your very last sentence is super confusing: why would you want a slower shutter speed and a higher ISO at the same time? The slower the shutter speed, the brighter the image is. You would want to lower your ISO unless you want to completely overexpose your image.
It’s only confusing to you. You probably don’t want a slower shutter and higher ISO, but if as you suggest you go to a slower aperture and reduce the light coming through the lens, you (or the camera) may have no choice but to both slow the shutter and raise the ISO in order to capture an image. Again, this is an example of how shutter, aperture and ISO relate to each other in regard to exposure, something you seem unclear on. All of this depends on light levels, but again that’s not something you gave any consideration to in your original post.
 
Last edited:
  • Like
Reactions: scrtagntman

nemofish

macrumors regular
Mar 11, 2019
142
129
The video posted by @slplss is exactly what I meant in low light condition. The video at ~3:56 where the video creator shows his children's playhouse shows the noise. In the video he talks about AI oversharpening and noise and maybe I think it is mostly the AI.

When comparing the test images of the iPhone 13 Pro and iPhone 14 Pro the image size is always ~50% smaller.
Phone 13 Pro (12MP, 26mm f1.5 ISO 1000) vs. iPhone 14 Pro (12MP 24mm f1.78 ISO 1000) with an increase in the
brightness of the light source in order to match the same ISO values. I can't explain where and why ~50% data is lost.

@ToddH sorry, I can't/don't want to upload the test pictures of my messy 🙈 room. Rather, I wanted to find out if others also recognized the noise in the pictures and it seems so.

Can someone confirm the smaller image size in low light?
This is the same when you compare a Google Pixel 6pro to iPhone 14 pro. The JPEGs lack so much detail on iPhone and the file size is 50% smaller. For some reason Apple are heavily compressing JPEgs and as a result, dumping loads of detail.
 

Hunter5117

macrumors 6502a
Mar 17, 2010
569
401
Yes, the quality of the lens would, of course, affect that as well. However, working with different lenses, I can tell that most of them behave in a similar manner pertaining to that.
There are only a couple of things that can degrade sharpness when you open up the aperture on a lens. The most likely is increased aberrations when the lens is not corrected well enough to focus all the wavelengths at the same point. This is not so much because of "more light" but rather the path the light follows while passing through the lens. Aberration decreases as the iris is closed down forcing the light to follow the most well-corrected path through the lens. Inversely, light diffracts more as you close down the aperture, and this also causes lack of sharpness as the light waves spread out more after passing through the iris and cause interference patterns. This is why most lenses have a "sweet spot" f-stop where sharpness is optimized. You mentioned that you are shooting with Sony camera and possibly lenses. These are top quality equipment and should be well corrected for both problems especially aberration.
 

Hunter5117

macrumors 6502a
Mar 17, 2010
569
401
Aperture is based on the ratio of the size of the rear element of the lens to the size of the sensor... Aperture is not a fixed variable that can be used to compare all lenses to eachother ignoring all other specs.
Sorry but that is not correct. Aperture or f-stop is the ratio of the lens aperture to the focal length. Thats why fast (aka large aperture) long focal length lenses are so big and expensive. Making a 400mm f2.8 lens with a 143mm maximum aperture is a really large and expensive lens. It can be loosely used to compare one lens to another with similar maximum aperture although T-stops (transmission stops) are more accurate because they take into account the light loss going through the lens.

FYi in the midst of all this discussion. Smartphones to not have variable apertures. If it is listed as a F1.8 lens then it shoots everything at f1.8. I think maybe Samsung did have a variable aperture smart phone but not sure. The only way smart phone cameras adjust exposure is through shutter speed and iso. With the relatively tiny sensors and appropriate lens for that sensor, what looks like a "fast" aperture is really not all that fast and does not have very shallow depth of field. That's why Apple has made such a big deal about their software that can simulate out of focus (OOF) effects aka bokeh, and their "cinematic" video mode that looks like push and pull focus but is really just a software trick.
 

winterny

macrumors 6502
Jul 5, 2010
433
239
Sorry but that is not correct. Aperture or f-stop is the ratio of the lens aperture to the focal length. Thats why fast (aka large aperture) long focal length lenses are so big and expensive.
You are ignoring one variable here... When they say 26mm (for the 13 Pro) or 24mm (for the 14 Pro), the camera doesn't actually have a 26mm lens ... it has a 26mm "equivalent" lens, based on what 26mm would look like on a 35mm camera ... Bigger sensor, bigger lens, more light. The f stop is not expressed as "equivalent", but rather based on the actual size of the lens.

Once you do the math to figure out the f stop equivalent for a 35mm sensor you find that the 14 Pro is equivalent to f/6.2 vs f/6.8 on the 13 Pro, which is almost 1/3 of a stop improvement.

You can read up on "Equivalence" here: https://www.dpreview.com/articles/2666934640/what-is-equivalence-and-why-should-i-care

There are of course other factors that impact the 'speed', including the sensitivity and noise levels of the sensor ... which it would be fair to assume that in 12MP mode is *probably* improved over the 13 Pro Max's main camera, but certainly at least performs 'as well' as the 13 Pro Max's main camera.
 
Last edited:

Hunter5117

macrumors 6502a
Mar 17, 2010
569
401
You are ignoring one variable here... When they say 26mm (for the 13 Pro) or 24mm (for the 14 Pro), the camera doesn't actually have a 26mm lens ... it has a 26mm "equivalent" lens, based on what 26mm would look like on a 35mm camera ... Bigger sensor, bigger lens, more light. The f stop is not expressed as "equivalent", but rather based on the actual size of the lens.
Agree with you in concept but I think your assumption is wrong.

Yes, Apple along with most all of the point and shoot camera community, states focal length in "35mm equivalent) numbers so that the average consumer can understand.

However, aperture value is always the ratio of the aperture to the focal length.

So, regardless if Apple is stating the max aperture as "equivalent" to f1.8 on a 24mm (in 35mm terms) lens, or if it is actually f1.8 on a 6.8mm lens (if I did my math correctly), it is still a f1.8 lens. The concept of equivalence does not come into play here. The relative size of the aperture obviously changes depending on the focal length of the lens but the ratio stays constant. That is a basic concept of photography for the past 100+ years.

Over the years I have shot with a number of different film and sensor formats, from 35mm, 35mm half frame, 2 1/4" square, and others. All that time I have had to keep in my head the change in perspective that a particular focal length delivers to each format based on the size of the film or sensor. However, never have I had to calculate "equivalent" aperture values, regardless of how they are stated they are the same.

Regarding the DP Review article, their discussion of "equivalent" apertures is in regard to depth of field or more correctly back ground blur. I remember learning this lesson very well many years ago in the DP Review forums. At that time the smaller digital sensors were relatively new and I was not familiar with the fact that depth of field is not the same across all the formats. The article you referenced is describing that fact, in order to get the same apparent back ground blur with different formats, even when using an "equivalent" focal length lens, you also need to use the appropriate "equivalent" aperture. However, this does not translate to the actual aperture of the lens, that is again, constant based on the ratio of focal length to aperture size.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.