Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Phrasikleia

macrumors 601
Original poster
Feb 24, 2008
4,082
403
Over there------->
I frequently read this sort of thing in lens reviews:

"Diffraction limiting begins to set in at ƒ/8, though it doesn't impact on sharpness until ƒ/16; even at ƒ/16, we're seeing only 2 blur units across the frame. Using the ƒ/22 aperture setting produces a softer image, at 3 units across the frame."

That's a review of a Canon superzoom lens on a 20D.

But my understanding is that where diffraction sets in is entirely a matter of which camera body you're using. It's all about pixel size and aperture, regardless of the lens, as explained here. On that page, they state, "the Canon EOS 20D begins to show diffraction at around f/11." How do I reconcile this statement with the lens review above?

Now I understand that some lenses just aren't that sharp and will show their softness before the camera itself is experiencing diffraction, but that situation doesn't seem to be what the review above is describing. Or is it? Here is an excerpt from another review:

"As you stop down, diffraction limiting seems to begin setting in somewhere between f/8 and f/11, depending on focal length and the size of the pixels on your DSLR body."

Depending on focal length?? Are they saying that (in this case) the Canon 70-200mm f/2.8L is not sharp enough to out-resolve a 20D at f/8-f/11 at some focal lengths?

I'm confused.
 

dmz

macrumors regular
Jan 29, 2007
139
0
Canada
More confusing...

Diffraction blurring is primarily all about focal length and aperture, the camera body and pixel array have very little to do with it. f22 on a 300mm lens is much a much larger opening than f22 on a 50mm lens, but much further from the film plane, so the effect on diffraction blurring somewhat cancels out. Diffraction blurring only depends on the f-stop - or more accurately (I think) the numerical aperture (the inverse of twice the f-stop, i.e f22=1/44 f2 = 1/4. At some point on any lens, stopping down to increase depth-of-field begins to increase the amount of diffraction blurring - in the first example you gave, the 300mm lens gets blurrier past f8.

The pixel sensor does have something to do with it, as the size of the array helps determine focal length, and the individual "pixels" must be small enough to resolve the smallest "spot" (technically - the Airy disk) before they start showing any errors/blurring from the lens and not the pixel array.

All lenses show this error, even a pinhole camera has a resolving limit based on this criteria. Better lenses do not escape this phenomonon, and all lenses are about equal in this regard.

Hoping I haven't confused you further...

dmz
 

anubis

macrumors 6502a
Feb 7, 2003
937
50
This is about to be your authoritative guide on diffraction.

The diffraction pattern of a circular aperture is a sinc-squared function. The central part of the sinc-squared function is the "airy disk", which contains almost all of the light (there's harmonic ringing out to infinity, but doesn't really matter here). The size of the airy disk is what you would call the size of the "blurring" of a single point of light and so we compare the size of the airy disk to the size of a pixel on the detector to decide if we're "diffraction limited". An airy disk that is much alrger than the size of a pixel means your image will be blurry.

Diameter of airy disk = 2.44 * wavelength * f/#. Call the wavelength constant, and you'll see the only parameter affecting the airy disk size is f/#.

Cropped sensors typically have smaller pixels and so would hit the diffraction limit sooner than a full frame sensor.
 

anubis

macrumors 6502a
Feb 7, 2003
937
50
But my understanding is that where diffraction sets in is entirely a matter of which camera body you're using. It's all about pixel size and aperture, regardless of the lens

The first sentence is incorrect. "Where diffraction sets in" is a function of both the pixel size and the F/#, regardless of lens. Remember that F/# is a property of the LENS, not the camera body. What you're dialing in is the size of the aperture, which is a part of the LENS, not a part of the camera body.

"As you stop down, diffraction limiting seems to begin setting in somewhere between f/8 and f/11, depending on focal length and the size of the pixels on your DSLR body."

The diffraction limit only depends on focal length in as much as f/# = focal length divided by the entrance pupil diameter
 

H2Ockey

macrumors regular
Aug 25, 2008
216
0
This is about to be your authoritative guide on diffraction.

The diffraction pattern of a circular aperture is a sinc-squared function. The central part of the sinc-squared function is the "airy disk", which contains almost all of the light (there's harmonic ringing out to infinity, but doesn't really matter here). The size of the airy disk is what you would call the size of the "blurring" of a single point of light and so we compare the size of the airy disk to the size of a pixel on the detector to decide if we're "diffraction limited". An airy disk that is much alrger than the size of a pixel means your image will be blurry.

Diameter of airy disk = 2.44 * wavelength * f/#. Call the wavelength constant, and you'll see the only parameter affecting the airy disk size is f/#.

Cropped sensors typically have smaller pixels and so would hit the diffraction limit sooner than a full frame sensor.

Thank you. This is perhaps the best post i've ever read. For whatever reason I have ever only understood partially the concept. I wanted to comment on the earlier posts, as I knew what was wrong, but couldn't even begin to say what was correct, let alone back up questions to what I would have stated. This post really clears up my fuzzy understanding.
 

Phrasikleia

macrumors 601
Original poster
Feb 24, 2008
4,082
403
Over there------->
The first sentence is incorrect. "Where diffraction sets in" is a function of both the pixel size and the F/#, regardless of lens. Remember that F/# is a property of the LENS, not the camera body. What you're dialing in is the size of the aperture, which is a part of the LENS, not a part of the camera body.

Yes, yes...perhaps I should rephrase. Where diffraction sets in is determined by the particular aperture of any given lens at that aperture...plus the properties of the sensor that determine the pixel size (which in turn establishes the level of correspondence with the airy disk).

I wasn't suggesting that aperture and lens are independent of each other, but that any given aperture will have the same theoretical result, regardless of which particular lens.

So is that correct?

Thanks to Anubis for the pithy summary, which directly contradicts dmz's assertion that the "camera body and pixel array" are essentially irrelevant.

What I'm trying to determine is whether or not one lens at f/11 would experience diffraction sooner than any other lens at f/11 on the same camera body.
 

dmz

macrumors regular
Jan 29, 2007
139
0
Canada
Don't flame me bro!

Sorry if you misunderstood my post, but Anubis agrees with me - the camera body is irrelevant, and as for the sensor:

The pixel sensor does have something to do with it, as the size of the array helps determine focal length, and the individual "pixels" must be small enough to resolve the smallest "spot" (technically - the Airy disk) before they start showing any errors/blurring from the lens and not the pixel array.

Obviously, as with film grain, if the sensor is smaller than the feature being recorded, it is not only irrelevant, but invisible. Diffraction blurring is the phenomenon we are dealing with and that's what I am talking about here. Diffraction limiting is the inability of the medium to resolve the blurring, whether its film or pixels.

I'm just trying to help!

And yes, to answer your question more directly, no two lenses of the same focal length but of different makes will exhibit the same diffraction blurring, again because no two lenses like that would have the exact same pupil/focal length ratio.

dmz
 

Phrasikleia

macrumors 601
Original poster
Feb 24, 2008
4,082
403
Over there------->
Sorry if you misunderstood my post, but Anubis agrees with me - the camera body is irrelevant, and as for the sensor:

By saying "camera body" I suppose I was being too vague, but I was of course referring to the sensor, as the two are hardly separable.

Obviously, as with film grain, if the sensor is smaller than the feature being recorded, it is not only irrelevant, but invisible. Diffraction blurring is the phenomenon we are dealing with and that's what I am talking about here. Diffraction limiting is the inability of the medium to resolve the blurring, whether its film or pixels.

I'm just trying to help!

Yes, and I appreciate it. Was your title "Don't flame me bro" directed at me? Because I neither had any intention of flaming you nor am I a bro (I'm a female).

And yes, to answer your question more directly, no two lenses of the same focal length but of different makes will exhibit the same diffraction blurring, again because no two lenses like that would have the exact same pupil/focal length ratio.

dmz

Perfect. That makes it very clear and is very helpful. Many thanks!
 

ChrisA

macrumors G5
Jan 5, 2006
12,834
2,040
Redondo Beach, California
Diameter of airy disk = 2.44 * wavelength * f/#. Call the wavelength constant, and you'll see the only parameter affecting the airy disk size is f/#.

This is correct as far as it goes. It describes the image that is projected onto the sensor. But the next step is sampling the image. and it's not so simple.

First off the concept of "airy disk" really only applies to images of points of light, like stars. A lens will make a point of like look like a disk, But real images can be thought of as having an infinite number of points. (yes, truly "infinite", not just "a whole lot".) So the image is an infinite number of overlapping disks. So now you ask "If the disks over lap way can't we see detail smaller than the diameter of a disk?"

A better way to think about this is the lens has what's called a PSF or "point spread function. Where it maps a geometric point on the subject to a kind of "blob" on the image plane. So far not much different then the Airy disk idea. But the way to understand the image is that it is a two dimensional convolution of the PSF with the subject. Convolution lets you get out of the problem of dealing with an infinite number of disks. Then a couple centuries ago some smart person figured out that "convolution" in image space is the same as multiplication in frequency space. In other words the fourier transform for the PSF times the FT of the subject equals the FT of the image on the sensor.

multiplication is easy to understand. Look at the frequencies in the FT of the PSF. You know a product can't contain any that are smaller.

I think the above is a rough outline of a proof that that shows the minimum spacial frequency is related the the f#.
 

Phrasikleia

macrumors 601
Original poster
Feb 24, 2008
4,082
403
Over there------->
I think the above is a rough outline of a proof that that shows the minimum spacial frequency is related the the f#.

Right, everyone is in agreement that the f# matters in the equation. Rereading dmz's last post, I'm not sure where the pupil:focal length ratio fits in, though. Assuming that two lenses are equally sharp, then the f# should be all that matters, otherwise we need to plug in some other variable. Or is the lens quality (aside from f#) included in the "wavelength" portion of the equation?

This calculator may be one source of my confusion, since it has no variable for a specific lens, only an f# for any, random lens:

DiffLimitCalc.png
 

MisterMe

macrumors G4
Jul 17, 2002
10,709
69
USA
This is about to be your authoritative guide on diffraction.

The diffraction pattern of a circular aperture is a sinc-squared function. ...
No. The diffraction pattern due to a circular aperture is governed by the square of a Bessel-function. The diffraction pattern for a long straight slit is determined the square of the sinc function.
 

dmz

macrumors regular
Jan 29, 2007
139
0
Canada
Don't flame me Sis!

Sorry, I shouldn't assume! And, as you can see, this gets very technical, and it's complicated by the fact that we're talking about image sensors, not film. It's also complicated by the terminology, which overlaps somewhat - Circles of Confusion (where we are all walking in this thread), diffraction-limiting, point-spread-functions (isn't that how you bet on sports?) and the good-ole Airy disk are all related and describe what is essentially the same phenomenon and help us determine two things - a) how sharp can an image be? and b) when does an image begin to appear out of focus? And this little corner of optics has wider application than one might imagine at first blush - everything from eyeglasses to cinema screens use these criteria to determine applicability in different areas.

As to your other question - there will be slight differences in the construction of lenses, i.e. the number and type of elements - even of the same apparent focal length and at the same aperture - which cause the diffraction blurring to vary from lens to lens, not much, but trade-offs are consciously made by the lens-maker when she is formulating her lens.

Happy lensing!

dmz
 

wheelhot

macrumors 68020
Nov 23, 2007
2,084
269
Interesting choice, sadly not as good as Nikon equivalent, but I gotta admit, if its Canon, I will only look at their primes, the EF-S 17-55 f/2.8, and EF-S something wideangle zoom and L lenses. Other then that, I will just ignore and look for alternatives. I'm sure to get a lot of flaming for saying this, but just my opinion.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.