Page 1 of 2 12 LastLast
Results 1 to 10 of 19

Thread: Diffraction and Depth of Field

  1. #1
    Still Developing
    Join Date
    Jul 2007
    Location
    Leeds, UK
    Posts
    582

    Diffraction and Depth of Field

    I'm trying to come up with an equation to graph defocus against distance for a typical lens given aperture, focal length, etc.

    The thing that is giving me problems is working out how to combine the effects of diffraction and depth of field. Obviously diffraction doesn't just 'stop' once you get to a certain defocus based on depth of field and so you have to combine the two.

    Now using MTF is the best way of combining the two but I don't want to jump in that deep immediately.

    My current method of combining the two is using 1/(1/R + 1/R) using the circle of confusion but some of the articles I've read suggest a root mean square approach.

    Any ideas which may be right?

    Tim
    Still Developing at http://www.timparkin.co.uk and scanning at http://cheapdrumscanning.com

  2. #2

    Join Date
    Aug 2007
    Location
    Indianapolis, Ind.
    Posts
    590

    Re: Diffraction and Depth of Field

    You may want to read through through following thread:

    http://www.largeformatphotography.in...ad.php?t=36745

  3. #3
    Still Developing
    Join Date
    Jul 2007
    Location
    Leeds, UK
    Posts
    582

    Re: Diffraction and Depth of Field

    Quote Originally Posted by aduncanson View Post
    You may want to read through through following thread:

    http://www.largeformatphotography.in...ad.php?t=36745
    Done that. Does it apply to diffraction is the question? The usual MTF's are to do with simple linear degradation of the image. Diffraction appears to behave differently and I wanted to check to see what people think as various sources use the root mean square of the disk diameter or the combined MTF's, both of which provide different figures. I don't know what a diffraction MTF would look like either?

    The RMS calculation gives the following graph where red line is no diffraction and the blue lines includes diffraction



    Tim
    Still Developing at http://www.timparkin.co.uk and scanning at http://cheapdrumscanning.com

  4. #4

    Join Date
    Apr 2004
    Location
    SF Bay Area, California, USA
    Posts
    331

    Re: Diffraction and Depth of Field

    You might take a look at Depth of Field in Depth, under Diffraction. The MTFs don’t include the effects of aberrations, so don’t take the results at large apertures too seriously.

  5. #5

    Join Date
    Aug 2007
    Location
    Indianapolis, Ind.
    Posts
    590

    Re: Diffraction and Depth of Field

    Am I misreading your chart, or do I see diffraction actually improving resolution in certain ranges. That would seem to suggest that you have a sign error somewhere.

    I think that Struan was suggesting that because the Airy disk is steeper than Gaussian, 1/r better corresponds to measured results than1/r^2. The CoC due to defocus is somewhat complex, but would certainly be non-Gaussian. I think that is further reason to use 1/r.

    Best of luck, I am interested in what you come up with - Alan

  6. #6

    Join Date
    Apr 2004
    Location
    SF Bay Area, California, USA
    Posts
    331

    Re: Diffraction and Depth of Field

    Recognize that both linear and root-square (no mean is involved) combinations are rules of thumb that have little theoretical basis, though they’ve been around for a long time—H. Lou Gibson of Kodak discussed this in the 1950s. The calculated MTF for combined defocus and diffraction was developed by H.H. Hopkins in 1955, and as far as I know, it remains the accepted approach. As I mentioned, unless a calculation is made for a specific lens with known aberrations, aberrations are necessarily ignored, so the results at large apertures usually aren’t meaningful. If we’re looking to maximize DoF, though, we’re usually interested in small apertures, so this isn’t a problem.

    Certainly, diffraction cannot improve overall sharpness. But it’s not realistic to compare calculations for pure defocus with calculations for combined defocus and diffraction—you can have diffraction without defocus, but you cannot have defocus without diffraction. My Figures 7–10 show pure defocus, and seem to suggest that in some cases the combined MTF is better, but the pure defocus is an impossible condition—I’ve shown it only because everyone else does. A more realistic take is to note that the combined curves (in black) always have lower MTFs that those for pure diffraction (in green)—in other words, defocus always decreases sharpness. Presumably, this is not a surprise to anyone here.

    Whatever the approach, using root-square combinations of defocus and diffraction blur spots (as Gibson and Hansma did), or using calculated MTFs for combined defocus and diffraction (as I have done), there is usually an optimal aperture. Hansma and I assumed that in most cases, the focus spread is fixed—the camera position is chosen for the best composition, the lens is selected to provide the desired framing, and the focus spread is then determined by the required DoF. This leaves only one control—the aperture. In the plane of focus, there is a tradeoff between aberrations and diffraction—once the lens is “diffraction limited” at moderate to middle apertures, additional stopping down softens the image because of increased diffraction. At the DoF limits, defocus additionally softens the image, and the tradeoff is usually between defocus and diffraction. Initially, decreasing the aperture decreases defocus, and increases overall sharpness. But at some point, the softening from diffraction exceeds the gains from decreasing defocus, and stopping down further decreases sharpness even at the DoF limits. If you look at my Figures 11, 12, 15, and 16, diffraction places an upper bound on sharpness. And this is for an ideal lens with no aberrations—with any real lens, the sharpness will be less than that shown.

    In summary: again, for fixed focus spread, up to a point, stopping down improves sharpness at the DoF limits by decreasing defocus blur. But at some point, the losses from diffraction exceed the gains from decreased defocus, so that further stopping down decreases sharpness. The effect is illustrated in my Figure 13—there is an optimal aperture for every focus spread, and as focus spread increases, the resolving power at the optimal aperture decreases. The results aren’t radically different from Hansma’s using root-square combinations of defocus and diffraction. Obviously, anything than can be done, especially use of tilt or swing, to reduce the focus spread will reduce the required f-number and increase the maximum possible sharpness.

  7. #7

    Join Date
    Jul 1998
    Location
    Lund, Sweden
    Posts
    2,214

    Re: Diffraction and Depth of Field

    Hi all :-)

    The fundamental reason why you inevitably ratchet downwards in resolution is that the blurs introduced by aberrations and defocus and by diffraction are uncorreleated. The two physical processes spreading the light out are independent and do not influence each other. One just blurs the blurred result of the other, and you always get extra blur, even if only by a little bit.

    You can imagine taking some ideal Platonic capture of the image, blurring it a bit to represent aberrations, and then blurring it some more for diffraction. Mathematically the blurring is done with a convolution: imagine taking some of the light in each pixel and spreading it out into neighbouring pixels. Do the same for all the pixels in the image and each new pixel becomes a weighted sum of itself and it's surroundings (there are, of course, analytical approaches which handle the non-digitised continuous analogue case).

    The finicky details are in the weightings of the surrounding pixels, or, equivalently, in the kernal of the blurring function. Very, very often, it is assumed that the kernal is a Gaussian bell curve shape. That is because a lot of physical processes do indeed produce a Gaussian shape, but also because it's a good approximation to many other shapes, and because a wonderous piece of maths called the Central Limit Theorem means that combining repeated measurements tends to make the overall shape of the kernal converge to a Gaussian.

    There is also laziness and convenience: a Gaussian can be handled analytically, since you can prove all sorts of useful general theorems about how convolutions of Gaussians give new Gaussians with widths which are simply related to the ones you started with. That is where the 1/R2 + 1/R2 rule comes in. In real life things are not that simple: for example, the commonest lineshape for atomic spectra is a Lorentzian, and that doesn't even have a defined variance. You can't come up with a definition of 'width' and you have no choice but to do the convolution explicitly (or cheat, and use a Gaussian).

    Note that in real life, none of the functions affecting blur in photographs is a Gaussian. Aberrations produce the complex functions seen in spot diagrams, Pure defocus is a simple geometric shape, and diffraction is an Airy sinc function (for a circular aperture). There is no reason whatsoever to assume that the combination of blurs should follow a 1/R2 rule.

    MTFs come in because they are one part of the Fourier Transfer of the error kernal. Another useful theorem says that instead of convolving two functions (which is time consuming, even for a computer) you can instead just multiply their Fourier Transforms together. Thus combining errors, or adding the effects of multiple blurring mechanisms, becomes a simple matter of multiplying the MTFs. The only issue is that you need to keep track of phase, and MTFs only handle magnitude - in 'real' calculations you use the full Optical Transfer Function, which includes phase.

  8. #8
    Still Developing
    Join Date
    Jul 2007
    Location
    Leeds, UK
    Posts
    582

    Re: Diffraction and Depth of Field

    Quote Originally Posted by Jeff Conrad View Post
    You might take a look at Depth of Field in Depth, under Diffraction. The MTFs don’t include the effects of aberrations, so don’t take the results at large apertures too seriously.
    I shall go away and read - I may be some time.. :-)
    Still Developing at http://www.timparkin.co.uk and scanning at http://cheapdrumscanning.com

  9. #9
    Still Developing
    Join Date
    Jul 2007
    Location
    Leeds, UK
    Posts
    582

    Re: Diffraction and Depth of Field

    Quote Originally Posted by Struan Gray View Post
    Hi all :-)

    The fundamental reason why you inevitably ratchet downwards in resolution is that the blurs introduced by aberrations and defocus and by diffraction are uncorreleated. The two physical processes spreading the light out are independent and do not influence each other. One just blurs the blurred result of the other, and you always get extra blur, even if only by a little bit.

    You can imagine taking some ideal Platonic capture of the image, blurring it a bit to represent aberrations, and then blurring it some more for diffraction. Mathematically the blurring is done with a convolution: imagine taking some of the light in each pixel and spreading it out into neighbouring pixels. Do the same for all the pixels in the image and each new pixel becomes a weighted sum of itself and it's surroundings (there are, of course, analytical approaches which handle the non-digitised continuous analogue case).

    The finicky details are in the weightings of the surrounding pixels, or, equivalently, in the kernal of the blurring function. Very, very often, it is assumed that the kernal is a Gaussian bell curve shape. That is because a lot of physical processes do indeed produce a Gaussian shape, but also because it's a good approximation to many other shapes, and because a wonderous piece of maths called the Central Limit Theorem means that combining repeated measurements tends to make the overall shape of the kernal converge to a Gaussian.

    There is also laziness and convenience: a Gaussian can be handled analytically, since you can prove all sorts of useful general theorems about how convolutions of Gaussians give new Gaussians with widths which are simply related to the ones you started with. That is where the 1/R2 + 1/R2 rule comes in. In real life things are not that simple: for example, the commonest lineshape for atomic spectra is a Lorentzian, and that doesn't even have a defined variance. You can't come up with a definition of 'width' and you have no choice but to do the convolution explicitly (or cheat, and use a Gaussian).

    Note that in real life, none of the functions affecting blur in photographs is a Gaussian. Aberrations produce the complex functions seen in spot diagrams, Pure defocus is a simple geometric shape, and diffraction is an Airy sinc function (for a circular aperture). There is no reason whatsoever to assume that the combination of blurs should follow a 1/R2 rule.

    MTFs come in because they are one part of the Fourier Transfer of the error kernal. Another useful theorem says that instead of convolving two functions (which is time consuming, even for a computer) you can instead just multiply their Fourier Transforms together. Thus combining errors, or adding the effects of multiple blurring mechanisms, becomes a simple matter of multiplying the MTFs. The only issue is that you need to keep track of phase, and MTFs only handle magnitude - in 'real' calculations you use the full Optical Transfer Function, which includes phase.
    Hi Struan,

    in the case of diffraction and defocus, would you expect the end result to always be worse that the worse of the two? My RMS calculation gives a better result than straight defocus for areas far away from the focus point. I'm presuming this is why 1/R might be better? (actually - is my mistake using RMS i.e. d = sqrt(a^2/2 + b^2/2) when it should be root sum of squares? d = sqrt(a^2 + b^2).

    Tim "just trying to come up with an approximation but would love to know the facts" Parkin
    Still Developing at http://www.timparkin.co.uk and scanning at http://cheapdrumscanning.com

  10. #10

    Join Date
    Jul 1998
    Location
    Lund, Sweden
    Posts
    2,214

    Re: Diffraction and Depth of Field

    Quote Originally Posted by timparkin View Post
    in the case of diffraction and defocus, would you expect the end result to always be worse that the worse of the two?
    Yes.

    If a is the width of the blur introduced by the film, and b is the width of the blur for diffraction, then the combined width, d is given crudely by:

    d = ab/sqrt(a^2+b^2)

    Empirical testing has shown that the following is closer to the truth:

    d = ab / (a+b)

    Both give an answer which is larger than a, or b individually.

    If a' and b' are 'resolutions' in lp/mm or similar units, you have already taken the reciprocal, and the formulae are:

    d' = sqrt(a' + b')

    d' = a'+b'

    Again, the latter has been found to give a better fit to optics shining light onto film.


    Note that digital and analogue light capture can both lead to an MTF from the recording medium which is higher than 1, i.e. contrast is increased at some spatial frequencies. The integral of the MTFs over the whole passband is limited (otherwise energy would not be conserved during capture), but some parts can be higher than unity if others are less to compensate. Film does this through adjacency effects in development, digital usually with aliasing.

    But. When you combine such an MTF with the effects of diffraction, it always reduces. Not necessarily to less than unity, but certainly to less than the value without diffraction.

    Note also that many published MTFs are downright vague about normalisation, even if they got it right in testing. I wouldn't get too hung up about the *value* of the MTF, more with how it varies across the passband, and spatially across the image frame.

Similar Threads

  1. To owners of 600mm Fujinon C lens
    By Marco Annaratone in forum Lenses & Lens Accessories
    Replies: 12
    Last Post: 30-Apr-2021, 12:28
  2. DOF question
    By Joe_1422 in forum Style & Technique
    Replies: 11
    Last Post: 23-Jan-2012, 16:43
  3. Format for 60x75 inch Gallery Prints - 4x5 or 8x10?
    By e2aa in forum Cameras & Camera Accessories
    Replies: 60
    Last Post: 11-Aug-2009, 21:47
  4. Rodenstock depth of Field Calculator and Sinar Norma
    By Justin Coombes in forum Style & Technique
    Replies: 13
    Last Post: 6-Apr-2007, 11:52
  5. Depth of Focus ... related to focal length or not?
    By Erec Grim in forum Cameras & Camera Accessories
    Replies: 26
    Last Post: 16-Jan-2002, 17:25

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •