I'm throwing this up as a new thread, even though it arises out of a concurrent thread on Quickloads. (I don't want to disrupt the flow of deep philosophy dominating that series of exchanges...)

On a side discussion of edge diffraction, Jorge noted, in part, "But.....when I was working with 4x5 I had some instances where difraction ruined my shots. For you and your printing method it might not be a consideration, but for others it might be. IMO it is good to be aware of the possibilities."

I'm familiar with edge diffraction in theory, and am trying to make it have a theoretical impact in my head. Mind you, I'm just doing the math off the top of my head, and I'm an ignorant fool besides... But, assuming an extremely small f/stop of 1mm (that's f/64 on a 65mm lens, or f/90 on a 90mm lens), and a wavelenght of light around 500 nanometers, (visible light is 400-700 nm), the area along the circumference that would affect light (pi x .oo05, where pi is the circumference in mm and 0.0005 is the mm value of 500 nm) is 0.oo157 square mm out of a total aperture area of 0.785 square mm. That comes out to 1/500 of the passing light possibly being affected by edge diffraction of the aperture.

I don't think 1/500 of the light being affected by diffraction would ruin an exposure. Anyone a bit more familiar with optics/physics/math care to clue me in where my logic is failing?

BTW,I think my high school students changed my posting name to "William Mortensen", and I've been trying to change it back. We'll see how it comes out this time...