PDA

View Full Version : Optics: resolution versus contrast ?



Ken Lee
13-May-2014, 06:44
I have an amateur Physics question about about scanner resolution, and perhaps about all lenses for that matter.

It seems that there's a reciprocal relationship between resolution and contrast. Is this due to the wave nature of light - or to the Uncertainty principle ?

With regard to making actual prints, what does it mean that with a given capture device, we get high resolution but low contrast ? Can judicious use of sharpening tools help ?

paulr
13-May-2014, 07:37
I think the reciprocal relationship is more anecdotal than anything else. It just doesn't make much sense.

Resolution and contrast are always related, because when we talk about lens resolution in the old fashioned sense, we mean extinction relolution, which is the point at which the contrast drops so low we can't see it anymore. No contrast, no resolution. This relationship is direct.

You can compare MTF curves and see that some lenses have a longer, shallower tail ... meaning, higher extinction resolutions but lower contrasts at moderately high resolutions. Others have a steeper drop off ... they may have higher contrast at moderately high resolutions, but then drop to zero very quickly. I think the existence of lenses like this, which such different characteristics, gives rise to the idea that you can have one without the other. But plenty of good lenses have both, and plenty of lousy ones have neither.

Nathan Potter
13-May-2014, 10:19
Definitely a reciprocal relationship between resolution and contrast but the value of the reciprocal is not consistent between all lenses. This is fundamentally due to the wave nature of light but from a practical point of view the loss of contrast with decreasing spacial frequencies is due to diffraction effects (higher order interference fringes) combined with failure of the optical design to completely eliminate all aberrations such as spherical, astigmatism, coma and for color spherochromatism.

Scanners use lenses so they contribute to what Paul above calls extinction resolution but the finite spacing of a row of pixel sensors contributes at least equally pervasive degradation of the contrast at higher spacial frequencies. It should be intuitive that as the detail in the image (or line pitch of a target) exceeds the pixel pitch in the sensor array the contrast washes out.

The Heisenberg uncertainty principle is not directly relevant to the resolution/contrast discussion since it is confined to the explanation that it is not possible to precisely measure the state of a particle since the act of measuring will disturb the state that is attempting to be measured.

Sharpening tools take a look at small details in an image then increase the contrast within the tiny sampled area to yield an apparent increase in sharpness. I think this is useful in some instances depending on the proclivities of the artist but the process does not contribute any new information from the original negative.

Nate Potter, Austin TX.

paulr
13-May-2014, 11:44
Nathan and I might be interpreting the question differently. Maybe I misunderstood it. I thought Ken was asking about why some lenses seem to be high contrast / low-resolution vs. the opposite. And my answer was that there's no imperative reason.

Nathan I think is pointing out that with any given lens, as the spatial frequency of image detail goes up, the contrast goes down. This is indeed a reciprocal relationship, and not confined to photography or optical systems. You'll see the same kind of curve in any representation of frequency response in any signal system.

I couldn't tell you the exact physical principles at work here, except to say that they're unrelated to the uncertainty principle or anything quantum. Sound waves and waves in water behave the same way. I bet someone here can explain it concisely. An extra dollar if they can do it without the phrase "fourier transform."

The Heisenberg principle, btw, is not actually about measurement. It often gets conflated with the Observe Principle, which is another idea in physics, that some phenomena can't be observed by some means without being influenced. But heisenberg was talking about the probabllistic relationship between position and velocity that exists because of the wave nature of particles, regardless of whether or not they're being observed.

Peter De Smidt
13-May-2014, 12:54
I can't speak to the theoretical matters, but a good use of sharpening can bring out detail that was obscured without it. You can see this with photos of a high resolution targets. You might gain a group, or so, better results with sharpening.

Struan Gray
13-May-2014, 13:22
I am not an optical designer, or even someone who has spent any real time looking at 'thick' lenses in a mathematical way. However, the giants whose shoulders I find myself standing on are perhaps slightly taller, and more likely to be facing in the right direction, than those stood on by the more pure artists among us here.

I think that when internet lore talks of a tradeoff between resolution and contrast, they are either full of it (it happens :-), or they are referring obliquely to how an optical designer balances residual aberrations against each other to achieve a final look in how the lens renders. However, the effects this produces are all grouped at spatial frequencies near to the resolution limit, so this is certainly not a case where overall contrast is related to resolution - rather, the ultimate possible resolution is generally reduced in favour of higher contrast at frequencies just below the resolution limit. Usually this makes the bokeh uglier too, but it does give lovely sharp edges for small features.

The reason is that the MTF cannot just have any old shape. In particular, if it is normalised properly, then conservation of energy means that a reduction somewhere in the MTF-vs frequency plot must be balanced by a boost somewhere else. So you can squish the response out at the end of the tail of the function at very high frequencies, and add that to other frequencies nearby.

One factor not often considered by photographers is that the MTF is only half the story. The full story, the Optical Transfer Function, or OTF, includes a phase part as well. Note that this refers to the phase of the spatial features, not the phase of the light waves. Squidgeing the tail of the response to boost slightly lower frequencies can have dramatic phase effects because the MTF and phase plots are hard-wired to each other: squish one, and you distort the other. This can sometimes be seen in USAF resolution charts where the four black bars of groups near the resolution limit of the lens are replaced by only three, often quite distinctly-defined, but fewer in number. In this case the spatial phase has inverted so that black becomes white and vice versa.

It would be fun to mount a Cooke on an MTF machine and measure the MTF as various degrees of diffusion are dialled in. Perhaps someone can get Roger Cicala at Lens Rentals interested. This would be one of the few cases where aberrations (often known ones) can be added in to the lens design in a controlled way without introducing too many new variables at once.

The Heisenberg principle does crop up in measurement, because the wave-mechanics considerations from which it was derived relate the spatial and frequency widths of 'wavepackets' or groups of similar waves. It describes exactly how diffraction spots get broader at smaller apertures, in a way which has a direct correspondence to the usual momentum-position formulation of the quantum mechanical version.

But the consideration here has more to do with the functions you use to describe the spatial distribution of light, and those that describe how that distribution is changed by passing through an optical system. Both sorts of functions are bound by rules. These include the conservation of energy and momentum familiar from school physics, but also more complex and less intuitive constraints such as the link between the phase and amplitude response of the OTF. Fourier Transforms are indeed involved, but also more exotic stuff like Hilbert and Zernicke Transforms.

sanking
13-May-2014, 13:38
"One factor not often considered by photographers is that the MTF is only half the story. The full story, the Optical Transfer Function, or OTF, includes a phase part as well. Note that this refers to the phase of the spatial features, not the phase of the light waves. Squidgeing the tail of the response to boost slightly lower frequencies can have dramatic phase effects because the MTF and phase plots are hard-wired to each other: squish one, and you distort the other. This can sometimes be seen in USAF resolution charts where the four black bars of groups near the resolution limit of the lens are replaced by only three, often quite distinctly-defined, but fewer in number. In this case the spatial phase has inverted so that black becomes white and vice versa."

I have done a lot of tests with the USAF resolution target and never observed this specific kind of inversion. Would one be more likely to see it with film or digital tests, or cameras or scanners?

Struan Gray
13-May-2014, 13:50
I have done a lot of tests with the USAF resolution target and never observed this specific kind of inversion. Would one be more likely to see it with film or digital tests, or cameras or scanners?

In most digital tests, the lens and the digital sensor have similar resolution limits, and this kind of phase effect gets lost in the (equally odd-looking) phase effects of aliasing. You need a sensor which handily out-resolves the lens. I remember seeing it in (other people's) tests using USAF charts on tech pan. It was common enough to warn that the spatial group defining the resolution limit of a lens had to have four observable bars, and not three.

I was reading photo.net fairly regularly back in those days, I'll see if I can dig up anything there. If you want to recreate it, a poor-resolving lens with bad nisen bokeh on a high pixel density sensor should be able to show it, if the sensor is dense enough not to contribute aliasing artifacts.


PS: I was miss-remembering how many bars the USAF chart had. For four read three and for three two. :-)

PPS: http://toothwalker.org/optics/spurious.html is a nice demonstration. Any blurring function (aberrations, diffraction, defocus) can produce it, but just how much and how dramatic it is depends on the how the pattern and the blurring interact.

Drew Wiley
13-May-2014, 14:01
Different film/dev combinations obviously create a distinction between mere detail and perceptible acutance, largely related to edge effect. But lenses also sometimes
exhibit an analogous contradiction. An allegedly "sharper" lens per MTF might handle detail less successfully than a lens of a different design, but lower hypothetical
MTF. A lot of this can get pretty subjective. But it is also related to degree of enlargement in the print, to whether color stimulii are in effect, such as simultaneous
color contrast, all kinds of things. I don't personally regard PS sharpening following a scan to be even related, because it can be applied to any or all the above.
So can unsharp masking tricks in fully analog technique. But in terms of hard physics, blue light obviously scatters more than red light.... Anything beyond that
simple fact gets involved with the optics, engineering, and software of the specific equipment. There's a world of difference between how a drum scanner works
and the average amateur flatbed. The practical implications are a lot less forgiving for small format work than large.

IanG
13-May-2014, 14:47
An interesting question but in my experience far more noticeable with 35mm which like 120 is a miniature format (according to pre-WWII and early 1950's books).

If you compare Leitz lenses to comparable Japanese lenses there's not a reciprocal relationship between resolution and contrast, the Leitz lenses have greater resolution the Japanes greater (micro) contrast. In prints using the same film you can see a subtle differance.

Ian

Emmanuel BIGLER
13-May-2014, 15:10
A few remarks about physics since the original question mentions basic principles of physics.
Heisenberg's uncertainty principle is actually closely related to Fourier analysis.
In classical signal processing theory, e.g. in electronics, the width of a certain function in the time domain, is inversely proportional to its width in the Fourier (frequency) domain and the same applies t o a wave-function modelling a particle in micro-physics.
This also applies to the analysis of image quality delivered by a diffraction-limited lens, for very small fields close to the optical axis.
In this very restricted case, the product of the aperture width in the exit pupil plane by the diffraction spot width in the image plane is constant and this is nothing but a basic Fourier transform property since the distribution of light in the focal plane is computed through the Fourier transform of the distribution of light in the exit pupil plane.
This approach through a Fourier analysis in optical systems was pioneered by a French professor, Pierre-Michel Duffieux, as early as 1946. But there was no laser at the time and Fourier optics actually started in the sixties with the use of laser sources and coherent optical imaging.

A gaussian laser beam also satisfies this kind of relationship, the product of the beam minimum width ("beam waist") by the beam angular divergence cannot be smaller than something like the wavelength of light, and this again is similar to any Fourier property.

BUT in our real photographic lenses, except when we stop then down so heavily that they are close to the diffraction-limited model, but not quite as poor as a pinhole ;), as mentioned above we can have any kind of FTM curve shape, except that all FTM curves HAVE to go to zero at the ultimate diffraction limit.
There is an interesting example in Stroebel's book "View camera technique" showing macro shots of playing cards, one with a lens with poor ultimate resolution but good contrast for large-size details of the image, and the other where the resolution limit is very high but were contrast for large-size details is poor. The 1st kind of "bad' lens delivers a more pleasant image that the 2nd type. (Leslie Stroebel, "View camera technique" 7-th ed. page 116 - ISBN 0240803450)

And regarding Peter's remark :
a good use of sharpening can bring out detail that was obscured without it.

yes, but definitely
- not beyond the ultimate diffraction limit, the ultimate analog period that can be detected behind any lens is N.Lambda where N is the effective f-number (N.(1+M) where M is the magnification ratio) and Lambda the wavelength or light;
- not beyond Shannon's sampling limit defined by the pitch of the digital sensor. Ideally the sensor pitch should be 2 times smaller than the ultimate optical analog period, 2 sampling points are mandatory per analog period.

Needless to say that in most film- or digital- camera systems of today, we are in most of the cases still quite far from those ultimate limits.
And this is the reason why sharpening tools do have a proven visual effect ;)
But if you discuss with friends who recenty acquired a high-end 35 mm digital SLR or even better, a high-end medium format silicon back, they will complain about diffraction: hence if we trust our good friends, their faithful lenses are now diffraction-limited ;-)

And regarding the inversion of contrast in a defocused image, it is very easy to display the phenomenon to a large audience using an overhead projector that you gradually defocus.
Do not choose a USAF-like target (and again a Frenchman will refer to Foucault target, used by the French physicist Léon Foucault in the XiX-th century, many decades before the USAF actually was created ;)) but prefer a radial target known in Germany as "Siemens Star" (http://en.wikipedia.org/wiki/Siemens_star). This object is terrific to reveal defocusing. And printed on a piece of paper it gives you a headache since you can see the radial lines "vibrating".
It is due to the constant fluctuations of accommodation our our eyes, combined to the various amounts of defocusing & astigmatism or our vision.
Or simply if you are short-sighted, look at the printed radial target with your naked eyes, and from a distance where the target is blurred, get closer and see the inversions of contrast moving from the periphery of the target to the center !! A headache, I promise ;)

Jim Jones
13-May-2014, 15:24
. . . One factor not often considered by photographers is that the MTF is only half the story. The full story, the Optical Transfer Function, or OTF, includes a phase part as well. Note that this refers to the phase of the spatial features, not the phase of the light waves. Squidgeing the tail of the response to boost slightly lower frequencies can have dramatic phase effects because the MTF and phase plots are hard-wired to each other: squish one, and you distort the other. This can sometimes be seen in USAF resolution charts where the four black bars of groups near the resolution limit of the lens are replaced by only three, often quite distinctly-defined, but fewer in number. In this case the spatial phase has inverted so that black becomes white and vice versa. . . .

In the enclosed file of a more compact test chart than the USAF 1951 chart, the thumbnail exhibits two or only one or two bars instead of three. Downsizing digital files has an effect similar to lenses near their resolution limit. The full GIF file is correct. Rudolph Kingslake, on pages 61 and 62 of the 1951 Lenses in Photography, shows examples of three- and four-bar charts that survive the printing process in decent condition, and an example of another four-bar chart that a lens has distorted to exhibit only three and two bars.

Drew Wiley
13-May-2014, 15:27
The way all these different lenses handle color, then color in relation to specific kinds of dye clouds, then, alas, in relation to digital sensors... all gets pretty complicated. First it was achromatic correction, then apochromatic... But even that has relative standards of definition, and not always with both longitudinal and
tangential correction in mind. It is microphotographers who sweat over this stuff. And for serious budgets, they have some pretty sophisticated ways of correcting
things that we do not (and really do not even need in the first place). But even when using small cameras, there are times when the "look" of an older lens outweighs its more modern super-crisp counterparts. I realize I'm not hitting the nail on the head at all when it comes to Ken's question. Sorry, Ken. But it would
probably take a few graduate courses in optical engineering to do that.

onnect17
13-May-2014, 18:51
Interesting thread. Perhaps that's the reason the film specs includes data about the resolution in two contrast levels, 1.6:1 and 1:1000.

I wish the approach to evaluating the resolution was in a better fit to the chart. Instead of using Fourier I would use square waveforms, like the Walsh Transform.

Also, Shannon is a requirement but not sufficient. Here comes the phase of the sampling (that's one of the reasons why Canon does not want to remove the filter in front of the sensor). Besides, to assume the sampling frequency is the same as the distance between pixels implies evaluating the response to white light, to include all the colors in the bayer design to properly extract the luminance, unless, of course, we use other sensor designs like Foveon or a simple monochrome sensor.

In the case of the scanners, the sources of error are linked to the technology behind the device. In most flatbed scanners light ending in the neighboring pixels in the linear CCD due to dispersion and cheap optics is quite common. To make it worst, the pixels are contiguous in the perpendicular direction of the scan, so the effective resolution is already different by axis.

You can always print an RGB version of the USAF chart at different contrasts and take a shot with Provia (or dupe) then scan it and evaluate the results.

Leigh
13-May-2014, 19:01
Digital resolution, be it in-camera or in-scanner, cannot be evaluated the same as resolution on film.

The digital world is divided into distinct elements, while film is not.

You can easily create a black/white line array that will resolve perfectly at one specific position relative to the sensor, but which will turn into uniform gray when the image is moved a distance equal to one-half of the sensor element pitch at the sensor plane.

- Leigh

Brian C. Miller
13-May-2014, 19:39
I have an amateur Physics question about about scanner resolution, and perhaps about all lenses for that matter.

You ran out of film, didn't you? Or it has been raining too much for you to go outside. You need to go and expose more film. You know you need to do it. Your camera is calling for you. "Use me! Please! It's dark in here! Let me see the light!" But you've been ignoring it. Letting it waste away. Alone. Lonely. It's there, pining to be used.

Show your camera some love. Go and photograph!

Emmanuel BIGLER
14-May-2014, 02:05
Optics: resolution versus contrast, a fascinating topic!

A few words more before, Ken runs out of film. And as soon as he runs out of film, he should run-in (if he is doing outdoor photography) to his computer and get a copy of this most valuable reference book:

Image clarity: high-resolution photography, by John B. Williams
Focal Press, 1990 - ISBN 0240800338

http://books.google.fr/books/about/Image_clarity.html?id=BdpTAAAAMAAJ&redir_esc=y (http://books.google.fr/books/about/Image_clarity.html?id=BdpTAAAAMAAJ&redir_esc=y)

1990! The Good Old days! The end of the Cold War!

Imagine all those "armies" of scientists & engineers working 24/7 on military contracts, on both sides of the Iron Curtain, and pushing resolution and contrast of their aerial or satellite reconnaissance imaging systems to the best of the best!
As a civilian taxpayer, I rejoice that all this effort is now partially de-classified and that we can get the benefits of top-class lenses, films, silicon image detecfors, digital imaging systems and digital enhancement software, that were initially developed prior to 1990.

-------------------------------------

MTF curves for film and lenses are good tools and good excuses for endless discussions on our forums, but actually MTF curves fail to supply a global model explaining what humans actually see.

MTF theory implies a transfer of small modulations, but humans in general, and we LF aficionados prefer to look at a real image in the real world and our human LFer's eyes are extremely sensitive to abrupt contrast changes, edge effects, combining a global vision with a scanning-like analysis.

Abrupt contrast changes are typically the kind of features not covered by a theory dealing with a linear transfer of small intensity or density modulations.
The human eye likes to evaluate luminances in a logarithmic scale, Saint Ansel's Holy Zones are extremely non-linear in terms of intensity transfer! Hence our detector is so non-linear, that all silicon sensors, which are stupidly linear, are required to eventually deliver an image file to be re-computed through some non linear gray scale, i.e. through a non linear transfer function!
Exit the MTF theory since no linearity is still valid!

And MTF curves never take veiling glare into account. MTF curves are re-normalized for 100% of contrast at zero spatial frequency, they can represent both the transfer of small intensity modulations or small optical density modulations. A perfectly sharp image on which you superimpose some uniform amount of stray light will appear less sharp, even if the small details are always resolved.

Contrast and sharpness in the image are not an intrinsic property of the lens and proper focusing set-up, the way the object is illuminated plays an important role.

It is well-known that fine-art B&W printers prefer diffuse enlarger light heads to point sources. Point sources are mandatory in microfilm reading systems, since they deliver better image contrast and better resolution (combined, of course to a highly specialized microfilm enlarging lens); but for fine art prints, point sources are a pain since all scratches and dusts are enhanced.

There is a general rule of optical transfer, stating that incoherent light systems can never enhance high spatial frequencies in the image, thay can just smear them out more or less rapidly, see the superb example page 116 of Stroebel's book.

However, partially coherent imaging system play in another league, and enhancements of the contrast of fine features is possible. For example the "dark field" setup in a conventional optical microscope. Hence it is possible both in principle and in practice (microscopes) to enhance the contrsat of fine details without any digital post-processing.
But progress in digital image processing make digital enhancements so easy, that the only thing that the old grumbling analog guy can say is "Eh, young boys & girls, believe me, you'll never be able to recover everything which is blurred" ;) It's like in the story of the bumblebee: according to physicists, it is impossible that the bumblebee can fly. Aficionados of digital post-processing do not care for the warnings of old physicists, and continue to post-process digitally like the bumblebee continues to fly.

The situation in regular photography is close to the enlarger setting with a diffuse source, i.e. perfectly spatially incoherent illumination. Point sources belong to the category of partially coherent illumination systems, and MTF curves differ when the same lens is used with various kinds of illumination setups.

My understanding is that drum scanners like the Heidelberg Tango use an illumination setup close to the classical micro-densitometer setup with a pair conjugate slits and a pair of microscope lenses, one for illuminating a tiny portion of the film and one for recording the light after crossing the film.
This kind illumination system with conjugate slits is one of the most complex in terms of optical physics, it is not like the diffuse head and not like the point source used in microfims readers, it is something else.
But it is extremely efficient in terms of resolution (providing that the focus is always perfectly maintained) and rejection of stray light. Again here, contrast and resolution are highly intricated.

BUT the drum scanner only records a very tiny object field at a time, even much smaller than the field actually covered by a microscope lens, since the field is defined by the conjugate pair of slits.
On the other extreme, consider the basic setup of taking a picture of your 4x5" film with a 35 mm DSLR. All image points have to "cross the lens" at the same time. In this setup, even if we do not speak about stray light and minimum achieveable density recording, the maximum number of points actually resolved is ultimately limited by ... the diffraction cut-off period as stated above fo incoherent light illumination.
This has been well-known since the 1950's, simply put together Duffieux ideas (1946) with Nyquist's (1928) Shannon's ideas (1949) and you get the number of resolved points.
For example if the image field is square and represents 20x20 degrees of field angle (about f/3 by f/3 in size), the maximum number of resolved points at lambda = 0.7 micron for a N=f/4 theoretical diffraction-limited lens with f=100mm is around 550 millions. (1000*33/1.4)2
This figure should make us optimistic regarding the future development of digital imaging systems and the durability of one of our preferred discussions here about DSLR scanners, initiated by Peter J. de Smidt ;)

Drew Wiley
15-May-2014, 15:37
Interesting, Emmanuel. But part of the whole complexity of this question is how to weave together the aspects of the measurable physics of optics and the consequences of the physiology of human vision, which is largely ignored in these kinds of debates. I had quite an expert in here in microscopy yesterday, who is
also a collector of microscopes, hundreds of them it seems. I was chatting to him about simple film photography with sheet film adapters and Zeiss microscopes,
and was shocked to learn that he once had to literally throw away dozens of such units at a govt facility because nobody wanted to buy them. Dang, those things
were worth at least ten grand apiece. Nowadays these kinds of rigs really operate hybrid, cost five times as much, but don't have the mechanical quality of build as traditional scopes in the 60's and 70's. It those kinds of folks who really understand these question better than most. Nikon has an excellent site devoted strictly to microscopy and photomicroscopy (now mostly video), since they supply a lot of the current medical industry. My contact here at Tinsley Labs has passed away. They made some of the most expensive lenses in the world.

Struan Gray
16-May-2014, 00:06
One caveat Drew: a high-end research microscope of today is considerably 'better' built than the equivalents from the 60s. Less backlash in the focussing, more precise repeatability when changing objectives, better light sources, and lenses which are miles better in terms of quality of image across the whole field. That's before you get into the advantages of infinity conjugates (the lack of which is the likely reason your friend's models were 'obsolete'), phase contrast and DIC imaging, and the ability to work well with piped-in light sources like lasers for fluorescence. The up-to-date Nikon and Ziess models I have worked on are much, much better than the similar stuff from thirty years ago.

Which is not to say that amateurs and schools would not have welcomed them with open arms, if only they had known. Just the focussing mechanisms would be worth scavenging for the focus stackers amongst us. But institutions think in terms of administrative load, not the abstruse benefits of recycling useful stuff. It's why I used to haunt the dumpsters at the University :-)

Emmanuel BIGLER
16-May-2014, 09:31
From Drew:
how to weave together the aspects of the measurable physics of optics and the consequences of the physiology of human vision, which is largely ignored in these kinds of debates.

Yes, I really agree, we share the same point of view. This is the reason why when we organise our informal LF gatherings, we put the emphasis on sharing real images, real prints, real slides (I try not to bring with me my catalogues of MTF charts ;-) )

Regarding the use of optical microscopes, I try to do my best in my practical hand-on microfab courses for engineering students to train them to minimal skills in the use of an optical microscope. They have to prepare a report on what they have done and they have to take pictures of the small objects they have fabricated during the course. Few of them actually have even a minimum knowledge of how to use an optical microscope. So they have to manipulate the microscope before playing with computer images ;)
In several industries in the field precision engineering, instruments (like mask aligners), not relying on the operator's visual skills (thanks to computer monitors) are now preferred to direct-optical viewing positioning systems.
In modern CNC machines or modern submicronic mask aligners, I doubt that operators ever look into an eyepiece;
unlike in the good old days of manually operated milling machines, sometimes you had to mount a kind of a microscope to precisely position the tool.

However today in micro-surgery, surgeons do have to use a binocular microscope, and watchmakers do not hand-mount a complicated watch by looking at a computer screen ! (most watchmakers now prefer the binocular loupe to the traditional "one eye" watchmakers's loupe; in jewelry, stone setting is done with the help of a binocular microscope, no computer screen yet! (http://alexandreschool.com/cms/))
But this might be the exceptions ? I do not know.

Drew Wiley
16-May-2014, 10:00
Oh I agree with you, Struan... but the older pure optical scope are just so much more fun, esp if you're just peeping at a drop of pond water with dark field. I'm interested in the visual experience at this point in life, just like I was in childhood, not the research aspects. What I do suspect, however, is that some of the specialized methods of hybrid correction being used in modern microscopy have been adapted to large format film use too, namely, in survelliance applications -
obviously not the kind of lenses and cameras we personally used. While I've never touched any of that actual classified equipment, I have seen some of the shots
from time to time, and they seemingly seem to break all the limitations of conventional conversations about MTF, apochromaticity, spectral sensitivity, etc. And I'm
talking about true-color (human vision) chemical color prints - nothing digitally manipulated, but perhaps electronically corrected somewhere in the light path itself,
then optically reassembled. Fun to guess. But yes, I do know about those surgeries. My wife would never drink coffee on a day she had to do the stitching. And
the neurosurgeon would use an obsidian-tipped knife, cause steel still isn't sharp enough. Some things still haven't changed since the ice ages. And I wonder if people once learned to make magnifiers out of stone too. There were gifts baskets woven by coastal Miwok women less than an inch wide overall, with intricate feathered and fiber designs in them so tiny the unaided human eye can't even make them out. In the museum, they are displayed behind magnifiers. And I was
personally the first person in the W Hemisphere to discover ice age microliths - incredibly precisely made tiny, tiny obsidian points which were used in a serial manner, aligned into a shaft in mammoth ivory or bone using made using a grooving burin. Nodody else even bothered to examine stuff that small. But old world
paleo researchers instantly recognized what I had when I showed it to them. But somewhere around here I have a tiny little block of polished crystal with a laser sculpture inside of it (sculpted in the negative sense, by removing material inside the little polished block) that you need to magnifier to see it. It was a stunt piece from some laser outfit.

paulr
16-May-2014, 11:31
The physiology of human vision may get left out of these discussions, but it's been well studied, especially in the areas of subjective sharpness, clarity, and detail. The factors that correspond with these perceptions correlate very well with information you can get from MTF data.

The caveat is that you have to look at the MTF of the whole system, which includes viewing distance of the final print. We create our sense of sharpness based on MTF at a specific range of spatial frequencies—measured at the retina. Which means that the difference between looking at a print from 1 foot and 2 feet changes those frequencies 100%. Fortunately the relevant frequencies encompass a range of about 1:5, so optimum conditions aren't terribly brittle. But it means that you'll want to sharpen an image differently for printing at 10" wide vs. 24" wide vs. 50" wide. And if you're printing for a site-specific installation, you'll want to know how close people will get.

MTF curves perfectly explain why velvia transparencies look so sharp on a lightbox. The MTF at lower frequencies rises well above 100%, indicating strong edge effects at the exact range of resolutions that would matter when looking at an image unmagnified (and to a somewhat lesser degree through a 4X loupe.) In a big print it will no longer have this advantage over many films that look less sharp unmagnified.

On the subject of veiling flare, I don't really think it shows a limitation. An optical lab can reduce it through normalizing to the same degree as you could in the darkroom / computer through increasing global contrast.

The more important reason MTF curves don't tell us everything is that we care about more than just perceived sharpness and detail.

sanking
16-May-2014, 13:41
......
This figure should make us optimistic regarding the future development of digital imaging systems and the durability of one of our preferred discussions here about DSLR scanners, initiated by Peter J. de Smidt ;)

We have certainly seen a lot of progress in digital with dynamic range. The announced Sony a7s is said to have a dynamic range of 15.3 stops. Seems only yesterday ten or so stops was considered good.

Sandy

Emmanuel BIGLER
16-May-2014, 15:40
From Paul R.:
The factors that correspond with these perceptions correlate very well with information you can get from MTF data.
Well, I am certainly convinced as far as lens performance is concerned, but for film resolution and granularity I still have doubts regarding the high-frequency side of film MTFs.
The example for Velvia is very demonstrative, though, at low frequencies, but Velvia is not the only film to exhibit such a bump over 100% at low frequencies; may be Velvia has a bigger bump than other films?

which includes viewing distance of the final print.
Agreed 100%, and this is a reason why many people asking on our forums questions regarding depth of field are often disappointed because before answering, we often ask them to define the viewing distance of their final prints.
And this variablilty of DOF makes useless the good ol' DOF scales engraved on our old lenses (fortunately, modern DSLR lenses and LF lenses do not show any DOF scales ;) )


From Sandy K.:
The announced Sony a7s is said to have a dynamic range of 15.3 stops
Thanks for the info, this is amazingly good news but this brings back the delicate question of stray light.
15.3 stops corresponds to a dynamic range of 1 to 40,000 : the ten zones (or is-it eleven ? ah the old trick of counting tick marks vs. intervals ;) ) by Saint Ansel correspond only to 1 to 1000.
In order to actually take benefit of such a dynamic range in a single shot, the amount of stray light generated by parasitic reflections inside the lens and inside the camera itself should be kept really low.
And this is a question for which manufacturers give little details.

Bernice Loui
17-May-2014, 08:56
Off topic on Microscopes, still some interesting bits.

"Older" microscopes had optical systems designed and built to an Royal Microscope Society standard of 160mm mechanical tube length which was set a long time ago and they went after any manufacture that deviated from this standard. The first to move away from the 160mm tube length was Leitz who introduced a 170mm tube length for some of their optical systems. RMS sent Leitz a formal letter to stop. In time the entire microscope industry move on to an infinity optical system in use to this day. Giving up that mandated 160mm mechanical tube length allowed important developments for microscope optical systems. From magnification changers to much improved illumination systems and a lot more.

The other significant change was moving away from RMS threaded objectives which was another mandated specification by the Royal Microscope Society.

160mm optical systems can be absolutely excellent optically, but they have specific limitations due to this mandated optical requirement. It does depend on what specifically the microscope is intended to be used for.

As for mechanical backlash in the focus stage, know that the mechanical focuser in the Leitz Orthoplan has none by design as it's fine focus is achieved by shifting a worm gear that does not depend alone on mechanical precision. This Leitz focus mechanism was chosen by NASA for more than one of their in space experiments that required an extremely precision stage mechanism.

"Older" microscope have their place and it is highly dependent on what the microscopy needs are.

There is an Leitz Ergolux with Fluotar BF/DF objectives and large x-y stage (6" x 6") that includes and digital position read out that I use for measuring tiny stuff. While this is one of the speciality microscopes in the pile, it is an example of a speciality microscope that is excellent to this day.

One cannot deny the beauty and excellence in craft of older microscopes like this Zeiss Universal..
https://www.youtube.com/watch?v=sQW5ttGJxpM

Like too many items produced and sold today, they might be easier to use and lower cost, but most are designed to be tossed once their usefulness is done. Newer is not always better.



Bernice



One caveat Drew: a high-end research microscope of today is considerably 'better' built than the equivalents from the 60s. Less backlash in the focussing, more precise repeatability when changing objectives, better light sources, and lenses which are miles better in terms of quality of image across the whole field. That's before you get into the advantages of infinity conjugates (the lack of which is the likely reason your friend's models were 'obsolete'), phase contrast and DIC imaging, and the ability to work well with piped-in light sources like lasers for fluorescence. The up-to-date Nikon and Ziess models I have worked on are much, much better than the similar stuff from thirty years ago.

Which is not to say that amateurs and schools would not have welcomed them with open arms, if only they had known. Just the focussing mechanisms would be worth scavenging for the focus stackers amongst us. But institutions think in terms of administrative load, not the abstruse benefits of recycling useful stuff. It's why I used to haunt the dumpsters at the University :-)

paulr
17-May-2014, 10:02
The example for Velvia is very demonstrative, though, at low frequencies, but Velvia is not the only film to exhibit such a bump over 100% at low frequencies; may be Velvia has a bigger bump than other films?.

It's the biggest bump I've seen. But I've never seen curves for b+w films developed with really edgy developers.

Ken Lee
18-May-2014, 05:30
http://www.kenleegallery.com/images/forum/mtf.jpg

Here's an MTF chart for a Rodenstock 150mm APO Sironar S.

If the lens covers 231mm, why would they only show the performance out to 126.7 mm, the width of a 4x5 sheet of film ?

If performance drops at 127mm, then what about the performance at another 100mm out ?

I'm not trying to split hairs here, just trying to understand and I greatly appreciate the expertise available on this forum.

Arne Croell
18-May-2014, 06:02
http://www.kenleegallery.com/images/forum/mtf.jpg

Here's an MTF chart for a Rodenstock 150mm APO Sironar S.

If the lens covers 231mm, why would they only show the performance out to 126.7 mm, the width of a 4x5 sheet of film ?

If performance drops at 127mm, then what about the performance at another 100mm out ?

I'm not trying to split hairs here, just trying to understand and I greatly appreciate the expertise available on this forum.
Ken, the x-axis of the MTF chart is the radius from the image center, the coverage is the diameter. 2 x 126.7mm=253.4mm. Or 231mm/2= 115.5mm. That is about where the 10lp/mm line goes below 50%, but I do not know for sure whether that is Rodenstock's criterion for the image circle.

Ken Lee
18-May-2014, 06:22
Excellent - Thank you very much !

Jim Jones
18-May-2014, 06:27
. . . The MTF at lower frequencies rises well above 100%, indicating strong edge effects at the exact range of resolutions that would matter when looking at an image unmagnified (and to a somewhat lesser degree through a 4X loupe.) . . .

I wonder if there might not also be an analogy in optical systems. Despite what a few theorists say, a pinhole image of a standard lens testing chart can resolve line pairs about 2/3 as wide as the pinhole diameter. This constructive interference occurs as the Airey disc approaches the pinhole diameter.

paulr
18-May-2014, 16:41
115573

Here's that Velvia curve.

If you compare to a lot of color neg films, the performance at high resolutions isn't so great. But that contrast bump in the low frequencies is ginormous.

Images on Velvia should look sharpest at close view distances in a 2X enlargement.

Struan Gray
19-May-2014, 01:08
Bernice: I have the greatest respect for well-made instrumentation of any era. I just wanted to twit Drew a little. The just plain gorgeous Axio-Imager I used extensively until recently (which outclassed every other optical microscope I have ever peered into for imaging flat subjects) was downright cludgy when I wanted to manipulate polarisation. Any good 50s metallurgical microscope would have been better for that task.

But, quite apart from the optics, I stand by my view that precision mechanics is a very different game than it was thirty or forty years ago. Both machine tools and metrology have improved tremendously, as have materials, coatings, and control over processes like annealing. Some of that capability has gone into making things more cheaply - which itself can be a worthy goal - but some has also gone into making the high end even better than it was. And that's before you get into drive-by-wire stuff like piezoelectric positioners.

I have spent a lot of time trying to persuade people to use eyepieces instead of a live view on a monitor - but kids today automatically regard an on-screen image as having more validity. Kids have always had to be trained. I have booked an especially warm corner of Hell for supposedly grown-up administrators who connect multi-tens-of-thousands-grade microscopes to the cheapest monitor in the catalogue.

Arne Croell
19-May-2014, 05:03
Bernice: I have the greatest respect for well-made instrumentation of any era. I just wanted to twit Drew a little. The just plain gorgeous Axio-Imager I used extensively until recently (which outclassed every other optical microscope I have ever peered into for imaging flat subjects) was downright cludgy when I wanted to manipulate polarisation. Any good 50s metallurgical microscope would have been better for that task.

But, quite apart from the optics, I stand by my view that precision mechanics is a very different game than it was thirty or forty years ago. Both machine tools and metrology have improved tremendously, as have materials, coatings, and control over processes like annealing. Some of that capability has gone into making things more cheaply - which itself can be a worthy goal - but some has also gone into making the high end even better than it was. And that's before you get into drive-by-wire stuff like piezoelectric positioners.

I have spent a lot of time trying to persuade people to use eyepieces instead of a live view on a monitor - but kids today automatically regard an on-screen image as having more validity. Kids have always had to be trained. I have booked an especially warm corner of Hell for supposedly grown-up administrators who connect multi-tens-of-thousands-grade microscopes to the cheapest monitor in the catalogue.
Struan, I am with you there. When we got our Axio Imager eight years ago, I was astonished that the optical difference to its predecessor, a Leitz Aristomet 17 years old at the time (which was and is not a slouch either), was clearly visible. That was using the eyepiece, of course, on a screen the difference would have been lost. With respect to the mechanics, on has to differentiate. Pieces made in comparatively larger numbers by CNC machines are more precise than their predecessors. It is not always true for rare pieces like e.g. a universal stage for polarization microscopy. The last one, made by Zeiss until maybe 10 years ago was a pretty simple affair compared to the ones made by them or by Leitz in the 1950's-1970's. And as for cost cutting, course microscopes for students now typically have nylon gears, which get stripped after a few years of hard use. Not an improvement in my book.

sanking
19-May-2014, 08:55
It's the biggest bump I've seen. But I've never seen curves for b+w films developed with really edgy developers.

What is an "edgy" developer?

Sandy

Drew Wiley
19-May-2014, 09:02
Most student and mid-grade microscopes in the world, regardless of brand label, are made in exactly the same factory in Xian, China, overall the largest optical factory in the world. That is where nearly all optical survey equipment is made too, including theodolites. This doesn't mean that there aren't brand distinctions in
quality or features, but does kinda dictate the ceiling of quality. This is of course the famous city where the terra cotta sculptures are; but the optics operation
in its own right employs over 30,000 people. That's a stunning figure. But my youthful research days are long gone. I'm well aware of certain technical advances in
microscope optics. I'm also well aware of optical advances which could hypotheticallly apply to view camera lenses. And I don't give a damn about either except in a curiosity sense, cause I could never afford either. But I probably a lot like a number of people on this forum. I appreciate good build quality, even if it does imply
something relatively antique. But I'm getting to be relatively antique too. And 8x10 film has such a big footprint, you'd don't need to uttermost MTF figure or whatever. Other optical characteristics, or a certain look, takes priority. I have nothing against nylon gear. It just depends on what kind of nylon is involved.
Delrin will hold up better than most brass alloys. I still haven't worn out the nylon gears on my Sinar backs after thirty years of use. Ordinary cheap nylon is a completely different story. It might as well be paper mache.

paulr
19-May-2014, 12:46
What is an "edgy" developer?

Sandy

I just mean developers that encourage edge effects (pyro, etc.) or development techniques that do the same (stand development). I don't mean a developer with provocative tattoos.

Bernice Loui
19-May-2014, 20:24
The Zeiss AXIO imager is their current research microscope offering or one of Zeiss best configurable microscopes. It does much to make complex microscopy setups easier and has the ability to produce excellent images. It is what the market expects from a very expensive modern microscope. It's results still depends on the specific stand/illumination/optics/stage/condenser/eyepiece pair/video/camera and ... set up driven by specific imaging needs. Not every microscope user needs an extreme research grade microscope. These improvements do come from technology progress as expected. But the diffraction limits spill apply as with the Numeric Aperture for a given magnification objective along with the limits of objective to air to subject limitations (different refractive index which light must contend with in working with any optical system).

Piezo positioners did not become common in microscopes until post 60's -70's.. These were used as part of the fix for the Hubble Telescope correction mirror fix. Without these piezo driven mechanical devices, correction for the Hubble problem would have been much more difficult. Million's of an inch degree of difficulty.

While modern CNC machine tools have made extremely complex metal parts producible, extreme precision using CNC alone is not going to achieve this goal as it is part of the solution and not THE solution. Same with modern materials used in nearly every aspect if technology.

Given the excellent optical imaging possible with modern high quality microscopes the idea that viewing an image on a screen is acceptable is simply not. It is possible the challenge of setting up the eye pieces for the proper inter pupillary distance, proper diopter correction and learning how to view the image formed by eye piece optics are very real obstacles for new microscope users. Lacking this skill, the high quality image produced by a high quality optical system is mostly lost.

Learning to use a microscope properly and interpreting the images is a skill that requires training and much development. Using only the screen for viewing is a serious handicap IMO.

What happens when the requirement of using lower magnification stereo microscope and working under a stereo microscope becomes a requirement? There is nil depth perception if a view screen is used.

As for the AXIO imager being better than the Aristroplan, it does depend on the specific optical system and subject involved. The Leitz Aristroplan aka Orthoplan 2 was originally set up and used 160mm or 170mm (greater than 16X) mechanical length optics compared to the modern infinity objectives. Beyond this objectives were offered and used from standard PL/NPL to Fluotar to APO.. along with various eye piece types and illumination/condenser/filter and.. configuration.
The possible options and combinations depends much on imaging needs.

Beyond this, we get into electron microscopes.

Then we have the astronomy/physics folks are tinkering with getting around the problem of diffraction by using entangled photons.

....and this is now way far beyond off topic.


:)
Bernice



Bernice: I have the greatest respect for well-made instrumentation of any era. I just wanted to twit Drew a little. The just plain gorgeous Axio-Imager I used extensively until recently (which outclassed every other optical microscope I have ever peered into for imaging flat subjects) was downright cludgy when I wanted to manipulate polarisation. Any good 50s metallurgical microscope would have been better for that task.

But, quite apart from the optics, I stand by my view that precision mechanics is a very different game than it was thirty or forty years ago. Both machine tools and metrology have improved tremendously, as have materials, coatings, and control over processes like annealing. Some of that capability has gone into making things more cheaply - which itself can be a worthy goal - but some has also gone into making the high end even better than it was. And that's before you get into drive-by-wire stuff like piezoelectric positioners.

I have spent a lot of time trying to persuade people to use eyepieces instead of a live view on a monitor - but kids today automatically regard an on-screen image as having more validity. Kids have always had to be trained. I have booked an especially warm corner of Hell for supposedly grown-up administrators who connect multi-tens-of-thousands-grade microscopes to the cheapest monitor in the catalogue.

Arne Croell
19-May-2014, 22:25
Bernice, the Leitz Aristomet already had infinity optics (new at the time, 1989), as opposed to the Aristoplan. At the time, the Zeiss competition was the Axioplan/Axiophot (also infinity corrected), and the Leitz was better for Nomarski Differential Interference Contrast, which is our main use. That changed with the Axio Imager.

Bernice Loui
19-May-2014, 22:54
Yes, by the time the Zeiss AXIO imager appeared, it had all the options covered.

For quite a while Leitz had little over Zeiss, with them becoming owned by Danaher and now being Leica Microsystems, they are not the company they once were.

Aristroplan was intended for transmitted light while the later Aristromet was intended for metallurgical / Eppi illumination work with BF/DF/DIC and etc.. They share much in common with each other and some of the bits interchange with the Ergolux which is even more specialized.

Still have the Leitz brochure some where from back in the days.

Even within the is group of infinity optics, Leitz offered NPL and Fluotar which were "semi-APO". Far as I'm aware, Leitz never made a APO version of these BF/DF objectives in RMS thread, APO only in 30mm threaded after they gave up on the RMS threaded objectives.

For some reason Leitz made a number of BF/DF objectves including Wahlstrom prisms with RMS threads. In time they moved on and went to a 30mm threaded objective for BF/DF and offered them in APO.

A bit of microscope history here..


Bernice



Bernice, the Leitz Aristomet already had infinity optics (new at the time, 1989), as opposed to the Aristoplan. At the time, the Zeiss competition was the Axioplan/Axiophot (also infinity corrected), and the Leitz was better for Nomarski Differential Interference Contrast, which is our main use. That changed with the Axio Imager.

Struan Gray
19-May-2014, 23:49
I think we are all vigorously agreeing with each other :-)

I meant it when I said I had the greatest respect for the instrumentation of the past. I have a soft spot for early C20th electric equipment, which often had to be highly precise in both its electric and mechanical properties. My favourite room in the London Science Museum (now there's an institution which is but a shadow of its former self) is the one full of mechanical computers. I itch to turn the handle on Kelvin's tide analyser.

I have a 100 year old Zeiss wide field refractor telescope. It is not as comfortable or convenient to use as a modern scope on a robotic platform, but there is the thrill of physically interacting with the device. Many modern telescopes you might as well be calling images up from a star catalogue.

I think this is part of the attraction of large format photography in general, and older lenses in particular, so I don't think we are that far off topic. The great lens designers understood aberration control better than anyone in this thread, even if they didn't have cheap calculation tools able to handle higher order wavefront errors. They also had craftsmen who could make the most amazingly precise equipment, and they had the understanding needed to calibrate and use it.

But not everything was better, even in the purely technical sense.

FWIW, my field was scanning tunnelling microscopy, so all this optical stuff seems very coarse grained :-)

Emmanuel BIGLER
20-May-2014, 10:47
From Struan G.
FWIW, my field was scanning tunnelling microscopy, so all this optical stuff seems very coarse grained :)

.. this is because you never experienced scanning optical near-field microscopy (http://link.springer.com/chapter/10.1007/978-94-011-1978-8_12), another specialty of Besançon, an optical imaging technique which easily breaks the infamous "pc=N.lambda" barrier ;)

---

From Paul R.
Here's that Velvia curve...that contrast bump in the low frequencies is ginormous.


Many Thanks, Paul ! This does make sense, Velvia's bump is ... whatever you call it ;)

Provia 100F also exhibits a FTM curve definitely above 100% up to 20 cy/mm; see page 6 of the official data sheet. However with Provia 100F, this does not look like a bump, but like a flat ;-)
http://www.fujifilm.com/products/professional_films/pdf/provia_100f_datasheet.pdf

Struan Gray
20-May-2014, 13:33
you never experienced scanning optical near-field microscopy (http://link.springer.com/chapter/10.1007/978-94-011-1978-8_12), another specialty of Besançon, an optical imaging technique which easily breaks the infamous "pc=N.lambda" barrier ;)

Never say never. I've done a little bit of pure SNOM, and quite a lot of near-field plasmon stuff. I invented a variant using low workfunction surfaces to do near-field photoemission spectroscopy. All quite useless, of course.

sanking
20-May-2014, 15:02
I just mean developers that encourage edge effects (pyro, etc.) or development techniques that do the same (stand development). I don't mean a developer with provocative tattoos.

How does one test the MTF of a film/developer combination? The width of typical edge effects is usually too narrow to be discriminated by scanners with real effective resolution of less than 3000 dpi- 4000 dpi.

Sandy

jb7
20-May-2014, 17:35
Wouldn't edge effects diminish resolution? While increasing sharpness? Surely a sharpened image has less resolution, if density is being built up around an edge?

paulr
20-May-2014, 18:27
Wouldn't edge effects diminish resolution? While increasing sharpness? Surely a sharpened image has less resolution, if density is being built up around an edge?

Yes. More specifically, you're increasing contrast (modulation) at a certain frequency, at the expense of contrast at higher frequencies. This is one reason it's so important to know the final print size and approximate viewing distance before doing any sharpening.

To make a print that looks as sharp as possible at a 10" viewing distance, you want a sharpening radius that's about 0.1mm. This corresponds to 5lp/mm. You will be sacrificing the contrast of detail at higher frequencies, but it doesn't matter. Detail 10 lp/mm and higher is basically irrelevant to our subjective sense of image quality. People don't even miss it if it's gone.

Detail at slightly lower frequencies matters. We get big cues about image quality from detail up to the 1 lp/mm range. But with good optics and a good workflow, that usually takes care of itself.

Sandy, scanners are plenty good enough to pick up on the edge effects. If the scanner couldn't see it, we sure couldn't. Notice on the Velvia MTF charts that the edge effects peak at 10lp/mm. This corresponds to a .05 mm radius. Or around 500 LPI. Probably the more subtle effects people get with BW film have a smaller radius, but I doubt 4X smaller.

Bernice Loui
20-May-2014, 21:08
"I think we are all vigorously agreeing with each other :-)"

Very much indeed, this is not a debate over who has the better microscope or what other else battle, it is a sharing of knowledge and experience with these speciality optical instruments. State Of The Art optical microscopes like the Zeiss AXIO and many others would not be without all those highly skilled, talented and knowledgeable individuals who designed them and highly skilled crafts folks that built and produced them. The best current optical instruments are built on a sound scientific foundation and culmination of many decades of much hard and creative work by all involved. Yet, what is possible is limited by our current understanding of optics, light and their interactions to form a viewable image.

For those who have never seen a Zeiss AXIO Imager.. here is the link. Note number of options and possible configurations for specific needs.
https://www.micro-shop.zeiss.com/index.php?s=100519235dfedea&l=en&p=es&f=e&i=10217&o=&h=25&n=0&si=490016-0002-000#490016-0002-000

Price can be eye brow lifting...


Here is an image of the Leitz Aristromet, Would like to have one for the collection:
http://www.krist.uni-freiburg.de/Forschung/Einrichtungen/GeraeteBilder/Einric3_g.jpg

The Leitz Aristroplan looks quite similar except for the transmitted light system system at the lower part of the microscope stand.

Microscopic imaging needs can be met with older less than state of the art research grade microscopes current or from the past. It really depends on the specific needs and applications. Knowing what is available and how to get the very most out of any given optical instrument is key to achieving useful information, data and results.

One of my hobbies is collecting high quality microscopes from Zeiss, Leitz and Wild Heeburg. They do get used for various things including inspecting image quality of view camera optics produced on film, learning more about how semiconductors have failed in more than a few electronic devices I have created, viewing metals and other material that are used to make stuff and what happened when they failed. The stereo microscopes allow working on small stuff that is otherwise not possible with the naked eye. By day, I'm involved with designing techno stuff (all that analog electronics stuff) that is also a blend of science, art and technology.

Another reason for collecting this optical stuff is to preserve some of technology's history. Many folks who work in engineering, science and technology tend to quickly discard the past and adopt the most recent techno stuff as quickly as possible.. even when more than a few of these technology master works remain viable for a given task and in some cases better than the device that replaced it. It also helps to remind those interested that the current generation of techno devices have a history and there is much to be learned from how these folks solved technical problems and difficulties from that time. It is a story and history of how to solve technical problems with a given set of tools, materials and abilities from that time.. and more than a few of those solutions are extremely creative and clever.

Which brings us to my affection for vintage optics -vs- state of the art optics for view cameras. From my point of view, lens designers from the past understood well optics design. This understanding went beyond correcting for nth order aberrations and inherent optical problems for a given lens, they artfully used this understanding along with highly skilled optics crafts folks to create lenses that produce pleasing images on film or what I term lens personality. Great view camera lenses are a interesting blend of science, technology, highly skilled craft and art in the same way expressive imaged are created.

One can be easily caught up in MTF curves, the Physics, the Math, the Testing and all else involved in this view camera image making stuff, but once one looks beyond all this and one looks at what what one's needs as creative image making tools, none of the previous techno stuff matters as they are merely a means to an end for creating expressive images. Understanding and learning their strengths, problems weaknesses, care & feeding is more of what matters to achieve fruitful results.

For those who might be interested, here is a electron microscope video with increasing magnification of concrete:

https://www.youtube.com/watch?v=UUcQSw7oO0k



:)
Bernice




I think we are all vigorously agreeing with each other :-)

I meant it when I said I had the greatest respect for the instrumentation of the past. I have a soft spot for early C20th electric equipment, which often had to be highly precise in both its electric and mechanical properties. My favourite room in the London Science Museum (now there's an institution which is but a shadow of its former self) is the one full of mechanical computers. I itch to turn the handle on Kelvin's tide analyser.

I have a 100 year old Zeiss wide field refractor telescope. It is not as comfortable or convenient to use as a modern scope on a robotic platform, but there is the thrill of physically interacting with the device. Many modern telescopes you might as well be calling images up from a star catalogue.

I think this is part of the attraction of large format photography in general, and older lenses in particular, so I don't think we are that far off topic. The great lens designers understood aberration control better than anyone in this thread, even if they didn't have cheap calculation tools able to handle higher order wavefront errors. They also had craftsmen who could make the most amazingly precise equipment, and they had the understanding needed to calibrate and use it.

But not everything was better, even in the purely technical sense.

FWIW, my field was scanning tunnelling microscopy, so all this optical stuff seems very coarse grained :-)

Struan Gray
21-May-2014, 00:42
Bernice, that would be a fun collection to have a play with.

I saw your pictures in the 5x7 thread. I shared an office with one of the people responsible for renovating and restoring the Alexanderson Alternator (http://en.wikipedia.org/wiki/File:Alexanderson_Alternator.jpg) at Grimeton. A mechanical radio wave generator - still used when the Swedish Navy are prepared to free up the antenna for frivolous transmissions.

Paul: you probably know this, but the effect you are describing is what makes the Einstein/Monroe hybrid image (http://en.wikipedia.org/wiki/Hybrid_image) work. I've done similar double portraits of our twins. I don't know of anyone using the effect for art purposes, but I'm sure it's a factor (beyond the mere revelation of more detail) in many large scale abstract and figurative works.

Arne Croell
21-May-2014, 08:24
"I think we are all vigorously agreeing with each other :-)"

Here is an image of the Leitz Aristromet, Would like to have one for the collection:
http://www.krist.uni-freiburg.de/Forschung/Einrichtungen/GeraeteBilder/Einric3_g.jpg

The Leitz Aristroplan looks quite similar except for the transmitted light system system at the lower part of the microscope stand.

Bernice
Struan is right, of course, we are in full agreement, it is just fun to talk about and appreciate these wonderful instruments. I actually use the Aristomet and some of my scientific samples for my personal photography from time to time: http://www.arnecroell.com/p457195038

But sorry Bernice, you can't have our Aristomet. We still use it regularly, besides having the Axio Imager sitting next to it. Actually, a few months ago there was a full Aristomet setup on the German ebay for €8000. It even had a 4x5" attachment for the Variophot photo unit. I was quite tempted at the price, but since I have access to the one here, I abstained. Btw, in that linked image the transmitted light system is missing a part of the condensor system, it is sitting in its box in the drawer as a cautionary measure. We use reflected light (DIC) 95% of the time and when the Ph. D. students have thicker samples and move the stage down, they sometimes forget that the condenser parts can hit the polarization filter unit below (Arrrgh...).

In terms of first rate collectable microscopes of the 1950's- 1980's, Reichert is also worth mentioning. My personal dream microscope for collecting would be a Zeiss Ultraphot III though...

Bruce Watson
21-May-2014, 13:28
It seems that there's a reciprocal relationship between resolution and contrast. Is this due to the wave nature of light - or to the Uncertainty principle?

You might be asking the wrong question.

Without contrast, there's nothing to resolve. Think about a solid white (255, 255, 255) surface. What's to resolve? OK, now think about that same surface, but put a spot in it somewhere that's not quite white (say, 255, 255, 254). If that spot is big enough in your field of view, you can just barely see it -- resolve it if you will. But as you back away, the white "background" overwhelms it, and you can't see it anymore. There's a limit of how much of your field of view it has to take up before you can see it.

With a scanner, it's relatively easy to define when that point occurs. It occurs when that spot on the white background takes up enough of the scanners field of view that a pixel in the scanner "averages" that 255, 255, 254 color. So that it's recorded as 255, 255, 254, and not 255, 255, 255. Yes?

Clearly, getting the scanner pixel to record something other than 255, 255, 255 will be easier as the spot you are trying to resolve move farther away from the values of the background. So, 255, 255, 253 will be easier to resolve than 255, 255, 254. And 0, 0, 0 will be way easier to resolve. Clearly, more contrast makes resolution easier.

Do you really need to know the name of the mathematical theories to understand the concept? No. But if you really want to know the names, I'm not your guy. Sorry. ;)

Ken Lee
21-May-2014, 14:33
The MTF charts I've seen show that (in general) as the target gets smaller (demanding greater resolution from the lens), the contrast of the image delivered by the lens decreases.

That's the reciprocal relationship to which I referred.

I'm an amateur and appreciate all the help available here.

paulr
21-May-2014, 16:46
Paul: you probably know this, but the effect you are describing is what makes the Einstein/Monroe hybrid image (http://en.wikipedia.org/wiki/Hybrid_image) work. I've done similar double portraits of our twins. I don't know of anyone using the effect for art purposes, but I'm sure it's a factor (beyond the mere revelation of more detail) in many large scale abstract and figurative works.

I don't know it ... I don't even know of it (had to check if there was another paul here ...)

Thanks for the link ...

paulr
21-May-2014, 19:00
The MTF charts I've seen show that (in general) as the target gets smaller (demanding greater resolution from the lens), the contrast of the image delivered by the lens decreases.

That's the reciprocal relationship to which I referred.

I'm an amateur and appreciate all the help available here.

This is a non-technical answer, but I think the simplest explanation is that all the natural forces that limit resolution do so by reducing contrast. optical aberrations, focus errors, and diffraction all introduce blur functions. The higher the spatial frequency of the detail, the more it's degraded.

Bernice Loui
21-May-2014, 22:08
Fun indeed, microscopes are really a gateway into another world in similar ways to telescopes or other technology devices that extend human vision. It is an adventure waiting to be discovered.

Alexanderson Alternator, these were the first attempts at generating wireless information transmission at high-power. These were basically alternators with lots of poles on the rotor and run a high speeds to generate high frequency energy around the early 1900's. An analogy would be a single tone (frequency) bell that is constantly struck with a rotary hammer at the same tone (frequency) rate as the bell with the sound of the bell as it's output.

No small task to keep one of these devices operating. Still it is very much worth the effort as these devices are bits of wireless history.

The rest of those images from the Point Reyes radio station can be found at the link below. These images are all of the film used that day.
http://www.meetup.com/SFBayAreaLFers/photos/all_photos/?photoAlbumId=20282972

Curious thing did happen during that visit. The curators and volunteers of that historic facility figured out I knew something about the technology and contents of this facility. They began to share many, many interesting stories about the goings on with the transmitters and other technological bits and this particular facility. It became a tour within a tour and sharing places in this facility that are not often open to the general public. In turn, I shared the how and why this stuff does what it does and why the innards of these things were designed they way they were both good and bad. One of the last things that happened, they had a Fluke Frequency synthesizer on the bench being repaired. They were having difficulty with the phase detector. I spent a moment looking at the service manual schematic and explained to the guy working on it how this phase detector worked and why it could be mis-behaving, what to check for and other key items that must be in place for it to function properly which they greatly appreciated.

-They really wanted me to volunteer and spend time helping them keep this facility up and running and with the restoration.

Another interesting moment in the journey of life :)



Bernice



Bernice, that would be a fun collection to have a play with.

I saw your pictures in the 5x7 thread. I shared an office with one of the people responsible for renovating and restoring the Alexanderson Alternator (http://en.wikipedia.org/wiki/File:Alexanderson_Alternator.jpg) at Grimeton. A mechanical radio wave generator - still used when the Swedish Navy are prepared to free up the antenna for frivolous transmissions.

Bernice Loui
21-May-2014, 22:37
Arne, those microscope images are beautiful. Another wonderful example of the beauty found in nature.

What happens to those who run the into the lower polarizer unit (OUCH)? Do they suffer the same fate as running the objective into the specimen or worst (REALLY OUCH)?

There appears to be more Aristromet systems offered on ebay Germay than ebay US. The last time I noted one of these Aristromet systems appeared that appeared on ebay Germay, it was a very complete set up including the very desirable Variphot with 4x5, ergo head, four sets of objectives, full DIC and polarized light set up and more.. I kept pondering how to get the whole thing shipped to the US damage free and what the shipping cost might be.. Maybe one day...

The Zeiss Ultraphot is a classic in many ways. It is extremely versatile and remained one of the great microscopes made to date. Zeiss produced these for a long time.
For those who have never seen a Zeiss Ultraphot here is just one image of many on the web. Note the size and heft of this microscope. Worth noting, the Zeiss Luminars were part of the objective offerings for the Ultraphot.

http://www.tianjixing.com/tbbs/UploadFile/2012-1/201211011292853765.jpg


To Arne, who has impeccable taste in collectable microscopes..

Berncie



Struan is right, of course, we are in full agreement, it is just fun to talk about and appreciate these wonderful instruments. I actually use the Aristomet and some of my scientific samples for my personal photography from time to time: http://www.arnecroell.com/p457195038

But sorry Bernice, you can't have our Aristomet. We still use it regularly, besides having the Axio Imager sitting next to it. Actually, a few months ago there was a full Aristomet setup on the German ebay for €8000. It even had a 4x5" attachment for the Variophot photo unit. I was quite tempted at the price, but since I have access to the one here, I abstained. Btw, in that linked image the transmitted light system is missing a part of the condensor system, it is sitting in its box in the drawer as a cautionary measure. We use reflected light (DIC) 95% of the time and when the Ph. D. students have thicker samples and move the stage down, they sometimes forget that the condenser parts can hit the polarization filter unit below (Arrrgh...).

In terms of first rate collectable microscopes of the 1950's- 1980's, Reichert is also worth mentioning. My personal dream microscope for collecting would be a Zeiss Ultraphot III though...

Struan Gray
22-May-2014, 12:25
The rest of those images from the Point Reyes radio station can be found at the link below. These images are all of the film used that day.
http://www.meetup.com/SFBayAreaLFers/photos/all_photos/?photoAlbumId=20282972

What a great place for a LF excursion. I'm not surprised that array of globe-like valves worked like catnip :-)



-They really wanted me to volunteer and spend time helping them keep this facility up and running and with the restoration.

Nice to be appreciated. A bit like with attempts to resurrect Autochrome, there's a difference between understanding the principles of operation, and having the deep practical knowledge needed to actually do something.

Drew Wiley
22-May-2014, 12:48
Yes... this certainly has been a pleasant thread, even if a slight detour from the original topic. And it's nice to become aware of the range of micro expertise available
on this forum, just in case I try something nutty like modifying a scope for 8x10 film.

Arne Croell
23-May-2014, 06:31
Yes... this certainly has been a pleasant thread, even if a slight detour from the original topic. And it's nice to become aware of the range of micro expertise available
on this forum, just in case I try something nutty like modifying a scope for 8x10 film.

Drew, the only reason to use 8x10" on a microscope would be if you want to get the "je ne sais quois" of contact prints, and that is certainly a valid reason.

Otherwise, a larger film format does not buy you anything on a microscope due to diffraction (with a conventional microscope - not Emmanuel's near-field microscope which makes an end run around diffraction, or Struan's STM/AFM instruments, but those images use scanning and can only be viewed on a screen). The influence of diffraction is in my opinion overrated in regular photography, but in microscopy its an everday fact of life, easily visible. There is a reason why there were no 8x10 options available on microscopes, at least in the last 80 years.

Here is a little off the cuff calculation: The 20x objective on our Aristomet has a numerical aperture of NA=0.45, quite typical, and would show an object field of approximately 0.75mmx0.9375mm for a 4x5 aspect ratio of the image. Provided you could find a photo eyepiece that gives you a good 8x10" image right away, that is an overall magnification on the negative of about 250x. The normal criterion for a "sharp" contact print is 5lp/mm, i.e. 200µm for the line pair width. Divide by 250, and the resolution in the object plane needs a width of 800nm for the line pair, or 400nm for the line. The microscope resolution based on the (slightly simplified) Rayleigh criterion is R=lambda/2NA (if the condenser aperture and objective aperture are the same, which is the case for reflected light), with lambda the wavelength of light. If we assume lambda=550nm (green) and the aperture above, you end up with 610nm, already 50% above your allowable line width. One can get a bit better by using blue light and/or an oil immersion objective (larger effective numerical aperture), but the basic gist is that an 8x10 print is usually the end of the line, if not already too much. My personal microscope image prints linked in the earlier post are usually around 5x7" for that reason. Using objectives with smaller magnifications helps a bit, but not as much as one thinks, because the numerical aperture typically goes down with decreasing power. Using a 10x objective would only be a 125x enlargement, of course, so your allowable line width would be 800nm instead of 400nm, but the NA of our 10x lens is 0.25, so the resolution is then 1100nm.

When we got the Leitz Aristomet new, it came with a 35mm camera attachment for the Variophot unit as well as a Polaroid pack film attachment. In the early 1990's I did a comparison of prints from 35mm fine grained film such as Agfa APX 25 vs. Polaroid 665 P/N (essentially type 55 in pack format), and there was no visible resolution or other advantage of the Polaroid negatives vs. the 35mm ones in prints of about 5x7" size - and larger prints are just not sharp due to diffraction. Again, the tonality of contact printing is another story, in this comparison both negatives were enlarged.

And we have not yet talked about film flatness issues - most camera attachments on microscopes look straight down...

Struan Gray
23-May-2014, 06:59
I was looking for a chance to drop my 1920s Zeiss camera lucida into the conversation :-)

Allows reproductions onto large paper sizes (limited by arm length - your arm length) and works well with watercolour and other handmade art papers.

Simply bursting with indexality.

Arne Croell
23-May-2014, 07:49
I was looking for a chance to drop my 1920s Zeiss camera lucida into the conversation :-)

Allows reproductions onto large paper sizes (limited by arm length - your arm length) and works well with watercolour and other handmade art papers.

Simply bursting with indexality.

Yes, of course! ;-) I actually had to do drawings of mineral thin sections when I had my first microscopy courses - one eye looking into the (single) eyepiece, one looking at the paper. I wasn't very good at it...