Page 3 of 3 FirstFirst 123
Results 21 to 26 of 26

Thread: There must be a limit somewhere

  1. #21
    Stefan
    Join Date
    Apr 2010
    Posts
    463

    Re: There must be a limit somewhere

    Quote Originally Posted by JeffKohn View Post
    Oversampling is not a bad thing, or even pointless. Nyquist theorem says you want 2x over-sampling, so for an 80lp/mm lens you would want a 160lp/mm sensor with no AA filter to maximize resolution without artifacts. Actually with Bayer-filtered sensor I guess there would even be some potential benefit from going higher than 2x, because each sensel is R, G, or B and therefore not a "full" sample.
    Some of the sharpest lenses for medium format digital have over 65% modulation at 80lp/mm, so in practice they will have very usable detail above this frequency. With a bit of sharpening, 10-20% modulation is still valuable for digital photographic use, and I'm sure the lenses with 65% modulation at 80lp/mm could at least do 120lp/mm at something like 20%.

    Additionally, oversampling is needed or you will loose detail to aliasing. Nyquist theorem states 2 times the frequency, but this is only the minimum frequency needed to be able to recreate the signal in theory, in practice you want more (try looking at a 10hz sine sampled at 20.1hz).

    300lp/mm on the sensor would probably be good for really sharp lenses, on sensors with the Bayer filter removed (black and white), but only for detail along the X and Y axis. For the same detail on the diagonal you'll need over 400lp/mm along the X/Y axis.

    Then add the Bayer filter, and you will need to double this again, otherwise you will loose detail, mostly on saturated red/blue subjects. So something like 800lp/mm is desirable to see what the lenses have to give. That is 1600 sensels per millimeter (two to a black/white pair). In the 54x40 sensor size used on the latest 80 megapixel backs, that would be a resolution of 86400x64000, or about 5500 megapixels.

    Of course, in practice not everyone uses the best lenses (or use the apertures and precise focusing needed), and few would notice if faint diagonal detail in saturated reds was lacking, but I don't think the megapixel race is going to slow down until we reach about 1000MP medium format and 200-300MP on full frame sensors (no AA).

    This is not a bad thing, in the high-end segment, pixel density increases have always gone hand in hand with dynamic range and real detail improvement. Sure, some people like to extrapolate their knowledge gained from noticing that a 14-megapixel compact might be worse than a 8 megapixel one, but for high-end low-ISO, dense is good.

  2. #22

    Join Date
    Jan 2002
    Location
    Besançon, France
    Posts
    1,617

    Re: There must be a limit somewhere

    Hello all !
    I realize that I did not follow this discussion after my post dated June 1, and I really apologize for not answering earlier.
    To Nathan :
    I'm not clear on the point of quantum efficiency of film vs sensor.

    Well, I'm quoting a value of 0.5% fro Tri-X from memory, read in a classical book on Photographic chemistry and physics by Pierre Glafkidès, a former Kodak scientist in France.

    Of course film does not behave like a counter and what you can define, with great difficulty and complexity, is an equivalent quantum efficiency. the fact that there is a chemical amplification of course degrades the noise figure like in any electronic amplification stage. I am unable to explain in detail how this equivalent efficiency is measured, but I know that it is meaningful in astrophysics at a very low level of illumination.

    Tim Povlick has mentioned hypersensitized film, I know that the procedure was used in astrophysics, but the improved sensitivity did not last very long, you had to expose your film immediately but Tim can explain better than me.

    There was an imaging device developped by a French astrophysicist named Lallemand,where you detected an electron image with film in a vacuum electron-beam camera. The optical image was converted into electrons in a photocathode, and the electronic image detected by a film that had to be inserted inside the vacuum chamber of the camera. I do not remember if there was an electron-multiplying stage like in an image intensifier, but such a multiplier can never improve the quantum efficiency.
    Of course, this strange mixed photoelectric-electronic-film camera delivered monochrome images.
    The quantum efficiency of the whole system was much better that direct optical detection on film. I have in mind a figure of 40% efficiency for the photocathode, and as an electron detector, the film in use was good enough so that the association of both devices exceeded film alone directly exposed to low light levels.

    For many years, outside imaging devices, the photo multiplier was the only detector capable of counting photons one by one. But as far as I know, the efficiency of photocathodes is limited in the range of 40%, so as soon as new silicon detectors reached and even exceeded this figure of 40%, the supremacy of the photo-multiplier was challenged.
    What we are living now is that all those technologies, previously limited to military or scientific use, reach us at an incredible price, and instead of a unique photon detector, we get an array of 80 millions ... all of them with an incredibly high quantum efficiency.

    For the kind of photography we are interested in, the real problem as mentioned, is the cost of large format silicon image sensors above the the 4.5x6 cm size. Bigger detectors have beed fabricated for astrophysics. So it can be fabricated ... too bad that the economic factor actually limits our enthusiasm.

    As far as I'm concerned, for the kind of hand-crafted images I like to make, film and the large format camera are the ideal choice, since I have no client to satisfy, no delivery schedules and so on ...
    I can wait for 1/30-s for my shutter to open and close. And now that I know that film is wasting so many good photons that I could efficiently capture with silicon, I do not really care and I easily forgive my sliver-halide layer ; for the kind of landscape or achitecture shots I like with the view camera, I have no benefits of 1/8000 s for the same shot !
    Nobody would object that van Gogh painted his famous 'sun flowers' much slowly than 1/30-s

    But, not kidding, I agree with Steve McLevie, my favourite camera would be a 4x5" with full-size 4x5" silicon sensor, the pitch of the individual sensing elements would exactly match the diffraction limit, say @f/11 somewaht like current top-notch 'film' view camera lenses. Down with anti-aliasing filters !!

    Being optimistic, we would believed that the diffraction limit would be reached on the whole field , and classical textbooks tell us that the limit period in the optical-analogue image is about 11 wavelengths @f/11. If we take 0.7 microns as the worse-case for visible light, we eventually get 8 microns, 125 cy/mm for this ultimate image quality. I doubt that even the best 150mm lenses, covering 75 degrees could pass 125 cy/mm for a retail price of $999.99.
    But the situation would be clear and simple. The pixel pitch would be 4 microns =8/2, exactly matching the sampling theorem for a diffraction-limited image @f/11, the total number of pixels would be about
    100,000/4 by 120,000/4 = 25000 by 30000 = 750 million, about 10 times more than we have today in curent top-class medium format sensors. Shall we see it some day ?
    I remember colleagues proudly showing me the images they got with a 1-st generation consumer digital camera @500 kilo-pixels...

    In fact the maths involved here are much, much simpler that all I have read about the theory of the silver halide process !
    This is one of the reason why we love film, nobody can explain what's going on in our tiny silver grains in a few words : it's magic, the engineers cannot understand

  3. #23
    JoeV's Avatar
    Join Date
    Oct 2006
    Location
    Albuquerque, NM, USA
    Posts
    242

    Re: There must be a limit somewhere

    Technically, Moore's Law (or, more accurately, Moore's Observation) only relates to digital photography in the point-and-shoot arena, where sensor size isn't defined at a specific physical size, and thus transistor density can continue to increase, doubling every 18 months or so according to the rule.

    The thing with any sensor size that's defined by a specific format size (like micro 4/3, APSC and its variants, FF, medium format, etc.) is that the economics of semiconductor manufacture can't scale over time to reduce costs, as they do with circuits whose physical size isn't constrained to a predefined format. Thus, Moore's Law doesn't apply in such cases.

    Moore's Law is all about the economics of semiconductor manufacture: the cost to process silicon wafers is fixed, as is the "real estate" size of said wafers. The goal is thus to maximize the price/unit area of each "chip" (or die) within the wafer by reducing the size of a transistor, increasing the transistor density which not only makes for more transistors per unit wafer size (and increased profits per unit wafer) but has the secondary effect of producing faster circuits (charges have shorter distances to travel and less heat loss), thus implying that newer chips can be sold at higher prices, or their improved performance can offset the inevitable decline in price over time by commodification. This is how Moore's Law plays out in actual fact.

    Example: compare an early Pentium processor with a current Atom processor. The early Pentium had a die size about that of a US postage stamp, and were originally processed on 6 inch silicon wafers. Only several dozen chips were usable at end of line. Each one sold for over a thousand US$. The current Atom processor is the size of a long-grain of rice, is processed on 300mm wafers (you can fit 2500-3000 such Atoms on a single 300mm wafer) and, although selling for $10-20, the economics of scale implies higher profits for the smaller processor being sold at cheaper prices. Also, because the transistor density is higher with the Atom, the circuit operates more efficiently, electron charges move faster and create less heat loss, etc.

    The benefits of pursuing increased transistor density, as per Moore's Law with logic chips, doesn't apply to image sensors whose overall size is predefined by a format specification, and whose individual photo diode size is constrained in performance by physics. The cost to operate a chip fab is divided into each square millimeter of usable silicon wafer area, which then defines the necessary break-even point between die size and retail price. Hence prices can only decline, and profits improve over time, by manufacturing efficiencies like ensuring chip fabs are running at full volumes, and amortizing equipment costs over time. Such cost improvements are miniscule as compared to the benefits of shrinking transistor size over time, which you can do with logic chips, but not image sensors that are fixed to specific sizes.

    Because size-specific image sensors are exempt from Moore's Law, cost improvements will only be made by reducing costs in manufacture of the rest of the camera: replacing mechanical mirror optics with live-view displays, reducing mechanical controls and increase LCD touch-screen controls, replacing the last stage of a lens optic with software correction in firmware. All of these cost reductions you can see in effect in the newer camera formats like micro-4/3. That's where the future of large-sensor photography is headed.

    ~Joe
    The photograph and the thing being photographed are not the same thing.

  4. #24

    Join Date
    Jul 2007
    Location
    Austin TX
    Posts
    2,049

    Re: There must be a limit somewhere

    Emmanuel, thanks for your response and I like your comment about silver multiplication in film defying easy physical characterization - part of the charm and intrigue of the silver halide process I think.

    JoeV, excellent discussion about the limitations of using Moores Law for scaling. The opportunity for large area sensors might be realized by using flat panel manufacturing technology where the substrate size is tens of square feet per panel and the manufacturing cost per panel is as low as $200 per square foot. I've not looked at this rigorously but of course the performance of the photo receptor (thin film transistor) would need to be analyzed and the lithography requirements may not be adequate for the pixel size needed. The issue of projected volume looms large unless ancillary applications can be found and made to run in the same facility. For example Xray sensors come to mind among other possibilities.

    Nate Potter, Austin TX.

  5. #25

    Join Date
    Sep 1998
    Location
    Loganville , GA
    Posts
    14,410

    Re: There must be a limit somewhere

    "A Rodenstock HR-Digaron-S 28mm lens has a flange focal distance of 53mm, considerably more than its focal length. It also shows the typical retrofocus problems, distortion and performance well below the sharpest lenses for the format.

    A Rodenstock Grandagon-N 90mm has a flange focal distance of 94-98mm (F6.8, F4.5), nearly the same as the focal length. Less retrofocus, less distortion, better performance relative other lenses for the format."

    Looking at the latest Rodenstock literature for the HR Digaron series and the Apo Grandagon/Grandagon series I find your statement a bit confusing.

    The distortion graphs for the 28mm HR Digaron S shows distortion between roughly -0.6% and +0.6% (depending on the reproduction scale and the coverage.

    The distortion graphs for the 90mm 4.5 Grandagon N show distortion between 0 and +1.2% again depending on image ratio and the coverage.

    The fall-off in illumination is lower with the 28mm and the 28mm's Long. Chromatic Aberration is under 0.1% compared to the 90's 0.3% at full coverage.

    If you were to compare the curves of the 90mm HR Digaron W to the 90mm 4.5 Grandagon, except for coverage, the digital 90mm beats it by a long way for fall-off, distortion and Long. Chrom. Aberration. Even more dramatic would be the curves for the 100mm 4.0 HR Digaron-S vs the 90mm 4.5 Grandagon-N.

    Would you like a set of the latest curves and specs?

  6. #26

    Join Date
    Aug 2009
    Posts
    1,176

    Re: There must be a limit somewhere

    Quote Originally Posted by Steve McLevie View Post
    I know my 2.4GHz pc is nearly 10 years old and the new ones are around 3.2GHz. Moore's Law is breaking down is it not? So too I think in the digital photography realm.
    With computers, more performance was squeezed out by focussing on parallelism, bus speeds, memory speed, cache sizes and types of caches, off-loading certain operations to separate processors, and so on.

    The relevant comparison here is that eventually megapixels might take a back seat to super high ISO performance, maybe something like variable exposures that account for highlights and shadows (kind of like intelligent HDR or darkroom manipulation that looks natural), or any number of things.

    I think one thing that would actually kill film for almost everyone would be being able to take a 64,000 ISO image that looks like 100 ISO slide film, even if it were only a 12MP camera.

Similar Threads

  1. Camera without swings? Does it limit too much?
    By jvuokko in forum Style & Technique
    Replies: 16
    Last Post: 25-Dec-2010, 18:30
  2. Beating the diffraction limit
    By dh003i in forum Lenses & Lens Accessories
    Replies: 18
    Last Post: 3-Sep-2010, 14:42
  3. Limit image circle to only usable area?
    By Darin Boville in forum Lenses & Lens Accessories
    Replies: 7
    Last Post: 25-Nov-2009, 12:58
  4. Diffraction Limit on Macro Lenses
    By DolphinDan in forum Lenses & Lens Accessories
    Replies: 36
    Last Post: 27-Oct-2009, 07:30
  5. Removing stop limit from 150 G-Claron barrel
    By john wilton in forum Lenses & Lens Accessories
    Replies: 0
    Last Post: 16-Feb-2009, 22:07

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •