Page 5 of 5 FirstFirst ... 345
Results 41 to 45 of 45

Thread: Tutorial: Illustrated Guide to B&W scan & processing

  1. #41

    Re: Tutorial: Illustrated Guide to B&W scan & processing

    Quote Originally Posted by Ed Richards View Post
    > Dowsampling to reduce noise does average or perhaps a better term would be smooth out the noise. But in doing so it also reduces sharpness.

    Only if you had 4800 real dpi that you were averaging. With these scanners you have about 1800 real DPI, so that when you average the data from 4800 to 2400, you are not losing detail because there was no real 4800 DPI (or, really, 2400 DPI) detail there in the first place.
    Ed,

    You still loose detail as the downsampling process is not perfect....you therefore introduce errors in that process that does impact detail and apparent sharpness.

    I would have had less of an issue with this whole process had he been working with 16 bit tiff files. Once I saw he was 8 bit scanning and saving his master file in jpg, he lost all credibility with me.

  2. #42
    Abuser of God's Sunlight
    Join Date
    Aug 2004
    Location
    brooklyn, nyc
    Posts
    5,796

    Re: Tutorial: Illustrated Guide to B&W scan & processing

    Quote Originally Posted by David Luttmann View Post
    Ed,

    You still loose detail as the downsampling process is not perfect....you therefore introduce errors in that process that does impact detail and apparent sharpness.

    I would have had less of an issue with this whole process had he been working with 16 bit tiff files. Once I saw he was 8 bit scanning and saving his master file in jpg, he lost all credibility with me.
    I find that with an Epson type scanner (one with an optical resolution around half the actual sampling frequency), I get small but ocasionally noticeable reductions in noise
    from downsampling. Scanning at 4800, dowsampling to 2400 (which is still slightly above the optical resolution under ideal circumstances). I don't see any reduction in sharpness from this method. I wouldn't expect to, since it's really just averaging four oversampled pixels. Scanning at 2400 ppi is much more flawed--the scanner simply throws out every other row and every other scan line.

    I agree that there's no good reason to use jpeg compression. It might be that the highest jpeg setting is actually lossless, but that makes its benefits no different from LZW (and adds the considerable disadvantage of 8 bit only encoding)

    On general principle it's foolish to throw out any bit depth early in the game. Although there are in fact benefits to increasing the bit depth of a file before any image processing. It's not about creating information that's not there; it's about making a file that's more resistant to degradation from the processing algorithms. You can demonstrate this yourself. Take two copies off any 8 bit image file. Convert one to 16 bits. Then abuse both files by repeatedly increasing and decreasing the contrast. Look at the images (and the histograms) afterwards.

    This principle is well known in audio. Most digital audio workstation software actually processes the signal in a 32 bit space, even though the files themselves are typically 16 or 24 bit. The idea is that all the processing artifacts end up several decimal spaces farther out, where they become harmlessly truncated once the file is downsampled to its final

    In real life, you should never see such damage. There's rarely a reason to subject a file to more than one or two total adjustments (everyone's using adjustment layers, right?) And there's also little reason to start with anything besides a 16 bit file ... even if it's only a 15 bit file in real life.
    Last edited by paulr; 26-Oct-2006 at 10:52.

  3. #43

    Re: Tutorial: Illustrated Guide to B&W scan & processing

    True Paul,

    Converting from 8 to 16 bit doesn't add anything, but it does allow for less error in multiple corrections to an image. Of course, starting off with more information for the original 16 bit scan is preferable.

    The JPG format is still lossy at the highest quality setting and thus is not an option for to high quality work.

    I agree with the audio example. We get less linearity errors when using a 24 bit master and then downsampling to 16 bit....just like we'd have less errors starting with a 16 bit scan and working down to 8 bit.....it appears Buze just doesn't understand this.

    However, whenever I do any serious listening to jazz, etc, it's a LP on the VPI....not a CD ;-) I know.....odd for a truly digital guy like me.

  4. #44
    Abuser of God's Sunlight
    Join Date
    Aug 2004
    Location
    brooklyn, nyc
    Posts
    5,796

    Re: Tutorial: Illustrated Guide to B&W scan & processing

    Quote Originally Posted by David Luttmann View Post
    However, whenever I do any serious listening to jazz, etc, it's a LP on the VPI....not a CD ;-) I know.....odd for a truly digital guy like me.
    It's true, and i don't think it's so mysterious. Both the CD and the LP are quirky media with their own fingerprints. A producer I used to work for gave me a pretty succinct impression ... "an LP sounds like an LP, a CD sounds like a CD, and neither one sounds anything at all like the master tape."
    Last edited by paulr; 26-Oct-2006 at 12:02.

  5. #45

    Re: Tutorial: Illustrated Guide to B&W scan & processing

    Quote Originally Posted by paulr View Post
    It's true, and i don't think it's so mysterious. Both the CD and the LP are quirky media with their own fingerprints. A producer I used to work for gave me a pretty succinct impression ... "an LP sounds like an LP, a CD sounds like a CD, and neither one sounds anything at all like the master tape."
    Amen!

Similar Threads

  1. Scan at Maximum Optical or Stated Resolution?
    By Brian Ellis in forum Digital Processing
    Replies: 15
    Last Post: 11-Oct-2006, 07:55

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •