PDA

View Full Version : Convert large 48bit TIFFs to JPEG2000



Remigius
7-Jan-2007, 05:29
Hi,

After several hours of searching and having tried and uninstalled several trial softwares, I decided to ask here.

I get large TIFFs (<500MB) with 48bit color from my scanner (Epson Perf 4990), and as they are so large, I'd like to compress them, but preserving the 48bits of color. So the best idea I had so far was to use JPEG2000 (other suggestions welcome). Unfortunately, there does not seem be a lightweight piece of software around that does this job - or at least it's well hidden (I got PS CS2 - but it's in my office and far more heavyweight than I'd like). If you know of such a tool (freeware or shareware for not more than ~ 50-100USD), please let me know.

Ted Harris
7-Jan-2007, 06:48
Depends on what you mean by heavyweight and your platform. If you are on a Mac the answer is Graphic Converter from Lemkesoft http://www.lemkesoft.com. My recollection is that it is only 30 bucks and it works flawlessly, been using it for years.

OTOH, remember that the JPEG2000 format has never been well accepted and is now almost a 'legacy' format so that if you ever need to send a file to someone else you may not be able to use it.

Finally, if you aren't going to use PS2 then why scan at such high resolution and why save 16 bit color? Curious as to what you are doing with the files that needs that level of detail and resolution but doesn't need PS2. You can use a lower setting on your scanner.

Leonard Evens
7-Jan-2007, 07:46
I would recommend using ImageMagick and converting to png format. You can also use it to convert to jpeg-2000. ImageMagick is available for Linux and Windows at

www.imagemagick.org/script/index.php

Clearly, you want the 16 bit version for Windows. It is free.

You would use the convert function in a command window.

Remigius
7-Jan-2007, 08:24
Finally, if you aren't going to use PS2 then why scan at such high resolution and why save 16 bit color? Curious as to what you are doing with the files that needs that level of detail and resolution but doesn't need PS2. You can use a lower setting on your scanner.
The scanner is at home, PS in my office, and my memory stick only 1gig (OK, I have a couple of USB harddisks etc., but still, 500meg is a lot to archive the images). If necessary, I can still convert the files to something else before sending them out.

Remigius
7-Jan-2007, 09:39
I would recommend using ImageMagick and converting to png format. You can also use it to convert to jpeg-2000. ImageMagick is available for Linux and Windows at

www.imagemagick.org/script/index.php

Clearly, you want the 16 bit version for Windows. It is free.

You would use the convert function in a command window.
the convert function displays the following error message after producing a jp2 file of 251 bytes:

"This application has requested the Runtime to terminate it in an unusual way.
Please contact the application's support team for more information."

I assume this is because my notebook has only 1gig memory. IM stopped after producing a (working, 48bit) 467mb png from my 571mb test tiff with the following error message:

convert: incorrect count for field "MinSampleValue" (1, expecting 3); tag ignore
d. `2006-12-29-008.tif'.
convert: incorrect count for field "MaxSampleValue" (1, expecting 3); tag ignore
d. `2006-12-29-008.tif'.

So, thanks for your info, although this does not seem to solve my problem (yet - maybe I should experiment with some settings).

Ted Harris
7-Jan-2007, 11:10
You are absolutely correct that 500MB is a large file to archive. OTOH, it all depends on why you are scannin g and archiving in the first place. I routinely create and archive files of 500MB and much larger. I archive the raw scans for future use on a hard drive dedicated to that. When the hard drive is full I remove it and store it. At less than $100 for a bare 250GB hard drive it is by far the cheapest way to save files.

So, why do I use such large files? Quality pur e and simple. I shoot LF for many reasons but one of the major ones is the quality of the image when printed at larger sizes. You can't get that quality if you are starting from digital files of small size.

If, OTOH, you are simply scanning for printing no larger than 8x10 you may not need such large files but don't go too much smaller or you will begin to defeat the purpose of shooting 4x5 in the first place.

Leonard Evens
7-Jan-2007, 13:26
I work under Linux primarily, so I can't say for sure why you had a problem. It might be memory. I have 3 Gb of ram. When I had only 1.5 Gb, I occasionally ran into problems. This issue aside, you should get as much memory as can fit in your computer if you want to do large format digital photography.

It could also be Windoxs, which is hardly the most efficient operating system, or it could be that fitting ImageMagick into Windows limited its usefullness. (You might let the Image Magic people know of your problem, and perhaps they can fix it in a later release.)

I just did a conversion of a 350 Mb tiff file to png, and I had no trouble. The png file actually ended up larger in size than the tiff file because I didn't specify a high degree of compression and the tiff file was already compressed with LZW. But I see no reason why it wouldn't also work had I specified a high degree of compression.

I have an Epson 3200, and I routinely reduce my files to something like 8000 x 10,000. That seems adequate for my needs, and I can store four such files on one CD using png and moderate compression. There isn't much point with the Epson scanners in files much larger than that because the optics of these scanners won't provide nearly the resolution that the high sampling resolutions suggest. If you keep your negatives or transparencies, you can always rescan them at a later date if you have a need for significantly higher resolution and a scanner which can deliver it.

schngrg
7-Jan-2007, 21:37
So the best idea I had so far was to use JPEG2000 (other suggestions welcome).

OTOH, remember that the JPEG2000 format has never been well accepted and is now almost a 'legacy' format so that if you ever need to send a file to someone else you may not be able to use it.

You can get a good and fast Jpeg2000 converter at http://www.kakadusoftware.com/

I don't think Jpeg2000 is dead or legacy, it was never meant to become mainstream for general photography/archiving, that atleast wasn't a short-term goal with the format. Reason behind this is that compressing files as Jpeg2000 is slower than Jpeg (which means it will need more battery power on Camera), anyway the speed is almost same as that of PNG. So if the speed is OK with you, go ahead and use it.

AFAIK it is starting to become the default format for HD movies (only for initial capture, DVDs will not have jpeg2000 movies :-) and is also used popularly where it is required to view small parts of large images on slow internet connections, as it has very flexible 'progressive' decompression modes (I can explain this in more detail if anyone is interested).


convert: incorrect count for field "MinSampleValue" (1, expecting 3); tag ignore

ImageMagick uses libtiff for Tiff support and this error comes because the tool you used to create the Tiff image (I guess your scanner software) is creating a 'old-style' TIFF image which has changed in a newer format specification.

It is not a fatal error and ImageMagick 'could' have handled this TIFF had it been smart enough to just 'ignore' this warning reported by libtiff, you might want to tell them this if you really want to use ImageMagick. (They are probably playing safe by quiting on all warnings so as to not accidently corrupt your data).

I hope this helps :-)

Sachin Garg [India]
www.sachingarg.com | www.c10n.info

Remigius
8-Jan-2007, 10:08
Ted,

I also use LF for image quality, hence also the desire to archive high quality images (4x5 @ 2400dpi in 48bit colors resulting in images of about 9000x11000 pixels - target print format currently 17x22"). Of course, using hard disks to archive images is a good choice, but still I'd like to get down to something smaller than 500 megs per image. I prefer to have my images online (i.e. on a disk connected to the LAN at my office). Jpegs are about 13 megs, but the difference in image quality is clearly noticeable (even on my notebook screen). I think the highest impact stems from the reduction to 8 bits per channel, not so much from compression artifacts. So I expect the image quality of a JPEG 2000 to be reasonable enough to replace a TIFF. As some examples on web sites I have found show, the quality of a JPEG 2000 is superior to that of a JPEG of the same size, and furthermore, it can preserve the 16 bits per channel.

schngrg,

(what a name!) I'll give the kakadu a try. The progressive thing comes from the use of vavelets and the particular order in which subband images are stored in the files. If course, I noticed that the errors weren't fatal, as the resulting png was OK. otoh, the achieved compression rate was not worth the time spent to perform the compression. I have experimented with some (very few) settings, but the result was even worse than the default settings, so I decided to drop png.

Thanks for all the answers. I'll let you know If I get to something usable.

Remigius
8-Jan-2007, 13:23
First results: For a first try, I have downloaded a demo version of the software LuraWave from http://www.luratech.com/ . The compression with a quality setting of 80 (whatever that means) resulted in a jp2 file of about 5 meg instead of the 571 meg tiff. Unfortunately, the resulting image has only 24 bit colors. Amazingly, there are no differences between the two images that I could visually detect in 1:1 size. The software allows to display the difference between two images, which in the case of the two images is just uniform neutral grey (which confirms my visual impression). At the moment I'm quite satisfied with the results, I think after some more experiments to determine workable settings and a final choice of software (one that supports 48 bit colors), I'll start to archive my scans as jp2s. If something goes wrong, I still got my negatives to rescan if necessary.

Remigius
8-Jan-2007, 13:57
Next experiment: I have downloaded the kakadu software. I remember to have found this site already last night, but due to the confusing number or options of the compression program I left it aside first. Now I have tried to compress my test image with default parameters resulting in a file of 1/10 the original size with preservation of the 48 bit colors. Comparison to the original again resulted in no visible differences and a blank difference image. So this looks really promising. Getting the size down a bit more might be helpful, but even with default parameters the results are quite acceptable to me.

Thanks again and best regards, Remigius.

Remigius
8-Jan-2007, 14:38
Experiments with -rate parameter: I have performed a few compressions with varying -rate parameter (2.0, 1.0, 0.5 and 0,25). The rate parameter sets the "bit rate", which is the number of bits per pixel of the original image. So a rate of 1.0 for a 48 bit image corresponds to a compression of 1:48. My impressions were as follows:

default setting: compression of 1:10, no visible differences to the original

-rate 2.0: compression of 1:24, no visible differences to the original

-rate 1.0: compression of 1:48, no visible differences to the original

-rate 0.5: compression of 1:96, slight artifacts become visible (in the sky), compressed image seems slightly softer than original, difference image blank

-rate 0.25: compression of 1:192, artifacts clearly visible in the sky, but not in other parts (stones of a wall), image seems softer than original, difference image has a few regions with a slight veil.

This is far from a scientific experiment, but hopefully useful for some of you...

Best regards, Remigius.

schngrg
11-Jan-2007, 03:30
To do a completely pixel lossless compression use the 'Creversible=yes' parameter, without any rate parameter.

kdu_compress -i infile -o outfile Creversible=yes


You can specify a rate of choice with 'Creversible=yes' also, although that won't serve any practical purpose.

And for lossy, specifying even a very high rate can still give some extra loss, to specify complete-codestream-retention in lossy mode, use: '-rate -'