How did Genuine Fractals compare to Gigapixel AI Peter
How did Genuine Fractals compare to Gigapixel AI Peter
With my image, Gigapixel AI was better.
“You often feel tired, not because you've done too much, but because you've done too little of what sparks a light in you.”
― Alexander Den Heijer, Nothing You Don't Already Know
That likely explains it, Alan. I've only used GF to test that one image.
“You often feel tired, not because you've done too much, but because you've done too little of what sparks a light in you.”
― Alexander Den Heijer, Nothing You Don't Already Know
I scan at 6000 dpi (part of reason is it is a multiple of 300 since I use a IPf6400). I found the scans at 2400-4200 to be well, not that great. From 4200-about 5000/5500 much improved. At 6000 the sharpness, etc of the image is far greater than at 2400 or 4800. I may typically down size the scan for editing to 5700 I think it is so that I can use ACR at times. It might by 5400 not sure. But typically I edit at 6000 dpi, and my MacPro can handle it. I then down size to the print size I want, sharpen if needed, then save the file as the print master for the size. Note, before I down size, I flatten the file first. Also, I save the full size file as a working copy from which I make all my prints from. There is a method that Ken Lee outlines on his website for "turbo" charging PS that I played with that seems to work. It allows you to work on a small file size at greater speed and then when done you create the full size file at whatever dpi you started with.
You also use the same method for downsizing that I do, except I chose smoother gradients as I am sharpening after resizing anyway as virtually my last step.
The above works for me and is not intended to create a scanner war thing.
Also, I posted all of my resolution testing a while back with links to the scanned targets at all the various resolutions from 2100 to 6300. I don't use 6300 because it generates a file to big for tiff format. And all my files are raw, linear scans.
Photoshop allows you to choose the down sampling scheme you prefer. I prefer smoother gradients and that has been sizing files from 6000 to 300 with final sharpening after. Works very well. I used GF, tried Nik, but I don't like having yet another program to deal with. Photoshop if you are experienced at all the methods available provides very good results. I also sharpen selectively and at a level that is almost not detectable (my scan files can pretty much stand o their own with no sharpening)
Yes.. nothing like having "native sharpness" !!!
Perhaps we should look a bit in the rear-view mirror.
I like the sharpening algorithms used by Sally Mann for her impressive prints. You place a 8x10 collodion plate in the enlarger's carrier... then we execute the algorithm:
/////////////////////////////////////////////
//GNU General Public License
/////////////////////////////////////////////
#include <iostream>
#include "enlarger.h"
int main()
{
Load_Default_Gear();}
if(!sharp_negative() ) return GO_TO_SHOT;
for (int i = 0; i < MAX_FOCUS_OPS; i++) {
}
CImage* pImg = pEnlarger->Focus_Procedure(the_negative, loupe); // pointer pImg owned by CEnlarger instance
if (Check_Sharpness(pImg)) {
Print_Nice_Image(pImg);}
return YOU_HAVE_A_NICE_PRINT;
return GO_ALIGN_ENLARGER;
Change the units from inches to pixels and don't resample at scan time.
Deal with pixels and not inches until you're ready to print, then consider them both.
Don't know about Mr. Lee, but this sounds a lot like the "Guide File Workflow" that West Coast Imaging outlined some 10-15 years ago. The only real restriction to working this way is that all layers must be adjustment layers, only; you cannot have any pixel-based layer. Once you've finished your edit, you simply copy all the adjustment layers to the full resolution file, then complete for final output. I would guess that nowadays, with all the desktop computing power we enjoy, this workflow is not needed so much.
Bookmarks