Hi all. I've made a few posts touching on the use of IR in the darkroom and those conversations have opened up more questions that I think would be best dealt with in their own thread.
As a primer, I'm interested in using IR technology in the darkroom. Not at all a new idea. I am trying to figure out the details of the materials and tools that I will be using and I could use some help understanding the things that I am unable to figure out or grasp hence my posts which so far have been very helpful.
I have a pair of IR goggles on the way (Nyte Vu). From my research, I believe these use IR LEDs which output at 850nm. This is low enough that I believe the IR LEDs will produce a red glow. I have some IR LED (from RadioShack) that I've been messing with to see some differences in what can be found locally. I have an 850nm IR LED and a 940nm IR LED. The 850nm IR LED produces a much more significant red glow then the 940nm IR LED. The packaging describes the 850nm and 940nm measurements as being "wavelength at peak emission" which I take to mean is the farthest into infrared that the light reaches. Is there a way to measure the range of wavelengths emitted (low to high)? Do I need to find someone with an infrared spectrometer? Basically, I'd like to know the lowest wavelength of light emitted so that I can compare it to the data in various film datasheets defining the film's spectral sensitivity. For example, if I'm reading it correctly Ilford's datasheet for Delta 100 (Wedge Sensitivity to tungsten light (2850K)) says that the film tapers off and becomes basically insensitive around 660nm. I want to know if the lowest wavelength produced by the red glow of an 850nm IR LED is anywhere near that 660nm mark. If I'm understanding things correctly, I know that 850nm is the max wavelength of this IR LED but I do not know what the min (red glow) wavelength is and if it is approaching the film's sensitivity range.
Can anyone offer any insight into this? Thank you very much!!
Bookmarks