All kinds of homemade light boxes are fine for sorting out transparencies etc. But critical color evaluation or backlit copy work just isn't possible with current LED illumination. The spectrum is far too uneven; and attempting to fool the eye by mixing the bulbs won't save a person from serious metamerism errors. Appropriate liner paints can be purchased or made by an expert who knows how to correctly use a spectrophotometer, and people like those (including me) are far and few between. Then one has to factor in the inherent bias of any diffuser and surface glass. Then finally, a good color temp meter is invaluable. Way more to it than just buying properly rated bulbs, of which the real deal are specialized, uncommon, relatively costly, and
of mature technology rather than of an adolescent LED nature. I don't like raining on your parade, Jim, but you're apparently unaware of what kind of equipment and training serious color evaluation actually involves. Outfits like Macbeth and XRite specialize in it. Your concept of RGB producing white light is theoretically correct, but like I hinted, there's a lot more to it
than just fooling the eye. CRI is based upon the percentage of specially selected color samples that can be rendered without
noticeable error or metamerism, not just the approximate RGB peaks, which are probably not accurate with current LED options anyway. You need something much more closely mimicking a blackbody continuous light source. And correctly measuring such things requires a helluva lot more than digital camera histograms can provide. Incidentally, the industry standard for color evaluation, as used by Macbeth etc is 5000K. The option of 5500 is given by Kodak in relation to mean
daylight color temp for film EXPOSURE purposes, though there seems to be some deviation from this among film manufacturers. But the problem in question here is not unexposed film, but how to best analyze already exposed and developed film. You're confusing these two kinds of problems. Simple tri-color breakdown of color is basically WWII technology and terrible inaccurate. True continuous spectrophotometers read the entire spectrum but were fussy to maintain.
Modern spectrophotometers typically use xenon flash tubes for an evenly distributed range of measurements (typically 12, 16, or 24 points), then interpolate. Hypothetically, the more the better; but engineering-wise and cost-wise, certain compromises are inevitable. The first spectrophotometer I worked with cost around $55,000. My wife operated a custom X-Rite model in biotech that cost six million; but at 40K per ml of end product, that was well justified.