Quote Originally Posted by RalphLambrecht View Post
Using 5000 K as a compromise makes a lot of sense to me, and in my view, it puts the 6500 K monitor-calibration standard in question. How, do you think, can they co-exist?
The colour calibration manual that came with my NEC SpectraView Reference 271 monitor explains that the D50 is the standard for press and print work, while D65 is used for web development. From my very limited experience of calibrating other people's screens (Mac and PC) I can see those have been mostly set around D65 or even higher (I've seen 70 and 80 occasionally). Personally, I always thought D65 was a compromise between what a casual web user would see and what was better for good judgment (D50). What you choose depends what you aim to produce, I suppose. Mind it, I always felt D50 on iMacs and Apple monitors did not look quite right, compared with other D50 monitors, like NEC, Eizo, or LaCie. D65 was fine.

As I was preparing to print my book, which contains tritone offset reproductions of my selenium toned black and white prints, I found D50 helped me find a common language with the designer and prepress. When I was adjusting those prints so anyone could see them nicely on my web site, I used D65 instead.

In the end, I am pleased with the results I got. The book really looks like the prints, and the web site is close to the spirit of the selenium tone as I could only hope for.

As for the original question, I use a single 100W tungsten Lucci milky bulb about 2m away from the trays, and I always evaluate the strips and the print hanging over the tray, not while floating in it. I found this to match the illumination that the gallery used very well. Tungsten seems to have a much better RI than any fluorescent strips I have found, and does not affect the visual perception of selenium tone as much.

Ps. The manual for the monitor also stressed that the room illumination ought to match the chosen white balance target, and that a compromise may need to be made should this be an issue.