Interesting: Scanner sees densities differently
I have been using scanner as an densitometer. Before that, I used spotmeter which worked fine, but scanner is easier.
Scanner is not linear and you have no clue what densities it gives as an output. So I checked densities that scanner gives from stouffer step tablet and created correction function that gives proper density from scanners output.
So far, so fine...
I used that method successfully couple of years, then I started experiments with tanning and staining developers. During those experiments and zone system calibrations for pyrocat-HD I had access to real densitometer.
It showed that all my readings were of, the more density the more error I had.
Up to about zone V, my scanner method had given enough accurate densities, but after that - error begun to grow.
The first I found this with tanned and stained negatives, but soon I realized that error was similar with all negatives, regardless of developers.
So what could be reason?
I tested stouffer step tabled with densitometer - all readings were exact.
I have correction function for scanner which gives exactly right density readings from stouffer tablet.
But from negative, it is different. Why? Does scanner see the density of silver film somehow differently than test tablet?
Photographic density is the logarithm of the ratio of input illumination to output illumination. If you plot your results on semilog graph paper or the log of your results on ordinary graph paper, you may see the light. Your spot meter shows a number proportional to the logarithm of illumination of its sensor.
Yes, I know that density is logarithm.
But that does not explain why I get different density reading from b/w film than calibration step tablet.
In my best knowledge, both should pass equally light at when both have equal density.
For example, when transmission step wedge 'step' have defined density 1.25, it gives real density 1.26 with densitometer and 1.44 with my scanner.
With that information I know that when scanner's densitometer gives reading 1.44, it is actually 1.25.
I have even created correction table. The Y axis is density given by scanner and the X axis is real density (measured from stouffer transmission step wedge).
This is really accurate with transmission wedge. Thus, it should also be accurate with any kind of transmissable material.
But for B/W film it seems not to be. For example, Neopan 400 film, it gives density 1.25 + FBF for zone VIII, which sounds really good. But real densitometer gives 1.17 + FBF.
Why such difference? Both equipments, the scanner and the densitometer reads transmission wedge accurately (ofcourse, scanner after applying correction). So why not negative?
What developer are you using for your film? If you are using a staining developer, you may get different densities, depending on how you measure. If the densitometer has a UV channel, for instance, you might get a higher UV density than a white light density from a scanner. Or if you scan in RGB, the green channel might have a higher density than the other channels. And then the fluorescent light source on the scanner has a different spectral output than the light source in the densitometer, so that could also cause differences in readings with a staining developer.
Read the following which may have the answers for you:
Sponsored Ad. (Subscribers to APUG have the option to remove this ad.)
No silver image from a standard developer is precisely neutral, so unless the densitometer and the scanner match in spectral response when reading, there are apt to be differences.
We all know that you can get warm or cold tones from different developer/emulsion combinations and the two instruments can read densities on these quite differently.
that's probably the reason.
I have tested my "scanner densitometer" with various developers, Pyrocat-HD, DiXactol, Rodinal, Xtol.
Even tested with cokin ND-filter - which I found not to be very neutral...
So my final thought is that transmission wedge can be used to create calibration function for scanner, and then scanner can be used as densitometer. At least it shows about where the densities are.
But how are the real densitometers? There are lot of folks that uses densitometers for creating characterist curves, calibrating zone systems, .... Are they aware that densitometer may not be really accurate with used material (film + developer combination)?
Mostly people are aware that staining developers need to be measured with blue light or better, with UV. But for other materials - I really don't know.
Sensitometry seems to be can of worms... That I just opened and lost the lid.
Thanks, that gave me some information. At least it can be used as rough correction table for stained negatives
Originally Posted by rob champagne
It is more like opening a can of live grashoppers.
The list of variables which could cause such a difference is quite long but it is likely that the point light source of the densitometer scatters within dense negs differently then a diffused light source of a scanner. This could be exacerbated by different film types, grain patterns, stain, etc. One way to check this is to make the densitometer see like a scanner. Some densitometers (older macbeth's) have mesh filters under the plastic base which can be adjusted between the point source and aperture which will diffuse and flatten out the spectrum.