Originally Posted by Helen B
However, the relative log values of the step tablet densities, which are plotted on the X-axis, are being used to determine the the configuration, or slope, of the curve on the y-axis. If the two units are not calibrated by measurment with the same mode, or color of light, it would appear to me that the CI of the curve would be either artificially expanded or contracted (slope increased or decreased), assuming of course a difference in measurement with different modes.
This situation does not exist when we work with traditional developers that yield neutral tones because the measurment system for both the original step tablet and the test strips is the same.
And just for the record, there is often a difference between UV and Visual channel measurements of reference step tablets. For example, one of my Stouffer TP 45 step tablets measures as follows. In Visual mode, Step 1 = 0.05, Step 11 = 1.50, and Step 21 = 3.05. In UV mode the measument is, Step 1 = 0.10, Step 11 = 1.45, and Step 21 = 2.87.
Last edited by sanking; 08-06-2004 at 11:12 PM. Click to view previous post history.
O.K. I only had time to develop three sheets tonight. I have three more that I exposed and will develop them tomorrow when I get home from work and add the curves.
Super SideKick processor 75Deg 20RPM
All negatives read using "UV" mode.
Step Wedge readings taken in "Visual" and "UV" mode.
WinPlotter "Default" table.
OK, I was very curious about this so I went back and compared results with an existing film tests, which was FP4+ in Pyrocat 1:1:100, using two different step tablet readings, both from the same step tablet but one made with Visual reading, the other with UV reading. The reading in Visual mode ranges from 0.05 at Step 1 to 3.05 at Step 21, and the UV reading ranges from 0.10 at Step 1 to 2.87 at Step 21.
Originally Posted by sanking
As I suspected there was a significant different in the curves as plotted depending on which of the step table readings was used. The differences affected effective film speed, CI, and SBR values. To be precise, here is the difference when both calcualtions are made base on 10 minutes of development.
Step Tablet One, or the Default, made with Visual Reading: EFS=160, CI =.69, and SBR=8.3
Step Tablet Two, or the one made with UV reading: EFS=100, CI=.76, and SBR=7.5.
So my conclusion is that you do need to read the densities of the step tablet that will be used to expose your test negatives in the mode that will be used to measure these negaitves, and substitute these values in the WinPlotter program for the Default step tablet if the values are different.
How did you know which results were correct? I'm still having difficulty seeing why the step wedge density should not be measured in the mode that most closely represents the way in which the film is exposed - ie a measurement of the relative amounts of actinic light falling on the film.
If the step wedge measurements are being used to calibrate the UV measurements (which is unlikely to be as accurate as the standard densitomer calibration procedure), how does the software know what relative exposures the film received?
The question is: 'What is the purpose of the step wedge measurements - is it to tell the software what relative exposures the film had, or is it to calibrate the densitometer against the known UV transmission of a Stouffer step wedge?'
Can the software be used with other step wedges?
Interesting dialogue on this subject. I also do not understand why one would measure the step wedge density in UV mode. The only time that this would appear to be a valid consideration would be when the step wedge exposure on film would be facilitated by UV light.
The step wedge density would seem to be valid when measured by the densitometer channel most closely approximating the spectral qualities of the light that would be exposing the step wedge onto film.
What am I missing here?
Sponsored Ad. (Subscribers to APUG have the option to remove this ad.)
Originally Posted by Helen B
Forgive me if my explanations lack clarity but this is the first time I have considered this particular issue, and/or tried to explain it. My initial assumption would have been that the measurement of a neutral tone step wedge should be the same with both Visual and UV mode and in fact I don't fully understand why that is not the case.
However, as to which results are correct I again offer the suggestion that the log density units of measurments on the X-axis and y-axis need to be the same, i.e, taken with the same measuring instrument or mode of measurement. Otherwise the distance between equal units of log measurement would have to be different on the x and y axis. So you have apple to orange log units. In the case of the example cited the step tablet measured in UV mode had a total DR of 2.77 (2.98 - .10) in contrast to the DR of the Visual mode reading of 3.0 (3.05 - 0.05). But the actual range of 2.77 is being expanded to a physcial range on the X-axis of log 3.0 units. As one could predict this expansion of the X axis will result in a shallower slope, i.e. lower CI, if the plotting is based on the Visual mode reading.
With respect to your two questions.
"What is the purpose of the step wedge measurements - is it to tell the software what relative exposures the film had, or is it to calibrate the densitometer against the known UV transmission of a Stouffer step wedge?"
The purpose of the step wedge measurement is to establish a common unit of log density for the x and y axis.
"Can the software be used with other step wedges?"
Sure, but need to measure the step wedge with the same mode that you will measure the tests strips that are made from it.
The color of the light being used to expose the film is irrelevant. The density could come from red, green, blue or UV light. The color might affect the contrast or density of the test strips but that would make no difference to the plotting.
Originally Posted by Donald Miller
The key with the plot is that you must have the same measurment of density on the x and y-axis. If not, the curve will be artificially distorted because the units of measurements are of different length on the two axis.
Thanks Sandy...I understand now.
The purpose of the H&D curve you get by exposing the step wedge on film is to see how the image will print on paper. The film should be exposed by the light you expect to use when exposing actual photographs. The step tablet as developed on film is what the paper will see and so should be read by blue light for graded paper. VC paper can be a problem as it is sensitive to blue through yellow but gives different contrasts with blue than with yellow. I have not found a really good way to measure the printable density range of stained negatives so as to get a good working print on VC without test strips.
Originally Posted by Donald Miller
"The film should be exposed by the light you expect to use when exposing actual photographs. The step tablet as developed on film is what the paper will see and so should be read by blue light for graded paper."
'Just out of interest' how do you think the step wedge densities (rather than the step wedge image on film) should be read? By the mode that most closely represents the light used to expose the film (eg visual), or by the light that will be used to measure the film negative densities (eg blue or UV)?