Ralph Lambrect's Film Testing Spreadsheet

Discussion in 'B&W: Film, Paper, Chemistry' started by CPorter, Jun 13, 2012.

  1. CPorter

    CPorter Member

    Messages:
    1,662
    Joined:
    Feb 2, 2004
    Location:
    West KY
    Shooter:
    4x5 Format
    ..............has anyone ever used it? I finished tests with 4x5 TXP in HC-110 (1:63) at 68 deg F. I don't like the results, not attributing anything to the spreadsheet, but something I did I'm sure, but not sure what that is. I did not use a 6th sheet for film speed as I'm not sure how the actual film speed test is recommended by Ralph, I haven't got his book yet. I input a speed consistent with his statistical typical speed of 2/3 less box speed.

    The spread between the 4 min avg gradient and the 16 min avg gradient seems quite narrow to me, resulting in such a narrow range of development times from just below "N" to just below +2. This, I certainly did not expect as I was hoping for clear +2 to -2 dev times. I developed in the Combi-Plan tank, using inversion agitation cycle of 4/10/1--------4 inversions in 10 sec every 1 minute, there was no temperature drift, processing is very consistent. In the .xls attachments (hope you can view them) are the TXP test that I finished recently using the 4, 5.5, 8,11, and 16 min times and a TMX (x-tol 1+1) test I did last year, but before having this spreadsheet, the times are my own determined from testing as described here, I simply input the densities into the spreadsheet from those development times. The X-Tol results are much more normal to me, a difference obviously being it was with a 100 ISO film versus an ISO 320 film with TXP.

    So, this test with TXP seems strange to me, and I exposed the step tablet the same in both tests i.e., a Zone X exposure. But, one difference may be that I used a middle gray test target in the X-Tol test and a white test target in the TXP test, I don't see why that should matter. The TXP test results simply do not look right. I welcome any thoughts.............especially Ralph's.

    Thanks.
     

    Attached Files:

    Last edited by a moderator: Jun 13, 2012
  2. lxdude

    lxdude Member

    Messages:
    6,937
    Joined:
    Apr 8, 2009
    Location:
    Redlands, So
    Shooter:
    Multi Format
    I would imagine Ralph has used it.
     
  3. wildbill

    wildbill Member

    Messages:
    2,851
    Joined:
    Nov 28, 2004
    Location:
    Grand Rapids
    Shooter:
    Multi Format
    :0
    that guys knows his shit, and I'm sure he'll chime in here.
     
  4. Bill Burk

    Bill Burk Subscriber

    Messages:
    4,890
    Joined:
    Feb 9, 2010
    Shooter:
    4x5 Format
    In 4 min HC-110 (1+63 AKA Dilution H) your raw data shows you get 1.0 NDR over 8 stops.

    That's "N-1" the way I work.

    It is not a huge difference that the spreadsheet calls this "N-0.4" (If I read the formula correctly Ralph uses the constants 1.2 NDR over 7 stops to define "N").

    Relatively speaking, I think you are getting more development activity than you might want... since you got a lot of density in 4 minutes.

    If development results are even, then there you have it: You have a rapid process. If you are getting uneven results, then you might pick XTOL for any film you mark for "N-1" or less development.

    p.s. I work for Kodak but the opinions and positions I take are my own and not necessarily those of EKC.
     
  5. RalphLambrecht

    RalphLambrecht Member

    Messages:
    8,080
    Joined:
    Sep 19, 2003
    Location:
    Central flor
    Shooter:
    Multi Format
    give me a day or two to look at your data. i've used or helped people use the spreadsheet in over 50 cases and it worked well every time!
     
  6. Usagi

    Usagi Member

    Messages:
    360
    Joined:
    Apr 17, 2007
    Location:
    Turku, Finla
    Shooter:
    Multi Format
    I have used Ralph's spreadsheets a lot. Spreadsheet has always gave same or almost same CI values as my previous paper calculations and measurements including BTZS's and WBM's rulers (and calculations). When there have been noticeable difference, it's usually caused by selected location of the speed point (I used 0.1 earlier) and used VIII's target density. If the curve isn't linear, these may introduce lot of variation.

    View attachment 52460

    In case of low contrast results like in old picture above (curves for 4, 5.5 and 8 minutes), the measured curve won't reach VIII density.
    Where to get right CI? One may extract it from the measured part of the curve, one may extend the curve by approximating the shape etc.. If the low contrast curve shape is upswept, CI calculation may vary a lot. Unless test is done by extending test target's range with ND filter or with two exposure (of step wedge).
     
  7. Bill Burk

    Bill Burk Subscriber

    Messages:
    4,890
    Joined:
    Feb 9, 2010
    Shooter:
    4x5 Format
    I think I was wrong about the HC-110 being overactive. I now think it is just about right.

    I graphed this on paper and overlaid a transparent Contrast Index meter.

    It was difficult to find the Base + Fog point to zero the meter, so I fudged by moving it up.

    When I fudged the zero, I came out with CI's:
    4 Min > CI 0.4
    5.5 Min > CI 0.45
    8 Min > CI 0.5
    11 Min > CI 0.6
    16 Min > CI 0.7

    For a 1.0 LER, this set came very close to being:
    4 Min > N-2
    5.5 Min > N-1
    8 Min > N
    11 Min > N+1
    16 Min > N+2

    When input my CI as average gradients in blue cells of Ralph's spreadsheet:
    4 Min > N-3
    5.5 Min > N-2
    8 Min > N-1
    11 Min > N
    16 Min > N+1

    Summary:
    1) You expected and got very clear N-2 to N+2.
    2) The data points are hard to fit to curves.
     
  8. CPorter

    CPorter Member

    Messages:
    1,662
    Joined:
    Feb 2, 2004
    Location:
    West KY
    Shooter:
    4x5 Format
    Well, thanks Bill, but what am I missing here? Why are you using a LER or LogH of 1.0 to make these points, not that I don't appreciate them :smile:, but the sheet specifies 2.1. So, just considering what the sheet is telling me with my data, I don't have clear +2 to -2 dev times. I guess I don't follow you.

    I should probably state that I am not really interested in the average gradient results of the spread sheet, just the "N" dev times and the associated effective film speeds. I adjust the gradients as Ralph suggests to tweak the curves as needed.
     
  9. CPorter

    CPorter Member

    Messages:
    1,662
    Joined:
    Feb 2, 2004
    Location:
    West KY
    Shooter:
    4x5 Format
    Bill,

    Did you mean to say a NDR of 1.2 over over about 8 stops. I see the LogH range of 2.23 (7 1/2 stops) presented in the 4 min results of the "Curve 1" tab. By raw data do you mean the "Input Data" tab? In which case I see the 4 min density range of 1.05 in a LogH range generated from the tablet of 1.95 (6.5 stops). Just curious about your reference to a NDR of 1.0. Like I said, I must be missing something.
     
  10. Bill Burk

    Bill Burk Subscriber

    Messages:
    4,890
    Joined:
    Feb 9, 2010
    Shooter:
    4x5 Format
    I think this picture will be better than a thousand words...

    CPorter HC110 Chart

    Exposure (the step wedge) goes horizontally 2.1
    This makes "N" 7 whole stops which fits the Zone System.

    Negative Density Range goes vertically.
    I use 1.0 for personal reasons to hit between Grade 2 and 3 Galerie.
    Ralph goes for 1.2 and that is fine. I drew them both

    The curves are accurate. But I was very hasty drawing the marks "NDR 1.2" ... "NDR 1.0" and "N+, N, N-" - I just wanted to see approximately where the "N" was falling.

    It's without flare, to keep it simple.

    When I take my densitometer measurements, I measure and note Base + Fog for each sheet. Then I Zero the densitometer on Base + Fog for each sheet and all my measurements are "above B+F". It may help us if you describe how you zeroed the densitometer or kept tabs on Base + Fog.
     
  11. CPorter

    CPorter Member

    Messages:
    1,662
    Joined:
    Feb 2, 2004
    Location:
    West KY
    Shooter:
    4x5 Format
    Thanks for that Bill. As per Ralph, the density readings for the "Input Data" tab are to be input as total density, so they include b+f, I sent him a message a while back asking specifically about that. This is because I have always graphed the curve based on net density too, that's how I learned in The Negative anyway----as per that resource, the b+f level is simply "printed through", it's not given any weight in reading the curve. Ralph's sheet includes it in the totality of the curve, which I find interesting to see. And in his literature, he specifically states to zero the densitometer without anything in the light path to enable reading fb+f of each sheet. I'm looking forward to Ralph's comments. My "net" density curves produce the same narrow range of resulting "N" development times, as they should.

    I have input "net" densities into the spread sheet and all that seems to differ, as expected, is the "cross hairs" indicating Dmin and Dmax drop to the actual Dmin and Dmax densities that were specified at the input tab. And at the "curve family" tab, one would see "net" density data indicated by a "level" red line at Dmin across all curves. This is opposed to a red line that would be slanted across all curves indicating the fluctuation of fb+f from 4 min to 16 min of development.

    On the individual curve tabs, using the total density, I placed a red line (the eyeball method) at the Dmin and Dmax densities, the cross hairs are above them and the difference between the two is the b+f level, which it differs with each sheet of course.

    My densitometer calibrates perfectly with it's calibration tablet and it indicates good linearity. So, I have no doubt that the curves as measured are right. But I question the high densities that I generated that have resulted in the curves shown, I believe something is amiss. If I have, as you say, clear +2 to -2 times, then there is the immediate question of why it is not indicated in the spreadsheet, at least in the way I am interpreting it anyway.
     
    Last edited by a moderator: Jun 14, 2012
  12. Bill Burk

    Bill Burk Subscriber

    Messages:
    4,890
    Joined:
    Feb 9, 2010
    Shooter:
    4x5 Format
    Hi Chuck,

    That is fine to do it the way Ralph said. His spreadsheet is designed to take net densities.

    So that means my drawings with a straight lines across would ideally need to be a slanted lines as well.

    It may come down to a scenario where the curve fitting is difficult for the computer. Or maybe it illustrates the difference between the points selected to calculate "Contrast Index" versus "Average Gradient"
     
    Last edited by a moderator: Jun 14, 2012
  13. Stephen Benskin

    Stephen Benskin Member

    Messages:
    1,633
    Joined:
    Jan 7, 2005
    Location:
    Los Angeles
    Shooter:
    4x5 Format
    I've made a simple Time / Gradient graph in Excel using the data from Chuck's spreadsheet. While the rate of development isn't very steep, except for the 8 minute test, there doesn't appear to be a problem with the overall progression.

    Time gradient Curve - Chuck.jpg
     
    Last edited by a moderator: Jun 16, 2012
  14. Sponsored Ad
  15. CPorter

    CPorter Member

    Messages:
    1,662
    Joined:
    Feb 2, 2004
    Location:
    West KY
    Shooter:
    4x5 Format
    That's good to know, but I can't look at the "development compensation vs. development time" graph and think that it is right, at least at the moment.
     
  16. Stephen Benskin

    Stephen Benskin Member

    Messages:
    1,633
    Joined:
    Jan 7, 2005
    Location:
    Los Angeles
    Shooter:
    4x5 Format
    Chuck,

    If Ralph's spreadsheet calculates gradient in the way he describes in his book, it's base point is higher on the curve than what is normally used. Placing the base point higher on the curve will mostly factor in the straight-line portion of the curve and produce a higher gradient value than if the reading incorporated more of the toe. The two shorter timed tests of yours show a bit of a toe. There's a good chance this is the reason for higher than expected gradients. Like Bill indicated, if you measure the curves using different methodology like CI or average gradient with a 0.10 base point, you will get different results. I believe you might find them to be closer to what you are expecting.
     
  17. CPorter

    CPorter Member

    Messages:
    1,662
    Joined:
    Feb 2, 2004
    Location:
    West KY
    Shooter:
    4x5 Format
    I thought of this last night and changed to 0.1 for Relative Dmin and 1.3 for Relative Dmax. The range of development compensations I got were from N-1 to just over N+1 as seen in the attachment, I admit, I don't understand why +2 and -2 for the final summary is not obtained, I guess it's over my head, IDK. Ralph mentions that a density range of 1.2 includes from the beginning of Zone II and to the end of Zone VIII (presumably for 0.17 and 1.37) ----since the zone locations are the mid-point of the zone, I see that as actually from Zone I 1/2 to Zone VIII 1/2. However, I don't know if that still holds when lowering the relative Dmin and Dmax values----0.1 and 1.3 are what I have traditionally used and they range from Zone I to Zone VIII.

    Also, I tooled around with the charts, just to add things graphically that help me when evaluating them, but that's just me.
     

    Attached Files:

  18. Lee L

    Lee L Member

    Messages:
    3,246
    Joined:
    Nov 17, 2004
    Shooter:
    Multi Format
    Hope you don't mind but I'm attaching a version in Open Document Format for those who haven't paid Microsoft to use spreadsheets. Let me know if you want it deleted.

    Lee

     

    Attached Files:

  19. Stephen Benskin

    Stephen Benskin Member

    Messages:
    1,633
    Joined:
    Jan 7, 2005
    Location:
    Los Angeles
    Shooter:
    4x5 Format
    Chuck, could you give 0.10 and 1.20 a try and see if there's any difference?

    How something is measured effects the results. This is one of the reasons why I tend to emphasis theory. Most of the time the variance in results will be inconsequential, but under certain circumstances there can be significant differences. The idea of the "best" methodology isn't about if it works or not, but that it produces the most consistent results under greatest number of situations. Take gamma for instance. It works very well for short toed, straight-line curves, but not with longer toed or curves without significant straight-line portions. Defining what is Normal development and the pluses and minuses is another example. Most methods work great for determining normal but begin to differ in results when working with the more extreme ranges of scenes encountered. Often times an alternative method can be superior to what is considered "the best" method under a certain set of conditions, but will fail under others. And this doesn't even begin to address how difficult it is to compare results from different people who use different methods.

    If you take a look at the gradient/time curve, you will notice your results don't exactly create a smooth arc. There's nothing wrong with that. There will always be slight variations in any test and this is what your results indicate, but if you add 0.02 to the 8 minute and subtract 0.02 to the 4 minute test, a more "realistic" depiction will result. From here, you can project the time you need for +2 and -2. I get about 18.5 for the gradient Ralph uses for +2 and maybe a little under 3 minutes for -2.
     
  20. CPorter

    CPorter Member

    Messages:
    1,662
    Joined:
    Feb 2, 2004
    Location:
    West KY
    Shooter:
    4x5 Format
    I've no problem with that..........
     
  21. CPorter

    CPorter Member

    Messages:
    1,662
    Joined:
    Feb 2, 2004
    Location:
    West KY
    Shooter:
    4x5 Format
    Not much difference there.....the off set is due to the sheet's formulas being set up for a NDR of 1.2, your suggestion was an NDR of 1.1. Maybe Ralph will chime in on it soon.
     

    Attached Files:

  22. Stephen Benskin

    Stephen Benskin Member

    Messages:
    1,633
    Joined:
    Jan 7, 2005
    Location:
    Los Angeles
    Shooter:
    4x5 Format
    That's interesting. Thanks for taking the time to do that. I decided to plug your numbers into my program. I got values very close to those Bill got with his hand drawn curves and CI overlay.

    Results from my program:

    4 min 0.41
    5.5 min 0.44
    8 min 0.50
    11 min 0.57
    16 min 0.68

    My program uses a variation of contrast index where it uses 0.10 as a base and draws an arc 2.0 log-H units to the right. As described in Photographic Materials and Processes. Just to help illustrate the how the various methods of measurement can effect the results, my program also does Ilford's G bar with a log-H range of 1.50 and an average gradient with a log-H of 1.80. I got 0.72 for the 1.80 average gradient method.

    And here's a Time/CI Graph. As you can see, it's smoother than the one derived from the values from Ralph's spreadsheet, which is a positive indication as to the validity of the results.

    Time gradient Curve - Chucks data - My program.jpg
     
    Last edited by a moderator: Jun 16, 2012
  23. CPorter

    CPorter Member

    Messages:
    1,662
    Joined:
    Feb 2, 2004
    Location:
    West KY
    Shooter:
    4x5 Format
    That's good to know too, though I have a difficult time believing that the "N" dev times are right, it's because I think I did something wrong, not due to the spreadsheet---so, while I'm waiting for Ralph to have a say on why the results are what they are, I welcome others. I just can't believe that 4 minutes of development is almost "N"!! That would make -1 or -2 rediculously short IMO. For the life of me I can't see what aspect of my process could be the reason for this test showing these results. I lean toward the tablet getting too much exposure, but IDK. I exposed the tablet the way I do for all my other previous testing via Shaeffer. Perhaps that does not work well with Ralph's spreadsheet, IDK.
     
    Last edited by a moderator: Jun 17, 2012
  24. Bill Burk

    Bill Burk Subscriber

    Messages:
    4,890
    Joined:
    Feb 9, 2010
    Shooter:
    4x5 Format
    I have a very important observation to make here.

    Because of the long toe and upswept highlights in the curve family from your HC-110 and Tri-X test, it is very important to your prints where you place your shadow exposures.

    The lower on the toe that you place your exposure, the less likely you are going to get chalky highlights. (So do what Ansel Adams would do, not what Bruce Barnhaum would do).
     
  25. Stephen Benskin

    Stephen Benskin Member

    Messages:
    1,633
    Joined:
    Jan 7, 2005
    Location:
    Los Angeles
    Shooter:
    4x5 Format
    I don't know if the spreadsheet plots the film curves in sufficient detail to do this, but one way to check if the gradient values are accurate is to make some mathematically derived density range projections and see if they match up with the actual curves.

    All you need is the Rise / Run = Gradient equation. To determine the density range of any given log-H range, you take the gradient from the spreadsheet and multiply it by a desired log-H range.

    For example:

    5.5 min test
    Gradient: 0.57
    log-H range: 2.10

    Gradient * Run = Rise
    0.57 * 2.10 = 1.20

    or you can use NDR / Gradient = log-H range
    1.20 / 0.57 = 1.20

    May I suggest testing both 0.10 and 0.19 as the base for the log-H range. Don't expect it to be a perfect fit. Curve shape will have an effect, but this should give you an idea if there is a problem with the gradient values or not.

    Here's an example with one of my own tests. I use a NDR of 1.05 and for normal a luminance range of 2.20 with 0.40 flare for an aim log-H range of 1.80. With a film developed to a CI of 0.58 the NDR of 1.05 should be achieved at a log-H range of 1.81.

    NDR check from projection.jpg

    Just for the sake of clarity, there is no definitive set of gradient values for the different stages of development. Not only is there the choice of diffusion or condenser enlarger, but the choice of paper (including alternative methods), personal taste for the NDR, what is considered the average luminance range, and flare also play there part. The following example illustrates the results from four different models for diffusion enlargers. While three of the four results are very similar, they are determined using different variables. The Practical Flare Model is mine and is determined using a combination of a fix flare and variable flare approach.

    CI Development Model Comparison.jpg
     
    Last edited by a moderator: Jun 17, 2012
  26. RalphLambrecht

    RalphLambrecht Member

    Messages:
    8,080
    Joined:
    Sep 19, 2003
    Location:
    Central flor
    Shooter:
    Multi Format
    not surprisingly,i get the same results as steve when i work my spreadsheet with the op's data!all is good congrats to an obviously successful test!