Great thread thanks.
Great thread thanks.
After my first attempt to conduct this film test (where there was way too much flare in my test setup) I have now repeated and remeasured the 5 negatives and filled out the spreadsheet again.
In summary I think I have decent results. I have some questions about the toe area of the film. Contrary to my first effort where for some reason I didn't give the step wedge sufficient exposure and ended up with a long toe and low Dmax, this time time I have barely any toe present !! 
My plan now is to replace some of the points say from 28-32 with a toe interpolated between one more reading at D=3.8 (see below) (I will also need to space out the y-axis values over that region too (because the default step is 0.1 and I will need it to be about 0.2).
I think this points to a flaw in the spreadsheet as it assumes you have sufficient toe present in the data.
Anyway, any comments on my plan and spreadsheet would be appreciated.
 This lack of toe is despite me bracketing from a nominal exposure by metering the mid grey step to 3 stops below. I also 'nominally' rated the HP5+ at 200ASA. If I had rated it at 400ASA and bracketed +/- 1 stop I would prob. have sufficient toe. Fortunately I also included an X-rite cal step wedge which has its darkest step at D=3.8 (the Stouffer 3 only goes to D=3.05). That permitted me to get ONE point in the toe region and I can then interpolate/guess the others between 3.05-3.8
It does look like you've over exposed the step tablet, although it looks like you got results good enough to work with. I am a little concerned about the second Fb+f reading if I am correct in assuming that your Fb+f readings are 0.16, 0.13, 0.16, 0.19, 0.20? Film base plus fog doesn't decrease with increased processing, and the drop is too big to just be attributed to densitometer accuracy. As the next step (31) appears to be consistent with the data from the other test wedges it probably isn't operator error either. The 0.03 density drop for the Fb+f could be pointing to an error in the testing procedure. Don't discount the possibility that it might not be the one with the error. If there is a problem, perhaps it is with the other tests.
Thanks for looking Stephen. All is OK with the FB+F reading as I didn't make it sufficiently clear. The FB+F readings are stored as comments in the xls spreadsheet in cells G41 to K41. (those comments aren't visible on the screen capture, but in the actual xls file). They are 0.11, 0.12, 0.14, 0.15 and 0.14. all readings have a tolerance of +/- 0.01 so I am happy they are monotonically increasing after taking into account the tolerance.
The density values you read off were from cells G42 to K42 and are in fact those for the darkest step on the X-rite tablet (D=3.80) which I exposed alongside the 31 step Stouffer tablet. The value I entered into cell G42 is actually out of place and is understandably confusing (just ignore it if you can't understand what I've done after reading the rest of this). It results from a discontinuity of 2 stops as explained further on. The reason the film developed for 8 minutes has a higher measured density for all its steps including the one at cell G42 is because I was forced to read the entire set of values off a neg frame exposed 2 stops higher (1/15s rather than the nominal 1/60s I used for the remaining films) and then shift everything by two stops along the Y axis (vertical Transmission Density axis). This happened because one of the 5 films was ruined during the development as I tapped the tank to release the air bubbles the lid popped off and let stray light in to fog most of it ! I managed to recover from that scenario by cutting another roll in half and developing one half for 8 mins and the other half for some other time. I had filled all rolls by bracketing with 4 shots and the nominal exposure I chose to use in the spreadsheet (1/60s) ended up on one half of the cut roll not the other, so I had to use the 1/15s and compensate by shifting the y-axis values by 2 stops = 0.6 density units. I had to guess/interpolate some of the numbers because this method meant I was missing values for steps 26-31. Additionally the value of 0.16 in G41 was actually for step ((38.5-2x0.3)=32.5) not step 31 so the shifted value in G41 is slightly lower than what it really was, but for the sake of not upsetting the automated curve fitting equations Ralph set up I shaved a bit off it. It was the only point I compromised on from a data integrity point of view and could easily be explained by the fact that the ratio of my two shutter speeds (1/15)/(1/60) wasn't exactly 2.
Given the 8 min graph looks well placed compared to the others in the Family of Curves then I am happy my recovery steps were successful.
So, I would like to try this test with my medium format film... However, my only controllable light source is my condenser enlarger. Is there a way to determine how long my exposure time (i.e., time the enlarger is switched on) for contact printing for a given enlarger height and lens aperture? I have a spot meter and a gray card, and for some reason that seems like all I should really need to determine a length of time for which to expose the film. But I'm clueless about how to determine a contact exposure time given a light meter's reading of f-stop and shutter speed.
This Google search will show you various peoples experiences with that.
I'm unsure if your light meter and grey card are sufficient. I think you need to use an integrating light meter or perform some preliminary tests, but I neither read nor performed the testing as described in BTZS so you'd best wait for replies from those who have used it or alternatively read through the search results I gave you.
Phil Davis has a really nice approach using an enlarger. I use a calibrated sensitiometer, so my approach is slightly different, but fundamentally they are the same. You need to determine much light is required to produce the required amount of exposure through a chosen step tablet density to produce a target density.
The speed equation is 0.8 / Hm (exposure in mcs at the point where the film has a density of 0.10 over Fb+f). For a 400 speed film, the aim exposure to produce a density of 0.10 over Fb+f is 0.8 / 400 or 0.0020 mcs. Generally you want to have this fall on the third or fourth step of the step tablet (0.15 step). The step tabet’s D-Max is 3.05 – 0.45 (3 steps) = 2.60 density. The equation to find Transmittance is:
Transmittance = Transmitted Light / Incident Light
Converting the equation to find Incident Light:
Incident Light = Transmitted / Transmittance
We already have the required transmitted light for a 400 speed film – 0.0020 mcs
Transmittance is the reciprocal of opacity or 1/ 10^density: 1/10^2.60 = 0.0025
0.002 / 0.0025 = 0.8 mcs
I like to use footcandles to measure incident light so for a shutter speed of 1/125 your will need:
(0.8 / 10.76) * 125 = 9.29 fc.
9.29 fc * 10.76 (convert to metercandles) * 1/125 (shutter speed) = 0.8 mcs
Transmitted light = Transmittance * Incident
.0025 * 0.8 = 0.0020
Equation for film speed 0.8 / Hm (mcs at 0.10 over Fb+f)
0.8 / 0.0020 = 400
For a 400 speed film exposing a step tablet with 9.29 footcandles for 1/125 second should produce a exposure through the 2.60 density step of 0.0020 producing a density of 0.10 over Fb+f.
You can use the reflected meter’s user manual to calculate how to determine footcandles or meter candles using the meter and a gray card.
I’m terrible with unit conversions, so I hope I got these right.
OK, after a bit more work with the spreadsheet, I have now shifted the relative placement of my curves up to simulate reducing the exposure (if you remember I had overexposed the stouffer step wedge thereby not getting into the toe region).
This shifting permitted me to create the toe for each curve. I already had two points in the toes on the curves, the FB+F and the density of one step from an X-Rite cal film having a Tx_density of 3.80.
Now having done this, I notice that I probably didn't develop for long enough. Given I was using Xtol 1+2, even though I extended out to nearly 32min@20degC, it looks like I needed to take it out to perhaps >50 minutes in order to get out to N+3 (avg. gradient =1.0) on the curves. Does that sound reasonable ? If so, then to save me exposing yet another film, can I assume that it is now OK for me to extrapolate my existing curve of Zone System [N] vs. development time[min] (bottom left of the 4 curves) out to say t=50minutes ?
One thing which makes me weary of doing this is that Ralph's example had him reaching N+3 and his corresponding curve is much more linear above N=0 rather than concave down (like mine). Now of course he used Tmax100 in ID-11 (I used HP5+ in tol 1+2) which could account for the different shapes, but I'm not familiar enough with the shapes to know if my shape is OK and whether I can extrapolate it or not.
My test summary:
Ralph's Test Summary:
My Family of Curves:
There's nothing wrong with the shape of the Time/Gradient curve. The shape depends on the film/developer/methodology combination.
One of the advantages of plotting a Time/Gradient curve is that you can extrapolate data. One potential problem of attempting to extrapolate too much above the available data is there's no information about when the film hits gamma infinity.
Looking at your curve, I'm not sure if the film will make a gradient of 1.00.
I have done some research and the maximum contrast of HP5 plus (independent of the chosen dev or concentration or agitation etc) is about 0.9. 
This isn't a big problem for me as I don't live in England where the sun seldom shines and most of my plans for this film will be outside with high SBR, and worst case scenarios will certainly require no more than N+2 development.
 Roger Hicks: "I don't think that Ilford can get much above about 0.9 for gamma infinity on HP5, and that was in D19R, but equally, many films of the 30s could go well above 1. A figure I seem to remember is 1.3 to 1.5, but I can't verify that without effort. " link.