Switch to English Language Passer en langue française Omschakelen naar Nederlandse Taal Wechseln Sie zu deutschen Sprache Passa alla lingua italiana
Members: 68,719   Posts: 1,483,127   Online: 1178
      
Page 2 of 3 FirstFirst 123 LastLast
Results 11 to 20 of 30
  1. #11
    PeterB's Avatar
    Join Date
    Apr 2005
    Location
    Sydney, Australia.
    Shooter
    Medium Format
    Posts
    592
    Quote Originally Posted by Bill Burk View Post
    There is no definition correlating density to a particular Zone,
    Thanks Bill. When I initially tested/calibrated my film as per WBM I established a relationship between N+/-s, my film speed, film density and zone number (etc).

    I actually exposed a Process test roll last night using the steps I outlined above and found zones III to VIII were within half a zone of my target. This is moderately encouraging. I noticed that zone III was a bit high in density, zone VIII a bit low and I calculated the gamma/CI to be 0.4 rather than the target gamma of 0.5 for those conditions.

    I have a number of possible explanations for this. The most likely possibility is my XTOL stock (mixed using distilled H2O) is slightly less active than it should be as it is 1 year old and this was the main reason I wanted to do this test to see if it was still "OK". I will either extend my development to compensate or mix up another batch. I hate wasting half of what I mix up !! (I know I need to shoot more films !!)

    The other less likely explanation is the formula I used to adjust the development time is not sufficiently accurate (I have never validated it for use). The temp I initially calibrated my film at was 20degC, my Process roll was developed at 23.1degC. I use the formula: Click image for larger version. 

Name:	Dev time adjustment with temperature.png 
Views:	13 
Size:	2.2 KB 
ID:	67320 to determine the new dev time. I obtained the formula from this relatively comprehensive webpage on XTOL.

    regards
    Peter

  2. #12
    PeterB's Avatar
    Join Date
    Apr 2005
    Location
    Sydney, Australia.
    Shooter
    Medium Format
    Posts
    592
    I have redone this test with modifications and came up with even more confusing results, the exact opposite of what I was expecting !!

    To summarise my initial problem: I was getting a lower gamma (0.42 compared to the target 0.5) and I though this was due to the target I photographed introducing too much flare (in spite of me intending that it wouldn't - I suspect I wasn't thinking straight). The initial target was a matt cream coloured wall, but I only framed about a small 10% of the centre of the wall and in hindsight then I think the rest of the wall might have contributed to flare both in and out of the frame thus reducing my contrast. So to rule that cause in or out I repeated this test and found something even more strange which I need help understanding.


    I decided to find a target with much less opportunity for creating flare. So I draped a white towel folded in half in front of a dark green hedge. The towel in the centre filled about 30% of the frame. I exposed the towel to sit on Zones III through VII. The hedge and the area out of the frame in front of the camera were a few stops darker than the towel. It was in shade and I couldn't see the sun.

    So with another roll of film I now separately shot two targets. The first Five frames had this new towel target and another 5 frames had the original cream wall target. I fully expected the gamma calculated with the towel shots would be higher than the wall shots. It was in fact the exact opposite . Towel gamma=0.46, wall gamma=0.55 . The towel gamma should be equal to and probably higher than the wall gamma. (The target gamma in both cases was 0.6)

    So what can possibly explain this difference ? Experimental error ?


    Here are some misc notes in case you have read this far and might be thinking about an explanation:
    It turned out by chance that the exposure reading of each target was within 1/4 of a stop of each other (the cream wall was in a slightly higher light level). Accordingly I used the same shutter speed for both targets and simply varied the aperture from f5.6 to f22 in steps of 1 stop. I also took the opportunity to expose and develop for N+1 rather than N (as per the first check roll I did). This at least proved my gamma increased by about the expected 0.1 . The 5 data points from each target all lay quite well on the straight line portion of my HD curve. This rules out any light level or shutter speed variations. I double checked my densitometer calibration before and after measuring the densities. Also because both targets were on the same roll of film this rules out the possibility of dev strength/temp/timing errors. I also used a second spot meter this time to double check the exposure readings.

  3. #13
    Bill Burk's Avatar
    Join Date
    Feb 2010
    Shooter
    4x5 Format
    Posts
    2,916
    Images
    46
    First thing that comes to mind is... Color separation negatives through the blue filter have to be developed longer than the other separations because blue light results in lower gamma. (That's an explanation I've read, others may corroborate or dispute). Maybe the light on the white towel had more blue than the cream wall.

    Second thing that comes to mind is... 15% isn't a great difference, I occasionally have that much difference in test results that are supposed to be the same. (Of course something always turns up to be at fault, like trying to re-use a tray of developer)

  4. #14
    PeterB's Avatar
    Join Date
    Apr 2005
    Location
    Sydney, Australia.
    Shooter
    Medium Format
    Posts
    592
    Quote Originally Posted by Bill Burk View Post
    First thing that comes to mind is... Color separation negatives through the blue filter have to be developed longer than the other separations because blue light results in lower gamma. (That's an explanation I've read, others may corroborate or dispute). Maybe the light on the white towel had more blue than the cream wall.
    Thanks for your thoughts Bill. I am not using any colour separation filters. My densitometer is reading greyscale/"visual" density of a B&W negative.

    Quote Originally Posted by Bill Burk View Post
    Second thing that comes to mind is... 15% isn't a great difference, I occasionally have that much difference in test results that are supposed to be the same. (Of course something always turns up to be at fault, like trying to re-use a tray of developer)
    Possible but something must have caused this deviation and I can't figure out what. My 1st light meter is a Sekonic L328, my 2nd light meter was the spot meter in a Nikon D90.

  5. #15
    Bill Burk's Avatar
    Join Date
    Feb 2010
    Shooter
    4x5 Format
    Posts
    2,916
    Images
    46
    Quote Originally Posted by PeterB View Post
    Thanks for your thoughts Bill. I am not using any colour separation filters. My densitometer is reading greyscale/"visual" density of a B&W negative.
    Not that you used a blue filter, but that the distribution of spectral energy on the towel might be predominately blue because you are in a shadow, taking light mostly from the blue sky. I only mention blue-filter exposure to support my idea.

  6. #16
    PeterB's Avatar
    Join Date
    Apr 2005
    Location
    Sydney, Australia.
    Shooter
    Medium Format
    Posts
    592
    Quote Originally Posted by Bill Burk View Post
    Not that you used a blue filter, but that the distribution of spectral energy on the towel might be predominately blue because you are in a shadow, taking light mostly from the blue sky. I only mention blue-filter exposure to support my idea.
    Wow Bill. Now I understand what you are saying and it is a very plausible explanation. Thanks for your insight.

    This begs my next question, why aren't (some ?) other people concerned with compensating for the colour temperature of the either the incident light or the colour of the reflected light when exposing their B&W films ? In my accidental discovery I was able to show the equivalent of one non trivial N step in contrast (for want of a better term) between two different colour combinations without trying to find the largest possible deviation/worst case.

  7. #17
    Bill Burk's Avatar
    Join Date
    Feb 2010
    Shooter
    4x5 Format
    Posts
    2,916
    Images
    46
    There's meter spectral response, human eye spectral response, film spectral response and the spectral distribution of light and the "color" of the test target. Lots of variables.

    Many people inadvertently obtain a "Tungsten" speed rating by their tests without realizing it.

    Others put an 80b filter over the light or lens to (at least partially) simulate daylight.

    More important in my mind is to stick with a test plan where you understand it's limitations, and strive for consistency. Yes it's significant in your case, but if your test-to-test result stayed within 2/3 stop (or 2/3 N step), then you can use your test to control your processes.

  8. #18
    Stephen Benskin's Avatar
    Join Date
    Jan 2005
    Location
    Los Angeles
    Shooter
    4x5 Format
    Posts
    1,196
    There's a reason why there are guidelines for testing. Why try to reinvent the wheel. You're making assumptions that may or may not be correct. Bad testing can be worse than no testing at all, and other similar cliches. I recently ran across a quote by Phil Davis on in camera testing,

    "Traditionalists defend this testing method — some vehemently — on the grounds that involving the camera in the test simulates the conditions of practical use and is, therefore, not only convenient but desirable, Similarly, they are apt to argue emphatically that, after all, the purpose of this whole thing is to produce prints, so appraising print values must therefore be the most appropriate way to judge the materials’ performance.

    In fact, that’s a technical non sequitur. These traditional testing procedures can’t supply material-specific information any more than driving your car around the block can inform you about the comparative quality of your motor oil, You can obviously tell whether the car runs satisfactorily or not, but you can’t know for sure what part the oil has played in that performance. There are simply too many unrecognized or uncontrolled variables in the procedure; there is no accurate way to quantify the results of such subjective tests, and you have no logical basis for assuming that the conclusions drawn are valid."

  9. #19

    Join Date
    Feb 2010
    Location
    Montreal, Canada
    Shooter
    Multi Format
    Posts
    4,267
    I'm not defending OPs particular test method, but I think Phil Davis overstates the case slightly. After all he had to sell books too.

  10. #20
    PeterB's Avatar
    Join Date
    Apr 2005
    Location
    Sydney, Australia.
    Shooter
    Medium Format
    Posts
    592
    Thanks Stephen.

    I hold a very different position to the one you suggest. I am a professional engineer having worked in the medical device industry for 20 years now. I have had plenty of experience with (among other things) proving designs are fit for their intended purpose - a necessity required by regulatory bodies worldwide.

    Attempting to test a product or a process by focusing on the individual sub systems would never cut the mustard. Along with low level testing (known as Verification testing) to prove that sub systems meet their functional/engineering requirements, it is imperative to also perform testing at the system level - this is known as Validation testing. Validation testing is performed without needing to know any internal implementation details. It is akin to black box testing. It proves the product meets the needs of a variety of stakeholders such as those of the customer and regulatory bodies (such as the FDA).

    As a photographer if all I did was individually test the sub sections in my photographic process I would only ever be performing verification testing and not validation testing. Both are necessary and valuable.

    regards
    Peter

Page 2 of 3 FirstFirst 123 LastLast


 

APUG PARTNERS EQUALLY FUNDING OUR COMMUNITY:



Contact Us  |  Support Us!  |  Advertise  |  Site Terms  |  Archive  —   Search  |  Mobile Device Access  |  RSS  |  Facebook  |  Linkedin