Quote Originally Posted by Photo Engineer View Post
I usually shoot all negative films about 1/3 stop slower than box speed so as to get better overall latitude. So, a 400 speed to me is a 320.
Latitude on the low end, to me suggests you are giving yourself an additional safety factor because of how many variables there are in metering etc.

This is different than why most people down-rate their B&W negative films. The common argument is they get better shadow detail - implying better overall print quality. This is where I'd repeat people likely don't know what kind of negatives they are actually making vs what they think is happening. It's an issue with how people read/interpret/mis-interpret all the various methods out there in the Zone world etc. We tend to come away from those methodologies thinking we are "smarter" than we actually are. Like "aha, now I know how to find the real speed of my film, not the BS Kodak tells me, which must be why my prints always sucked".

So this goes back to my earlier question - how do people know D-76 doesn't produce "full rated speed"? How are they even defining "full rated speed"? Is isn't just a density above B+F. There is a required gradient over an exposure range.

I'm not saying it is wrong to shoot at half box speed. That's fine. I just think very few people really know why they are doing it. For a long time I certainly didn't.

I'm not saying people need to learn about any of this to make good prints either. There is enough latitude and error throughout the end-to-end process for lots of approaches to lead to excellent results. The only reason I initially brought all this up is the title of the thread: "Box ISO rate and Real ISO". Barring extreme procedures, baloney developers and stuff, if you say the real speed of your ISO 400 film is 160, you should be able to back it up.

PS: Glad to see Stephen post to the thread!