• Originally Posted by AndreasT
Ok, here the issue because I was reading another thread and got very, very confused. When I look at the BTZS way of working an average scene of SBR 7 the avg. gradient is developed to a contrst of 0,5 and in most other publications I read to 0,57.
Why?
When pushing or pulling the values are very different as well.
What am I missing?
All the various forms of average gradient, including CI, are about determining the slope of the curve, Rise / Run. The value for the average gradient depends on a combination of three major variables: Log subject luminance range, paper LER, and flare. The average subject luminance range is 2.20 logs. Grade 2 paper printed using a diffusion enlarger has around a 1.05 LER. Average flare for a statistically average scene is considered to be 0.34 for large format and 0.40 for smaller formats. Flare reduces the apparent subject luminance range - 2.20 - 0.40 = 1.80.

1.05 / 1.80 = 0.58

This is what Kodak considers normal processing for a average scene.

When it comes to compensating for longer or shorter luminance ranges there are two basic ways to deal with flare. One is using the same value of flare and the other is to use a variable flare model. The amount of flare does decrease with shorter luminance ranges and increases with longer luminance ranges, but as flare also can vary greatly within a given luminance range, there's a question whether the extra effort of the variable method is worth it.

Here is a comparison of developmental models differing only in the application of flare.

CI Chart of Developmental Models.jpg

Almost all methodologies are going to give you a workable normal. Scene luminance ranges fall within a bell curve which means the statistically average scenes occur in the majority of situations. What distinguishes a method as effective is how successfully it applies to more extreme conditions. One of the difficulties with extreme conditions is in determining operator error from failed methodology.