Quote Originally Posted by AndreasT View Post
To tell the truth I am bit lost here.
Reading all this.
All I can say is that I use the 0.1 speed point as my first reference. I plot my curves to see how the shape is. This is important for me. When I use my adopted method (a bit of this a bit of that) I can usually make my print at about Grade 2 (to simplify this using Grades), I increase the development times slightly because I like to use unsharp masks and the funny thing is I land at a contrast of about 0.62 which is what most published time use as a standard. However my actuall development times are always less than published and I always have to overexpose.
I am always amazed and wondering when people say they use box speed. It doesn't make sense to me.
My point is that it is good practice to critically question the details / theory of testing method they use or are considering using. I believe the testing results most people produce create a false sense of accuracy. They basically aren't getting what they think they are getting.

Would you mind breaking down your testing process? You say you use 0.10 speed point. How do you test for this? Do you have a reference for the CI 0.62 that you say most published sources use this standard? Are you determining it yourself or have you chosen it based on a source?

Why are you amazed people use box speed?