Quote Originally Posted by RalphLambrecht View Post
Regardless of what the average scene reflects or what lighmeters are calibrated to, it all has to do with how our eyes compare different brightness levels. The human response to reflection (lightness) is not linear. For example, a surface reflecting 18% of the light that falls onto it, is perceived as being only half as bright (50%) as the illumination itself. The response follows the following equation:

L =116*(R)^(1/3)-16

where L and R are the lightness and the reflection in % respectively.

For example, use R=0.18 (18%) and L will return 50%.
I'm always curious about how we can say that the light is "perceived as being only half as bright". Any idea how these tests are conducted?

If I was asked to look at two stimuli, I dont' think I could say, "ahh yes, that looks exactly twice as bright".

Same thing with decibals, a 3dB increase is perceived as "twice as loud", but how are these types of qualitative measurements reliably made?