</span><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td>QUOTE (Mark in SD @ Mar 24 2003, 02:17 PM)</td></tr><tr><td id='QUOTE'>So, given the same picture, it looks like the European meters will read about 1 stop less exposure than the Japanese meters (it was never stated directly which way the difference was to I've inferred it from otehr comments).
Basically correct?</td></tr></table><span class='postcolor'>
I've Been following this .... I have two meters now, a Gossen Ultra Pro and the internal meter in my Olympus OM 4.
Both agree with each other - near as I can tell, dead-on.
Must be me. I seem to be eternally the "odd one". I don't subtract 10% - 15% of time in the JOBO processor - and now my Japanese and German meters agree with each other.
I have *no* idea why they should be different. I've worked with some sophisticated light-measuring equipment - Cascade Photomultiplier based systems ... and there was *no* bias - as far as color temperature goes. Couldn't be, where light energy at discrete frquencies was being measured. I *suppose* one could deliberately introduce filters to create a bias, a' la Fred Picker - but for a meter that is intended for *universal* use - why would you?
Is everyone SURE of that 3200K - 5500K difference? - Or are we looking at a calibration problem?
--- Uh ... tell me we're NOT trying to verify sophisticated light meters by the "Sunny 16 Rule" ...