Yeah, I've made a zillion screw-ups trying to calibrate mine last night.

In the short time that I've used it, I've enjoyed the Analyser Pro a lot. It's really great to be able to get a usable print in one try without making any test strips.
Even when you use it straight out of the box, without calibrating, the thing is still pretty darned accurate. I'd say 90% or better, depending on conditions.

Quote Originally Posted by mr rusty View Post
... The biggest thing I have "learned" is that the "highlight" reading must be on a not-quite-white highlight...
I find the main trick in using the Analyser Pro is to pick the right spots to sample. You don't sample the brightest highlight or the darkest shadow. You sample the highest highlight or the darkest shadow where you want to see detail. Some shadows or highlights will be beyond the range where they show detail and depending on how you want the print to look, those sample locations might be different. Once you get those samples, the gray scale display helps you place tones where you want but, unless you pick the right tones to sample in the first place, you won't get the best results.

This isn't hard to do. It just takes a couple of practice runs to understand what information the machine is asking you for. Once you do it a couple of times it practically becomes second nature.

I'd say, bottom line, the Anayser Pro teaches you to better visualize your print before you start.

Quote Originally Posted by MattPC View Post
... The calibration process is a pain. I made several stupid mistakes on each paper/developer combo (forgot to change the filter, and similar stuff) but once you get through that it really is great...
I didn't find the calibration procedure to be hard, per se. It just takes time and there are a lot of little details to trip you up. I don't know how many times I forgot to change the filter.

From my short experience with the Analyser Pro, I'd suggest using it straight out of the box for a day or two without worrying about calibration unless you get really wonky results. Then, when you see how the unit works and are more used to operating it, go ahead and calibrate. You don't need to get all wrapped up in calibration right off the bat.

Quote Originally Posted by mr rusty View Post
... Holding the print button down for >1 sec gets "Diff" on the display, and the unit then prints the difference time to burn in.
I made a stupid mix-up when I made my top post. I said to press "focus"+"expose" to get to the "diff" display in order to make a burn-in. That isn't right.
To make a burn-in, you just hold the expose button down until the display says, "diff."
(Sorry, I must have dyxlesia. I always seem to get my mords wixed. )

So, when you make a burn-in, I'm guessing whatever time is on the display before you press and hold "print" gets stored in temporary memory to be used to make the difference calculation. As long as you don't quickly press "print" or press "X" (clear), that number stays in temporary memory and any numbers on the display when you press/hold "print" again are used to make the "diff" calculation.

That's the way it seems to work to me. As I said, I like to understand the way things work instead of just memorizing steps from a manual.