Getting set up with Analyser Pro
I just got an Analyser Pro. I've used it to make a few prints. It's a great piece of equipment!
It seems to be about 90% on the spot, right out of the box but, now that I've had some time to experiment, it's time to do the calibration. I've got a couple of related questions about getting set up and using it.
1) Getting it calibrated.
2) Using it to make burn-ins.
I understand the steps on how to calibrate it and use it but, in order for me to best wrap my head around how to use it, I'd like to know more about how it works. Specifically, what goes on inside it's electronic brain.
Question #1 - Calibration:
When you calibrate the Analyser Pro, you are basically making 7 incremental test strips, one for each contrast grade. Then you make 7 contact prints of the step wedge, one for each contrast grade. Once you make and develop those 14 strips, you compare them to the color chip and create a list of offsets which are entered into the analyser's memory.
So, when you do all this, you essentially created a lookup table that the microprocessor uses to calculate exposure times, based on the densitometer readings you make under the enlarger.
Am I right, so far?
Question #2 - Burn-ins:
Assuming your calibration is all set, as above, when you make an exposure, after you press the "expose" button, that exposure time is stored in a temporary memory location. Now, when you alter your time setting and press the "focus" button, followed by the "expose" button, the microprocessor subtracts the time on the display from the time in the memory to get the "difference" time in order to make the burn.
Am I still on track?
Now, if I understand correctly, the time changes that occurr from altering the contrast setting come from the lookup table we created in Question #1 above. Right?
So, to follow that up, let's take the classic example of the landscape photo where the sky needs to be burned at a different contrast than the foreground: First, I burn in the ground at one exposure and contrast. Then, what must I do in order to burn in the sky at another contrast?
My understanding is that you make your alterations to the time and/or contrast and the microprocessor performs the calculations via the lookup table to get a new time. Then, when you press "focus"+"expose," the "diff" time is calculated from the last time stored in temporary memory.
When does that time in temporary memory get lost or altered?
Is it updated when you press "expose" again without pressing "focus" first?
Or, does it stay in memory until you press "X" to clear?
Good questions and I hope there are enough users here to supply answers which I can use as well as a new owner who has yet to use the analyser
On the RH Designs site there are a number of video tutorials on the analyser which might help answer you questions. Have you had a look at them?
I got the videos. They're good.
Actually watched most of them before I ordered the unit.
I understand the manuals and the videos pretty well. I'll be starting a calibration run this evening after dinner settles.
I am the kind of person who likes to understand the principles on which something works instead of learning the steps by rote.
If you just learn the steps, it's harder to figure out a problem you were never taught to handle. If you learn the principles, you can figure out how to handle anything that comes your way.
The instruction manuals are good and they are easy enough to understand. I figured, instead of asking questions via e-mail, I'd ask in the forum.
First, everybody can share the knowledge gained instead of just one or two people.
Second, getting input from more than one source is almost always better. That's what forums are for. Right?
For burn in, what I do is set up for the print at the base time, print, then adjust the time until the indicator measured from the bit I want to burn has moved darker to the point I want i. The time has increased. Holding the print button down for >1 sec gets "Diff" on the display, and the unit then prints the difference time to burn in.
I am no guru - still learning with this unit
From my limited experience (maybe a year with mine?) you're pretty well on track. Your lookup table gives the unit an offset from the 'firmware' in my understanding. The calibration process is a pain. I made several stupid mistakes on each paper/developer combo (forgot to change the filter, and similar stuff) but once you get through that it really is great. The burn-in method is much simpler in practice than it seems in the instructions. The only thing I have difficulty with are extreme exposure times at large (for me) print sizes on my presently preffered slow paper. There's 'only' 240 (?) seconds available (unless theres a simple setting that I've overlooked, anybody like to correct me here?) I often have to expose twice or try some other manipulation to obtain an extra stop or so. I think it's time to find a faster paper or lens. I sometimes have trouble with developer temperature control (hot in my crappy little darkroom). Last summer I had a go with lith and snatched when my temperatures became to difficult to control. I have larger trays now so I may get more stable temps, fingers crossed.
I hope you enjoy you new tool,
Sponsored Ad. (Subscribers to APUG have the option to remove this ad.)
As I said. I'm still learning, but it helps me to get decent prints without making loads of test strips. I tend to make a print, and then possibly make another if I decide it can be "improved". The biggest thing I have "learned" is that the "highlight" reading must be on a not-quite-white highlight. If you choose to measure the highlight from the shadow of a very dense part of the negative that is blown out, the time is waaaaay too long. However, with a bit of practice it is a VERY useful tool, and I certainly wouldn't want to be without mine!
Having used the analyser/pro for a number of years, for burn in exposure I take my base exposure as normal, then take exposures of the areas that I think may need a burn or dodge, then palace the leds on the grey scale for the tone I want, then note the given time, for dodging I subtract the dodge from the base exposure and that gives me the dodge time, for burn time I have noted the exposure, then after the base exposure/dodges I set the burn time on the timer press the exposure button for a second or two and the machine subtracts the base from the burn and exposes the paper, It takes a lot more time to type all this into the computer than to actually do the exposure, For the split grade, if, say the base is 20 seconds at 3, then I set the machine to grade 1 and line up the led at 1 onto the grey scale point I want and the time is there on the display, not the time, set it in the timer and the machine does the rest, best piece of darkroom gear I ever bought, saves me a fortune in wasted papers, 90% of the time it is right first time, 10% I need an extra sheet.
Yeah, I've made a zillion screw-ups trying to calibrate mine last night.
In the short time that I've used it, I've enjoyed the Analyser Pro a lot. It's really great to be able to get a usable print in one try without making any test strips.
Even when you use it straight out of the box, without calibrating, the thing is still pretty darned accurate. I'd say 90% or better, depending on conditions.
I find the main trick in using the Analyser Pro is to pick the right spots to sample. You don't sample the brightest highlight or the darkest shadow. You sample the highest highlight or the darkest shadow where you want to see detail. Some shadows or highlights will be beyond the range where they show detail and depending on how you want the print to look, those sample locations might be different. Once you get those samples, the gray scale display helps you place tones where you want but, unless you pick the right tones to sample in the first place, you won't get the best results.
Originally Posted by mr rusty
This isn't hard to do. It just takes a couple of practice runs to understand what information the machine is asking you for. Once you do it a couple of times it practically becomes second nature.
I'd say, bottom line, the Anayser Pro teaches you to better visualize your print before you start.
I didn't find the calibration procedure to be hard, per se. It just takes time and there are a lot of little details to trip you up. I don't know how many times I forgot to change the filter.
Originally Posted by MattPC
From my short experience with the Analyser Pro, I'd suggest using it straight out of the box for a day or two without worrying about calibration unless you get really wonky results. Then, when you see how the unit works and are more used to operating it, go ahead and calibrate. You don't need to get all wrapped up in calibration right off the bat.
I made a stupid mix-up when I made my top post. I said to press "focus"+"expose" to get to the "diff" display in order to make a burn-in. That isn't right.
Originally Posted by mr rusty
To make a burn-in, you just hold the expose button down until the display says, "diff."
(Sorry, I must have dyxlesia. I always seem to get my mords wixed. )
So, when you make a burn-in, I'm guessing whatever time is on the display before you press and hold "print" gets stored in temporary memory to be used to make the difference calculation. As long as you don't quickly press "print" or press "X" (clear), that number stays in temporary memory and any numbers on the display when you press/hold "print" again are used to make the "diff" calculation.
That's the way it seems to work to me. As I said, I like to understand the way things work instead of just memorizing steps from a manual.
When you press the "Print" button briefly, the currently-set exposure time is stored. This stored time is subtracted from the set time when you press-and-hold "Print" to give you the burn-in time. It is stored until the next short press of "Print" at which point the stored time is updated, so you can make as many burn-in exposures as you like.
Calibration of exposure does indeed introduce an offset or correction factor to the basic settings, just as you might calibrate a camera exposure meter to get the results you want. Contrast calibration is done in ISO(R) units; if your paper data sheet specifies contrast in ISO(R) units you can enter those figures into the Analyser unchanged as a starting point, though I'd still recommend doing a full calibration for best results.
Here's another question:
After you've completed your first round of calibration, when you make the finer calibration with smaller step sizes, are the results relative to the settings that are already in the calibration table or are they absolute?
I assume they are relative. Correct?
For example, if you have already calibrated for Grade 5 at 1/4-step intervals and your test strip showed you needed a "+1" correction, you would enter "+3" into that grade of the paper channel you are working on. (Multiply the results of your test strip by 3 when working in 1/4-steps.)
Now, if you made a second test strip at 1/12-step intervals which showed a "-2" you would click the "down" button two times from the value that you already entered. Right?
If you entered an absolute value you would end up setting Grade 5 for that channel at "-2" when it should have been "+1."