


Does the program you wrote for R let you solve the resulting spline equation easily for various x values; like x = 0.1 and give the resulting y value? This has been the stumbling block with the software I had been using, thus my conversion to pointtopoint interpolation.
Last edited by icracer; 09012013 at 07:10 AM. Click to view previous post history.

Originally Posted by Rafal Lukawiecki
Indeed, Rudi—the upsweep towards lessthannone exposure looked humorous. Bear in mind, however, if I you increase the number of degrees of freedom on the Bezier spline to 7 or more, the toe looks just fine, yet the calculation still gets the CI—what we miss then is the smoothing that neatly takes care of the statistical measurement error. Having said that, I have to play, yet, with other types of splines, including loginfluenced ones, which might have that "perfect" look.
Splines are commonly used to describe arbitrarily curved shapes, because they can be made to fit almost any point set with a smooth looking curve. In the same fashion you can make them LMS fit a point cloud, and that's what you did with your LOESS approach, or to say it more accurately: that's what the LOESS algorithm did for you. The more degrees of freedom you allow the LOESS algorithm, the closer the result will fit your point cloud, but remember that your point cloud is still noisy data!
What you really want is the following:
 find a model that fits the characteristic curves you expect. Allow this model to use as few parameters as possible, because the more parameters you allow, the more data points you need to get meaningful and reliable numbers for each parameter. Look at curves of the log(1+exp(a*x)) or log(1+exp(a*x + b*x^{2})) type, you can describe most characteristic curves that a reasonable developer will give you. The first type describes straight slopes, while the second type also allows for upswept (b>0) and downswept (b<0) curves. Use shifting to place the toe where you need it, and use scaling to match the size of the toe area.
 Use LMS fit to fit that model to your noisy data set. Since you have very few parameters with very distinct effects, numerical stability should be fine > no more NaN results.
 Extract CI or whatever you were looking for from the best fit model. Since all data points have their impact on the final result, you should get very reliable results even with very noisy data. And yes, densitometer readings can be quite noisy.
Trying to be the best of whatever I am, even if what I am is no good.

Tat's rather complicated.that's ehy I use a simple average gradient as phil davies shows in his book''Beyond the ZO
ZONESystem'.Tha's all you needin my opinion.

Originally Posted by icracer
Does the program you wrote for R let you solve the resulting spline equation easily for various x values; like x = 0.1 and give the resulting y value? This has been the stumbling block with the software I had been using, thus my conversion to pointtopoint interpolation.
Dale, indeed you can do both ways. If you know x (say log rel E) and want to get y (say density), you just use predict() which takes the curve model as its first parameter, and a vector of x values for which you would like to find y—it can be just a singlevalued vector, as in:
predict(my.curve, data.frame(He=1.4))
which will compute density at logE of 1.4. Since I use this a few times, I define function D that takes an x and the curve and returns the y (see around line 130 in the code). However, to do it the other way, that is to find the x for which y has a certain value, say to find the logE at which density is 0.1 over fb+f, you use uniroot(), which takes a function, including nonlinear ones that may results from using splines, and looks for an x at which y is 0. So if you want to find x for y=0.1, you just subtract it from the value. The call would be something like:
uniroot(function (x) D(x)  0.1, c(0:3), curve.model)
assuming D was the function mentioned earlier. Let me know if you would like me to add code for this into the script. I might extend the code, at a later point, to look for ISO triangles, and possibly the other gradient calculations.

Originally Posted by RalphLambrecht
Tat's rather complicated.that's ehy I use a simple average gradient as phil davies shows in his book''Beyond the ZO
ZONESystem'.Tha's all you needin my opinion.
I would not disagree with you, Ralph—after all, it was your great book that got me onto the path of material testing, many thanks for your very useful Excel spreadsheets.
However, I have found that I wanted to understand the various ways of parametrising the curves, and their gradients, as I was getting into it all. I suppose it is personal. Also, at this stage, I am finding that, for my work, CI matches my needs a little better than other gradients. Since there were no easily available tools for computing it, I just wrote one yesterday, so it is quite easy—for me at least—to have a CI at a push of a button.

Sponsored Ad. (Subscribers to APUG have the option to remove this ad.)

Rafal,
I'm glad to hear that my suggestions were helpful and that you were able to implement them in R. You might want to try fitting your data to the function I suggested in the earlier thread:
D(x) = a1*ln(1+exp(a2*x + a3))
where D is density, x is exposure (on a logarithmic scale) and a1,a2, a3 are constants. ln is the natural logarithm and exp is the number e raised to the power in the parentheses.
I didn't realize it when I first suggested this function, but I later discovered that it had been introduced for sensitometry in 1920s by Robert Luther, a Professor at the University of Dresden. Over the years, it was found to work remarkably well for lots of filmdeveloper combinations, except for the shoulder region, where it is not applicable at all. There is also a bit of theoretical basis to justify its use.
David

Originally Posted by Rudeofus
Splines are commonly used to describe arbitrarily curved shapes, because they can be made to fit almost any point set with a smooth looking curve. In the same fashion you can make them LMS fit a point cloud, and that's what you did with your LOESS approach, or to say it more accurately: that's what the LOESS algorithm did for you. The more degrees of freedom you allow the LOESS algorithm, the closer the result will fit your point cloud, but remember that your point cloud is still noisy data!
Rudi, you make a very good point, and I have no doubt that spending more time writing code that would fit a more correct curve, which follows the log relationship which you, and David, have mentioned, would be a better way to get accuracy. However, my original goal was to calculate the CI in a more automated way than what I did previously using graph paper, having picked the technique from Bill (thanks!). When doing it by hand, I already had to create a "spline" by rotating the French curve (Bezier) two or three times, as I was drawing a smooth line through my points plotted off the densitometer. Then I was using the CIruler to approximately find the 3 points that met the condition. I have a feeling that the Bezier spline approximation, calculated in my code, does as good, or perhaps even a better job that what I did by hand. Perhaps you have a set of logE/D points for which you know their CI, that we could use to further test my code? I wonder how much more (useful) precision would I gain by finding a better, more "filmlike" logbased smoother.
Originally Posted by dpgoldenberg
Rafal,
I'm glad to hear that my suggestions were helpful and that you were able to implement them in R. You might want to try fitting your data to the function I suggested in the earlier thread:
D(x) = a1*ln(1+exp(a2*x + a3))
where D is density, x is exposure (on a logarithmic scale) and a1,a2, a3 are constants. ln is the natural logarithm and exp is the number e raised to the power in the parentheses.
Many thanks, David and Rudi, for your suggestions—I wonder what would be able to cope with a touch of shouldering, should it be present. I will use them the next time I have a chance to spend a day coding filmtesting utilities.

Originally Posted by RalphLambrecht
Tat's rather complicated.that's ehy I use a simple average gradient as phil davies shows in his book''Beyond the ZO
ZONESystem'.Tha's all you needin my opinion.
The question is how much information you can extract from a limited and noisy set of data points. If advanced methods and modern computing give you results that are much more statistically sound and are therefore much more accurate and reliable, why wouldn't you want to use them? Not everybody is scared to death by math ...
Trying to be the best of whatever I am, even if what I am is no good.

Originally Posted by Rafal Lukawiecki
I wonder what would be able to cope with a touch of shouldering, should it be present. I will use them the next time I have a chance to spend a day coding filmtesting utilities.
If you look at my two model functions, the second one should be able to cope very well with shouldering with negative values for b.
Trying to be the best of whatever I am, even if what I am is no good.

Originally Posted by Rafal Lukawiecki
Many thanks, David and Rudi, for your suggestions—I wonder what would be able to cope with a touch of shouldering, should it be present. I will use them the next time I have a chance to spend a day coding filmtesting utilities.
Rafal,
Here is another function that does include a shoulder:
f(x) = a1*log(1+exp(a2*x+a3))/(a4+log(1+exp(a2*x+a3)))
This is based on my original function for the toe and linear regions, but then modifies the result of that function by a rectangular hyperbolic function to form the shoulder at the larger values. I suggested this in the megathread on curve fitting. Here is a link to an image showing a fit of this function to a set of data for enlarging paper that Ralph Lambrecht provided.
http://www.apug.org/forums/attachmen...perfit.png.att
David

