Switch to English Language Passer en langue française Omschakelen naar Nederlandse Taal Wechseln Sie zu deutschen Sprache Passa alla lingua italiana
Members: 69,725   Posts: 1,515,028   Online: 1190
      
Page 3 of 4 FirstFirst 1234 LastLast
Results 21 to 30 of 32
  1. #21
    ic-racer's Avatar
    Join Date
    Feb 2007
    Location
    Midwest USA
    Shooter
    Multi Format
    Posts
    6,097
    Does the program you wrote for R let you solve the resulting spline equation easily for various x values; like x = 0.1 and give the resulting y value? This has been the stumbling block with the software I had been using, thus my conversion to point-to-point interpolation.
    Last edited by ic-racer; 09-01-2013 at 06:10 AM. Click to view previous post history.

  2. #22
    Rudeofus's Avatar
    Join Date
    Aug 2009
    Shooter
    Medium Format
    Posts
    1,531
    Images
    10
    Quote Originally Posted by Rafal Lukawiecki View Post
    Indeed, Rudi—the upsweep towards less-than-none exposure looked humorous. Bear in mind, however, if I you increase the number of degrees of freedom on the Bezier spline to 7 or more, the toe looks just fine, yet the calculation still gets the CI—what we miss then is the smoothing that neatly takes care of the statistical measurement error. Having said that, I have to play, yet, with other types of splines, including log-influenced ones, which might have that "perfect" look.
    Splines are commonly used to describe arbitrarily curved shapes, because they can be made to fit almost any point set with a smooth looking curve. In the same fashion you can make them LMS fit a point cloud, and that's what you did with your LOESS approach, or to say it more accurately: that's what the LOESS algorithm did for you. The more degrees of freedom you allow the LOESS algorithm, the closer the result will fit your point cloud, but remember that your point cloud is still noisy data!

    What you really want is the following:
    1. find a model that fits the characteristic curves you expect. Allow this model to use as few parameters as possible, because the more parameters you allow, the more data points you need to get meaningful and reliable numbers for each parameter. Look at curves of the log(1+exp(a*x)) or log(1+exp(a*x + b*x2)) type, you can describe most characteristic curves that a reasonable developer will give you. The first type describes straight slopes, while the second type also allows for upswept (b>0) and downswept (b<0) curves. Use shifting to place the toe where you need it, and use scaling to match the size of the toe area.
    2. Use LMS fit to fit that model to your noisy data set. Since you have very few parameters with very distinct effects, numerical stability should be fine ---> no more NaN results.
    3. Extract CI or whatever you were looking for from the best fit model. Since all data points have their impact on the final result, you should get very reliable results even with very noisy data. And yes, densitometer readings can be quite noisy.
    Trying to be the best of whatever I am, even if what I am is no good.

  3. #23
    RalphLambrecht's Avatar
    Join Date
    Sep 2003
    Location
    the villages .centralflorida,USA and Germany
    Shooter
    Multi Format
    Posts
    6,334
    Images
    1
    Tat's rather complicated.that's ehy I use a simple average gradient as phil davies shows in his book''Beyond the ZO
    ZONESystem'.Tha's all you needin my opinion.
    Regards

    Ralph W. Lambrecht
    www.darkroomagic.comrorrlambrec@ymail.com[/URL]
    www.waybeyondmonochrome.com

  4. #24
    Rafal Lukawiecki's Avatar
    Join Date
    Feb 2006
    Location
    Co. Wicklow, Ireland
    Shooter
    Multi Format
    Posts
    730
    Quote Originally Posted by ic-racer View Post
    Does the program you wrote for R let you solve the resulting spline equation easily for various x values; like x = 0.1 and give the resulting y value? This has been the stumbling block with the software I had been using, thus my conversion to point-to-point interpolation.
    Dale, indeed you can do both ways. If you know x (say log rel E) and want to get y (say density), you just use predict() which takes the curve model as its first parameter, and a vector of x values for which you would like to find y—it can be just a single-valued vector, as in:

    predict(my.curve, data.frame(He=1.4))

    which will compute density at logE of 1.4. Since I use this a few times, I define function D that takes an x and the curve and returns the y (see around line 130 in the code). However, to do it the other way, that is to find the x for which y has a certain value, say to find the logE at which density is 0.1 over fb+f, you use uniroot(), which takes a function, including non-linear ones that may results from using splines, and looks for an x at which y is 0. So if you want to find x for y=0.1, you just subtract it from the value. The call would be something like:

    uniroot(function (x) D(x) - 0.1, c(0:3), curve.model)

    assuming D was the function mentioned earlier. Let me know if you would like me to add code for this into the script. I might extend the code, at a later point, to look for ISO triangles, and possibly the other gradient calculations.
    Rafal Lukawiecki
    See rafal.net | Read rafal.net/articles

  5. #25
    Rafal Lukawiecki's Avatar
    Join Date
    Feb 2006
    Location
    Co. Wicklow, Ireland
    Shooter
    Multi Format
    Posts
    730
    Quote Originally Posted by RalphLambrecht View Post
    Tat's rather complicated.that's ehy I use a simple average gradient as phil davies shows in his book''Beyond the ZO
    ZONESystem'.Tha's all you needin my opinion.
    I would not disagree with you, Ralph—after all, it was your great book that got me onto the path of material testing, many thanks for your very useful Excel spreadsheets.

    However, I have found that I wanted to understand the various ways of parametrising the curves, and their gradients, as I was getting into it all. I suppose it is personal. Also, at this stage, I am finding that, for my work, CI matches my needs a little better than other gradients. Since there were no easily available tools for computing it, I just wrote one yesterday, so it is quite easy—for me at least—to have a CI at a push of a button.
    Rafal Lukawiecki
    See rafal.net | Read rafal.net/articles

  6. #26

    Join Date
    Feb 2009
    Shooter
    Med. Format RF
    Posts
    46
    Rafal,
    I'm glad to hear that my suggestions were helpful and that you were able to implement them in R. You might want to try fitting your data to the function I suggested in the earlier thread:

    D(x) = a1*ln(1+exp(a2*x + a3))
    where D is density, x is exposure (on a logarithmic scale) and a1,a2, a3 are constants. ln is the natural logarithm and exp is the number e raised to the power in the parentheses.

    I didn't realize it when I first suggested this function, but I later discovered that it had been introduced for sensitometry in 1920s by Robert Luther, a Professor at the University of Dresden. Over the years, it was found to work remarkably well for lots of film-developer combinations, except for the shoulder region, where it is not applicable at all. There is also a bit of theoretical basis to justify its use.

    David

  7. #27
    Rafal Lukawiecki's Avatar
    Join Date
    Feb 2006
    Location
    Co. Wicklow, Ireland
    Shooter
    Multi Format
    Posts
    730
    Quote Originally Posted by Rudeofus View Post
    Splines are commonly used to describe arbitrarily curved shapes, because they can be made to fit almost any point set with a smooth looking curve. In the same fashion you can make them LMS fit a point cloud, and that's what you did with your LOESS approach, or to say it more accurately: that's what the LOESS algorithm did for you. The more degrees of freedom you allow the LOESS algorithm, the closer the result will fit your point cloud, but remember that your point cloud is still noisy data!
    Rudi, you make a very good point, and I have no doubt that spending more time writing code that would fit a more correct curve, which follows the log relationship which you, and David, have mentioned, would be a better way to get accuracy. However, my original goal was to calculate the CI in a more automated way than what I did previously using graph paper, having picked the technique from Bill (thanks!). When doing it by hand, I already had to create a "spline" by rotating the French curve (Bezier) two or three times, as I was drawing a smooth line through my points plotted off the densitometer. Then I was using the CI-ruler to approximately find the 3 points that met the condition. I have a feeling that the Bezier spline approximation, calculated in my code, does as good, or perhaps even a better job that what I did by hand. Perhaps you have a set of logE/D points for which you know their CI, that we could use to further test my code? I wonder how much more (useful) precision would I gain by finding a better, more "film-like" log-based smoother.

    Quote Originally Posted by dpgoldenberg View Post
    Rafal,
    I'm glad to hear that my suggestions were helpful and that you were able to implement them in R. You might want to try fitting your data to the function I suggested in the earlier thread:

    D(x) = a1*ln(1+exp(a2*x + a3))
    where D is density, x is exposure (on a logarithmic scale) and a1,a2, a3 are constants. ln is the natural logarithm and exp is the number e raised to the power in the parentheses.
    Many thanks, David and Rudi, for your suggestions—I wonder what would be able to cope with a touch of shouldering, should it be present. I will use them the next time I have a chance to spend a day coding film-testing utilities.
    Rafal Lukawiecki
    See rafal.net | Read rafal.net/articles

  8. #28
    Rudeofus's Avatar
    Join Date
    Aug 2009
    Shooter
    Medium Format
    Posts
    1,531
    Images
    10
    Quote Originally Posted by RalphLambrecht View Post
    Tat's rather complicated.that's ehy I use a simple average gradient as phil davies shows in his book''Beyond the ZO
    ZONESystem'.Tha's all you needin my opinion.
    The question is how much information you can extract from a limited and noisy set of data points. If advanced methods and modern computing give you results that are much more statistically sound and are therefore much more accurate and reliable, why wouldn't you want to use them? Not everybody is scared to death by math ...
    Trying to be the best of whatever I am, even if what I am is no good.

  9. #29
    Rudeofus's Avatar
    Join Date
    Aug 2009
    Shooter
    Medium Format
    Posts
    1,531
    Images
    10
    Quote Originally Posted by Rafal Lukawiecki View Post
    I wonder what would be able to cope with a touch of shouldering, should it be present. I will use them the next time I have a chance to spend a day coding film-testing utilities.
    If you look at my two model functions, the second one should be able to cope very well with shouldering with negative values for b.
    Trying to be the best of whatever I am, even if what I am is no good.

  10. #30

    Join Date
    Feb 2009
    Shooter
    Med. Format RF
    Posts
    46
    Quote Originally Posted by Rafal Lukawiecki View Post
    Many thanks, David and Rudi, for your suggestions—I wonder what would be able to cope with a touch of shouldering, should it be present. I will use them the next time I have a chance to spend a day coding film-testing utilities.
    Rafal,
    Here is another function that does include a shoulder:

    f(x) = a1*log(1+exp(a2*x+a3))/(a4+log(1+exp(a2*x+a3)))

    This is based on my original function for the toe and linear regions, but then modifies the result of that function by a rectangular hyperbolic function to form the shoulder at the larger values. I suggested this in the mega-thread on curve fitting. Here is a link to an image showing a fit of this function to a set of data for enlarging paper that Ralph Lambrecht provided.

    http://www.apug.org/forums/attachmen...perfit.png.att

    David

Page 3 of 4 FirstFirst 1234 LastLast


 

APUG PARTNERS EQUALLY FUNDING OUR COMMUNITY:



Contact Us  |  Support Us!  |  Advertise  |  Site Terms  |  Archive  —   Search  |  Mobile Device Access  |  RSS  |  Facebook  |  Linkedin