Quote Originally Posted by Loose Gravel
Wow, what a thread.

When I'm out making a photograph shortly after sunset and the light is decaying, I often think that there must be a moment at which one could start an exposure that it would take all the light you could get (for the rest of the day) to complete the exposure. That is that because the light is decaying while I am exposing the film, my exposure needs to increase and by the time I've reached my predicted exposure, the light has faded more. Do you all think you can add this effect to your graphs?
I don't think this could be easily modelled, as there are too many variables, including things you couldn't know, like sky conditions over the horizon after sunset. Heavy clouds over the horizon could kill the light faster than expected.

Also, the sun sets at different angles at different latitudes at different times of the year. So the rate at which it drops below the horizon changes, which changes the length of twilight according to season and location.

There are good working models of twilight, so you could do something with that, dependant on latitude and time of year, but I think combining that with unknown sky conditions over the horizon would be too complex and involve variables that the photographer couldn't observe in the field.

Lee