Does exposed film deteriorate faster than unexposed film?
So, I found two rolls of Elitechrome in a random box of my photo stuff the other day. Both have been sitting around at room temperature (~75 degrees) for a year or two and are a few years expired. One was exposed and the other was not.
For kicks, I shot the unexposed roll and sent them both out for processing. When I got them back, the roll sat exposed for a year was heavily magenta fogged while the other roll was (more or less) perfect.
It is conceivable (but unlikely) that these rolls were stored under different conditions before I received them -- a friend gave them to me around when they expired. Since I've received them, they been together, both stored in their canisters.
Is this a real thing or is it a fluke? If so, what causes it?
Silver Chloride recombines to Silver Chloride if left in the dark after exposure to light (breakdown to Silver and Chlorine) relatively rapidly; hence Silver Chloride transition lenses.
I suppose Silver Bromide may do the same at a lower rate? Not sure about that though as it doesn't sound right.. it would then be at a lower rate than fog accumulates then, in that case net effect would be fogging + slow erasure of latent image over time.
A couple of years ago, I was given three rolls of Kodachrome dated 1986. I used two rolls and noticed that the third roll was wound into the cartridge - probably exposed in the 1980s or early 1990s.
I sent all three off for processing. My two rolls came back fine but the third roll had very feint images. I suspect that they were exposed properly at the time and that the latent image had faded somewhat.