I'm guessing that on a theoretical basis there ARE changes in the underlying fog, but for practical purposes, it doesn't matter; the only important thing is the resulting total density.
Let me make a hypothetical example. Say that 1) we have a system where development byproducts restrain further development, and 2) the unexposed areas have a slight tendency for unwanted development, increasing the base+fog level. My guess is that areas with greater exposure will release enough byproducts to restrain the "natural fog" in their immediate vicinity. So in this case, the "fog level" does not stay constant.
The real question is, to me, does this change anything about how I see or measure the density? I think the answer is no. Anything that I can measure, or that the printing paper can "see," has already included every effect and interaction, and I don't have any simple way to distinguish between any of these. Since neither I, nor my densitometer, nor my printing paper can tell these effects apart, I just treat them as though they are all effects of exposure. For my purposes, they effectively are. (Someone studying the mechanics of exposure and development would probably want to treat things differently, though.)
ps: I think the real reason one can subtract off the base+fog levels is that they essentially function the same as a neutral density filter sandwiched with the film.
(ps: the prior two posts came up while I was still writing)