This is why we've beein discussing Kriss and references to a way of incorporating edge effects in a more unified model of image definition. It is in both the definitions document and the summary of Henry's measurements I posted. I've repeated this a number of times.

What you are referring to as the "old theory" is edge sharpness - ie the average gradient across the transition. Edge effects are something else - "microcontrast" effects that can, under some conditions, enhance the subjective impression of sharpness.

What I'm proposing is that based on both experimental results and theory, the traditional concept of solvent "etching" reducing edge sharpness is at best a misrepresentation of the difference between solvent and non-solvent developers. Further, we cannot easily generalize because different films react in different ways.

It would appear traditional acutance is primarily driven by exposure and film characteristics. The primary effects of developer choice are on graininess and in some cases, edge effects. I high pH, low sulfite developer such as Rodinal or Beutler tends to increase graininess significantly, and may give stronger edge effects than a mildly diluted solvent developer. So I would say the subjective characterization of these developers as sharp has much more to do with pronounced grain and edge effects than anything else, even if resolution declines. That seems to be what everything shows.