A question about balancing depth of field and resolution. Bear with me this might be confusing if I don't write it out clearly enough. Assume 35mm negatives to be enlarged to around 8x10".

In my photographs I want everything sharp (read no selective depth of field). At the same time, whenever possible I prefer not to use apertures smaller than f11 as f16 seems to be where diffraction effects become readily noticeable. So there is always a balance that must be struck. I've started using my brother's nifty golf rangefinder to measure the distances in a scene before focusing. So I'll measure distances to the far away objects, the near objects, and/or the most important objects.

Now, most depth of field tables, including the scales on most lenses (which are not necessarily even accurate) assume a CoC of 0.03 or 0.032 as far as I can tell. But for film that needs to be enlarged a CoC this large is really pushing the limits of what I would consider acceptable sharpness. So I usually try to use a smaller CoC, arbitrarily something like 0.02 or lower if possible (I use the DoFMaster website to calculate depth of field for different CoC sizes using my distance measurements).

What I'm never sure about is whether to favour a smaller CoC with less depth of field or a larger CoC with more depth of field. Which will result in the higher perceived sharpness (all other factors remaining equal)?

Here's an example. Suppose for a scene I've determined f11 gives sufficient depth of field for a CoC of 0.025. Would it be better or not, to stop down to f16 (ie more diffraction) and theoretically get a CoC of 0.015 (ie more depth of field)?

And how much smaller does a CoC have to get from the standard 0.03 for there to be a visible improvement in resolution?

Is 0.02 significantly better than 0.03?

Is 0.01 significantly better than 0.03?

What is the lower cutoff point beyond which we can't really see an improvement (assuming a constant optimum aperture)?

Obviously the enlargement factor plays a major role...