What is "sharpness"? What a wonderful question
Sharpness is made up of three inter-related components, and should probably be referred to as "perceived sharpness" because it is both objective and subjective.
One of the components of "sharpness" is resolution - the capacity to render fine details. It can be objectively measured, and people tend to obsess over it, but relatively speaking resolution contributes the least amount to "sharpness".
Generally speaking, with respect to films the finer the grain, the higher the resolution. With respect to developers, see below.
The next component of "sharpness" is contrast. By contrast I refer to both overall range of light to dark - dynamic range - and so-called micro-contrast - the contrast between adjacent small details with different tones.
The perception of sharpness is strongly affected by both the overall dynamic range of the image and how clearly contrasts between adjacent tones reveal themselves. For this reason, high key images don't usually strike the viewer as being very "sharp" even when they reveal a tremendous amount of detail, whereas images that have both deep shadows and sparkling highlights tend to look sharp even when they reveal minimum amounts of detail.
High contrast films (like copy films) tend to give results that appear sharper for a number of reasons, including the fact that they have high inherent contrast.
Some developers enhance the contrast capabilities of films (think lithographer's materials).
The most important component of perceived sharpness is acutance - how accurately the film renders sharp edges of details. We perceive edges of details very well, and the rendering of those edges has the greatest effect on our perception of sharpness. Techniques such as unsharp masking are used to emphasize edges, and therefore enhance perceived sharpness.
Where edges are rendered using discrete film grains (more accurately: discrete clumps of grains) the smaller those grains are, the better they are at representing a fine edge. So one would think that finer grain film would inevitably appear sharper. There is, however, another factor that comes to play. Our ability to perceive edges is very advanced. So advanced, in fact, that the rendering of the individual clumps of grains influences us. If the edges of the grains appear sharp, the image will appear more sharp to us. This effect actually manifests itself in a way where film that exhibits sharp edged large grain will often appear sharper than film that exhibits softer edged small grain.
This is where the choice of developer is so important. Developer has a large effect on the appearance of grain. A higher acutance, normal grain developer used with a film that has larger grain (think Rodinal and Tri-X) appears sharper than a small grain film in a mid-level acutance developer (think X-Tol and TMY2), even though the resolution capability of the latter combination is much higher.
So-called "fine grain" developers do reduce the size of the grain somewhat, but more importantly they reduce the appearance of the grain by softening the appearance of the edge of the clumps of grain. The graininess of the image is reduced, at the expense of the acutance - thus reducing the appearance of sharpness.
When choosing films and developers, be careful about emphasizing "sharpness" or "graininess" over other concerns.
As an aside that is only slightly relevant to APUG, the "sharpen" controls available in photo editing software all work by artificially enhancing edges. Both analogue and digital capture methods disrupt the rendering of edges, due to the fact that they break the details into discrete entities (either film grains/dye clouds or pixels) so a tool that attempts to "repair" that disruption is needed, but any such tool really can only at best approximate reality.