The "graininess" we observe, which is a visual perception thing, has not much to do with the grains at the microscope, which are not visible, or indirectly perceivable, by the human eye.

A pixel is perceivable as such and can be counted as they are ranged and don't overlap. The "grains" overlap and what we see is some form of a "cloud of edges of crystals" which we call grain because we see the "graininess" but not the crystals.

Although I appreciate the implied argument that there is more resolution in film than in common digital sensor (which I confirm with practising both technology and being able to confirm that in a 135 frame scanned by a film table scanner at 4000 ppi resolution there is WAY more detail than in my 11 MP serious digital) I think that calling crystals "grains" as if each of them was perceivable by the human eye (as it happens with pixels) is a bit forced, but would not like to descend into a nominalistic quarrel here.

Those crystals whose edges overlap form areas of higher or lesser density but one cannot "count" the crystals, or the grains, in the way one counts "pixel" to arrive at a resolution measure, the resolution measure being the reason of the original question.


PS I see Ray says it better and more concisely, in fact.