Often ignored, but diffraction increases every time you stop a lens down, no matter what image scale it is set to.
So much so that with roughly every two stops you close a lens down, (maximum achievable - there are more limiting factors, like motion blur, or lens aberrations) resolution is halved.
That's quite something.
I would qualify that as starting at where the lens is made to be sharpest. Most real world sharpness limitations that I have encountered with quality optics are set by the technique, circumstance or equipment, not diffraction. I am not, of course denying the principle, or that it shouldn't be considered, I just find it unlikely in this case. In reviewing the OP, I have realized he is using a TLR and now suspect the calibration of the focusing lens, or focus issues as a side effect of parallax (taking lens closer than focusing lens) as my primary suspicion.