Depends on what you put in front of it and what axis you're rotating around, and how you're sampling the signal from the photodiode.

If you have a slit and no other optics, the diode will see a whole slice of the scene summed. If you then scan the slit across, the slice will of course change. Say you take a vertical slit and scan it horizontally, the image recorded by the diode would be the same as if you'd made a pinhole image and averaged everything vertically, i.e. you have an Nx1 pixel image. There is only 1 diode, so there can be only 1 pixel, multiplied by how many samples (in time) you took horizontally.

There are IR imaging sensors that work a little like this, though they have lenses for good focus, a single (cooled) extremely sensitive photosensor (looking at one very small point in the scene at a time) and then scan that in a circular pattern that slowly slides to one side. It's very slow and requires a static subject, but it removes all inter-pixel calibration issues; assuming you have a well-calibrated sensor then the result is of extremely high precision.