("photography always lies")
But that aside, surely everyone realizes (if they think about it) that analog processes are also full of stages in which information is lost or distorted, and that the feeling that a photo is somehow an accurate representation of "what was really there" or "what you would have seen" is an illusion that skips over a whole lot of mental modelling that we do unconsciously. There's nothing wrong with that unconscious elision, but it's easy to confuse "I don't notice this class of inaccuracies" with "This class of inaccuracies is not important" (or even "...does not exist").
There is, pretty obviously, no optical system that doesn't lose *some* information, including your eye---even before anything takes place that could be described as a capture, the in-camera projected image is already "degraded" from the pool of available photons that arrived at the lens. Practically speaking, nobody really thinks the degree of loss in a reasonably modern camera is important---we accept photographs as legal evidence of fact without having courtroom arguments over the number of air-to-glass surfaces in the lens used---but at some point, people start saying "I dunno, it just doesn't *feel* *real* *enough*", and shockingly enough that point is differently defined for different people in different contexts. I'm not sure why the first digital processing stage is such a popular critical point, but it sure is one.