An interesting idea, but not even remotely possible. A camera lens and the human eye/brain visual system are quite different animals. The camera takes in the entire scene at once, with a fixed focus, a fixed aperture, and a fixed "white point".
Originally Posted by Ektagraphic
The human eye scans the scene, varies focus as required, and varies aperture more or less continuously, again as required. Resolution varies markedly across the visual field, from amazingly sharp at the fovea to a dull blur at the periphery -- which is why the eye scans the scene continuously -- unlike a camera it can't take in the whole scene at once but instead has to build a "map" by tracing out the interesting bits.
The effect is that much of the scene isn't looked at closely with the human visual system -- just the high points (whatever the brain decides them to be and that's highly subjective from person to person). The stuff deemed of less import is often just a blur.
When it comes to color processing, again the human visual system has tricks that film / digital capture lack. For example, the human visual system has a rapidly movable white balance point. When film looks at snow in a shadow under an open sky, it sees a bluish tint. The human visual system shifts it's white balance for the shadows and sees the snow as white, or perhaps a light gray.
I could go on (do we even need to discuss stereo vision?), but I think I've made my point. Cameras and human eye/brain systems see the world very differently. No photograph can look "exactly as the eye sees it".