Traditional Culture Encyclopedia - Photography and portraiture - Why are eyes compared to cameras?

Why are eyes compared to cameras?

In fact, the principles behind eyes and cameras are quite different. Moreover, the structure of the eye is much more complicated than that of a camera, but some principles can be roughly explained using a camera, so let’s look at some similarities first.

Both have lenses to focus the light, and photosensitive elements to capture the light, but they work in very different ways.

The camera lens moves to focus on the object, and the eye responds by changing shape.

Most camera lenses are "achromatic," meaning the camera focuses red light and blue light at the same point.

Your eyes are different. When an object's red light is in focus, its blue light is out of focus.

Then why don’t we always feel that parts of things are out of focus when we look at them?

To answer this question, we first need to look at how eyes and cameras capture light: photoreceptors.

The light-sensitive surface in the camera has only one type of photoreceptor, evenly distributed over the entire focusing surface. These photoreceptors have an array of red, green, and blue filters that allow them to respond to long, medium, and short-wavelength light respectively.

The retina of the eye has several different photoreceptors, usually three under normal light conditions and only one under low light sources. This is why we can only see gray in the dark.

Under normal light, we don’t need color filters like cameras because our photoreceptors already respond to different wavelengths.

The difference with a camera is that your photoreceptors are not evenly distributed. In the middle, there are no receptors for dim light. This is why stars disappear when we look directly at faint starlight.

There are also very few receivers in the middle that can detect blue light, so although you haven’t noticed the blurry blue image before, you can still detect that there is blue there because your brain Supplement it according to the surrounding conditions.

And there are fewer receptors for light of any wavelength at the edge of the retina, so our visual acuity and ability to detect colors will drop rapidly from the center of our sight to both sides.

There is also an area in the eye called the blind spot where there are no photoreceptors of any kind. But we don’t notice what’s not there because our brain fills in the gaps for us again.

In fact, we see things with our brains, not our eyes. And because our brains, including our retinas, are so deeply involved in this process, we can easily create visual illusions.

Here is an example of an illusion caused by the eyes themselves.

The center of the picture seems to be flashing.

This is because your eyeballs are actually moving. Without movement, your vision will eventually stop because the nerves in your retina will eventually stop responding to still images of constant intensity.

Also unlike a camera, vision stops briefly when your eyes move significantly. That's why when you look in a mirror, you can't see your eyes moving from one place to another.

Cameras can capture details that our eyes miss, magnify distant objects, and accurately record what is seen.

But our eyes are the product of efficient evolution after hundreds of millions of years of independent evolution with the brain.

So what if we don’t see exactly what the world looks like every time.

You can find some pleasure in watching the still leaves sway in the wind in an illusion.

Evolutionary advantages have even been observed. Let’s save this lesson for another day.