The camera does lie – a paradox of sorts

The greatest misconception about photography is that the camera is “all seeing”. But as we previously explored the camera does lie. The majority of photographs are lies because they don’t have any basis in fact. First and foremost, photographs are 2D representations of 3D scenes, so do not capture the world as it truly is. Black and white photographs are monochromatic representations of a coloured reality, and “frozen” stills represent moving objects. Yet every photograph is a true rendition of a subject/object/scene at one particular moment in time. This is something of a paradox – everything visible in the cameras field of view is authentic, but it lacks the intricate qualities of the real scene. You can take a picture of a sunrise on a beach, but there will be missing the factors that make it a memorable scene – the wind blowing (sure video can capture this), the smell of the sea, the warmth of the first rays of the sun, the feel of the sand on the beach. The camera then produces a lie, in so much as it only tells a portion of a story, or distorts it in some manner.  A difference exists between a photograph, and the subject/scene it depicts. It is a snapshot in time, nothing more.

Conversely, the camera allows us to capture things the human eye cannot perceive. It allows differences in viewing angles – a fisheye lens can see 180° in extremes, and although the human eyes can perceive 120° individually, dual eye overlap is only about 120°, and of that the central angle of view is only about 40-60°. Our peripheral vision is only good enough for sensing motion, and huge objects. Camera’s are also capable of stopping motion – human eyes can’t, we have no ability to slow down a video, or “freeze” motion. Therefore the cameras ability to lie can be beneficial, producing images that are more effectual than the actual experience.

Examples include far-away scenes that the human eye is incapable of perceiving, yet a telephoto lens can show quite distinctly. Another is high speed photography of an egg being dropped on a hard surface, where each frame represents milliseconds in time, yet clearly depicts each facet of the egg hitting the surface with clarity the human eye is incapable of. Or, an image where blur and unsharpness (or bokeh), have been used with great effect to isolate a quality of a particular subject/object (human eyes don’t actively perceive the unsharp regions of our vision). In all these cases the subject/object is shown in a way different to how the eye would perceive them, and in many cases the photograph contains information that is lost to the human eye. Of course a photograph can also hide information. A photograph of a small village in a valley may veil the fact that a large expressway lies behind the photographer – the viewer of the photograph sees only a secluded village.

For good or bad, cameras do lie.

How many colours can humans see?

The human eye is a marvelous thing. A human eye has three types of cone cells, each of which can distinguish 100 different shades of colour. This puts the number of colours at around 1,000,000, although colour perception is a highly subjective activity. Colour-blind people (dichromats) have only two cones and see 10,000 colours, and tetrachromats have 4, and see up to 100 million colours. There is at least one case of a person with tetra-chromatic vision.

Of course the true number of colours visible to human eyes is truly unknown, and some people may have better perception than others. The CIE (Commission internationale de l’éclairage), who in 1931 established the “CIE 1931 XYZ color space”, created a horseshoe-shaped colour plot covering the hue range from 380-700nm, and saturation from 0% at the centre point, to 100% on the periphery. The work of CIE suggests humans can see approximately 2.4 million colours.

CIE 1931 XYZ color space

Others postulate that humans can discriminate about 150 bands between 380 and 700 nm. By changing saturation, and brightness, it is possible to determine many more colours – maybe 7 million [1].

Visible colour spectrum

This puts the human visual system in the mid-range of colour perception. Marine mammals are adapted for the low-light environment they live in, and are monochromats, i.e. they perceive about 100 colours. Conversely, on the other end of the spectrum, pentachromates can see 10 billion colours, e.g. some butterflies.

Now in computer vision, “true colour” is considered to be 24-bit RGB, or 16,777,216 color variations.  Most people obviously can’t see that many colours. The alternatives in colour images are limited. 8-bit colour provides 256 colours, and 16-bit which is a weird combination of R (5-bit), G (6-bit) and B (5-bit), giving 65,536 colours. Can we perceive the difference? Here is a full 24-bit RGB photograph:

Colour image with 24-bit RGB

Here’s the equivalent 8-bit colour photograph:

Colour image with 8-bit RGB

Can you tell the difference? (Except for the apparent uniformly white region above the red and yellow buildings).

[1] Goldstein, E.B., Sensation and Perception, 3rd ed. (1989)