on the James Webb Space Telescope.Image courtesy of NASA
The eye has long been thought the jewel of human anatomy. In Mesopotamia, fount of civilization and astronomy, Sumerians worshipped small gods of clay and marble, featureless but for the stare of large eyes. The ancient Egyptians, famous for economy of expression, had seven different hieroglyphs for the eye. In his Metaphysics, Aristotle called seeing the noblest faculty of man. Not even modernity has scrubbed the eye of its metaphysical sheen. James Russell Lowell dubbed it the notebook of the poet, and among the religious, its mere existence is said to refute Darwin.
Yet, for all this tribute, the human eye remains a limited technology, seeing only the rainbow of visible light, a thin slice of the electromagnetic spectrum. And, even in this realm, it falls short of other mammals, for whom nightfall is no curfew — to have squinted into the dark and seen the glimmer of raccoon eyes is to have felt the chill of this truth. Human vision fails to encompass the horse’s panoramic field of view or the brilliant ultraviolet shades seen by birds of prey. Even the insect, lower still on the totem pole of consciousness, absorbs a gushing, flood-like cinema — some 250 frames per second.
Hence, the human toolmaker has had to compensate, and vigorously. First by pouring new light into the world with fire and electricity, then by dreaming up technologies to complement the eye. Like early stone tools, these began crudely: chips of crystal unearthed and shaped into small magnifiers. In Rome, Nero was said to have peered through an emerald at gladiators fighting in the distance. It would take until 1608 for the telescope to be invented, and another year still until one was pointed at the night sky. Light from the moons of Jupiter fell down that telescope and into the mind of Galileo, who deduced from it that not all heavenly bodies circled the Earth — the first in a series of fresh cosmologies wrought by the telescope. In the 19th century, William Herschel would use a large wooden telescope to find and catalogue thousands of “nebulae,” single stars then thought to be surrounded by clouds of luminous fluid. A century later, at the Mt. Wilson Observatory, Edwin Hubble took a closer look at Herschel’s nebulae. Hubble discovered that Herschel had been right in thinking that nebulae contained stars, but that he was seriously mistaken about the number. We now have a new word for nebulae: galaxies.
The 400 years since Galileo have marked a revolution in seeing unlike anything since the Cambrian explosion, when light sensitivity first rippled through the food chain, remaking it wholesale. In that time, the telescope has divided and grown, mutating from a single modest tube to a multitude of enormous, landscape-dominating forms. It has assumed the ways of an ascetic, leaving civilization for more solitary, contemplative environments: deserts, the shoulders of volcanoes, exotic islands, even space itself. The camera and the computer have given the telescope a memory, freeing it from constant attachment to the human eye. Most importantly, it has become a refined aesthete, keen to the entire electromagnetic palette, including a species of energy especially prized by astronomers: infrared light. A telescope sensitive to infrared light can see into thick clouds, where new stars and planets lurk. And in the chill of deep space, freed from the distorting shimmer of the atmosphere, an infrared telescope can see nearly all of time.