Science is in bad shape « Why Evolution Is True

Much ado lately about a pair of articles in The Economist that gather the growing evidence that science is getting an awfully lot wrong these days, and may be less reliable than we think (and certainly far less than we wish.) 

Perhaps the best summary of this I’ve read is from Jerry Coyne. Here’s the nut:

One piece is called “How science goes wrong“; the other is “Trouble at the lab.”Both are free online, and both, as is the custom with The Economist, are written anonymously.

 

As I read these pieces, I did so captiously, really wanting to find some flaws with their conclusions. I don’t like to think that there are so many problems with my profession. But the authors have done their homework and present a pretty convincing case that science, especially given the fierce competition to get jobs and succeed in them, is not doing a bang-up job.  That doesn’t mean it is completely flawed, for if that were true we’d make no advances at all, and we do know that many discoveries in recent years (dinosaurs evolving into birds, the Higgs boson, black matter, DNA sequences, and so on) seem solid.

 

I see five ways that a reported scientific result may be wrong:

 

  • The work could be shoddy and the results therefore untrustworthy.
  • There could be duplicity, either deliberate fraud or a “tweaking” of results in one’s favor, which might even be unconscious.
  • The statistical analysis could be wrong in several ways. For example, under standard criteria you will reject a correct “null” hypothesis and accept an alternative but incorrect hypothesis 5% of the time, which means that something like 1 in 20 “positive” results—rejection of the null hypothesis—could be wrong. Alternatively, you could accept a false null hypothesis if you don’t have sufficient statistical power to discriminate between it and an alternative true hypothesis.  Further, as the Economist notes, many scientists simply aren’t using the right statistics, particularly when analyzing large datasets.
  • There could be a peculiarity in one’s material, so that your conclusions apply just to a particular animal, group of animals, species, or ecosystem.  I often think this might be the case in evolutionary biology and ecology, in which studies are conducted in particular places at particular times, and are often not replicated in different locations or years. Is a study of bird behavior in, say, California, going to give the same results as a similar study of the same species in Utah? Nature is complicated, with many factors differing among locations and times (food abundance, parasites, predators, weather, etc.), and these could lead to results that can’t be generalized across an entire species. I myself have failed to replicate at least three published results by other people in my field. (Happily, I’m not aware that anyone has failed to replicate any of my published results.)
  • There could be “craft skills”—technical proficiency gained by experience that isn’t or can’t be reported in a paper’s “materials and methods,” that make a given result irreproducible by other investigators.

If you read the Economist pieces, all of these are mentioned save #4 (peculiarity of one’s material). And the findings are disturbing.

 

Coyne follows with a particularly lucid commentary on these articles and their implications. Highly recommended. 

Find it at Science is in bad shape « Why Evolution Is True.

 

Advertisements

Ross Anderson on the James Webb Telescope

lareviewofbooks:

ROSS ANDERSEN

on the James Webb Space Telescope.

Image courtesy of NASA

The eye has long been thought the jewel of human anatomy. In Mesopotamia, fount of civilization and astronomy, Sumerians worshipped small gods of clay and marble, featureless but for the stare of large eyes. The ancient Egyptians, famous for economy of expression, had seven different hieroglyphs for the eye. In his Metaphysics, Aristotle called seeing the noblest faculty of man. Not even modernity has scrubbed the eye of its metaphysical sheen. James Russell Lowell dubbed it the notebook of the poet, and among the religious, its mere existence is said to refute Darwin.

Yet, for all this tribute, the human eye remains a limited technology, seeing only the rainbow of visible light, a thin slice of the electromagnetic spectrum. And, even in this realm, it falls short of other mammals, for whom nightfall is no curfew — to have squinted into the dark and seen the glimmer of raccoon eyes is to have felt the chill of this truth. Human vision fails to encompass the horse’s panoramic field of view or the brilliant ultraviolet shades seen by birds of prey. Even the insect, lower still on the totem pole of consciousness, absorbs a gushing, flood-like cinema — some 250 frames per second.

Hence, the human toolmaker has had to compensate, and vigorously. First by pouring new light into the world with fire and electricity, then by dreaming up technologies to complement the eye. Like early stone tools, these began crudely: chips of crystal unearthed and shaped into small magnifiers. In Rome, Nero was said to have peered through an emerald at gladiators fighting in the distance. It would take until 1608 for the telescope to be invented, and another year still until one was pointed at the night sky. Light from the moons of Jupiter fell down that telescope and into the mind of Galileo, who deduced from it that not all heavenly bodies circled the Earth — the first in a series of fresh cosmologies wrought by the telescope. In the 19th century, William Herschel would use a large wooden telescope to find and catalogue thousands of “nebulae,” single stars then thought to be surrounded by clouds of luminous fluid. A century later, at the Mt. Wilson Observatory, Edwin Hubble took a closer look at Herschel’s nebulae. Hubble discovered that Herschel had been right in thinking that nebulae contained stars, but that he was seriously mistaken about the number. We now have a new word for nebulae: galaxies.

The 400 years since Galileo have marked a revolution in seeing unlike anything since the Cambrian explosion, when light sensitivity first rippled through the food chain, remaking it wholesale. In that time, the telescope has divided and grown, mutating from a single modest tube to a multitude of enormous, landscape-dominating forms. It has assumed the ways of an ascetic, leaving civilization for more solitary, contemplative environments: deserts, the shoulders of volcanoes, exotic islands, even space itself. The camera and the computer have given the telescope a memory, freeing it from constant attachment to the human eye. Most importantly, it has become a refined aesthete, keen to the entire electromagnetic palette, including a species of energy especially prized by astronomers: infrared light. A telescope sensitive to infrared light can see into thick clouds, where new stars and planets lurk. And in the chill of deep space, freed from the distorting shimmer of the atmosphere, an infrared telescope can see nearly all of time.

Read More