Pseudoscience and Junk Science
Categories: Popular culture and society; science and technology
Many ideas that are presented in popular media as science are actually false or very poorly supported by evidence. These pseudoscientific ideas can be used both as general disinformation and for specific practical purposes, such as winning a court case or influencing legislation.
anecdotal evidence: isolated instances illegitimately used to establish a logical principle or general rule
conspiratorial outlook: a belief that others' behavior is caused by bias or ulterior motives
junk science: faulty science that attacks established scientific findings or advances poorly supported theories for a practical goal
pseudoscience: scientific-sounding claims that are either scientifically false or so poorly supported by evidence that they should not be taken seriously
Many people assume that any idea that claims to be science actually is science; they find the very existence of pseudoscience surprising. Pseudoscience is more than merely wrong or unsupported science. Merely being wrong or speculative does not make an idea pseudoscience. Speculation is a legitimate part of science, and many ideas in science that initially look promising turn out to be wrong (and vice versa). When ideas in legitimate science turn out to be wrong, supporters eventually admit it and abandon the ideas.
Page 912 | Top of ArticleOne example of such self-correction is cold fusion, the idea that nuclear fusion was possible using simple apparatus. After an initial flurry of interest in the 1980's, it turned out that the initial observations that seemed to support cold fusion were faulty. Most scientist supporters of cold fusion abandoned the idea. A few scientists continue to believe there was some basis for believing in cold fusion, but they also realize that they must supply adequate proof in order to convince other scientists.
In contrast, pseudoscience is not merely wrong, but its supporters also refuse to accept contrary evidence, often invent complex alternative explanations to support their ideas, and openly defy the consensus of the scientific community, often militantly. Frequently, they accuse the scientific community of bias or ulterior motives for refusing to accept unorthodox ideas.
There is no sharp boundary between science and pseudoscience. One useful simple classification is that of physicist James Trefil, who grouped scientific ideas into a center, where ideas are firmly supported and unlikely to be disproven, a frontier, where ideas are debatable, and a fringe, where ideas are very unlikely to be valid. Generally, frontier ideas move either into the center or out toward the fringe fairly quickly. For example, the idea that a meteor impact triggered the extinction of the dinosaurs was a frontier idea when it was first proposed but has moved toward the center. Cold fusion was also a frontier idea when it was first proposed but moved fairly rapidly to the fringe.
Rarely, even larger shifts occur. For example, continental drift was considered a fringe idea for decades before new evidence was discovered that caused it to move rapidly to the center in a modified form called plate tectonics. Not all fringe ideas necessarily deserve the label “pseudoscience.” For example, although medical studies have not confirmed the idea that vitamin C in large doses helps prevent colds, the theory itself is harmless and does not violate any well-established scientific findings. Bigfoot and UFOs are fringe ideas that do not necessarily violate known scientific findings, but they count as pseudoscience because the evidence for these ideas is very poor. Finally, some ideas, such as the Earth being only a few thousand years old, are pseudoscience because they contradict well-demonstrated scientific findings.
Although some advocates of pseudoscience engage in deliberate deception, most sincerely believe the ideas they publish. They are often motivated by a desire for fame or to have their beliefs accepted as science, or for psychological reasons such as a need for mental or spiritual satisfaction. Usually, their ideas are supported by a combination of wishful thinking, faulty data, and logical fallacies.
For scientists well trained in their fields, the distinction between science and pseudoscience is generally obvious. For nonscientists without specialized training, the distinction can be difficult. However, there are a number of clues that often are revealing and do not require much specialized training. If a new theory directly conflicts with established science, explicitly claims that existing science is wrong, or makes sweeping claims that most scientists reject, it is less likely to be legitimate science. If it merely reports a new discovery that seems improbable but does not contradict core scientific principles, it is more likely to follow accepted scientific practice, although it may still be incorrect.
The more strongly a theory opposes accepted science, the more likely it is to be pseudoscience—although almost all revolutionary scientific breakthroughs would also appear to be pseudoscience from the point of view of the prerevolutionary scientist. If a theory does radically oppose established science, one question to consider is whether its author makes obvious mistakes about simple matters. If so, the author is very unlikely to be correct in more complex matters that require more technical knowledge.
There are certain kinds of faulty data that crop up repeatedly in pseudoscience. Anecdotal data, for example, is data gathered from isolated cases and used to support a general conclusion. In order to be useful, anecdotal data must be both true and representative. If the anecdote is false, of course, then it cannot be used to prove anything, but even if it is true, it must still be representative and not just an isolated case. For example, it may be true that someone became ill after a vaccination, but if the event is just a rare case, it does Page 913 | Top of Articlenot necessarily prove the vaccination is unsafe. To count as scientific evidence, the anecdote must be reproducible—that it, scientists must be able to reconstruct the situation described and obtain the same result.
Pseudoscience often also makes use of obsolete data. Often, the alleged facts were reported in newspapers or popular media or were early scientific findings that later either were found to be erroneous or were superseded by more accurate results. If a theory is based heavily on old data, especially from popular sources that are hard to check, it should be considered suspect.
Pseudoscientists often cite other authorities in support of their ideas. Important questions to ask about alleged supporting authorities include how well respected they are in their own fields, how well informed they are about the subject they are supposedly supporting, and whether they really do support the theory in question or whether they are being quoted out of context and without their consent.
One of the most widespread and conspicuous features of pseudoscience is a conspiratorial outlook. Pseudoscientists commonly claim that their theories are not being taken seriously by the scientific establishment because of ulterior motives, such as protecting research funding, refusal to admit that an outsider might have made an important discovery, pride, or ideology. Even if these claims are true, they have nothing at all to do with whether the scientific claims of the author are true.
Page 914 | Top of ArticleA common tactic in pseudoscience, often termed the “Galileo fallacy,” is to point to examples of scientists who were persecuted in the past only to be vindicated later on. Often, as in the case of Galileo, the examples are based on oversimplified stereotypes that are not historically valid. Even when it is true that an idea met opposition from other scientists only to be found true later, that does not mean that opposition makes an idea true. The vast majority of ideas that encounter opposition turn out to be false.
Disproving an idea does not prove its opposite. For example, in 2008 the method of lead analysis used to analyze the bullets that killed President John F. Kennedy was found to be faulty. The method could not be used to prove that all the bullets came from the same gun, as had been claimed. However, that does not prove that the bullets came from more than one gun; it only proves that the method was inconclusive. It is very common in pseudoscience for the author to attack some finding in science and then assert that his own idea must be correct. Even if the scientific concept really is incorrect, that does not prove that some contrary idea is true. Unanswered questions are merely unanswered; they are never proof for an idea. For example, although there are many unanswered questions about evolution, those unanswered questions do not disprove evolution.
The term “junk science” came into widespread use in the 1980's. Perhaps the first significant example was the attempt by the tobacco industry to discredit the scientific evidence linking tobacco and lung cancer. Scientists and others working for the industry attacked minor and generally irrelevant flaws in the medical research, offered alternative explanations for the data, and questioned the validity of applying animal studies to humans. The intent of this strategy was to justify doubt in the minds of customers, legislators, judges, and juries. In fact, a common strategy in both pseudoscience and junk science is to exaggerate the degree of uncertainty about some scientific idea, and then assert that, since “the experts do not agree,” people should decide for themselves whether or not an idea is valid.
Pseudoscience is a significant concept for many reasons. First, many people are unaware of its existence, and merely knowing that pseudoscience exists can help them understand that some claims that sound revolutionary are not worth serious consideration. Pseudoscience has often been used in support of extremist movements, and a historical understanding of its function will help to contextualize future pseudoscientific claims. Widespread popular acceptance of pseudoscience is an indication that public understanding of science and logic are poor. Considering that many public policy issues involve science and that scientific evidence is so widely used in court, poor public understanding of science and of the proper use of evidence are matters of grave concern. Finally, pseudoscience can illuminate the workings of science by contrasting real science with counterfeit science.
Both sides in the climate change controversy have accused the opposition of practicing pseudoscience and junk science. The most reliable way to determine which side is more correct is to become well informed about climatology. However, clues such as allegations of conspiracy or ulterior motive, strength and relevance of professional credentials, use of logical fallacies and questionable data, attempts to exaggerate doubt, and the weight of scientific opinion are all revealing.
Steven I. Dutch
Feder, Kenneth L. Frauds, Myths, and Mysteries: Science and Pseudoscience in Archaeology. Boston: McGraw-Hill, 2008. Every legitimate scholarly field has a pseudoscience counterpart. This book discusses topics such as ancient astronauts, Atlantis, Noah's Ark, and the Shroud of Turin.
Gardner, Martin. Did Adam and Eve Have Navels? Debunking Pseudoscience. New York: W. W. Norton, 2001. Gardner has been observing and writing about pseudoscience for over half a century, and his collected works cover almost every pseudoscientific theory of modern times. This book deals with issues as varied as opposition to evolutionary theory and alternative medicine fads.
Page 915 | Top of ArticleRandi, James, and Arthur C. Clarke. An Encyclopedia of Claims, Frauds, and Hoaxes of the Occult and Supernatural. New York: St. Martin's Griffin, 1997. A former professional magician whose expertise in magic enabled him to spot the tricks used by alleged psychics, Randi has become one of the most prolific authors on pseudoscience. He joins forces with celebrated science-fiction author Arthur C. Clarke in a comprehensive survey of pseudoscience.
Powell, James Lawrence. The inquisition of climate science. Columbia University Press, 2012. Powell describes the organized attack on climate science in this highly unsympathetic book.
Schadewald, Robert J. Worlds of Their Own. Philadelphia, Pa.: Xlibris, 2008. Collects major essays previously published in numerous periodicals. Uses enduring themes in pseudoscience to explore the workings of science and shows that pseudoscience movements all have features in common.
Shermer, Michael, and Stephen Jay Gould. Why People Believe Weird Things: Pseudoscience, Superstition, and Other Confusions of Our Time. New York: Holt, 2002. Shermer, who writes the “Skeptic” column in Scientific American, and Gould, a biologist and popularizer of science, explore numerous popular delusions. They argue that a desire for mental and spiritual comfort plus a tendency to see nonexistent patterns account for much of the widespread belief in false ideas.
Gale Document Number: GALE|CX6074500444