In antiquity, the followers of the Greek philosopher Pythagoras trumped any doctrinal dispute by saying “Ipse dixit”––“He himself has said it.” Rather than demonstrating the strength of his argument, a Pythagorean would simply invoke the great master in order to end the debate. Today we call this logical fallacy the “appeal to authority,” yet we continue to indulge it. Only now, our master is science or, more often, what appears to be science.

The rise of modern science in the seventeenth century was driven by testing and rejecting such appeals to authority. Whether scripture, tradition, or Aristotle, authority could not be allowed to substitute for logic and evidence. The famous example, of course, is Galileo, who trusted the truths of mathematics and personal observation even if they led to conclusions that contradicted the doctrines of the church or the authority of the ancients. Over the centuries, the success of the scientific method­––grounded in skepticism of received wisdom, and in demands for coherent argument and convincing evidence no matter whose authority was challenged––led to the remarkable understanding of nature and the subsequent technologies that have transformed our world beyond the fantasies of our ancestors.

Yet the success of science has ironically given it an authority we too often accept without question. The provisional conclusions of research frequently are announced as definitive before the scientific community has adequately vetted them. But the prestige of science and its scholarly institutions can often obscure just how tentative the claims of much research are. When professional advancement, political advantage, or ideological gratification are bound up in the acceptance of new ideas or alleged truths, the temptation to suspend one’s skepticism becomes powerful and sometimes dangerous.

The anti-vaccination movement is an example of the dangers caused by bad or fraudulent scientific research. Since their development in the late eighteenth century, vaccines have saved billions of lives and nearly eradicated diseases like smallpox and polio. Over two centuries of experience and observation have established that vaccination works and its risks are minimal. Yet in 1998, British gastroenterologist Alexander Wakefield and his co-authors published a paper in the prestigious medical journal Lancet claiming that the MMR (measles, mumps, and rubella) vaccine given to children could cause autism and bowel disease. Researchers across the world debunked the claim in a few years, but by that time, an anti-vaccination movement had sprung up among parents who did not have their children vaccinated based on shoddy, or what some have charged was fraudulent, research.

Wakefield’s claims were able to gain traction because he and his colleagues are credentialed scientists and their work appeared in one of the world’s most authoritative medical journals. Our reflexive faith in science, particularly on matters the layman knows little about, compelled assent, particularly from the parents of autistic children who understandably wanted to know the reasons for their child’s condition and to exact accountability from those responsible. Even though the Lancet repudiated the original research, calling it “fatally flawed,” the anti-vaccination movement has continued, albeit much diminished—though it did flare up again recently when Calvin College threatened to bar non-vaccinated students from campus after a mumps outbreak. The dangers of this movement are not just to the unvaccinated children, but also to other children who lose “herd immunity,” the protection against pathogens given to a whole community when significant numbers of its members are vaccinated.

Even more dangers have arisen from the extension of the scientific method from the natural to the human world. The spectacular success of natural science suggested that the behavior and motivations of people, and their social and cultural practices and institutions, could also be known and understood in terms of natural laws as reliable and predictable as Newtonian physics. This faith in turn spurred the fervent belief in progress, the inevitable improvement of human life, which would eventually correct the malign forces of ignorance, superstition, tradition, and poverty. The technological advances that have improved our material existence have made science our world’s “ipse dixit,” the ultimate authority to which we turn for solutions to our problems.

But our most important problems concern human behavior and motivation, making this faith a dangerous category error. Humans are radically different from animals or other natural phenomena. They alone, arguably, have minds, consciousness, self-awareness, and most importantly, free will, the ability to act spontaneously and unpredictably. None of these attributes has as yet been explained solely through science, and their existence still keeps humans and their behaviors a mystery. As such, they cannot be known and explained with the certitude and predictability required of science: “For,” as philosopher Isaiah Berlin writes, “the particles are too minute, too heterogeneous, succeed each other too rapidly, occur in combinations of too great a complexity, are too much part and parcel of what we are and do, to be capable of submitting to the required degree of abstraction, that minimum of generalization and formalization––idealization––which any science must exact.”

In many cases, then, the quantitative methods and technical vocabulary of science applied to human behaviors and experiences lie beyond what mathematician John Allen Paulos has called the “complexity horizon,” that “limit or edge beyond which social laws, events, or regularities are so complex as to be unfathomable, seemingly random.” While social scientists have discovered certain patterns of behavior to hold true under certain circumstances, there will always be exceptions that defy the norm.

The belief, however, that scientism is science, and that its authority should be similarly accepted, has been disastrous. For example, Marxism did not present itself as a philosophy of history, but as the science of history, comprising predictable, objective laws of economic and political development equivalent to the laws of biology and physics. As we now know, Marxism is more of a pseudo-religion, which explains why many today still cling to some of its tenets in the face of the overwhelming evidence of its bloody failure evident in the 100 million people killed in vain in its name. The scientistic camouflage merely made its murderous irrationalism more acceptable for those who scorned traditional religion, but never lost the human need to believe. As Arthur Koestler described his own communist faith, “there is now an answer to every question,” and “doubt and conflict are a matter of the tortured past,” and “nothing henceforth can disturb the convert’s inner peace and serenity––except the occasional fear of losing faith again, losing thereby what alone makes life worth living.”

The U.S. was spared the worst of Marxism’s destruction, but we experienced our own malign social policies promulgated under the guise of objective science. The eugenics movement of the early twentieth century created social and economic policies justified by “race science.” This division of humanity into superior and inferior races was influenced by Darwinian theories about natural selection of species based on their fitness for survival. Armed with the authority of Darwin, eugenicists categorized people based on superficial and often arbitrary qualities deemed “unfit” for survival. If allowed to reproduce or intermarry with superior races, the inferior races would swamp the more civilized and advanced white ones. Irrational bigotry was transformed into objective science.

For the first three decades of the twentieth century, eugenics was accepted as “settled science,” and adherence to its theories was a sign of intellectual sophistication and superiority. Professors and esteemed scholars from the nation’s most prestigious universities published eugenics research and started academic programs teaching this new “science.” Mainstream magazines popularized this research for a wider readership. States passed forced sterilization laws, as New Jersey did under Governor Woodrow Wilson in 1911. The federal government passed the 1921 and 1924 immigration restriction acts in response to fears of racial and ethnic pollution from Chinese, Slavs, Poles, and Southern Italians. Apocalyptic books warning of race suicide, like Lothrop Stoddard’s The Rising Tide of Colored Against White (1920) and Madison Grant’s The Passing of the Great Race (1916), were national bestsellers. High school biology textbooks taught eugenics and race “science.” Indeed, the high school textbook used to teach evolution in the famous Scopes Trial of 1925, Civic Biology, had a whole unit on eugenics. In one passage, the author wrote that the “lower races” were “spreading disease, immorality, and crime,” and he mused that if they were animals, “we would probably kill them off to keep them from spreading.”

Eugenics and race science comprised the scientific consensus that shaped decades of federal and state government policy. Yet there was little genuine scientific evidence underlying both. They led to illiberal and cruel policies of forced sterilization, racial and ethnic exclusion, and institutionalization of those deemed “unfit.” It took the horrors of the Holocaust, which followed these theories to their logical conclusion, to discredit eugenics and relegate it to the long catalogue of other pseudo-sciences like phrenology, mesmerism, and alchemy.

Modern science has immensely improved human life, but human life involves much more than science can know or improve. Giving our assent to claims based on mere authority or assertions of “settled science” leaves us vulnerable to the scientism that has been used to justify some of the worst horrors in human history. A healthy skepticism, the hallmark of genuine science, should be our guide––especially when sweeping claims are made about the quirky, unique, complex mystery of human beings.

overlay image