American Council of Learned Societies
Occasional Paper No. 31



Beyond the Academy:
A Scholar’s Obligations

George R. Garrison
Arnita A. Jones
Robert Pollack
Edward W. Said


The Social Responsibility of the Academy and Its Academicians
George R. Garrison

Reflections on the History Wars
Arnita A. Jones

THE DANGERS OF WILLFUL IGNORANCE
Robert Pollack

On Defiance and Taking Positions
Edward W. Said


The Dangers of Willful Ignorance

Robert Pollack
Columbia University

I have been asked to address the topic, “What is the scholar’s obligation to the larger public?” This is an easy question. A scholar’s obligation is always the same: to speak truth to power. The alternatives — to lie to power; or to tell the truth, but only to one’s colleagues — always end up contributing to a mess at best, or a disaster at worst. The idea that scientists, in particular, can avoid dealing with the political consequences of their work, has never sat well with me. As I study the roots of my own field — human genetics — I am struck by the magnitude of the problems that have stemmed from my profession’s capacity for willful innocence.

In this season, it makes sense to begin with a look backward, over our shoulder. On the 18th of June, 1940, Winston Churchill spoke to the House of Commons on the disastrous course of the war of England and France against Germany. He ended with the famous peroration:

Let us therefore brace ourselves to our duties and so bear ourselves that, if the British Empire and its Commonwealth last for a thousand years, men will say, “This was their finest hour.”

Many will recognize this sentence today, but few may recall an earlier phrase it refers to; the reason for “therefore:”

But if we fail, then the whole world, including the United States, including all that we have known and cared for, will sink into the abyss of a new Dark Age made more sinister, and perhaps more protracted, by the lights of perverted science.

What could Churchill have meant in 1940 by “the lights of perverted science?” While he could have been guessing at the coming use of rockets, jet planes, napalm or nuclear bombs, he did not have to imagine any future weapons to call upon a full decade of enthusiastic participation by life-scientists in the pre-war agenda of Hitler’s government. This collaboration had already led to the orderly, scientifically-planned and executed euthanasia of hundreds of thousands of Germans, and by 1940 these operations had been extended to the East, in occupied Poland.

By the time Churchill spoke, the major organizations of German genetics, biology, anthropology, and medicine and many of the best scientists and physicians in the Reich had joined for more than five years in murder of what the German government and its scientists had agreed to call “Ballastexistenzen,” lives not worth life.

Fifty years later those “lights of perverted science” still have the power to cast a shadow over the laboratories and hospitals in which biology and medicine come together. In the last third of the century, a new biology — built on the discovery that DNA is the genetic material — has provided medicine with a research agenda and a set of tools and techniques drawn from basic research on the human genome.

These have recently given us such notable successes as the isolation and characterization of the genes responsible for cystic fibrosis, Huntington’s Disease, and hundreds of other inherited diseases; the restoration of function by insertion of absent genes in tissues of patients with inherited diseases; and technologies of early warning for late-onset diseases such as Alzheimer’s, cancer, and heart disease.

While these successes and others like them are welcome, they have come to us entwined with less desirable consequences. For every late onset disease that can be diagnosed in advance of symptoms by a telltale difference in DNA, considerable numbers of healthy people find they are paying large sums only to confront news that often brings them little to do but wait for the inevitable.

The widening gap between diagnosis and treatment has had a second consequence, one that touches one of the most sensitive issues facing us today. Prenatal DNA diagnosis coupled with termination of pregnancy provides a rational way to avoid bearing a child with a life-threatening inherited disease; more and more diagnoses of variant versions of a gene can be made in a first-trimester fetus, providing a woman with a new and ever-growing set of reasons for early termination of her pregnancy.

But before molecular diagnostic techniques can be properly used on the DNA of either adults or fetuses, all interested parties must agree which versions of any gene are to be considered normal, and which may be taken as markers of childhood or adult disease.

In the near future these techniques will allow pregnant women to decide whether or not they want to bear a child whose physical and mental states today fall well inside the boundaries of “normal.” With time, computer technology may well allow the simultaneous analysis of DNA data on dozens or hundreds of different genes. At that moment, a knowledgeable woman will be able to get the information she needs to decide whether or not to carry to term a child that would be, for instance, a boy, or a girl, or short, or deaf, or gay, or straight.

Taken together, these and many other two-edged developments at the boundary of medicine and basic science have defined a new sort of privacy, one that all other definitions of privacy are dependent upon: the right to control the information contained in one’s own genome. Both law and politics move slowly; the technology is moving much faster than either.

As a result, issues of genetic privacy, left to grow in the dark of legal and political neglect, have developed the capacity to present us with unexpected and nasty surprises. If the “perverted science” that the Allies and the people of occupied Europe rightly feared and hated is still nowhere on the horizon, the tools and capacity for its reappearance are, unfortunately, nevertheless in our hands today.

The public knows this. Advances in human genetic analysis have been met by a widespread fear that aspects of molecular medicine somehow are — or will soon become — a shadow on every person’s future. The negative reaction to the contributions of genomic science to medicine manifests itself in many ways, from Congressional hostility to further increases in basic research budgets at the NIH, to legal skirmishes based on the supposition that techniques to elucidate a person’s genetic status will be used by government for non-therapeutic purposes.

How should the academic community itself respond to these matters? If we confuse what is possible with what is so, we slow the progress of the biomedical sciences, and reduce everyone’s chance of benefiting from such progress — including our own. If we ignore the past, and claim the risk is too small to worry about, we will lose control of our own futures and share responsibility for a future burdened with avoidable consequences.

The profession has only one good answer: to approach the problem as scientists. And the profession is right. Scientists should ask perceptive questions about the technologies we have developed, gather data carefully, test our hypotheses, draw our conclusions, and publish our results so that our colleagues and others may know what we have found.

Our obligation is sharpened by the fact that the most powerful technologies for violating genetic privacy come from the best — not the worst — of our nation’s laboratories. It would be wrong to simply cull out the risks and attribute them to “bad” science; these problems are ours, precisely because they derive from excellent science.

There is a second answer, equally important though less central to the profession: to teach science well, to teach it so it becomes a living part of our culture. Here is the great physicist, Richard Feynman of Cal Tech, on this second task:

It is our responsibility as scientists, knowing the great progress which comes from a satisfactory philosophy of ignorance, the great progress which is the fruit of freedom of thought, to proclaim the value of this freedom; to teach how doubt is not to be feared but welcomed and discussed; and to demand this freedom, as our duty to all coming generations.1

Today, few people see science this way. According to too many news magazines, TV documentaries, soap operas and movies, scientists pursue magic powers as white-coated practitioners of a pagan religion. They obey only their own arcane rules, and then only to be first to uncover some mystery of the universe that would be better kept hidden. But since these discoveries can lead to new products of great, if morally ambivalent, value, the public cannot totally disregard their efforts. Public interest in science from this perspective is reduced to a somewhat risky set of deals with somewhat shady entrepreneurs.

But scientists active in their laboratories cannot be asked to create new knowledge, and share their creations with the lay public, out of good will alone. We need the leaders of our colleges and universities to articulate a vision of the University that includes at its center a commitment to the study of the political implications of science, and to back that vision with reasonable resources. Without such a jump-start, the other interests that lie at the heart of a science department are simply too strong and too utilitarian to be budged. The problem is, all too often we are managed, rather than led. Here is how a great academic leader, the late A. Bartlett Giamatti of Yale, distinguished between the two:

Management is the capacity to handle multiple problems, neutralize various constituencies, motivate personnel. . . . Leadership on the other hand is an essentially moral act, not — as in most management — an essentially protective act. It is the assertion of a vision, not simply the exercise of a style: the moral courage to assert a vision of the institution in the future and the intellectual energy to persuade the community or the culture of the wisdom and validity of the vision. It is to make the vision practicable, and compelling.2

Finally, it is not enough to be well led, it is not enough to tell the truth; it is also necessary to live in the world, to engage the issues of the day in one’s scholarly work. I saw this was possible when I was an undergraduate, and the lesson has stayed with me. Though I was majoring in physics at Columbia in the late 1950s, I tagged along with my friends to literature classes taught by Lionel Trilling. He was a distant, somewhat foggy creature to me, since he dragged constantly on his cigarettes, and I always wound up at the back of one or another very smoke-filled room.

Nevertheless, I knew he was serious about the books he taught, and serious about the world, because an overlap of concerns — the text and the world-marked Trilling’s teaching. Even though he sometimes claimed to be interested solely in the words of the text, the world could not keep from informing his interpretations.

My colleague Edward Said caught this twenty years ago, in this quote from an article about his book Orientalism:

In a recent interview [Edward Said] cites with approval Lionel Trilling’s assertion that “there is a mind of society” and argues that it is this mind that the critic should “address, tutor, doctor, inform, evaluate, criticize, reform.”3

I find this notion of a “mind of society” entirely congenial. But as a scientist, when I look around me I find, with some dismay but no surprise, precious few colleagues willing to “address, tutor, doctor, inform, evaluate, criticize, reform” the scientific part of the societal mind.

There are few small classes in science for a curious undergraduate, no common syllabus; there is no list of exemplary ideas, there are no axes of debate. Instead we offer a lot of different ways to memorize, with a few oddball chances to read and argue thrown in for flavor, like raisins in a bland, doughy pudding.

Why is this? Why does the scholarly world presume that any idea from the humanities or social sciences can be not only understood, but debated, by a seventeen-year old; but that no idea from the sciences is debatable, unless one first marries the profession, through choice of major and then career?

Some scientists — not all, and not the best — think this is just the way it is. Some humanists — also not all, and also not the best — agree. Both, oddly enough, agree that science is hard stuff. Both see science as a narrative with a special claim to truth, a claim that makes it intrinsically inaccessible. Even as they disagree as to whether the claim is justified, they agree on science’s inaccessibility.

I don’t agree with either of them. I see science as a fully accessible argument between imagination and physical action. The imagination of a scientist creates a vision of one aspect of the natural world, usually of the world outside the mind, but sometimes even of an aspect of the mind itself. But that vision is never enough: physical action — experimentation — weighs in immediately, to test the model.

This back-and-forth of theory and practice — the scientific method — works, because in science, the imagination must either yield to, or encompass, the results of experiment. There is no room in science for empty speculation, nor for its complement, the involutionary, anarchic, cynical despair we find in so much of today’s critical theory.

The resulting narratives of successful science — discoveries, we call them — are bounded by culture no less than any other narrative. But the models they stem from, confirm, and alter, are not simply narratives. These models, the most-recently-adapted, current working hypotheses of science, float above all their previous narrative versions, persisting through time, never final, never culture-bound.

We live by such models, because they mold the patterns of our thought. Shakespeare gave us our way of seeing ourselves as having inner voices and developing through inner dialogue. In no really different way, the sciences continue to give us new ways to see ourselves. These, in time, become as completely taken for granted as the Shakespearean notion of a private monologue. In just this way Freud’s unconscious and Darwin’s natural selection — to name two — have not merely been added to our vocabulary. They have become aspects of the way we understand ourselves.

Now here’s the paradox: new ways of seeing ourselves or our place in nature are precisely what we do not teach today, neither to the undergraduate, nor to the specialist. There is a reason for student and professor alike to feel the same urgency about this intellectual shortfall, as we are obliged to feel about the various fiscal shortfalls that nibble at our heels.

I’ll sum up with a few words from Dante’s Inferno. I first read the Inferno in 1958, in a Columbia general education course. A while ago I returned to it, reading Pinsky’s new translation with great pleasure. The Inferno is about many things; but to me, it was and still is, above all, an extraordinary example of the power of words — text — to transcend death. Dante meets the damned souls of hell, and has the audacity to promise they will have eternal life on earth if they will allow him to write their stories. They tell him their stories, he writes them out brilliantly and, after seven hundred years, we still read those stories. Apparently no one can say Dante promised more than he could deliver, because it is clear that we live today in a world of science easily recognized in Dante’s Inferno.

In Canto 31, we meet ourselves, face-on. At the bottom of the last circle Dante sees, in the distance, a Stonehenge of monstrous, missile-like towers. Thinking these to be the Giants of Genesis surrounding the very pit of Hell, lie says to us in a parenthetical aside:

. . . (Nature indeed,
When she abandoned making these animals,
Did well to keep such instruments from man;
Though she does not repent of making whales

Or elephants, a person who subtly inquires
Into her ways will find her both discrete,
And just, in her decision: if one confers

The power of the mind, along with that
Of immense strength, upon an evil will
Then people will have no defense from it.)

Have no doubt: there will be more moments when misused science will indeed leave people with no defense from an evil will. Our obligation as scholars is to do what we can to keep science from being misused. To do this, we must begin to open collaborations between scientist and nonscientist, to create a real home in the academy for the changing but always powerful models, and not just the painfully-memorized, data-filled narratives, of science.

These scientific models not only articulate, but also shadow, our lives. Some threaten older notions of free will, human equality, even of fate itself. This is not a reason to inhibit the work of science; but it is a reason to be sure these models do not go from the laboratory to the general culture, unchallenged by examination in our colleges and universities.

Notes

1. Feynman, Richard, What Do You Care What Other People Think (New York: Bantam Books, 1989) 248. [Return to text]

2. Giamatti, A. Bartlett in National Research Council, Fulfilling the Promise: Biology Education in the Nation’s Scbools (Washington, DC: National Academy Press, 1990) 102. [Return to text]

3. Katz, James C., “Trilling Award to Said,” Columbia College Today 4.9 (Winter 1977) 14. [Return to text]

Back to Top