Chapter 2: Applied Science And Human Welfare
A chemist is working for a company which pours waste products into a stream, polluting it in a way that, for a reasonable cost, could be avoided. In recommending policy at such a point, is this man’s first duty to his employer or to the public? Eighteen top German physicists, including four Nobel prize winners, stated in 1957 that they would not work on projects having any connection with nuclear weapons. Were they right in assuming that the scientist has some responsibility for the uses to which his discoveries are put? This chapter deals with the moral decisions involved in expressing through applied science the Christian vocation to serve human need.
A. Science: Creator and Destroyer
In the last hundred years, science has had an impact on almost every aspect of life in the West. Men have been released from backbreaking labor, living standards have risen, and leisure has increased. New drugs, cures of formerly fatal diseases, and improvement of health standards have more than doubled the average life span in the last century. New products, processes, and machines surround us on every hand, from our electrified homes to our industrialized cities. A trip from New York to San Francisco, which required four months in 1860, takes four hours by jet plane.
With the release of men from deadening drudgery and toil, new possibilities of cultural growth and the use of man’s varied capacities have emerged. Education and an enriched intellectual life have been more widespread, encouraged by the demands of an industrial civilization for skilled workers. For the first time in history, abundance is possible for every nation — not, as in the past, at the expense of others, but through a nation’s own scientific development. The ancient dream of a society free from famine, disease, poverty, and despair is beginning to be fulfilled by the applications of science.
The need for scientific progress is illustrated by the situation faced today in underdeveloped countries. About half of the babies born in most parts of Africa and Asia die in childhood. Those who survive face a life of squalor and misery. In India, for example, the average life expectancy is 301/2 years, compared to 681/2 years in the United States; the average annual income is less than $40, compared to $1,469 in the U.S. Energy utilized annually per capita, which is a rough index of living standard, is in some countries equivalent to.02 tons of coal, compared to 8 tons, or 400 times as much, in the U.S. Two thirds of the world usually goes to bed hungry at night. A correspondent reports:
In Persia I talked with a peasant who has seen a can of our dog food. He said that if he could get such a can once a week for his family he would be happy. He might be willing to die to realize his ambition to lead the life of an American dog.1
Clearly one of the justifications of science is its contribution to human welfare. Science can be an instrument of good will, extending the reach of the good Samaritan’s hand. Central in Christian ethics is love of neighbor, which means concrete action to meet his needs. Biblical religion is not primarily otherworldly, nor is it interested only in ” in the sky by and by.” It finds crucial meaning in this world, for example in how one treats other people. Jesus was concerned about men’s bodies as well as their minds. “I was sick and you visited me” (Matthew 25:36). “If your enemy is hungry, feed him” (Romans 12:20). Thus Kirtley Mather, the Harvard geologist, can say:
Science is obviously in the service of religion. One of the keynotes of every great religion is expressed in the desire that the sick should be made well, blind eyes opened, unfortunate economic situations set aright, that persons in positions which give them no opportunity to display their own real worth should be given that opportunity. . . . Is it not obvious that religion has profited greatly by the knowledge which science gives along such practical lines as these?2
The chief justification for any work is that it meets genuine human needs. Same commentators, however, feel that this should not be the main Christian motive in science because it would “reduce Christianity to an ethical system.” William Pollard, executive director of the Oak Ridge Institute for Nuclear Studies, argues that ethics is not the main message of Christianity, and that humanitarian goals for science are held also by agnostics having secular objectives.3 However, concern for others is an essential part of the Christian faith, even though religion is more than ethics alone. And Western culture has been so influenced by the biblical tradition that “humanitarianism” in any form may owe a greater debt to Christianity than it recognizes. Surely scientists may legitimately see their work as a Christian response to human needs.
But if science can be justified by its constructive results, what can be said of its destructive consequences? The possibilities for both good and evil uses of new discoveries have been repeatedly illustrated in the history of science. Leonardo da Vinci suppressed his plans for a submarine: “This I do not divulge on account of the evil nature of men who would practice assassinations at the bottom of the sea by breaking ships in their lowest parts and sinking them.” Napier refused to publish the ingredients of an explosive. Alfred Nobel thought that in inventing dynamite he had made a weapon so powerful that war would be impossible; disillusioned, he gave part of the fortune he made from it to establish the Nobel Peace Prize. Thus the problem of the destructive power unlocked by science is no new one, though it has achieved a new magnitude in our time.
The scientists working on the atomic bomb recognized at an early stage some of the implications of their work. The Franck report was written at the Chicago project in June, 1945, two months before Hiroshima:
If the United States were to be the first to release this new means of indiscriminate destruction upon mankind, she would sacrifice public support throughout the world, precipitate the race for armaments, and prejudice the possibility of reaching an international agreement on the future control of such weapons. Much more favorable conditions for the eventual achievement of such an agreement could be created if nuclear bombs were first revealed to the world by a demonstration in an appropriately selected uninhabited area.4
In contrast to the Franck report, the decision to drop the bomb on civilian populations was governed mainly by immediate military considerations. “Unconditional surrender” had become an end in itself, at the expense of clarity about the political goals for which we were fighting. The U.S. did not even try to exploit by diplomacy the intercepted cables which were decoded during July, indicating that Japan was trying through Russian intermediaries to negotiate a surrender on all our terms except the retention of the Emperor — which, after using the bomb, we were to allow anyway. Louis Morton, deputy chief historian of the Army, summarizes the situation in July:
Thus the Japanese Government had by then accepted defeat and was seeking desperately for a way out; but it was not willing even at this late date to surrender unconditionally, and would accept no terms that did not include the preservation of the imperial system.5
Michael Armine’s recent reappraisal, The Great Decision, states:
Grew, Stimson, McCloy, and many men in uniform who were familiar with Asia, such as Zacharias [the man in charge of psychological-warfare broadcasts to Japan], came to feel that a grave error of our surrender policy was in not negotiating about the Emperor and conditional surrender sooner that we did.6
It is in terms of such policy questions that one must evaluate the fact that mankind’s first use of atomic energy took from two cities 120,000 lives.
The continuing sense of social concern among the atomic scientists was a striking feature of the postwar years. J. R. Oppenheimer, who directed the Los Alamos project, wrote afterward:
The experience of the war has left us with a legacy of concern. Nowhere is this troubled sense of responsibility more acute . . . than among those who participated in the development of atomic energy for military purposes.. In some sort of crude sense, which no vulgarity, no humor, no overstatement can quite extinguish, the physicists have known sin; and this is a knowledge that they cannot lose.7
The Bulletin oj the Atomic Scientists was launched, and has continued to deal responsibly and influentially with social and political implications of science. Scientists campaigned successfully for civilian rather than military administration of atomic energy. They continued to warn the public of the destructive power of atomic weapons and the urgent need for international controls. More recently, the American Association for the Advancement of Science (AAAS) established a Committee on the Social Aspects of Science, whose 1956 report spoke of “the pressing need that scientists concern themselves with social action” and urged scientific organizations to abandon their traditional isolation from public problems.
The effects of science on society present a mixed record of good and evil. The automobile brings mobility to the average man, but U.S. traffic deaths in the last decade exceeded our fatalities in World War II. TV and radio provide new channels for the communication of ideas, but make it easier for dictators to control nations, or for trivial entertainment to mold the mass mind. Technical advances produce new products but make possible new centralizations of political power in the police state, and new concentrations of economic power in industry. These destructive powers, especially in warfare, have led some authors, e.g., Aldous Huxley, to call for a moratorium on science. Yet without the continuance of the work of the scientist millions of people would suffer or die of hunger and disease in a few years. Further, our consciousness of knowledge already gained cannot be erased, nor could man’s desire for knowledge be thwarted except at the price of his freedom. Any instrument of good can be misused. A total pessimist about human nature might avoid scientific activity; a total optimist might have no cause for concern about its consequences. But a person who sees in man potentialities for both good and evil will combine scientific work with the attempt to exert his influence toward its beneficial use.
B. The Social Responsibility of The Scientist
To what extent is the scientist responsible for the uses to which his work is put? The problem arises not only in the dramatic case of atomic energy but in thousands of lesser cases in which an application has significant consequences for society. P. W. Bridgman, Harvard’s Nobel-prize physicist, has argued that in most occupations individual blame does not extend to all results of an act: the miner of iron ore is not expected to think about all the uses of iron.
For if I personally had to see to it that only beneficent uses were made of my discoveries, I should have to spend my life oscillating between some kind of a forecasting bureau, to find what might be the uses made of my discoveries, and lobbying in Washington to procure the passage of special legislation to control the uses. In neither of these activities do I have any competence.8
He sees the concern of the atomic scientists as misguided, “a youthful philosophy, enthusiastic, idealistic, and colored by eagerness for self-sacrifice.” “If anybody should feel guilty, it’s God, who put the facts there.” He also feels that it is unfair for the public to “impose” responsibility on scientists, which is to “exact disproportionate service from one group because of their special ability.” Other writers have said that the scientist must be free to search for the truth without having to consider the consequences. The outcomes. of research, they argue, are unpredictable; moreover, it is. the job of society, not of the individual, to make such decisions. I. I. Rabi adds: “Scientists who dabble in politics. usually make fools out of themselves.”
This commonly held position seems to the author to be inadequate in the case of applied science, where effects can usually be foreseen to some extent. Even in pure research a person often has a general idea of the sort of results that are likely to follow, and the motives of the project’s sponsor indicate the type of outcome for which he hopes. Furthermore, those who say they disavow all accountability except to the pursuit of truth would probably draw the line at some point. Who would condone Nazi experimentation on human guinea pigs, even though the data added to knowledge? Or again, would a biochemist, asked by a crime syndicate to develop a poison which would be undetectable at autopsy, be able to ignore the consequences of such a discovery? To most scientists, research on germ warfare is abhorrent. The question, then, is not whether to draw a line, but at what point to draw it.
The attempt to delegate all responsibility to society must also be scrutinized. The Nuremberg trials did not exonerate German scientists of individual answerability for their work, even though they were acting on orders. Even in a democracy, it is a dubious interpretation which says that the individual must always conform to the opinion of the majority. The essence of democracy is not rule by majority but rather government by discussion, including the right of the minority to be heard. To be a free man means to take responsibility for one’s own actions. For the Christian in particular, absolute allegiance to nation or group can never come before obedience to God as he understands it. In an age of conformity the significance of individual conscience must not be lost, nor the integrity of the man who says with Luther: “Here I stand; I can do no other.”
A further objection may be raised concerning Bridgman’s view of the nature of responsibility. Moral responsibility is not primarily a burden “imposed” on the individual by society, but an opportunity for constructive action, voluntarily acknowledged. The word responsibility comes from the verb “to respond” and represents a person’s response to his total situation. The main question is not whether society blames me for what I do, but whether I can find ways of making a maximum positive contribution, or do anything to prevent destructive results. Eugene Rabinowitch comments on Bridgman’s article: “Does he expect to find satisfaction, as he contemplates the radioactive ruins of Harvard Yard, in the thought that he, at least; had resisted all attempts to saddle him with responsibility that was not his?” The opportunity of the scientist may be greater than that of the average citizen with respect to some issues because his technical knowledge gives him greater understanding and appreciation of them, and because he has considerable influence in contemporary life. And if there are some scientific geniuses who can give the greatest service to society by giving attention only to their work, they should do even this as an expression of, not an escape from, responsibility. Actually Bridgman himself has been actively involved as a citizen in a number of public issues; these criticisms are directed not at his own practice, but at the common viewpoint, of which he has been a forceful spokesman, disavowing concern for the uses of a scientist’s work.
Many scientists are simply indifferent to the social implications of their job. For every one who consciously disclaims accountability for his work, there are dozens who do so by default because of:
1. Absorption in technical work. Concern for human welfare will play little or no role for the man whose only motivations are curiosity and technical interest. Research is a time-consuming process and can easily absorb a man’s entire attention. Some men were attracted to the H-bomb project only because it was exciting and they were eager to take part in a brilliant scientific achievement. The British major in The Bridge on the River Kwai9 was motivated by professional pride in “doing a good job” without asking about the purposes it served. He built a superb bridge and kept the morale of his men on a high level, but realized too late that his technical success had aided his country’s enemy.
2. Bureaucratic organization. Laboratories in both industry and government are components of power-hierarchies in which most individuals have little control over policy. Such a situation discourages responsible participation in decision-making. Specialization often isolates one part of a project from other parts of the same operation. When men work in teams, no individual feels responsible for decisions; even the project director is carrying out directives. A scientist easily becomes what Whyte calls “the Organization Man,” accommodating to the expectations of others concerning his work role. It is easier to “fit in” and “go along,” in unquestioning loyalty to the group, than to raise questions about what the organization is doing.
3. Caution about value-judgments. Some scientists try to avoid social or moral questions. They feel that their training has encouraged them to be detached and impersonal, wary of individual preferences. But the avoidance of personal commitment may only encourage the exploitation of one’s talents by someone else — politician, industrialist, or the state. And decision can not be so easily avoided: silence may mean consent to the status quo, and not to act is often itself a decision. More often the scientist, because of specialized training and a busy schedule, is simply ignorant of the wider issues on which his work touches.
4. Faith in automatic progress. The spectacular development of early science gave rise to the eighteenth-century optimism that increase of knowledge would lead inevitably to happiness and virtue. In the nineteenth century, evolution was often interpreted as a guarantee of universal advance. Some scientists continue the assumption of the rationality and goodness of man; with such confidence in progress one does not need to be concerned about the uses that will be made of one’s work, since it is assumed that the end result of all discovery is beneficial. But every new advance brings its new problems and temptations, as well as benefits. Technical development is cumulative, but moral progress is more precarious, as the efficiency of the Buchenwald extermination camps reminds us.
In contrast to those who either consciously disavow or unconsciously neglect the implications of their work, some authors insist that every individual is answerable for the consequences of his research. Norbert Wiener, Massachusetts Institute of Technology mathematician, withheld some of his results and said: “I do not expect to publish any future work of mine which may do damage in the hands of irresponsible militarists.”10 “The Society for Social Responsibility in Science (SSRS) has as its purpose “to foster throughout the world a functioning cooperative tradition of personal moral responsibility for the consequences for humanity of professional activity, with emphasis on constructive alternatives to militarism.” Among its members have been Nobel-laureates Einstein, Pauli, Born, and Yukawa. Several recent books, e.g., Jungk’s Brighter Than a Thousand Suns, take this viewpoint in blaming physicists for co-operating in work on the atomic bomb.
Even with respect to applied science, this position perhaps tends to overemphasize individual action, which the position discussed earlier underemphasizes. Since any knowledge may be misused, all technical work involves the acceptance of an element of risk. This risk must be taken if our concern is the maximum probable human benefit rather than the certainty of keeping our own consciences spotless. Perhaps it should be said that the scientist is partially responsible for the probable uses of his work, though not for all its conceivable ramifications. The SSRS position also seems to put disproportionate stress on individual witness, whereas Bridgman’s viewpoint went too far in the opposite direction of relegating decisions to others. Since many of the crucial decisions today are made by groups, attention must be given to the scientist’s participation in the broader processes of public discussion. To such discussion he can contribute both his technical knowledge and his moral convictions, without claiming unwarranted authority in fields in which he is not an expert. This middle position has been taken by the Bulletin of the Atomic Scientists and the Federation of Atomic Scientists, who have operated effectively through channels of communication, education, and political action. Bulletin editorials have criticized the tendency of scientists to become “morally irresponsible stooges in a science factory”; and yet they have recognized that in pure research and even in some applied fields it is impossible to predict all uses of new discoveries, much less what their wider effects will be.
If the scientist thus has some responsibility for the outcomes of his work, he will inevitably be involved in moral decisions. What light does religious faith shed on such choices? Christian ethics stresses several criteria: (a) The centrality of love. By parable, teaching, and example, the New Testament speaks repeatedly of forgiveness and compassion, which require sensitivity, concern, and willingness to act to meet human needs. In terms of the situation between persons, love means reconciliation and the restoration of community. (b) The value of the individual. Because each person is of value in the sight of God, human personality is sacred, an end and never a means. Response to the individual in need is an expression of worship; it is service to God as well as to man. The goal of action is to give each person the best possible chance to live as God means him to live. (c) Justice as the social expression of love. Justice is not opposed to love, but is precisely the form which love must take toward groups. Since it is impossible to express love personally toward large numbers of people, it must be embodied in institutional structures which make possible the fulfillment of persons.
We need to go a step further to consider the basis and motivation of such attitudes for the Christian. Concern for others is part of a person’s response of gratitude to God. “We love, because he first loved us” (1 John 4:19). The nature of God is thus the basis of ethics: “God is love, and he who abides in love abides in God” (vs. 16). The experience of God’s acceptance can free us from the anxieties and Insecurities which make us self-defensive; it can enable us to forget about ourselves for a while. This is an ethic of liberty rather than law. What Christ brought to man was not a new code-book of detailed regulations, but a new orientation and attitude from which new modes of action flow.
According to this understanding, the scientist should not expect to find in Christianity a detailed code telling him what to do. He must decide for himself, in the light of all the information he can obtain about the scientific aspects of a concrete situation, plus his understanding of the nature of God and man, and of love and justice. He decides as a whole man, and not first as a technician and then as a Christian. Thus “responsibility” means “response” — to the total situation, which includes God, man, and technical data.
One additional factor will influence this response, namely, his attitude toward involvement in the evil of the world. The ethical perfectionist will try to avoid any compromise, for he understands Christian ethics to consist of a set of absolute injunctions. He may resign from his laboratory rather than take part in work which might have harmful consequences. The perspective we have been outlining, however, holds that even though man must strive against corporate evil, he is inescapably implicated in it — for example, in injustices and exploitations by his nation, industry, or group. Practical choices are often ethically ambiguous rather than black and white. A person could try to keep “pure and unspotted” only by withdrawing to a hermitage — though to some people withdrawal to the laboratory gives an illusion of non-involvement. We must aim, then, for whatever social gains can be achieved — even at the price of limited compromise, and always with the risk that some evil may result along with the expected good. The goal is for science as an instrument of love and justice to make the maximum contribution to human welfare.
C. Moral Decisions on the Job
The vocation to serve human need thus requires moral decisions in scientific work. The approach outlined above does not permit the prescription of any simple “Christian answer” to ethical dilemmas. But examples can be given of concrete situations which scientists have faced, and of some of the factors which might be significant from a Christian perspective. The first major choice is the decision among the jobs for which one is qualified. Even though purely accidental factors, such as special preparation or chance openings, may limit the possibilities, there is always some choice of employer and type of work when it comes to selecting a position. Once on a job, many men feel “trapped” in that situation, but there is actually considerable job mobility in science.
Recently a chemist working for a food company turned down an offer from a liquor corporation at twice his current salary because he said that he wanted “to be able at retirement to look back on a useful life.” An electrical engineer accepted a job working on a rural electrification project in India, rather than a secure position in an American company. Compare the words of a young physics Ph.D. who had just accepted a job at Los Alamos, the H-bomb center: “I don’t believe that what the world needs most is bigger and better bombs. But the job pays well and there is a fine new housing development.” Was he not in effect selling his life to the highest bidder? Another man went to Los Alamos because of technical scientific interest in what he thought of as exciting pure research with excellent equipment. Since all work there was at the time “classified,” was he not closing his eyes to the main purpose of the project? Note that we are not criticizing those who participated in this project with sincere conviction of its social value, but only those who violated their own personal integrity or ignored the implications of their research. (The problem of national policy and nuclear warfare is discussed later.)
Choice of employment, particularly rejection of work on purely military projects, has been the chief concern of the Society for Social Responsibility in Science. Its dominantly pacifist position forces on it the character of a protest movement, encouraging individual action in withdrawing from “destructive work”; but it has given some attention to constructive alternatives, such as research in agriculture and small industries or technical assistance openings in under-developed countries. A number of outstanding scientists, including the Russian physicist Kapitsa who was for years kept under house arrest by Stalin, have refused to work on anything connected with atomic weapons. A source of current concern is research on germ warfare, which has been the subject of a recent public-relations campaign by the Army. The International Microbiological Congress has passed an unequivocal resolution condemning preparations for bacteriological warfare as unethical. In the light of man’s unrelenting attack on disease, some biologists believe that development of deadly germs is a betrayal of the human race as well as of the ideals of science; one-hundredth of an ounce of botulism toxin could kill a million people, and its production seems to further neither scientific knowledge nor any peacetime applications. Whether one approves or disapproves of such research, its implications for mankind cannot be ignored. In choosing employment the Christian must examine the purposes toward which his labors contribute.
Once a job has been selected, other areas of decision must be faced in the course of work. In applied science it is frequently necessary to consider human as well as technical factors — for example, in locating a bridge or highway, or in recommending agricultural methods. Another common problem is the conflict between obligations to employer and to society. Some scientists work for industries which exploit and waste natural resources with little concern for the public. Another situation of tension between the welfare of the consumer and the profits of the employer is the restraint of improvements in a product. There have been many cases in the courts (e.g., improvements in telephones, tires, fluorescent lamps, flashlight bulbs) in which changes that would have greatly lengthened the life or quality of an article were withheld to promote replacement sales.11 Often patents have been taken out on superior inventions but not used, so that the improvements were completely suppressed and an outmoded product continued. In many European countries, by contrast, a company that does not itself make use of a patent for three years must negotiate license agreements with other companies wishing to use it. Scientists increasingly have a rote in policy decisions concerning such questions in both industry and government.
The scientist today is offered various inducements to bestow his benediction on all sorts of products and enterprises, from new tooth pastes to new ways of finding peace of mind. His participation in false or dubious claims contributes to the exploitation of the public. A geologist told the author that he was once instructed “to produce evidence,” where there was none, to support a lawsuit concerning the location of a railroad line. In another instance a man was told to “prove” the superiority of the company’s material for window frames, though another material was clearly better.
Peculiar temptations are also present in government contracts with industry, since the public which may be defrauded seems so remotely affected. One man working in an industrial laboratory, with half his salary paid by the government, was given his instructions beforehand: “Any discoveries you make which have any scientific or commercial value, and any work you do which is at all profitable, is to appear on the books as having been done on ‘company time’; the rest is the government’s half of your time. All reports to the government are to be vague and non-committal so as to disclose as little as possible.” Desire for prestige and credit may also jeopardize the public interest; one U.S. missile group kept secret from rival groups the transmitter frequency of a satellite about to be orbited.
Another variety of conflict of interests was illustrated in the “Astin affair,” in which the director of the National Bureau of Standards was fired because of reactions to a NBS report which found the AD-X2 battery additive “without merit.” Scientists all over America protested the dismissal and the pressures exerted against the laboratory’s objectivity. The Jeffries Committee evaluated all the evidence from various laboratories and vindicated the Bureau’s findings, and Astin was finally reinstated. Clifford Grobstein comments: “A bill of goods, based on too few facts too carelessly evaluated, was sold to a cabinet officer and a congressional committee, and between them they almost wrecked a major scientific laboratory.”
The scientist today often becomes the servant of business interests rather than of the pursuit of truth or the welfare of society. A man in industrial research summarized the pressure he felt thus: “It’s made clear very soon that neither science nor a good product is the goal; making bucks for the company is the big thing.” The Christian perspective on such situations is surely neither unquestioning co-operation nor perfectionistic condemnation. Legitimate economic interests and the realities of power structures can be acknowledged. Yet the person attempting to relate his activities to human welfare must also keep in mind the interests of the consumer and public, both in his own decisions, his participation in policy decisions, and if necessary in his protest against directives he feels to be harmful.
Because ethical choices affecting other people do arise frequently in science, a number of authors have called for an extension of the Hippocratic Oath which for centuries has been associated with medicine. The scientist, like the doctor, has power over the life and well-being of man, and so, it is argued, the public is particularly vulnerable to his code of ethics. There has been recent discussion of the value to science itself, as well as to society, of having a more clearly defined code of professional ethics. The ethics of authorship, obligation to cite prior work, and problems of multiple authorship have been analyzed in an article in Science.13 Others have been concerned about the humane treatment of animals, which are crucial in experimental biology and pharmacology. A biologist reports seeing an extremely painful technique, which most groups use only with full anesthesia, applied by other groups to domestic animals in full consciousness, with no justification except laziness. He commends the British practice of issuing licenses to experimenters subject to specified conditions.14
The new field of “operations research,” in which scientists have been prominent, appears to escape moral decisions. Its objective is the scientific determination of the most efficient method of achieving a particular goal, e.g., increased output of a factory, maximum military damage in an attack, optimum man-power distribution for a turnpike tollgate staff. But value-judgments have already entered in selecting the goals, m assessment of results, and in choosing the assumptions and criteria in terms of which the “best” procedure is calculated. In the absence of conscious decision these values are taken from cultural presuppositions and assumed to be “obviously desirable.”
Another channel of action for the scientist is participation in public decisions relating to his work. It must be granted that in many cases the results of research are unpredictable, and hence it is only through corporate processes that they can be controlled. In Chapter 5 some opportunities of working for a better society are considered, as well as the dangers of naïveté, oversimplification, and unwarranted extension of his authoritative role when a person speaks outside his field of technical competence. But the scientist does have a duty to inform the public and its leaders about his results and their implications, and to warn of its dangers. Because of the gap between the expert and the layman, the interpretation of discoveries to the public is essential for intelligent democratic decisions. This educational task might take place through interviewing the press, writing letters-to-the-editor, drafting petitions, giving talks to local groups, or writing semipopular accounts.
The ethical evaluation of a decision can be made only within a context which includes social as well as technical aspects. The moral element is not an extraneous factor but consists in the relation of the scientific data to human welfare. An interesting example of the way a man’s role “as a person” includes his role “as a scientist” is provided by the famed Oppenheimer case. Withdrawal of his security clearance in 1954 involved two main accusations. The first charge was association with Communists in the thirties and subsequent indiscretions such as his attempt to protect his friend Chevalier. This charge was prominent in the Atomic Energy Commission’s statements and decision, but was given little weight by the Personnel Security Board (the Gray Board), which after extensive hearings found “no indication of disloyalty . . . and eloquent and convincing testimony to his deep devotion to his country” and “a high degree of discretion reflecting an unusual ability to keep to himself vital secrets.”15 The second charge, which was the major one in the Gray Board’s report, was that in 1949 he had “failed to display the requisite enthusiasm about building the hydrogen bomb.” After the decision was made by the President, he co-operated completely in its execution; but it was held against him that before the decision was made he opposed the development of the H-bomb on both technical and moral grounds.
The report did not distinguish clearly at this point between disloyalty to the nation and honest dissent from prevailing opinion as to the best course of action. Oppenheimer felt in 1949, along with many scientists — including all but one of the AEC General Advisory Committee — that it was a dubious gamble to divert major resources into investigation of thermonuclear reactions which might never work. In the hearings Oppenheimer was criticized for “more conservatism than the Air Force would have liked” in being concerned about defensive weapons, and for “interest in the internationalizing of atomic energy” (although the President had appointed him as scientific adviser to Baruch in the United Nations Commission on international controls). It is not surprising that there were a number of people who wished to silence a dissenting voice, and who wanted to claim that scientific and military questions can be kept separate from political and moral ones.
A government agency of course has the right to select its advisers. Oppenheimer’s appointment to the AEC Advisory Committee was due to expire the following month and could simply have been allowed to lapse. Instead, disagreement about a particular policy became the main ground for pronouncing him a “security risk” in the Gray Board report. Commissioner Murray filed a separate statement in the AEC decision, which reads in part:
Even though Dr. Oppenheimer is not an expert on morality, he was quite right in advancing moral reasons for his attitude to the hydrogen bomb program. The scientist is a man before he is a technician. Like every man, he ought to be alert to the moral issues that arise in the course of his work. This alertness is part of his general human civic responsibilities which go beyond his responsibilities as a scientist. When he has moral doubts, he has a right to voice them. Furthermore, it must be firmly maintained as a principle both of justice and of religious freedom that opposition to governmental policies, based on sincerely held moral opinions, need not make a man a security risk.17
The case, though admittedly a very complex one, caused widespread concern among scientists. Theodore White summarizes their reaction:
The issue, despite the attendant legalism, was whether within the councils of national debate a scientist should be allowed to express an opinion beyond the technique of invention and gadgetry. It was not, as they see it, whether Oppenheimer was right or wrong; but whether in the search for policy a scientist could permit himself the indispensable luxury of offering advice and opinion without exposure to retaliation and charge of crime if the decision went otherwise.18
The Oppenheimer case and the earlier example of the decision to use the A-bomb illustrate the intertwining of scientific, political, and ethical questions.
An individual makes moral decisions as a total person, taking all relevant factors into account. Although in general the applied scientist’s work clearly contributes to human welfare, there are many concrete choices which he faces in the course of his job. Obviously a non-Christian may have a sense of social responsibility; and for the Christian there is no “easy answer” which can be prescribed for such decisions. But religious faith can increase a person’s sensitivity to the ethical dimensions of alternative actions and the implications of his work. It can keep before him every man’s vocation to serve human need creatively through his job. And the scientist who sees his life in relation to God and man may perhaps have the courage and the integrity to act in accordance with his convictions.
1. D. L. Cohn, “Great Turning Point,” Saturday Review, May 16, 1953, p. 10. Used by permission.
2. K. F. Mather, “The Natural Sciences and the Christian Faith,” The Christian Scholar, June, 1953, p. 123. Used by permission.
3. W. G. Pollard, “The Place of Science in Religion,” ibid., p. 110.
4. “The Franck Report,” published in Minutes to Midnight (Bulletin of the Atomic Scientists, 1950). Used by permission.
5. L. Morton, “The Decision to Use the Atomic Bomb,” Foreign Affairs, January, 1957, p. 344. Used by permission.
6. M. Armine, The Great Decision (Putnam, 1959), p. 241. Copyright 1959 by Michael Armine, and used by permission.
7. J. R. Oppenheimer, “Physics in the Contemporary World,” The Technology Review, February, 1948, edited at the Massachusetts Institute of Technology, pp. 202-203. Used by permission.
8. P. W. Bridgman, “Scientists and Social Responsibility,” Scientific Monthly, 1947, p. 150. Used by permission.
9. E. Milner, “Why Social Irresponsibility,” SSRS Newsletter, October, 1958, p. 3.
10. N. Wiener, “A Scientist Rebels,” Atlantic Monthly, January, 1947, p. 46.
11. F. L. Vaughan, The U.S. Patent System (Univ. of Oklahoma Press, 1956), chap. 8; E. H. Sutherland, White Collar Crime (Dryden Press, 1949).
12. C. Grobstein, “Washington Listening Post,” Bull. At. Sci., January, 1954, p. 27. See also issue of May, 1953.
13.W. Pigman and E. Carmichael, “An Ethical Code for Scientists,” Science, June 16, 1950, p. 643.
14. C. W. Hume, “Ethics of Experiments on Animals,” Nature, February 10, 1951, p. 213.
15. The Gray Board Report, Bull. At. Sci., June, 1954, pp. 248-249.
16. The Oppenheimer Transcript (U.S. Government Printing Office, 1954). See also C. P. Curtis, The Oppenheimer Case (Simon &Schuster, 1955), and Bull. At. Sci., May, June, and September, 1954.
17. “Concurring Opinion of Thomas E. Murray,” Bull. At. Sci., September, 1954, p. 277.
18. T. White, “U.S. Science: the Troubled Quest,” The Reporter, September 23, 1954, p. 26. Used by permission.