Making Choices About the Final Exit

A book about how to commit suicide has vaulted to the top of the best-seller lists. New York Times columnist Anna Quindlen admits that she picked up Final Exit out of curiosity, but kept it for another reason. The day may come when she will want to know how to die with a minimum of pain and anguish. And if that day does come, "whose business is it, really, but my own and that of those I love?" Derek Humphry’s little volume went unnoticed until it was highlighted in the Wall Street Journal. Then media coverage was immediate and widespread, pushing the book to the top of the Times best-seller list.

Most commentators make the usual demurs, reminding us that the choice to die should be made in discussions with loved ones and professional counselors. And they point out that teenagers and adults despondent over temporary—or even permanent—burdens are not the book’s intended audience. Only the terminally ill who face prolonged and painful suffering should be encouraged to prepare for the time when, as Quindlen says, "I may feel so bereft of strength, purpose, stamina and the will to live that I may want to know what constitutes a lethal dose of Seconal."

The issue here is clearly one of controlling how and when one dies—the understandable longing of the human spirit to name the time and place for a final exit. In our secular culture this seems an entirely reasonable desire, one which deserves fulfillment. But the desire to take one’s own life is the epitome of modern individualism. If one thinks ultimate reality is located no higher than human personality, what one does with one’s life is one’s own affair. Betty Rollin, who wrote an introduction to Final Exit, is a television journalist who assisted in the suicide of her mother, who was terminally ill from ovarian cancer. Rollin argues that "some people want to eke out every second of life—no matter how grim—and that is their right." But others, she insists, do not, and "that should be their right."

But is it? When Quindlen maintains that her death is her business and that of "those I love," she does not consider the significance of suicide on the wider circles of life that surround her. John Donne’s reminder that none of us is an island speaks to the point: the death of each individual has a ripple effect in the present and into the future.

If, as modernity dictates, the individual is supreme, then our responsibility is only to ourselves, since there is no God who gave us life or who awaits us in death. But if we believe that life derives from a loving Creator, then suicide must be considered within a larger context. In a nonreligious culture, Final Exit assures people that, in the face of death, individual choice is all that matters. Only someone who accepts individualism as the highest good would be so confident that there is an obvious qualitative difference between the "freely chosen" decision to die made by a person facing a terminal illness and a decision made by a physically healthy but mentally tormented individual.

In considering the "right to die," it is important to distinguish between the comatose patient being kept alive by mechanical means and the person still capable of making decisions. When consciousness disappears permanently, a decision to die becomes the responsibility of others, who may reach the judgment that for all practical purposes life for an individual has concluded and that therefore artificial supports need not be maintained.

Richard Lamm, the former governor of Colorado who has campaigned against excessive medical costs, recently cited the case of a patient in a Washington, D.C., hospital who has been in a comatose state since Lamm was a high school student. The patient has survived entirely through artificial means in a condition which benefits neither that person nor society. In this case, the larger community has not acted in the best interest of either the individual or the community. Fear of political and legal retribution from "right to life" activists has forced the medical community to preserve the person’s life. That decision reflects a narrow definition of "life" held by a small but politically strong group of activists.

An individual does have the "right to die" when individual choice has disappeared and the decision on life or death has fallen to the community (primarily the family). That is why it is so important to instruct one’s family in advance not to employ excessive means to sustain life when there is no prospect of recovering consciousness.

But what about a conscious decision to commit suicide? Though an individual may rationalize that his or her death would be to everyone’s advantage, suicide leaves a void in a network of close relationships. Its impact does not stop with "those I love." Friends, former teachers, colleagues, distant family relations, even casual acquaintances are all affected by suicide. The web of life, as Joseph Sittler so aptly put it, is like a spider web: touch any part, and the entire web shimmers.

Despite Humphry’s caveats and warnings, his book is irresponsible. There is, admittedly, a difference between the elderly terminal patient in horrible pain who wants all pain to cease and the despondent teenager whose pain is one of low self-esteem. But the difference is finally one of degree. The terminally ill person, out of personal suffering and a concern for the impact a lingering illness has on family and the immediate human circle, may turn to suicide. But the emotionally distraught teenager or adult may reach the same conclusion: my pain is too great, and my presence is detrimental to those around me. To make that decision before life involuntarily leaves us is a decision we are free to make, but it is a choice that is ultimately selfish. It is not surprising that our culture, which regards individual choice as inviolable, would find so much merit in a book like Final Exit.

Speaking of Religion

Biblical scholars have long debated the historical reliability of Matthew, Mark, Luke and John. Though there was not that much new to report on the topic, the three major news-magazines decided anyway to feature Jesus on their Easter week cover and report on recent arguments about whether Jesus actually said what the New Testament says he said. Not surprisingly, the peg for the stories was the Jesus Seminar, which has been attracting media attention for some time. With their cover stories, Time, Newsweek and U.S. News cl’ World Report were able simultaneously to acknowledge the belief of millions of Christians around the world while providing the "news" that the basis of Christian belief is something even Christian scholars disagree about.

No matter what millions of worshipers may celebrate on Easter Sunday, the resurrection of Jesus is still, in media parlance, an "alleged" event. The conflict between religion and scientific rationality is a phenomenon tailor-made for the news media, and it’s the kind of conflict they are used to covering.

Media coverage of religion is not biased against religious faith; it is biased in favor of Enlightenment rationality. Our culture’s embrace of scientific rationality as the ultimate measure of all reality has pushed religious faith over into a corner of irrelevancy. Even religion’s most informed advocates are reluctant to speak of their faith in public settings for fear of rejection by their intellectual peers.

On a special Easter Sunday edition of Meet the Press, several prominent politicians were asked how they could justify being religious and political at the same time. Even former New York Governor Mario Cuomo, one of the most articulate Christian political leaders, seemed uncomfortable as he fielded questions from moderator Tim Russert that hinted there is something nefarious about religious groups receiving government funds for programs that serve the public. Russert repeatedly pressed his concern that public funds in church hands might expose recipients to the danger of conversion. Horrors. Russert should watch Guys and Dolls. A little preaching and a little soup rarely hurts, and it sometimes helps.

Before we leave the Sunday morning talk shows, we might ask why the networks present the programs geared for the more thoughtful segment of the audience during the traditional hours for Sunday worship. Do they assume the intellectual community is staying at home on Sunday morning? The ads for the Sunday morning programs make it clear the corporate sponsors believe that their image-building campaigns are reaching the "thoughtful" community which ponders serious matters on Sunday morning rather than spending time on less important matters, like worship.

Some years ago when President Jimmy Carter was traveling in South Korea he held a lengthy conversation with the South Korean president on the subject of religion. Carter spoke of his Baptist faith, and the South Korean president, nominally a Buddhist, listened with interest. When the news of this discussion leaked out, the New York Times, that staunch defender of secularity, chastised Carter for attempting to "proselytize" the South Korean. The Times editorial implied that there is something wicked about holding a conversation on faith, since it might lead to conversion.

The national opinion-shapers don’t dislike religion. Rather, they’re programmed by a common cultural wisdom that for two centuries has celebrated the intellect over the heart. That conventional wisdom respects religion, in its place, but it does not trust religious commitment as the basis for national thought or as a perspective underlying public discourse. Indeed, to gain intellectual respectability, it is best to avoid discussing religion, especially if that discussion involves what we Methodists refer to as a "witness." One can see the same mind-set at any gathering of the American Academy of Religion, the group of academics who teach religion in public and private colleges and universities. The greatest fear you sense in the corridors, apart from the fear of not landing a job, is that a professor might be suspected of harboring a genuine religious commitment in the midst of all that intellectual conversation.

Or consider Tim Robbins’s comments in speaking to an interviewer at the Berlin Film Festival about a film he directed, Dead Man Walking: "I believe in . . . er . . . that there are . . . er . . . that there are people who are on earth who live highly enlightened lives and who achieve a certain level of spirituality, in connection with a force of goodness. And because these people have walked the earth, I believe that these people have created God." Though his film is an eloquent testimony to the power of a nun’s religious faith, he himself is hesitant to speak about religion as anything more than an offshoot of secular morality. I find more truth in his film than in his testimony to a secularized spirituality.

Or consider the report by Caryn James in the New York Times on the recent Sundance Film Festival, in which she describes the film Care of the Spitfire Grill as "a manipulatively heartwarming story about a young woman just out of prison who finds spiritual redemption." James records that the movie "won the feature film audience award and was sold to Castle Rock Entertainment for $10 million." She goes on: "No one seemed to notice that it was financed by a conservative Mississippi company affiliated with the Roman Catholic Church and founded, as its ‘mission statement’ puts it, to ‘present the values of the Judeo Christian tradition.’ The new company, called Gregory Productions, put up the $6 million for the film, set in a town called Gilead. Gregory is an offshoot of the nonprofit Sacred Heart League, which publishes inspirational literature."

James comments that the film "resembles an ‘after school special’ about forgiveness. But watching it with the Sacred Heart League in mind makes all the biblical imagery seem slightly sinister. When Marcia Gay Harden takes the heroine to mediate in a deserted church, it’s hard to forget where the movie’s money comes from. The director, Lee David Zlotoff, is Jewish and, he says, extremely religious. But the movies’s multidenominational roots—Catholic backers, Protestant characters and a Jewish director—don’t diminish the eerie sense that viewers are being proselytized without their knowledge."

Proselytized? Of the more than 600 movies released in this country each year, a substantial number of them are "guilty" of proselytizing on behalf of a worldview that celebrates greed, trivializes violence and winks at sexual activity among people of all ages at all times. Yet a movie that impressed a secular audience is found guilty of proselytizing because it has a clear religious perspective and origin. Such is the bias among our cultural leaders against religious faith as a basis for rational discourse.

 

 

 

 

 

A Soulful Afternoon in the Library

I was browsing through the recent-arrivals shelf at the public library when one of our librarians stopped to tell me how much she liked a film she had just seen on video. A film critic always hates to admit that he hasn’t seen a film, so I was relieved when the movie turned out to be Cinema Paradiso. Yes, I had seen it and yes, it was a fine work of art. It is much more than that, she said. "Anyone who doesn’t like that picture doesn’t have a soul."

A busy library is not the place to begin a discussion on the nature of the soul and how and whether one can lose one’s soul. But as I resumed my browsing I pondered the matter and decided she was right. Viewers who find Cinema Paradiso just another subtitled yarn about village life in wartime Italy are dangerously close to functioning without a soul. Ifwe define the soul as that dimension of our existence which is connected to the ultimate in the universe, then I am prepared to agree that one needs an active alert soul to resonate with a movie like Cinema Paradiso. This picture invites us to connect with the depths of reality, to mesh with the ultimate.

To engage in an intellectually responsible discussion on this topic in our time we must first define the soul, prove its existence and then explain how it can disappear. This battery of questions cows us into polite submission, for we have been educated to believe that truth must be measurable. We must assemble data according to scientific methods and offer a conclusion that will stand until someone else comes along with newer, more valid data. We are victims and practitioners of the modern mind-set that relies on what Italian philosopher Gianni Vattinio describes as "the reduction of everything to exchange-value" (The End of Modernity).

One way to counter this model of reality is to speak anecdotally—an approach scholars tend to find objectionable. How often we are told: "The evidence you are giving is purely anecdotal, a collection of random stories that can’t stand up to the harsh demands for rational validation." What are the central questions that matter, according to this view? Not how should I live, but will it succeed? Will it work? What is the bottom line? The more data we accumulate to prove that something works, the more validity it possesses. Anecdotal evidence is not sufficient for serious conversation.

Consider, as just one example, the man whose eyesight was restored by Jesus. When questioned by his friends as to how such a miracle could have happened, the man replies, anecdotally, "All I know is that I once was blind, and now I can see." Nice, but not enough to verify the healer as a reliable practitioner.

French philosopher Jean-Francois Lyotard maintains that one of the deadening contributions of modernity is to limit knowledge to "quantities of information" that are "translatable into computer language." In a section on Lyotard in An Introductory Guide to PostStructuralism and Postmodernism, Madan Sarup points out how Lyotard explicitly contrasts scientific language, the language of verification and falsification, with narrative or story, "which certifies itself without having recourse to argumentation and proof." Narratives are "fables, myths, legends," which scientists regard as part of a "different mentality: savage, primitive, underdeveloped."

This embrace of measured reality as the only valid reality is at the opposite end of the spectrum from an equally irresponsible dependence on emotion alone. We would not survive as an organized society if we all relied only on John Wesley’s dictum: "If your heart is right, give me your hand." A more balanced approach is essential. (Wesley balanced it with stern disciplines of thought and action.) But the soul shrivels when it is not open to experiential connections with the ultimate. Our obsession with measured reality ill prepares us for God’s strange, undeserved and unexpected gifts of grace.

Describing a period in his career when his health was bad and his spirits low, filmmaker Ingmar Bergman confessed to a friend, "I’m about to lose my joy. I can feel it physically. It’s running out. I’m just drying up, inside." Bergman recalled how Johann Sebastian Bach discovered that his wife and two of their children had died while he was away on a trip. In his diary Bach wrote, "Dear Lord, may my joy not leave me." In his autobiography Bergman wrote: "All through my conscious life, I . . . lived with what Bach calls his joy. It . . . carried me through crisis and misery and functioned as faithfully as my heart, sometimes overwhelming and difficult to handle, but never antagonistic or destructive. Bach called this state his joy, a joy in God."

To lose one’s joy is to lose one’s soul. Our existence is too crowded with burdensome tasks and unexpected setbacks for us to assume that alone we can overcome what confronts us. Bach’s experience of joy continues with us through his music. Bergman’s constant struggle between the experience of grace and despair is permanently available in his films. Both Bach and Bergman testify that holding on to joy is no easy assignment. But without joy, the ability to connect with the ultimate, we are left with only the hollow certainty of measured reality.

Reverence and the Freedom to Revise

In an effort to make sense of the presidential campaign, I turned to journalist James Reston’s recent autobiography, Deadline, thinking that his insights into earlier campaigns might put the 1992 race into perspective. What I found was not horse-race data about the past but a definition of wisdom taken from Alfred North Whitehead. "It is the first step in wisdom," Whitehead wrote in Symbolism, "to recognize that the major advances in civilization are processes which all but wreck the society in which they occur. . . . The art of free society consists, first, in the maintenance of the symbolic code, and secondly, in fearlessness of revision. Those societies which cannot combine reverence for their symbols with freedom of revision must ultimately decay from anarchy or from slow atrophy."

A presidential campaign is, ideally, a time for reassessing where we are as a people—a quadrennial exercise in pulse-taking and planning. Our best leaders are those who challenge us to recall "the symbolic code" that brought us to this moment and that enables us to confront the "major advances" that threaten to undermine us.

Abraham Lincoln responded to perhaps our most traumatic social shift by eloquently calling upon our symbolic code. He did not trivialize the code for his own partisan purposes, but rather reminded people of their common bond. Lincoln possessed a religious sensibility appropriate to a pluralistic culture. He spoke a biblical and democratic language that could rally the nation to look beyond the anger and suffering brought on by civil war. In the midst of a war that had divided the nation, he called upon Americans to remember the moral center that formed their union. He appealed to what was best in their tradition.

In calling for a national day of prayer during the war, Lincoln wrote: "We have been the recipients of the choicest bounties of heaven. We have been preserved, these many years, in peace and prosperity. We have grown in numbers, wealth and power, as no other nation has grown. But we have forgotten God. We have forgotten the gracious hand which preserved us in peace, and multiplied and enriched and strengthened us; and we have vainly imagined . . . that all these blessings were produced by some superior wisdom and virtue of our own. Intoxicated with unbroken success, we have become too self-sufficient to feel the necessity of redeeming and preserving grace, too proud to pray to the God that made us! It behooves us, then, to humble ourselves before the offended Power, to confess our national sins, and to pray for clemency and forgiveness."

Today, more than a century after Lincoln articulated a moral center on behalf of national reconciliation, secularity has so infiltrated our leadership and our elite institutions that presidential candidates are reluctant to employ moral language and are uncertain about any symbolic code. When they do use such language, they tend to trivialize it. In the 1988 campaign President Bush invoked the American flag in a way that trivialized that symbol, and Michael Dukakis became an object of ridicule by donning a tank-driver’s helmet—an ineffective use of an empty symbol of patriotism. In both instances the use of the symbols was demeaning both to the user and to the audience, since they were so obviously calculated political gestures.

When Martin Luther King Jr. helped lead this nation through a nonviolent civil war, he drew upon the language of the black church and on biblical images of hope and reconciliation. This language has been part of our national identity since John Winthrop challenged the Pilgrims to create a new commonwealth, the creation of which demanded a responsibility to others and to God. Winthrop’s sermon was a favorite of Ronald Reagan’s, but in praising this nation as the "city upon on a hill" Reagan conveniently ignored Winthrop’s reminder that with the gift of the new land came the burden of discipline and service and accountability to God.

Martin Luther King spoke in the Winthrop tradition when he called for an end to segregation in the 1960s. He denounced racial separation in language that appealed to our moral center, and thereby he sustained people even as he challenged them. King said, in effect, that we must change our behavior as a people, but that we can do so with the assurance that the change is consistent with God’s will and with our deepest commitments as a people.

In recent elections it has become obvious that presidential candidates are welcome to use religious language only in certain communities. Mainline churches, for the most part, have accepted the priority of secular language. They are anxious to keep religious matters separated from political issues.

Only in the black churches and in some white evangelical churches is it acceptable to link God’s purposes and demands to social issues as King did. (The Democratic candidates find their way to black churches, the Republicans to conservative white ones.) The "symbolic code" that Whitehead insisted was essential in coping with change is expressed openly only in those religious communities outside the mainstream that is dominated by the secular elite.

The 1990s will be a time of enormous shifts, processes which may "all but wreck the society in which they occur." The person chosen to lead this country into the next decade will have to be sensitive both to the demands of change and to the need to meet that change in the light of the rich symbolic code that has sustained the nation from its beginnings.

 

 

 

 

Remembering the 50’s

Pass it on. Dan Wakefield is one of us. His autobiography Returning: A Spiritual Journey gave it away, but secular critics, always nervous with anything that smacks of the "spiritual," prefer to think of Wakefield as either an evocative novelist with a special talent for describing the struggles of a midwestern kid who made it big in New York City and Hollywood or as a talented essayist who understands the ambiguities of modern culture. But Wakefield belongs in the same category as Bill Moyers, Garrison Keillor, Robert Coles, Garry Wills and novelists like Sue Miller, Reynolds Price and John Updike: they are authors who have gained an audience in the secular world without sacrificing their religious sensibilities.

Wakefield’s best-known novel is Starting Over, a fictionalized autobiography which was made into a successful motion picture with Burt Reynolds playing Wakefield. His other novels, which include Selling Out and Home Free, feature candid but gentle portrayals of young men caught between the harsh restrictions of conscience and the joy/pain of sex. Wakefield writes of this conflict with an honesty that earned him some initial criticism in his native Indianapolis. In his fiction Wakefield captures with pathos and humor the dark side of his search for meaning. More recently, in Returning, Wakefield confessed that after a decades-long struggle against the bland piety of his youth, he found that he was unable to find peace until he gave up all the artificial props—alcohol, drugs, psychiatry, aimless sex—and returned to what he had been running away from: a connection with God.

In his New York in the Fifties, a loving remembrance of a circle of political and literary radicals in rebellion against the blandness of the Eisenhower era, Wakefield is still writing his spiritual autobiography, as he confided recently. He seems to believe that the way to spiritual growth is through personal honesty. Thus students who take his increasingly popular seminar on spiritual renewal begin by writing about their own embarrassing moments.

In his earlier work, Wakefield was the quintessential Protestant writer, determined to confront his own experiences. He was handicapped, he thought, by the fact that he could never escape that hound of heaven that traveled with him on an overnight train from Indianapolis to New York City’s Grand Central Station back in 1952. In New York Wakefield absorbed the intellectual excitement of Columbia University by day and plunged into boozy collegiate discussions over numerous pitchers of beer by night. After graduation the nightly sessions continued in a Greenwich Village community that included James Baldwin, Joan Didion, John Gregory Dunne (Didion’s future husband), Jack Kerouac, Murray Kempton, Nat Hentoff, Gay Talese, Norman Podhoretz and Norman Mailer. Wakefield celebrates this community in New York in the ‘50s, which began as an essay on Baldwin, at whose Village apartment Wakefield met jazz musicians, authors and intellectuals. He expanded the book to include interviews with other friends. When Wakefield returned to the Village to do research for the book, he went back to the White Horse Tavern, where Dylan Thomas had his last drink before he died at age 39 in nearby St. Vincent’s Hospital.

"There’s a plaque on the wall now indicating the table where the poet had his last drink, but in the old days it was one of an insider’s privileges to know, and reveal the sacred spot to newcomers," Wakefield writes. Cut off from any traditional sense of the sacred, the White Horse regulars turned to psychiatry for their faith and the tavern for their church. The analyst "became our priest, garbed in his vestments of three-piece dark flannel suit, and his orthodoxy became our religion. Whether one partook of it or not, this communion on the couch was part of a dialogue and texture of our time and place," Wakefield recalls, but adds: "I lay down on the couch of Freudian psychoanalysis in the fifties and rose up six years later in anger and disillusionment."

In his spiritual quest Wakefield encountered several mentors, people who infiltrated his life and whose presence was still with him when he finally decided it was time to give up drinking and "return" to himself. Wakefield identifies Dorothy Day, Norman Eddy and Mark Van Doren as three of those mentors. "What drew me to [Day’s] Catholic Worker movement, first as a journalist and then as a friend and sympathizer," he notes, was "a real mystique that called to young people of the fifties and drew them from all across the country, offering in the midst of the grim poverty of the Bowery something that all the glittering affluence around us lacked—a spirit, a purpose, a way of transcending self through service that those who came still vividly remember."

That same blend of spirituality and practicality was evident in Eddy, a minister "I was prepared not to like." Looking for stories to sell as a free-lancer, Wakefield had heard of the East Harlem Parish on East 100th Street, but he was uneasy about meeting the pastor, who he feared would be "some kind of long-faced missionary who’d warn me darkly of the wages of sin" and worse yet "would try to recruit me for Jesus." He found instead an "open, vital man with an easy laugh and a sense of the ridiculous as well as the divine, and I had to admire him in spite of my prejudice against preachers, especially Protestants, because he wasn’t preaching his message so much as living it."

The practical commitment of Day and Eddy helped Wakefield bridge the gap between his negative feeling toward his Indiana piety and his longing for spiritual meaning. But it was Van Doren who restored to Wakefield a critical dimension in his spiritual search. In a college course on narrative art, Van Doren introduced his students (or reintroduced in Wakefield’s case) to the New Testament. Wakefield recalls Van Doren’s startling insistence that Jesus "was the most ruthless of men," ruthless in "following his conception of truth and iron in his will." In the context of the popular religion of the ‘SOs, with its "feel good, be successful" motif, Van Doren’s version of Jesus provided the intellectually hungry Wakefield with a spiritual leader he could respect.

Armed with this new understanding, Wakefield wrote "Slick Paper Christianity" for the Nation, a devastating attack on popular Protestantism as exemplified by the Methodist family magazine, Together. He described the now-defunct journal, which had a circulation of 1 million, as something of a Rotarian periodical for a Christian club. (Together, which happened to be edited by a former editor of the Rotarian magazine, annually selected an all-Methodist all-star football team.)

Van Doren was also a mentor for Columbia graduate Thomas Merton, who speaks appreciatively of Van Doren in his celebrated autobiography The Seven Storey Mountain. In describing his conversion to the Catholic Church, Merton writes of "how easily and sweetly it had all been done with all the external graces that had been arranged along my path by the kind providence of God." External graces also seem to have guided young Dan Wakefield on his path from Indianapolis to New York City, to Mark Van Doren’s classroom, and to a remarkably creative community in New York in the ‘50s. Pass it on.

 

 

The Religious Music Without the Words

If it is true, as G. K. Chesterton said, that the U.S. is a nation with the soul of a church, then why is it so difficult to use explicit religious references in public discourse? That question was at the center of a recent conference at which more than 200 people assembled under the auspices of the Center for the Study and Religion and American Culture to discuss "public religious discourse and America’s pluralistic society." It would appear, as conference speakers suggested, that this nation’s spiritual nature has been forced to hide under a secular shield, its traditional religious rhetoric muted to protect sensibilities in a pluralistic society. But history is difficult to ignore. Randall Balmer of Columbia University suggested that the most effective oratorical style in contemporary politics is strongly influenced by the evangelical Protestant tradition. The style and cadences of 18th-, 19th- and early 20th-century preaching, with its call for change and its insistence that the world is divided into good and evil, provide the "music" whenever effective political rhetoric is employed.

It is the evangelical preaching tradition, not the high-church tradition emphasizing sacrament and liturgy, that has shaped American communicative style, Balmer argued. This has left us with a political rhetoric that is simplistic, dualistic, populist, and charged with calls to repentance. But it is the style, not the content, that has survived from the nation’s initial Protestant worldview.

William F. Buckley once observed that anyone who mentions God more than once at a New York dinner party won’t receive another invitation. Specific words from the Christian or Jewish traditions are considered inappropriate in public forums, most participants at this conference acknowledged. Yet such religious "music" is at the core of our society; it is part of who we are as a people. With the arrival of pluralism and the need for tolerance, that music survives only in a denuded form of moral discourse that has little connection with its original source.

In one sense, the music lives in our civil religion—in our celebration of sacred days, sacred places and revered leaders. But the words to the music have lost their rootedness in the ultimate. Drew history professor Leigh Schmidt observed how Christmas celebrations have been secularized. Examining personal diaries, newspaper reports and advertisements from the late 19th and early 20th centuries, Schmidt noted the way Christmas was once celebrated in department stores like Wanamaker’s in Philadelphia, which featured hymn-singing and elaborate religious symbolism that turned the store’s cathedral-like central area into a commercialized version of a church. Today the emphasis is on non-Christian Christmas symbols. The baby Jesus has given way to Frosty the snowman and Rudolph the red-nosed reindeer.

Pluralism required this shift. Until early in this century Protestants controlled public discourse, leaving minority faiths and people of no faith without a voice. In recognition of these minorities and as an expression of tolerance and openness, the Protestant control has given way to a bland secular voice that offends no one but also fails to provide a religious worldview to help shape public discourse.

Still, as Balmer argues, political rhetoric and cultural style cannot escape their parentage. Consider the style of some of our presidential candidates, who have employed what Balmer describes as the major components of the evangelical preaching tradition: the appearance of spontaneity, a cadence that appears to be leading toward an altar call, the reduction of complex problems to a simple delineation of good versus evil. Ronald Reagan had the style. George Bush does not, though he can approximate it in 30-second sound bites (and did so with devastating effect against Michael Dukakis, one political figure who does not share the evangelical style). Bill Clinton and Jerry Brown both have the style, though in different ways. Clinton can be a preacher who connects with his audience (especially in black churches) through biblical passages. Brown can be the angry prophet denouncing the privileged, a frontier evangelist pointing the finger of condemnation at the sinful establishment.

This Protestant heritage has bequeathed to us, in short, a dualistic and populist style. But now the words are politically correct only if they reject all signs of the Protestant past. At its worst, that past reflected imperialism, patriarchy and exclusivity. At its best it formed the nation’s soul and moral core.

Later in its series of conferences the center will solicit papers on the role of the media in shaping the expression of religion in public discourse. But the media issue found its way into these discussions in a paper by Janet Fishburn. Another professor from Drew, Fishburn was a member of the task force that wrote the study on sexuality for the Presbyterian Church (U.S.A.) which was widely debated in the media and at the church’s General Assembly. Fishburn said that the task force had tried to use "expressive" language in dealing with sexuality—not sex, as it was invariably termed by the media—but that this language was converted by the secular media into a dualistic, simplistic and moralistic language that treated sexual relationships as if they were being entered on a police docket. As a result the committee’s attempt to be pastoral was perceived as an abandonment of traditional moral standards. The committee wanted to deal "expressively" with evolving forms of sexuality, but it did not find the language to do so. It also found little support for its effort within the church, which was not prepared to distinguish between normative moral statements and statements of pastoral care, especially when that nuance was ignored by secular reporters in search of a good story.

Fishburn acknowledges that the committee should have anticipated the outrage the report prompted not only in the church but also in the communications industry, which has assumed the role of guarding national mores, even as it profits from publicizing moral infractions. Indeed, judging by the reporting on the presidential campaign, it seems that in the absence of any substantive public debate on morality among religious leaders, media representatives have emerged as the new priesthood in our culture: they demand confessions of misconduct from public figures and then determine the seriousness of the sin and the degree of penance required for the sin to be forgiven.

Claremont philosophy professor John Roth suggested that one way of restoring moral and ethical dialogue to public discourse would be to speak in a poetic or lyrical mode—close to what Fishburn described as an expressive mode. But as Fishburn and her committee discovered, the secular media make it virtually impossible to use that mode of expression.

Still, the task must be undertaken. As Robert Sollard wrote recently in the Chronicle of Hiqher Education (as cited by Roth), "Much has been written about the loss of ethics, a sense of decency, moderation and fair play in American society. I would submit that much of the loss is a result of increasing ignorance, in circles of presumably educated people, of religious and spiritual worldviews. It is difficult to imagine, for example, how ethical issues can be intelligently appreciated and discussed or how wise and thoughtful decisions can be reached without either knowledge or reference to these religious or spiritual principles that underlie our legal system and moral codes."

An ethical system requires a living tradition for constant revision and sustenance; cut off from that source, the system loses its force. Which is why we must find a way to engage in public discourse that will reflect the "religious or spiritual principles that underlie our legal system and moral codes."

We are left with an increasingly frustrating dilemma: a nation with the soul of a church has lost its way, but its traditional manner of speaking of ethics and values is considered politically incorrect. We must search for a language with which to address this predicament, without giving undue preference to any segment of our pluralistic culture. It is a problem with no easy solution.

 

 

 

The Pictures Inside Our Heads

Richard D. Heffner recently examined the explosive response to Oliver Stone’s movie JFK and concluded that what really disturbs Stone’s critics is that he represents a new and powerful kind of historian, one who is "fully determined to have his own way with the pictures inside our heads." Stone’s "mind-boggling special effects, his rapid cuts and purposeful edits, his musical up-beats and down-beats, his endless flashbacks and flash-forwards, all play with our heads, mold our perceptions so much more effectively than the more linear media ever did."

What’s interesting about JFK, Heffner quotes Stone as saying, is that "it’s one of the fastest movies. . . . It’s like splinters to the brain. We have 2,500 cuts in there, I would imagine. We had 2,000 camera setups. We’re assaulting the senses . . . in a sort of new-wave technique. We admire MTV editing technique and we make no bones about using it. We want to . . . get to the subconscious . . . and certainly seduce the viewer into a new perception of. . . what occurred in Texas that day."

Heffner, himself a historian, finds JFK filled with "not-quite-provable speculations about the end of Camelot," but he is intrigued by the angry response to the film on the part of print media and television journalists. Heffner, who is also chairman of the motion picture industry’s film rating system, suggests that the anger of the "lords of print" derives from their awareness that Stone and his "fellow celluloid/video Pied Pipers will become our nation’s leading storytellers." The age of Gutenberg is experiencing its last gasp. We may dislike their vision, but these new storytellers "will set our national agenda, interpret our national future, just as the scribblers themselves had done until these last sputtering days of the 20th century."

The Gutenberg era has had an important connection with Protestantism. Under the slogan "sola Scriptura," Protestants translated the Bible into the vernacular, and with the help of the press distributed Bibles to an unprecedented number of people. With Gutenberg’s printing press the story of creation, fall and redemption was put into linear form, and Protestants have always been proud of their control over the biblical narrative of redemption. This was a notable advance, but also a major loss because it shifted the church’s focus from visual images to the printed page. Critics’ hostility toward Stone’s imagistic form of history has a parallel in Protestants’ hostility to visual art, and their corresponding insistence that print is superior to any other form of communication. Along with this preference goes a modern bias against artistic work, which is deemed irrelevant to the business of fact-gathering.

Heffner’s remarks about the last gasp of the Gutenberg age relate directly to the mainline churches’ dependence on print media to convey their message. The community that formalized the Bible expressed itself in signs and symbols. Its members pointed to a story with a beginning and an end, but this narrative is not limited by time or space. It is not just a linear narrative. It is a work of art, forged by a community of believers. It is a document that continually surprises and provokes. Its power lies not in the sequence of events it describes but in its power to transform our vision. It is a narrative that asks for commitment, not agreement.

The Bible is crammed with images that assault the senses, and their intention is certainly to "seduce the viewer into a new perception" of reality. That seductive process began in lonely encounters between God and Moses on a mountain, God and Daniel in the lions’ den, and God and Jesus in a garden. A faithful community sanctified the stories of these encounters and declared them valid expressions of God’s reality.

This notion of a transcendent source to the narrative is especially difficult to grasp in our era. We prefer to live by Protagoras’s dictum that "man is the measure of all things." And in order for Protagoras’s "man" to maintain this fiction of self-sufficiency, he tries to "measure" all things, especially through stories that move in a linear sequence.

But the Christian knows that stories are not bound by their linear shape, that the meaning of stories goes beyond the facts they portray. Art and faith converge in a protest against the elevation of linearity as the final word about reality. The Christian knows that the dichotomy between "truth" as a linear narrative and "truth" as shaped by images and the "pictures inside our heads" must be bridged—and that it is bridged in the faith that God creates and redeems reality and that God is the source of all that we are and will be.

Lutheran theologian Joseph Sittler respected the power of linear narrative, but he was constantly reminding us that to receive the full majesty of the biblical story we must accept it as an uncontrollable and unpredictable work of art. In one essay he pointed out how often the Gospels speak of unexpected "bestowals of grace," as in such phrases as "and suddenly. . . and on the way he met. . . now it happened that there stood before him a man." It is "in the midst of the many-threaded, wild unsystematic of the actual," said Sittler, that "the not-expected was crossed and blessed by the not-possible."

 

 

 

 

 

 

 

 

 

 

 

The Endless Quest for the Perfect Novel

Summertime, and the living may not be easy, but it is a good time to embark once again on the quest to find the perfect novel. Granted, perfection is unattainable, but like the holy grail it must be sought, if for no other reason than that the quest itself is a goad, pushing the earnest reader to the library or the bookstore.

One summer a few years back the quest was sparked by a brief mention in Variety that the flimmaking team of Ismail Merchant and James Ivory had selected their next project. I had just seen their Howards End, based on the novel by E. M. Forster, a writer whose work Merchant and Ivory had earlier mined successfully with A Room with a View and, less successfully, Maurice. The team is at its best with late 19th- and early 20th-century British tales of manners and repressed sexuality, elegantly presented with scenes of green English countryside and properly managed houses. Their latest venture follows that pattern by taking up the work of contemporary author Kazuo Ishiguro.

Anthony Hopkins, who plays Henry Wilcox in Howards End, had been signed, Variety reported, for a film based on Ishiguro’s 1989 novel The Remains of the Day. Not every proposed film makes it to the screen, but just the possibility of Hopkins being reunited with Merchant-Ivory sent me to the library to read Ishiguro. What I found was as close as I expect to get to the experience of reading the perfect novel, a state enhanced considerably by the thought of Hopkins in the role of Stevens, the aging, dignified butler in one of England’s grand but fading houses.

Ishiguro, who after three novels has already been hailed as "one of the leading figures in the new generation of British novelists," might seem an unlikely successor to Forster in the Merchant-Ivory corpus. He was born in Nagasaki, Japan, in 1954 and moved as a child to England. Ishiguro’s parents initially assumed their move to England would be temporary (his father worked in oil exploration in the North Sea) but they remained, and gave their son what he describes as a "very typical . . . southern English upbringing," although only Japanese was spoken at home.

Majid Tehranian, director of a peace institute in Honolulu, who was born in Iran but is now an American citizen, spoke to a church communications conference recently in New York and made a point that is pertinent to the quest for the perfect novel. All of us live with three kinds of lies, Tehranian said: the lies we tell others, the lies we tell ourselves, and the lies we don’t even know we are telling, or more accurately, living.

We know the lies we tell to others; the lies we tell to ourselves are a bit harder to discern; but the only way really to grasp the lies we don’t even know we are living is to get outside our own cultural setting. Tehranian cites the old line of the baby fish to the mother fish: "When am I going to see this water you talk about so much?"

A bicultural, bilingual author, Ishiguro displays in The Remains of the Day strong powers of observation, coupled with a remarkable grasp of the language that would be spoken or written by his narrator, Stevens, a butler for more than 35 years at Darlington Hall.

Ishiguro takes the reader with Stevens on a six-day ride in July 1956 through the south of England, describing the early morning mists, the pleasant, green countryside, and the modest inns and homes in which he stays. During this period Stevens recalls his career in the service of Lord Darlington.

The opening lines of Stevens’s narration immediately draw the reader into the story Ishiguro intends to tell, introducing the voice of a very proper, disciplined butler who knows his place and who lives only to perform his duties. "It seems increasingly likely that I really will undertake the expedition that has been preoccupying my imagination now for some days. An expedition, I should say, which I will undertake alone, in the comfort of Mr. Farraday’s Ford; an expedition which, as I foresee it, will take me through much of the finest countryside of England to the West Country, and may keep me away from Darlington Hall for as much as five or six days."

Stevens is motivated to take this journey in part because his new master, an American named Farraday, proposes that he take some time off, and partly because he has just received a letter from Miss Kenton, a former housekeeper who departed the house more than 20 years ago. It is possible, Stevens, surmises, that Miss Kenton might be interested in returning to work at Darlington Hall, since her letter suggests that her marriage has ended. With this hope of adding a veteran housekeeper to the staff—nothing of a more personal nature, of course—Stevens sets off "through the pleasant countryside," with a visit with Miss Kenton as the culmination of his journey.

It is easy to imagine Hopkins in this role, especially after his performance as Henry Wilcox, the rigid symbol of England’s upper class, "living comfortably on a lie," as one reviewer puts it. For The Remains of the Day is the diary of a man totally unaware of his inner life. Stevens faithfully records his exchanges with Miss Kenton, Lord Darlington, and others who come to the house. We know of his grief over the death of his father only because he recalls someone commenting that he is crying as he continues to perform his duties. Stevens has reached the end of his active life, and he now faces retirement—what a man he meets describes as "the remains of the day." Stevens faces this prospect with no sense that he can relate to people other than as a working butler.

What makes this novel approach perfection—and two comments on the book jacket actually employ the word—is the way Ishiguro leads the reader into Stevens’s life through his own words, enabling us to feel his pride in being a "great" butler and at the same time experience the pain of personal loss which he is utterly unable to acknowledge. Ishiguro has said, "What I’m interested in is not the actual fact that my characters have done things they later regret. I’m interested in how they come to terms with it."

Stevens’s relationship with Miss Kenton, which she understands far better than he ever could, is the centerpiece of the book. Not that he ever acknowledges any regret over the pattern the relationship took, but that he needs to "come to terms with it." Stevens’s role as a participant in British history—Germany’s ambassador to Great Britain was a guest at Darlington Hall for some critical prewar meetings—was, like his relationship with Miss Kenton, something to observe, not to feel.

At the end of his narrative Stevens has to confront the reality of his own "remains of the day." As he looks back, he considers his life through the mists of his self-deception, of which he seems hardly aware. Speaking of Lord Darlington, he says that his employer "wasn’t a bad man at all. And at least he had the privilege of being able to say at the end of his life that he made his Own mistakes . . . He chose a certain path in life, it proved to be a misguided one, but there, he chose it, he can say that at least. As for myself, I cannot even claim that. You see, I trusted. I trusted in his lordship’s wisdom. All those years I served him, I trusted that I was doing something worthwhile. I can’t even say I made my own mistakes. Really—one has to ask oneself—what dignity is there in that?"

What dignity, indeed, in not choosing one’s own life, but living always outside of it, fixed only on duty and performance. The journey with Stevens is more than a ride through the pleasant countryside of southern England; it is a journey toward one’s own "remains of the day."

 

Control as Original Sin

It isn’t until several days after the accident that Lottie lets herself—makes herself—think about it. Think about how it was for all of them, for Cameron and Elizabeth, and for Jessica." The opening paragraph of Sue Miller’s novel For Love hints at the answer to the novel’s central question: "Where are the 20th-century love stories?" Her answer: "They’re not allowed." And why are they not allowed? Miller never explicitly answers this question, but her story testifies to the shallowness of modern love and of couples afraid to make the leap of faith into commitment and trust. Unable to give up control, says Miller, modern lovers settle for sex, passing infatuations, boring marriages.

Miller’s novel, for me her best and most mature, explores the interlocking lives of a small group of friends. The accident that Lottie wills herself to put out of mind and then recall is the novel’s central event, the moment when everyone’s personal control is temporarily erased. Jessica, a young woman who works as an au pair for Elizabeth, is accidentally killed by Lottie’s brother, Cameron. He is driving at night in the rain to confront Elizabeth, the woman he thinks he loves. Distracted by his worry over Elizabeth’s phone call informing him that she will return to her husband, Cameron does not see Jessica until she steps directly into the path of his car. It is an accident—but nevertheless Cameron was distracted, and he was driving the car. He must share responsibility for the accident.

Elizabeth also shares some of the blame; she initiated the affair with Cameron, and she sent Jessica out to intercept Cameron. Lottie knows that she, too, bears some of the blame because she made the suggestion that led to the fateful summer affair between Elizabeth and Cameron. In Miller’s universe there are those who know they are responsible for their actions, and there are those so lost that they deny responsibility for anything. These are, in effect, hardened sinners and repentant sinners.

Lottie, the book’s narrator, reports her experience with both types of sinners, and it is upon her own growth from hardened to repentant sinner that the book is built. As the novel begins, Lottie has returned to Boston from a troubled marriage to prepare her mother’s house for sale. Her 20-year-old son Ryan joins her to work on the house, and his presence sets up a familiar Miller situation: a mother struggling simultaneously with her roles as parent and daughter. Miller builds her narrative around physical, sweaty work on the house, a growing intimacy between Lottie and Elizabeth during coffee-break conversations, and Lottie’s discovery that her brother’s obsession is being exploited by Elizabeth to fill her lonely summer months.

Miller, a disciplined novelist, reports only what Lottie sees or hears—including an ingenious use of recorded telephone messages. Control is an absolute necessity for Lottie. It is her way of dealing with a sorrow begun in a loveless childhood (her father went to jail for fraud, adding to her sense of uncertainty). Lottie wills herself to deal with problems only when she can no longer put them aside. She now recalls painfully that as a child she was programmed to fail by her alcoholic mother (Cameron was to be the successful child).

Bruised by her first marriage—to Derek, Ryan’s father—Lottie has an affair with Jack, a romantic fling made possible because Jack’s wife has suffered several debilitating strokes. The wife dies, and Jack and Lottie marry; reality intrudes, and Lottie runs away. These details have the flavor of a soap opera, but through them Miller probes the nature of love (a topic that Lottie, a free-lance author, is researching). With a superb realist style and sensitivity to relations between men and women, Miller explores her insight that intimacy cannot survive in an atmosphere that emphasizes control.

The control/trust dilemma implicit in Miller’s story might be described in Christian terms as original sin—the absence of trust in others and in God. Miller (whose father, Jim Nichols, taught church history at several Protestant seminaries) does not employ such theological categories. But her novels suggest them, and they also indicate—as in Lottie’s final journey—that original sin need not have the final word.

Each of Miller’s three novels focuses on a family in trouble. In The Good Mother a family is disturbed by an ambiguous encounter between a child and a mother’s lover. In Family Pictures a retarded child, who the mother insists must remain at home, exercises control over a family.

Lottie’s research on love leads her to John Donne, an encounter Miller describes with her lean and evocative attention to physical detail: "One night—this was about ten days before the accident—she was reading Donne’s love poetry. Just a few of them before she dropped off to sleep, she told herself. She was propped up in bed, and the circle of light fell on the worn white sheets and across the yellowed pages of the old book. A musty smell rose from it as Lottie turned the pages, a smell that weighted the words with physical meaning for her. ‘For, not in nothing, nor in things/Extreme, and scattering bright,’ she read, ‘can love inhere."

Lottie moves finally toward a moment of catharsis in which, after confronting her brother with his destructive, possessive love for Elizabeth, she begins a long trip alone. She has begun to sense, with Donne, that love is not going to inhere in "things," and she has made a decision to start life anew. The journey is undertaken with the added burden of a severe toothache which Lottie tries to ignore, relying for relief on strong pills. This becomes a memorable metaphor, written in Miller’s precise physical style, of a person denying reality.

As in her earlier novels, Miller eschews a conventional happy ending. Her characters descend too far into pain and conflict for easy conclusions. They must settle instead for new beginnings built on broken pieces, which in For Love are as diverse as an old house restored and a painful tooth replaced—and as hopeful as a tentative relationship that promises to begin, for a change, with trust.

 

 

 

Blending Commitment and Politics

The notion that politicians must not permit their religious sensibilities to affect political decision-making has reduced political dialogue to a seminar on pragmatism. Political leaders might benefit from reflecting on a distinction Max Weber made between the morality of saints and the morality of politicians. In his classic essay "Politics as Vocation," Weber did not seek to remove ethics from politics but urged politicians to blend ethical commitment with a pragmatic ethic of responsibility.

In our highly secularized environment, politicians are intimidated from expressing a commitment to ethical standards. At best they fall back instead on safe phrases like "family values." Afraid of being branded as moralists, or even worse, proselytizers, politicians cling to surface arguments that remain in the public’s comfort zone, choosing sides in the familiar debates on school prayer, pornography, media immorality and abortion.

Without an ethic of commitment behind our ethic of pragmatic responsibility there is no guide to being responsible. We have elevated pragmatism to the sole measurement of our political behavior. What moral discourse there is occurs in easily digestible sound bites:

Murphy Brown sets a bad example; adoption is better than abortion; and (one of my favorites from Pat Buchanan) school prayer makes students productive.

Václav Havel, an author and the president of Czechoslovakia, argues in Summer Meditations (Knopf) that "all genuine politics" has a moral origin. Ralf Dahrendorf, writing in the New York Times, reflects on Weber’s notion of politics and comments that "Havel’s every page breathes the spirit that made him the authentic spokesman of the Eastern European revolution of 1989, which was in his words about ‘living in truth." What is paramount to Havel the writer and what he now seeks to implement as a political leader is the belief that what finally matters is not power but "decency, reason, responsibility, sincerity, civility and tolerance."

Our attention, however, is focused almost entirely on solving short-term problems. Alice Hoffman’s novel Turtle Moon has a character named Lucy, a young divorced mother distressed over the behavior of her teen-age son. Reflecting on the physical complaints she hears from other mothers about their children, she thinks, "There is, after all, strong brown soap for poison ivy, iodine for cuts and bruises, mud for bee stings, honey for sore throats, chalky white casts for broken bones. But where is the cure for meanness of spirit? What remedy is available for unhappiness and thievery? Certainly, if it were anywhere in Florida, Lucy would have found it, since the sharp yellow afternoon sunlight hides nothing. It’s the sort of light that makes it difficult to begin all over again and doesn’t allow for much invention. You are what you see in the mirror above the sink—in Lucy’s case, a pretty woman with slightly green hair whose son hates her."

It is very difficult for our society to acknowledge the reality of "meanness of spirit," for there is no immediate cure for such a fundamental flaw. We do not solve the problems of urban decay by the application of brown soap or iodine. There is something seriously wrong with our society, but we do not begin to identify it. We have allowed the triumph of secularity to lull us into believing that meanness of spirit can be cured by a few Band-Aids, or ballistic missiles, or junk bonds.

To fill the vacuum left by the departure of religion from our public realm, with its diminution of spiritual goals, ideals and priorities, we have adopted a language that is ethically neutral. That neutrality leads us to elevate secularity to supremacy. The question that excites us is not, What is good? or What is just? or What is best for the larger community? but, Where’s mine? The Los Angeles looters were first cousins of the Wall Street pirates who loot our corporations with their buyouts, or the CEOs who demand and receive salaries and bonuses equal to the budgets of some countries.

Having lost a sense of transcendence in our common life, we look for meaning in power, achievement and success. As a nation we have no basis of measurement by which to judge what is of value. A cover story in Newsweek inspired by the Murphy Brown discussion asked, "Whose values?" The question is proper; but the answer from Newsweek was remarkably obtuse. Accustomed as I am to seeing religion blanked out in secular discussions, I was still surprised to find that Newsweek’s various writers on the topic managed to ignore religion. One interview referred to the Baptist background of a woman who discussed how she raised her four sons. The interview itself, however, allowed for no reference to such basics as, say, the Ten Commandments, or sacrificial love, or loving one’s neighbor as oneself.

One headline, "The Original Sin," suggested that here at last the topic might be examined within a religious framework. But alas, the reference was not to Eve, Adam or the fruit of the tree, but to a John F. Kennedy speech calling for deficit spending to jump-start the economy, a step the writer believes started us down the road to economic ruin. The "original sin" of the title referred to a sin against the one god in our culture that really matters.

Religious language is enough a part of our history that the magazine could play with the term "original sin" in the headline. Meanwhile, while media and political leaders carefully avoid religious references, a majority of Americans are expressing their frustration and anger either by not voting or by embracing candidates who promise quick and easy solutions to complex problems. It is time we said to our leaders that while we don’t expect to elect any saints to public office, we have had more than enough of political pragmatism rooted in nothing but the desire to win the next election.