The Evangelical-Jewish Alliance

Yielding to increasing pressure to show the Arab and Islamic worlds (and much of Europe) that he is sensitive to the plight of the Palestinian people, President George W. Bush recently declared his commitment to implement a "road map" to an Israeli-Palestinian peace. Meanwhile, a powerful domestic countermovement capable of undermining the U.S. initiative is well under way. Rising opposition from the conservative Ariel Sharon -- led Israeli government and its powerful U.S. lobby, the America-Israel Political Affairs Committee (AIPAC), was to be expected. But the most numerically significant opposition is coming from the Christian right, an important constituency for the president if he is to be reelected in 2004.

Republican election advisers undoubtedly are watching the Christian right carefully. An early April rally in Washington, D.C., organized by the International Fellowship of Christians and Jews drew key Christian right leaders like Gary Bauer, president of America Values. Bauer told the crowd that "whoever sits in Washington and suggests to the people of Israel that they have to give up more land in exchange for peace, that is an obscenity." Other key players convening the rally included the Christian Coalition and AIPAC. Major speakers such as the Israeli ambassador to the U.S., Daniel Ayalon, and pro-Israel Congressmen Eric Cantor (R., Va.) and Tom Lantos (D., Calif.) encouraged the predominantly evangelical Christian participants to campaign against any plan that would force Israel to abandon its settlements or relinquish land now under its control. Key provisions of the road map call for Israel to make concessions on both issues.

For the president, pushing for the implementation of the road map will require a careful balancing act. Since the September 11 terrorist attacks, he has solidified political support from three important constituencies: neoconservative intellectuals, American Jews (including members of the influential pro-Israel lobby) and fundamentalist Christians, constituencies that find common ground in their vigorous support for Israel.

A decisive moment in the forging of this alliance occurred in April 2002, while the Israeli army was demolishing several cities and refugee camps in the West Bank following the dreadful Passover terrorist bombings. Under increasing international pressure, Bush repeatedly appealed to Sharon to withdraw from the West Bank city of Jenin. The pro-Israel lobby, in coordination with the Christian right, mobilized over 100,000 e-mail messages, calls and visits urging the president to avoid restraining Israel. The tactic worked. The president uttered not another word of criticism or caution, and Sharon continued the offensive. As Christian televangelist Jerry Falwell commented during an October interview on 60 Minutes: "I think now we can count on President Bush to do the right thing for Israel every time.

Falwell spoke for a large number of Christian Zionists in the U.S., Christians who believe that the modern state of Israel is the fulfillment of biblical prophecy and so deserves unconditional political, financial and religious support. Christian Zionists work closely with religious and secular Jewish Zionist organizations and the Israeli government, particularly during periods when the more conservative Likud Party is in control of the Israeli Knesset (parliament). Though Falwell claims to be speaking for over 100 million Americans, the number is actually closer to 25 million.

Mainstream evangelicals number between 75 and 100 million; fundamentalist and dispensationalist evangelicals, whom Falwell represents, between 20 and 25 million.

Christian Zionism grows out of a particular theological system called premillennial dispensationalism, which originated in early 19th-century England. The preaching and writings of a renegade Irish clergyman, John Nelson Darby, and a Scottish evangelist, Edward Irving, emphasized the literal and future fulfillment of such teachings as the Rapture, the rise of the Antichrist, the Battle of Armageddon, and the central role that a revived state of Israel would play during the end days. Darby and Irving argued that portions of the books of Daniel, Ezekiel, Zechariah and Revelation predict when Jesus will return and how the final battle of history will take place.

Darby brought these doctrines to the U.S. during eight missionary journeys. They captured the hearts and minds of those who attended Bible and prophecy conferences in the years after the Civil War. Darby’s teachings were featured in the sermons of some of the great preachers of the 1880-1920 period: the evangelists Dwight L. Moody and Billy Sunday; major Presbyterian preachers such as James Brooks; Philadelphia radio preacher Harry B. Ironsides; and Cyrus I. Scofield. Scofield applied Darby’s eschatology to his version of the scriptures and provided an outline of premillennial dispensationalist notations on the text. The Scofield Bible (1909) gave dispensationalist teachings much of their prominence and popularity. It became the Bible version used by most evangelical and fundamentalist Christians for the next 60 years.

Christian Zionists insist that all of historic Palestine -- including all the land west of the Jordan which was occupied by Israel after the 1967 war -- must be under the control of the Jewish people, for they see that as one of the necessary stages prior to the second coming of Jesus. Among their other basis tenets:

• God’s covenant with Israel is eternal, exclusive and will not be abrogated, according to Genesis 12:1-7; 15:4-7; 17:1-8; Leviticus 26:44-45; Deuteronomy 7:7-8.

• The Bible speaks of two distinct and parallel covenants, one between God and Israel, one between God and the church. The latter covenant is superseded by the covenant with Israel. The church is a "mere parenthesis" in God’s plan and as such it will be removed from history during an event called the Rapture (1 Thess. 4:13-17; 5:1-11). At that point, Israel, the nation, will be restored as the primary instrument of God on earth.

• Genesis 12:3 ("I will bless those who bless you and curse those who curse you") should be interpreted literally -- which leads to maximum political, economic, moral and spiritual support for the modern state of Israel and for all the Jewish people.

• Apocalyptic texts like the Book of Daniel, Zechariah 9-12, Ezekiel 37-8, I Thessalonians 4-5 and the Book of Revelation refer to literal and future events.

• The establishment of the state of Israel, the rebuilding of the Third Temple, the rise of the Antichrist and the buildup of armies poised to attack Israel are among the signs leading to the final eschatological battle and Jesus’ return for his thousand-year reign. The movement looks for the escalating power of satanic forces aligned with the antichrist that will do battle with Israel and its allies as the end draws near. Judgment will befall nations and individuals according to how they "bless Israel."

Christian Zionism has significant support within Protestant fundamentalism, including much of the Southern Baptist Convention and the charismatic, Pentecostal and independent churches. The movement can also be found in the evangelical wings of the mainline Protestant churches (Presbyterian, United Methodist and Lutheran) and to a lesser degree in Roman Catholicism. Its reach is broad, since premillennialist dispensationalist themes are advanced through Christian television, radio and publishing. The National Religious Broadcasters organization, which controls almost 90 percent of religious radio and television in the U.S., is dominated by a Christian Zionist orientation.

The alliance of Christian Zionists and the pro-Israel lobby solidified during the Reagan administration, although it declined somewhat during the first Bush administration and the Clinton years. Clinton’s Israeli ties were with the secular Labor Party, led by Yitzhak Rabin and Shimon Peres, not with the conservative Likud Party. Through this alliance Clinton embraced the Oslo peace accords, which were opposed by Likud and the Christian Zionists because the accords called for reductions, however modest, in the expansion of Jewish settlements and asked that Israel withdraw from a significant portion of the West Bank, East Jerusalem and the Gaza Strip.

Shortly after Rabin’s assassination, Likud’s Benyamin Netanyahu became prime minister. Long a favorite of Christian Zionists, he convened the Israel Christian Advocacy Council, inviting 17 U.S. Christian fundamentalists to Israel for a tour of the Holy Land and a conference that produced a statement that resembled the Likud platform with biblical footnotes. The declaration included a blanket rejection of any outside pressure on Israel to abandon the settlements in the West Bank, East Jerusalem, the Gaza Strip and the Golan Heights. The Christian group supported a united Jerusalem under Israeli sovereignty rather that a Jerusalem shared by Palestinians and Israelis.

After they returned to the U.S., members of the Israel Christian Advocacy Council launched a campaign, "Christians Call for a United Jerusalem," with full-page advertisements in major newspapers and Christian journals. The advertisement carried several of the familiar Christian Zionist and Likud Zionist themes, including the claim that "Jerusalem has been the spiritual and political capital of only the Jewish people for 3,000 years." Citing Genesis 12:17, Leviticus 26:44-45 and Deuteronomy 7:7-8, the ad stated that "Israel’s biblical claim to the land" was "an eternal covenant from God.’ Among the signers were Pat Robertson of CBN; Ralph Reed, the director of the Christian Coalition; Jerry Falwell; Brandt Gustafson, president of the National Religious Broadcasters’ Don Argue, president of the National Association of Evangelicals; and Ed McAteer of the Religious Roundtable, one of the first Christian Zionist organizations in North America.

The ad campaign was a direct response to a campaign by mainline Protestant, Orthodox and Catholic churches, launched in April 1997, for "a shared Jerusalem." The United Jerusalem campaign claimed that the Christian Zionists and fundamentalists spoke for all evangelicals in North America, stating that "the signatories and their organizations reach more than 100 million Christian evangelicals weekly." These inflated numbers were meant to impress members of Congress, the media, and any evangelicals who took a different view.

In the late 1990s donations to Israel and to the Jewish National Fund declined because of the tensions between Orthodox Jews in Israel and Reform and Conservative Jews in the U.S. The loss of funding caused the Likud to turn to Christian Zionists for assistance, an appeal that met with a quick response. Additional support came through a campaign led by the International Fellowship of Christians and Jews, headed by a former Anti-Defamation League employee, the Orthodox rabbi Yechiel Eckstein. In 1997 this campaign claimed that it raised over $5 million from fundamentalist Christians. John Hagee’s Cornerstone Church in San Antonio, Texas, presented Eckstein with more than $1 million -- funds for resettling Jews from the Soviet Union in the West Bank and Jerusalem.

The Christian fundamentalist and Christian Zionist worldview converges with the agenda of neoconservatives like William Kristol, editor of the Weekly Standard; syndicated journalists William Safire and Charles Krauthammer; and the chief advisers in the Bush White House -- Paul Wolfowitz, Richard Perle, Douglas Feith and Elliott Abrams, Many of these figures used to work with pro-Israel think tanks such as AIPAC; MEMBI (Middle East Media Research Institute); JINSA (Jewish Institute for National Security Affairs); and the Washington Institute for Near East Policy. Perle narrowly escaped conviction for trading intelligence secrets with Israel in the late 1970s, and Abrams was convicted (and pardoned by Reagan) in the Iran-Israel-contra weapons and financial scandal.

The neo-conservatives’ quest for U.S. domination of the oil fields in the Middle East and of military and economic geopolitics in that region aligns neatly with the views of Harvard scholar Samuel P. Huntington, whose "clash of civilizations" theory divides the world into the West vs. the Rest. In the Huntington scenario, Islam is the force most hostile to U.S. interests -- a point of view that fits well with the "evil empire" rhetoric and the Antichrist scenarios found among the Christian Zionists. The "clash of civilizations" rhetoric often takes on theological overtones, as it did in the president’s 2002 and 2003 State of the Union addresses.

The advisers around Secretary of Defense Donald Rumsfeld and Vice President Dick Cheney are driving their views home. Anatol Lieven, writing recently in the London Review of Books, points to a 1996 policy paper "A Clean Break: A New Strategy for Securing the Realm," by Perle and Douglas Feith, which advised Netanyahu to abandon the Oslo peace process and return to military repression of the Palestinians. The policy statement was developed in an Israeli think tank, the Institute for Advanced Strategic and Political Studies. The document seems to have played a large part in shaping the Bush administration’s strategy on Iraq, and perhaps for redrawing Middle East borders according to the Likud vision.

Rumsfeld’s support of Israel’s illegal settlements, which he views as Israel’s "right" for having conquered the Palestinian territories, indicates that he agrees with Perle and Feith. Few have mentioned that Rumsfeld’s position violates existing U.S. policy, let alone international law and the international consensus on the issue. Republican Dick Armey, former House majority leader, agrees, and even advocates ethnic cleansing ("transfer") in Palestine.

Israel’s leading voice for "transfer," Tourism Minister Benny Elon, recently met with several members of the House and Senate, including Armey and Lindsey Graham (R., S.C.) to advocate transferring Palestinians to Jordan. While Elon’s views are linked with the radical fringe in Israel, his "transfer" concept is gaining support among Christian Zionist legislators and key spokesmen of the Christian right. Some Israeli analysts speculate that the purpose of Elon’s visit was to urge Israel’s "friends" in the Christian right and on Capitol Hill to tell the president not to pressure Israel to surrender land and settlements to a future Palestinian state. Newsweek’s June 2 edition reported that in mid-May, just prior to Elon’s visit, several Israeli officials contacted Bauer to rally the Christian right in opposition to the "road map.

The dominance of Christian right, Christian Zionist and Likud policies in the Bush administration reflects political realities, In 1987 polls indicated that the Christian right represented 26 percent of the total membership of the Republican Party. By 1999 that number had increased to 33 percent and was rising. The influence of pro-Israel groups and Christian Zionists in such vital swing states as Texas, California and all-important Florida may well have been the deciding factor for Bush in the 2000 election. Bush is very aware that he owes a political debt to this voting bloc.

The May 2002 "Washington Rally for Israel," which drew, according to some accounts, well over 100,000 people to the Washington Mall, illustrates the influence of these forces. An impressive lineup of U.S. politicians was joined by leading voices from the Christian right, Likud and mainstream American Jewish organizations. The list included Netanyahu; Wolfowitz; Holocaust writer Elie Wiesel; New York Governor George Patald; former New York Mayor Rudolf Guliani; U.S. Senators Arlen Specter (H., Pa.) and Barbara Mikulski (D., Md.), and leading members of the House such as Armey (R., Tex.) and Richard Gephardt (D., Mo.).

However, the loudest cheers at the rally went not to these political leaders but to a voice relatively unknown to the secular media, Christian radio personality Janet Parshall, host of the nationally syndicated Janet Parshall’s America. Parshall drew a deafening ovation when she proclaimed: "I stand before you today representing the National Religious Broadcasters. . . . We represent millions of Christian broadcasters in this country. We stand with you now and forever. . . . I am here to tell you today, we Christians and Jews together will not labor any less in our support for Israel. We will never limp, we will never wimp, we will never vacillate in our support of Israel."

The cozy partnership contains many contradictions, not the least of which is that within the Christian premillennial dispensationalist scenario, Jews ultimately have two options: either convert to Christianity or be incinerated at Armageddon. Israeli author Gershon Gorenberg (The End of Days) notes that dispensationalism is essentially a four-act play, "where we as Jews disappear in the fourth act, just prior to the return of Jesus." Further, anti-Semitism is often just beneath the surface among Christian Zionists and fundamentalists. Just two years ago Jerry Falwell claimed that "God told him" that the Antichrist is a Jew living in Romania -- a statement for which he later profusely apologized. And the Christian right’s agenda includes the creation of a "Christian America."

Despite these contradictions, not only AIPAC but mainstream Jewish organizations such as the Council of Presidents of Major Jewish Organizations and the Anti-Defamation League/BnaiBrith have allied themselves with Christian Zionist organizations such as the Christian Coalition, Religious Round-table and the 700 Club. Surprisingly, many progressive Reform Rabbis have expressed public support for the Christian Zionists and the Christian right, knowing full well that the Christian right’s theological and political agendas are contrary to the Reform Jewish community’s longstanding progressive stance on civil liberties and human rights.

I once asked Israel’s director of religious communities if he was aware of the implications of the alliance with fundamentalist Christians, particularly in light of their history of anti-Semitism, their dedication to the Christianizing of America, and the "convert or fry" Armageddon scenarios. His response was: "Of course we know all this, but we will take support wherever we can get it, and their numbers are significant. We do keep them on a short leash, however." At the April rally Ambassador Ayalon told the crowd, "We share the same belief in God and we share the same destiny"-- that appears to be crafted along the of the Likud Party platform.

The inevitable clash between Likud/Christian/ Zionist ideology and the promise of the road map inevitably will come to the fore as the 2004 presidential election campaign heats up. Mitri Raheb, the Palestinian pastor of Bethlehem’s Christmas Lutheran Church, fears that if the pro-Israel voices prevail, the "road map" will turn into a "road trap" for Palestinians and for those Israelis committed to a two-state solution.

Muslims and other non-Jewish religious minorities in the U.S. have no standing with the Christian right; indeed, Christian Zionists are openly hostile toward Islam. Though an evangelical-Islamic dialogue has begun, it is too new to begin to counter the voices of outspoken Christian right leaders such as Franklin Graham, Jerry Falwell and Pat Robertson, who have consistently portrayed Islam as an evil force that will align itself with the Antichrist to attack Israel, leading to the Battle of Armageddon.

This doctrine fits well with the Bush "axis of evil" concept, which can readily be applied to any nation in which Muslims are the majority. Now that Afghanistan and Iraq are occupied by the U.S., neoconservatives, Israeli politicians and the Christian right seem to be targeting Iran, Syria and possibly Saudi Arabia. In a revealing remark in London in December 2002, Sharon noted that once the U.S. and its allies dispose of Saddam Hussein and Iraq, Iran will be next on the list. Will such views dominate U.S. foreign policy? Powell’s speech to the AIPAC convention on March 30 included a warning to Iran and Syria -- an indication Sharon’s vision is alive in U.S. policy.

The Christian Zionist distortions of historic evangelical and orthodox theology must be debated and confronted primarily by evangelicals but also by mainline Protestants, whose churches sometimes absorb these doctrines. Christian Zionist and dispensationalist thinking appears to be growing in influence, especially in the Bible Belt and pockets of the West Coast and rural America. As it spreads it will dominate more and more of our culture and thus exert a growing influence on politics. Christian and Jewish theologians need to attend to the deep inroads made by millennial theology and its political alliances.

The biblical hermeneutic of Christian Zionism distorts biblical texts by reading them out of their canonical and historical context, making them seem more like such fictional works as the "Left Behind" series than the whole Word of God. The Christian Zionist worldview elevates Israel to a political entity not accountable for keeping Torah or obeying the norms of international law. In its justification of Israel’s illegal program of land confiscation, demolition of homes, targeted assassinations and continued transfer of Palestinians from their homeland, the Christian right and revisionist Zionist ideology encourage the breaking of the Ten Commandments and the Levitical codes. Christian Zionists have traded the mantle of the biblical prophets for an idolatry of militarism and the nation state.

An additional task for Christians is to make a closer examination of ecclesiology. Christian Zionism is grounded in a reductionist ecclesiology in which the state is elevated above the church. Such a view is inconsistent with the New Testament and traditional Christian theology. Darby’s doctrine that the church is a "mere parenthesis" enables Christians to minimize the role of the global church and to ignore or openly despise Palestinian Christians.

If present trends continue, the Palestinian Christian community, which claims a historic continuity dating back to the first disciples, will disappear from the Holy Land, leaving behind nothing but museums or shells of churches. Palestinian Christians are fleeing their homeland not because of Islamic fundamentalism, as many Israelis and Christian Zionists would have us believe, but because their lives, livelihood, families and future are doomed by the continued Israeli occupation. In providing political and economic support for Israeli militancy against Palestinian Christians and Muslims, Christian Zionists are aiding the collapse of Christianity in the Holy Land.

Evangelicals and Israel: Theological Roots of a Political Alliance



When Israeli Prime Minister Benjamin Netanyahu visited Washington this past January, his initial meeting was not with President Clinton but with Jerry Falwell and more than 1,000 fundamentalist Christians. The crowd saluted the prime minister as "the Ronald Reagan of Israel," and Falwell pledged to contact more than 200,000 evangelical pastors, asking them to "tell President Clinton to refrain from putting pressure on Israel" to comply with the Oslo accords.



The meeting between Netanyahu and Falwell illustrates a remarkable political and theological convergence. The link between Israelis Likud government and the U.S. Religious Right was established by Natanyahu's mentor, Menachem Begin, during the Carter and Reagan administrations. However, the roots of evangelical support for Israel lie in the long tradition of Christian thinking about the millennium.



In Luke's account of the ascension, the disciples ask Jesus, "Lord, is this the time when you will restore the Kingdom to Israel?" The question illustrates the early church's fascination with Israel and its prophetic role at the end of history--a fascination that continues to this day. Reflections on the end times draw on the Book of Daniel, Zechariah 9-14, Ezekiel 38-39 and various apocryphal books, as well as Matthew 24, the early Pauline letters (1 Thess. 4:16-17; 5:1-11) and the Book of Revelation.



An early version of Christian eschatology, called "historic premillennialism," held that Jesus would return and establish his millennial kingdom after the world had been evangelized. However, by the 18th century another model of eschatology emerged in England that emphasized the role of a reconstituted Israel in the end times. This eschatology was rooted in three streams of British Christianity: the piety of English Puritanism; the view that Britain was the "new Israel," a theme that dates back at least to the seventh century and the Venerable Bede; and a hermeneutic that interpreted biblical prophetic texts as having a literal, future fulfillment. Among the forerunners of this movement was Sir Henry Finch, a prominent lawyer and member of Parliament. In 1621, Finch wrote a treatise in which he called upon the British people and its government to support Jewish settlement in Palestine in order to fulfill biblical prophecy.



As the year 1800 approached, several premillennial theologies emerged as a result of the insecurity surrounding the American and French revolutions. Among them were various utopian movements and the Millerites (a group that later became Seventh-day Adventists). During this period John Nelson Darby (1800-82), a renegade Anglican priest from Ireland, popularized and systematized eschatological themes while simultaneously developing a new school of thought which has been called "futurist premillennialism."



During 60 years of unceasing travel and preaching across the European continent and North America, Darby converted a generation of evangelical clergy and laity to his views. Darby held that biblical prophecies and much of scripture must be interpreted according to a literal and predictive hermeneutic. He believed that the true church will be removed from history through an event called the "rapture" (I Thess. 4:16-17; 5:1-11), and the nation Israel will be restored as God's primary instrument in history.



According to Darby, Christians must interpret history in light of seven epochs or "dispensations," each of which reflects a particular manner in which God deals with humanity. For example, we currently live under the dispensation of "Grace," whereby people are judged according to their personal relationship with Jesus Christ. This hermeneutical method is called dispensationalism.



According to the dispensational model, a time of turmoil lies ahead, but believers will be "raptured" away before it begins. This period of tribulation will culminate in the final battle at Armageddon, a valley northwest of Jerusalem. As evangelical historian Timothy Weber points out, for premillennialists "the historical process is a never-ending battle between good and evil, whose course God has already conceded to the Devil.. . . History's only hope lies in its own destruction."



Through Darby's influence, premillennial dispensationalism became a dominant method of biblical interpretation and influenced a generation of evangelical leaders, including Dwight L. Moody. Perhaps the most influential instrument of dispensational thinking was the Scofield Bible (1909) which included a commentary that interpreted prophetic texts according to a premillennial hermeneutic. Another early Darby disciple, William E. Blackstone, brought dispensationalism to millions of Americans through his best seller Jesus Is Coming (1882). Blackstone organized the first Zionist lobbying effort in the U.S. in 1891 when he enlisted J. P. Morgan, John D. Rockefeller, Charles B. Scribner and other financiers to underwrite a massive newspaper campaign requesting President Benjamin Harrison to support the establishment of a Jewish state in Palestine.



Similar efforts were under way in England, led by the social reformer Lord Shaftesbury, who, like Blackstone, was so taken with Darby's eschatology that he translated it into a political agenda. These seeds of the Christian Zionist movement preceded Jewish Zionism by several years. Loni Shaftesbury is also credited with coining an early version of the slogan adopted by Jewish Zionist fathers Max Nordau and Theodor Herzl: "A land of no people for a people with no land." Both Lord Arthur Balfour, author of the famous 1917 Balfour Declaration, and Prime Minister David Lloyd George, the two most powerful men in British foreign policy at the close of World War I, were raised in dispensationalist churches and were publicly committed to the Zionist agenda for "biblical" and colonialist reasons.



The establishment of Israel in 1948 gave dispensationalism new momentum. The restoration of a Jewish nation was taken as a sign that the clock of biblical prophecy was ticking and we were rapidly approaching the final events leading to the return of Jesus. During the cold war, dispensationalists readily interpreted the Soviet Union and its allies as the Antichrist. Passages such as Ezekiel 38-39 were read as predictions of an impending Soviet attack on Israel. A ten-member confederation--often interpreted as the European Union--was expected to join the Soviet Union in this attack.



When Israel captured Jerusalem in the 1967 war; dispensationalists were certain that the end was near. L. Nelson Bell, Billy Graham's father-in-law and editor of Christianity Today, wrote in July 1967: "That for the first time in more than 2,000 years Jerusalem is now completely in the hands of the Jews gives the student of the Bible a thrill and a renewed faith in the accuracy and validity of the Bible."



By the early 1970s numerous books, films and television specials publicized the premillennial dispensationalist perspective. Hal Lindsay made a virtual industry out of his book The Late Great Planet Earth: it sold more than 25 million copies and led to two films, as well as a consulting business with a clientele that has included several members of Congress, the Pentagon, and Ronald Reagan.



In the mid 1970s at least five trends converged that accelerated the rise of Christian Zionism. First, evangelical and charismatic movements became the fastest-growing branch of North American Christianity. Mainline Protestant denominations and the Roman Catholic Church were declining both in budgets and attendance.



The election of Jimmy Carter; a Southern Baptist Sunday school teacher; to the presidency in 1976 increased the visibility and legitimacy of the once-marginalized evangelical movement. Time magazine declared 1976 "the year of the evangelical." Still, the mainstream media seemed confused by the various traditions and polarities within the complex evangelical movement, failing to distinguish between the diverse political and theological voices clamoring to claim the term "evangelical" for their particular viewpoint.



Israel's occupation of Arab lands after 1967 created tension between many Jewish organizations and the mainline Protestant, Eastern Orthodox and Catholic communities. Many Jewish organizations, particularly lobbying groups such as the American Israel Political Affairs Committee (AIPAC), turned to the growing evangelical community for support. As Rabbi Marc Tanenbaum of the American Jewish Committee stated, "The evangelical community is the largest and fastest-growing bloc of pro-Jewish sentiment in this country." AIPAC and the Anti-Defamation League (ADL) added staff to focus on relationships with evangelicals and fundamentalists. The Israeli ministry of tourism eyed evangelicals as a major new market for Holy Land tours and thus a source of revenue.



The fourth factor that stimulated the emerging evangelical Christian Zionist movement's political agenda was the election of Menachem Begin as Israel's prime minister in May 1977. Prior to Begin's election, Israeli politics had been dominated by the secular Labor Party. Begin's Likud Party was dominated by hard-line military figures such as Raphael Eitan and Ariel Sharon, and supported by the increasingly powerful settler movement and by small Orthodox religious parties. Likud constituencies used the biblical names "Judea and Samaria" for the West Bank and employed a religious argument to justify Israel's confiscation of Arab land for settlements: since God gave the land exclusively to Jews, they have a divine right to settle anywhere in Eretz Israel. Evangelicals welcomed the Likud leaders and endorsed their political and religious agendas.



The final development that accelerated the alliance between Likud and the Religious Right was Carter's March 1977 statement that he supported Palestinian human rights, including the "right to a homeland." Likud, when it came to power just two months later; immediately reached out to Christian evangelicals. Likud's strategy was simple: split evangelical and fundamentalist Christians from Carter's political base and rally support among conservative Christians for Israel's opposition to the United Nations' proposed Middle East Peace Conference.



Within weeks, full-page advertisements appeared in major U.S. newspapers stating, "The time has come for evangelical Christians to affirm their belief in biblical prophecy and Israel's divine right to the land." Targeting Soviet involvement in the UN conference, the ad went on to say: "We affirm as evangelicals our belief in the promised land to the Jewish people . . . . We would view with grave concern any effort to carve out of the Jewish homeland another nation or political entity."



The ad was financed and coordinated by Jerusalem's Institute for Holy Land Studies, an evangelical organization with a Christian Zionist orientation. Several leading dispensationalists signed the ad, including Kenneth Kantzer of Christianity Today and Trinity Evangelical Divinity School, singer Pat Boone, and dispensationalist theologian and Dallas Theological Seminary president John Walvoord.



The advertising campaign was one of the first public signs of a Likud-evangelical alliance. A former employee of the American Jewish Committee, Jerry Strober, who had coordinated the campaign, made the political connection in a statement to Newsweek: "[The evangelicals] are Carter's constituency and he [had] better listen to them... The real source of strength the Jews have in this country is from the evangelicals."



At times the new alliance was uncomfortable for Jewish leaders. On one such occasion, the president of the Southern Baptist Convention, Bailey Smith, stated that "God does not hear the prayers of the Jews." Within weeks, the AIC took Smith on a trip to Israel and corrected his views. While Christian Zionists and Jewish organizations agree on many points, the Christian Right's enthusiasm for evangelizing Jews remains an unresolved point of tension.



Evangelicals, major Jewish organizations and the pro-Israel lobby supported Ronald Reagan in the 1980 election. Carter's loss of the evangelical vote played a significant role in his defeat. Likud policy was aggressively represented by AIPAC both on Capitol Hill and within the Reagan administration. For example, when Israel decided to invade Lebanon in the spring of 1982, Begin sent Ariel Sharon, his defense minister, to Washington to enlist the Reagan administration's support. By late May, Sharon was reportedly given the green light by Secretary of State Alexander Haig. Within days of the June invasion, full-page ads appeared in leading newspapers requesting evangelical support for the invasion.



Begin developed a unique relationship with Reagan and many fundamentalist leaders, especially Jerry Falwell. Falwell and his Moral Majority had long supported Israel. In 1979, Grace Halsell reports, Israel gave Falwell a Lear jet and in 1981 gave him the prestigious Jabotinsky Award during an elaborate dinner ceremony in New York. When Israel bombed Iraq's nuclear plant in 1981, Begin called Falwell before he called Reagan. He requested that Falwell "explain to the Christian public the reasons for the bombing."



In March 1985, while speaking to the conservative Rabbinical Assembly in Miami, FaIwell pledged to "mobilize 70 million conservative Christians for Israel and against anti-Semitism." He also takes credit for converting Senator Jesse Helms (R., N.C.) into one of Israel's staunchest allies. Helms soon became chair of the Senate Foreign Relations Committee.



The Reagan administration regularly conducted briefings and seminars for its Christian Right supporters, briefings in which the pro-Likud lobby (Americans for a Safe Israel and AIPAC) participated. Among the approximately 150 Christian fundamentalist leaders invited to each event were Hal Lindsay, Jimmy Swaggart, Jim and Tammy Bakker; Pat Robertson and Tim and Bev LeHaye.



Reagan himself was a committed Christian Zionist. His support for Israel derived from both strategic political concerns and a vague dispensationalist perspective. He told Tom Dine, AIPAC's executive director; "I turn back to your ancient prophets in the Old Testament and the signs foretelling Armageddon, and I find myself wondering if we re the generation that is going to see that come about." The remark was published by the Jerusalem Post and widely distributed by the Associated Press.



Netanyahu's 1996 defeat of Shimon Peres brought Likud back to power. During his years as Israel's representative at the UN, Netanyahu spoke regularly on the Christian Bight's "Prayer Breakfast for Israel" circuit and similar venues. Within a few months of his election, in conjunction with the Israeli ministry on tourism, he convened the Israel Christian Advocacy Council. Seventeen American evangelical and fundamentalist leaders were flown to Israel for a tour of the Holy Land and a conference at which they pledged support for what was essentially a Likud agenda. Included in the delegation were Don Argue, president of the National Association of Evangelicals; Brandt Gustavson, president of the National Religious Broadcasters (an organization that oversees approximately 90 percent of Christian radio and television broadcasting in North America); and Donald Wildmon, president of the American Family Association. The evangelical leaders signed a pledge expressing the hope that "America never; never desert Israel."



Several members of the Advisory Council backed the pro-Israel advertisement in the April 10, 1997, New York Times. Titled "Christians Call for a United Jerusalem," the ad may have been a direct response to a December 1996 Times ad sponsored by Churches for Middle East Peace, calling for a "Shared Jerusalem."



The Christian Zionist ad claimed that its signatories reach more than 100,000 Christians weekly and called for evangelicals to support the Likud position on Jewish sovereignty over Jerusalem. Using several familiar dispensationalist themes, the ad claimed: "Jerusalem has been the spiritual and political capital of only the Jewish people for 3,000 years." Citing Genesis 12:17, Leviticus 26:44-45 and Deuteronomy 7:7-8, it spoke of Israel's biblical claim to the land. The ad was signed by Pat Robertson of the Christian Broadcasting Network; Ralph Reed, then director of the Christian Coalition; Ed McAteer of the Religious Roundtable; and Falwell, among others. Voicing one of Netanyahu's themes, the ad asked that Israel "not be pressured to concede on issues of Jerusalem in the final status negotiations with the Palestinians."



Likud also turned to evangelical and fundamentalist Christians to offset the decline in contributions for Israel from the American Jewish community. In response to the increasing power of the Orthodox parties in Netanyahu's government and the second-class status these parties assigned to non-Orthodox Jews, Reformed and Conservative Jewish communities cut back their usual generous contributions to the Jewish National Fund and other agencies in the U.S. that support Israel. But the International Fellowship of Christians and Jews, led by Rabbi Yechiel Eckstein of Chicago, raised more than $5 million for the United Jewish Appeal, almost all of it from evangelicals and fundamentalists.



In a separate initiative, John Hagee, pastor of the Cornerstone Church in San Antonio, Texas, and a signer of the Christians for a United Jerusalem Statement, announced in February of this year that his church was giving more than $1 million to Israel. He claimed that the money would be used to help resettle Jews from the former Soviet Union in the West Bank and Jerusalem. "We feel like the coming of Soviet Jews to Israel is a fulfillment of biblical prophecy," Hagee stated. When asked if he realized that his support of Israel's Likud policies was at cross-purposes with U.S. government policy and possibly illegal, Hagee retorted: "I am a Bible scholar and a theologian and from my perspective, the law of God transcends the law of the United States government and the U.S. State Department."



While the U.S. and European governments in 1997 were pressing Netanyahu to negotiate with the Palestinians, the prime minister's public relations specialists developed another strategy involving the cooperation of Christian Zionist organizations in Jerusalem. The initial phase of this strategy was launched in an October 22, 1997, report on Israeli Radio (Kol Israel) News, a report claiming that the Palestinian National Authority (PA) was persecuting Christians.



Two days later the Jerusalem Post published an article charging that, according to a new Israeli government report, "the few Christians remaining in PA-controlled areas are subjected to brutal and relentless persecution." The report alleged that "'Christian cemeteries have been destroyed, monasteries have had their telephone lines cut, and there have been break-ins to convents.'" Moreover; the Palestinian Authority "has taken control of the churches and is pressuring Christian leaders to serve as mouthpieces for Yasser Arafat and opponents of Israel"



A month later; Congressman J. C. Watts (R., Okla.) reiterated these charges in the Washington Times, blaming Arafat and the PA for the Christian exodus from the Holy Land and calling into question the $307 million in grants the U.S. has given the PA.



Palestinian Christian leaders were quick to respond. Said Bethlehem mayor Hanna Nasser, a Christian: "Our churches have complete freedom, and I've never heard that they've been under pressure." Mitri Raheb, pastor of Bethlehem's Lutheran church, challenged the Israeli report as pure propaganda. He noted that while Bethlehem was under Israeli occupation, his house had been robbed and his car stolen twice; but "there have been no robberies since the Palestinian Authority has taken over. On the contrary, there is a greater sense of security now than there was under occupation."



Last May, Evangelicals for Middle East Understanding and Open Doors International sent a 14-member team to the Holy Land to investigate the allegations of persecution. The delegation interviewed more than 60 spokespersons in Israel and the Palestinian territories, including a number of Christian leaders; Uri Mor, director of the Israeli Ministry of Religious Affairs in the Department of Christian Communities; and several Christian Zionist leaders.



The delegation concluded that though there were isolated incidents of discrimination and increased tension between Christian and Muslim communities in certain areas, there were no cases that could be characterized as persecution in the territories under the Palestinian Authority. Four converts from Islam to Christianity had experienced pressure from their families and communities. One or two who had criminal backgrounds had been pressured by the PA. But in neither case could the context and reasons for the pressure be construed as persecution. Furthermore, though some Christian Palestinians are concerned that if Islamic law (Shari'a) becomes the law of the Palestinian areas, the religious freedom of Christians maybe restricted in the future, no evidence of this development is present.



The investigative team found "disturbing indications of political motivations behind [the] recent publicity about Christian persecution." The team learned that a Christian Zionist group, the International Christian Embassy--Jerusalem, had cooperated with the office of David Bar-llan, Netanyahu's chief spokesman, in exaggerating accounts of Christian persecution and circulating them to the international press. A staff member of the U. S. consulate in Jerusalem interviewed Mor; the Israeli religious affairs official, who stated that the report was intended to be an internal document, but Bar-llan's office leaked it to the Christian and secular media.



Asked why the prime minister's office would do such a thing, Mor noted that Bar-llan uses such information as his "bread and butter" in the Israeli propaganda war against the PA. Clearly, there was no attempt by either the Israeli government or the Christian Embassy to note the criminal status of some claiming to be persecuted, or to distinguish between persecution and understandable pressure from families or communities opposing a member's conversion to another faith.



It is true that Palestinian Christians are leaving the Holy Land. But it is not because of Muslim persecution. They are leaving because of the brutality of Israeli occupation and because Israel's resistance to negotiating a just peace with the Palestinians makes them despair about the future.



At this juncture, it appears that the hardline Likud position has the backing of both houses of Congress, the major Jewish lobbies, and the Christian Right. President Clinton and those who advocate the Israeli Labor Party peace formula, or the Oslo Accords, have little leverage with Likud. Palestinian Christians and their supporters fear that the Christian Right's alliance with Likud may in the end serve as a self-fulfilling prophecy, heightening tensions in the region and leading to a new round of conflict in the Holy Land, which the Christian Zionists will readily interpret as "the final battle."





Live and Let Die: Changing Attitudes

 America's support for "the right to die" increased significantly during the 1980s, according to data in the National Opinion Research Center's annual General Social Survey. Though most Christian denominations oppose the idea, support has grown in all religious segments of society. Even most "fundamentalists" now agree that "doctors should be allowed by law to end the patient's life by some painless means if the patient and his family request it." And it is only among the fundamentalists that a majority does not support the notion that "a person has the right to end his or her own life if the person has an incurable disease."

Surveys measure attitudes, they do not determine moral norms. One does not arrive at ethical values by counting noses. To report these findings is not to endorse them. Moreover, it should be noted that the responses do not necessarily indicate that the respondents would personally follow such courses of action. Nor do the responses guarantee support for specific legislation. They do, however, represent a dramatic social change in a brief period of time.

In the late 1970s three out of five Americans endorsed the right to die and two out of five the right to suicide. By the late 1980s almost three-quarters supported the right to die and almost half the right to suicide. Attitudes toward the right to die vary with age and education, with younger and better-educated respondents more likely to support it. The education factor is negligible, however, given that younger people have higher educational attainments than older people. Age does not account for the change in attitudes either, for the increase in support occurs at all age levels. Moreover, it has occurred in every segment of religious life. Jews and liberal Protestants are most likely to support the right to die and Catholics and fundamentalists least likely. But all groups, including moderate Protestants, show an increase in support both for the right to die and the right to suicide. By the late '80s half the fundamentalists supported the right to die and two out of five the right to suicide, and seven out of ten Catholics supported the right to die and almost half the right to suicide.

Only among Jews and only on the right to suicide is there little change-from 80 percent support at the end of the '70s to 82 percent at the end of the '80s. Perhaps that figure is as high as such support can go. It may also represent where the rest of the population will be at the end of this decade, given the rate of change in the past decade.

These changes do not represent the replacement of older people with more conservative ideas by younger people with more liberal ideas. Responses from 1977 and 1978 and from 1988 to 1990 were pooled to provide enough respondents from the various cohorts at two points in time. During the decade each cohort born since the 1920s has increased its support for the right to die. The change then is not only populationwide but among all major age cohorts.

This change is not the result of an increase in support for abortion during the same period. While there is, as one might expect, a correlation between right-to-die attitudes and abortion attitudes, there has been no statistically significant change in responses to NORC's battery of questions about the legality of abortion. Nor has there been a change in support for suicide in cases that do not involve an incurable disease. When the motive for suicide concerns bankruptcy, dishonor or depression, fewer than one of ten Americans approves.

Among liberal Protestant, moderate Protestant, fundamentalist and Catholic Christians, approval of the right to die increased approximately ten percentage points in the decade. On the right to suicide the approval rate increased ten percentage points for liberals and between 15 and 20 percentage points for the other three groups.

This drastic change has taken place in a decade in which there was no decline in belief in God or belief in life after death or the inspiration of the Bible or in church attendance, and a statistically significant increase in the frequency of prayer.

How can one account for such a shift? Survey data suggest that religious practice and religious dogma are less likely to affect attitudes on the right to die than moralistic styles and fundamentalist orientations. In my research I have been interested in the impact of religious imagery, particularly the difference between regarding God as a "spouse" and regarding God as a "master." The image of God as spouse appears more frequently as support for the right to die increases both among those who picture themselves as "very close to God" and those who do not see themselves that way. The notion that morality is a personal matter does not shape attitudes on the right to die among those who picture God as "spouse." But among those who picture God as "master," those opposed to the idea that morality is a personal matter show diminished support for the right to die. Support for the right to die rises 30 percentage points for those who picture God as a "spouse" and have not tried to convert anyone else to Jesus, as opposed to those who picture God as a "master" and have tried to make converts. Religious imagery therefore makes a big difference.

Might it be that those who picture God as living are less likely to try to impose moral choices by law and to leave moral judgment to heaven rather than the courts? If this speculation has any validity, the dramatic change in attitudes toward the right to die may be part of a more general drift since World War II toward greater tolerance, as evidenced by an increase in respect for the civil liberties of "deviants" of both the "left" and the "right." Changes in attitudes toward the right to die may simply be the result of an increased willingness of Americans to let people do their own thing, even if that is hooking oneself to a suicide machine.

There is some confirmation for this explanation in the NORC data. The General Social Survey asks about attitudes toward the civil liberties of communists, atheists, homosexuals, racists and militarists-about their right to speak in a community, to teach in a college or university, and to have their books available in a public library. Support for the rights of these groups has increased through the 19 years of the survey. In the late'70s 55 percent of Americans would allow books by homosexual authors in public libraries, 62 percent supported gays' right to lecture in the community and 49 percent supported their right to teach in a college. While there has been little change in American attitudes on the morality of homosexuality (a little less than two-thirds think it is always wrong), there has been a shift in attitudes toward the civil liberties of gay people: 63 percent now would allow their books in libraries, 75 percent support their speaking in the community and 63 percent support their teaching in a college.

Thus there are reasons for thinking that the shift in attitudes toward the right to die is to some extent attributable not to a direct change in moral views but to a change in people's tolerance for others' moral views or moral situations. It is tolerable, Americans seem to be saying, that others do their own thing even if their own thing is the termination of life in the context of an incurable disease.

 

The Wounded Self: The Religious Meaning of Mental Suffering

We usually speak of mental illness when the dynamic synthesis of diverse and often opposing forces which constitutes the self becomes seriously unbalanced. But such a classification is misleadingly selective, for the commonly recognized states of self-estrangement are certainly not the only possible ones, perhaps not even the most important ones.

Through the very awareness of their emotional instability the mentally ill are often more in touch with their real selves than are many well-adjusted persons. The forgetfulness of the self in routine work drudgingly performed, in conventional values unquestioningly accepted, in patterns of thought inherited but never interiorized poses an equally serious threat to any authentic self-realization. A person may spend a lifetime in such a stable but closed universe without ever approximating genuine selfhood. If despair means lack of possibility, as Kierkegaard thought, then the spiritually obtuse live all the more in despair as they are less aware of it.

The mentally diseased often find themselves at the opposite extreme. Oppressed by the necessity to choose between multiple possibilities, they find it hard to accept any particular determination of their selfhood at all. In his moving Images of Hope (Helicon, 1966) William F. Lynch has shown how a sense of endlessness is the main source of despair, in the mentally ill. "Perhaps this experience of the terror of endlessness occurs in heightened form in men of achievement and in the ill" (p. 74). The gifted and the sick (and the privilege of the former is often paid for by the curse of the latter) share a high sensitivity to the self’s infinite potential, but also a greater difficulty in achieving concrete self-hood.

This sensitivity may move some to high spiritual accomplishments, while others may be crushed by it, lapse into total insensitivity or adopt an inauthentic, presumably less demanding self. Both groups perceive how limited decisions slowly seal an unchangeable fate. Like the Greeks who equated destiny with character, persons afraid of their own possibilities tragically realize to what extent personal disposition may hold the seeds of their downfall.

Duality and Schizophrenia

The condition of the mentally ill should be particularly instructive to those who are "healthy," reminding them of the precarious complexity of inner selfhood. Or, in Lynch’s Platonic metaphor, the sick are ourselves, writ out in larger letters. Yet society appears to have little use for this object lesson and prefers to seclude the mental patient as a nonperson with whom society’s other members share no common destiny. Our way of treating the ill reveals our fear of entering into the murky depths of selfhood. To regard the mentally diseased merely as "cases" for psychopathology is a convenient way of escaping their message about the fragile condition of the mind as such. In selfhood the primary distinction is not the one that divides the sick from the well, but that which separates the developed from the undeveloped. The more complex the self, the more refined its awareness of itself, the more imminent the threat of mental disorder.

Existential psychiatry has revealed the universal significance of states of mind that we have all too lightly brushed aside as "sick." Schizophrenia is inherent in the nature of a self that is both interior to itself and exterior to others. In its essential duality the self is constantly torn between the mode in which it knows itself and the one in which others envision it. The schizoid person is uncommonly aware of this ontopathic condition. The inner self feels threatened by the outsider’s look. To safeguard that self, one hides it and tries to conform entirely to the way one imagines the outer self appears to others. One avoids being in order to escape self-destruction. In The Divided Self, R. D. Laing discloses the ontological significance of the schizoid attitude:

Thus, to forego one’s autonomy becomes the means of secretly safeguarding it; to play possum, to feign death, becomes a means of preserving one’s aliveness. To turn oneself into a stone becomes a way of not being turned into a stone by someone else. . . . The individual’s actions are not felt as expressions of his self. His actions . . . which I have proposed to call his false-self system, become dissociated and partly autonomous. The self is not felt to participate in the doings of the false self or selves, and all its or their actions are felt to be increasingly false and futile. The self, on the other hand, shut up with itself, regards itself as the "true" self and the persona as false [The Divided Self (Penguin, 1965), pp. 51, 74].

In severing appearance from reality the self creates an outward "persona" (in the original sense of "mask") which no longer reflects the inner self and which may eventually replace it. The cure of this illness, if there be any, cannot consist in a return to the "average" (which drove the self into hiding in the first place) but in the achievement of a new synthesis of selfhood. To the extent that they remain aware of their predicament, mental patients may be in a better position to gain authentic selfhood than "normal" persons, especially in a society that equates normality with hiding the inner self.

Self and Society

As a therapist Laing concentrates mostly on the social conditions that unquestionably define the concrete form of the conflict in our present culture. But his interpretation is clearly ontological; the whole process occurs as a battle between the powers of being and those of nonbeing. The alienated condition reveals a tension in the self as such, rather than in a particular state of society. We first need to understand the nature of selfhood if we are to determine which social and cultural factors obstruct its development. We then find that any society that denies the individual the possibility of spiritual growth and freedom estranges one from oneself, regardless of its material conditions and educational maturity.

Since sensitive persons are most likely to suffer from spiritual privation, a society’s state of mental health is also an index of its spiritual well-being. In a memorable essay Georg Simmel has made this point with all desirable clarity:

The ripening and the proving of man’s spiritual powers may be accomplished through individual tasks and interests; yet somehow, beneath or above, there stands the demand that through all of these tasks and interests a transcendent promise should be fulfilled, that all individual expressions should appear only as a multitude of ways by which the spiritual life comes to itself. This demand expresses a metaphysical condition of our partial and emotional existence, however remote it may seem from our real liLe in the world. It symbolizes a unity which is not simply a formal bond that circumscribes the unfolding of individual powers in an always equal manner, but rather a process of unified development which all individuals go through together ["On the Concept and the Tragedy of Culture," by Georg Simmel, in The Conflict in Modern Culture, translated by Peter Etzkorn (Teachers College Press, 1968), p. 28].

Most social-economic theories of alienation onesidedly ascribe the individual’s state of mind to external conditions, while the positive or negative quality of those conditions depends upon the mind’s own need to create a spiritual environment and must, in the final analysis, be judged by their success or failure to fulfill this need.

Religion’s Need for Alienation

All faiths share a close interest in the various forms of self-alienation. They keep the believer constantly aware of the alarming facility with which the self may disturb or destroy its own fragile synthesis. Their interest is not fortuitous, for the salvation which religion promises presupposes an unsatisfactory state of being that must be remedied. Indeed, religion stands so badly in need of that preliminary feeling of alienation that one may well wonder whether it does not itself create the very condition which it wants to remedy. In any event, while most men and women feel only intermittently, if ever, estranged from themselves, faith thrives on feelings of alienation and takes as its first task to boost them. where it finds such feelings insufficiently present. William E. Hocking perceptively observed:

Religion is often described as the healing of an alienation which has opened between man and his world: this is true; but we may not forget that it is religion which has brought about that alienation. Religion is the healing of a breach which religion itself has made, and if we could reach the original sources we must find them in man’s awareness of an Other than himself [The Meaning .of God in Human Existence (1912), by William E. Hocking (Yale University Press, 1962), p. 238].

Everywhere and at all times religion has taken healing to be one of its principal objectives. Though today no major faith would regard itself as a substitute for medicine, all of them continue to stress faith’s medicinal quality and most have spawned healing sects. Thus Christian Science may be alone in emphasizing the healing quality of Christianity beyond all other aspects, but this quality is closer to the core of their faith than most Christians care to admit.

The confrontation with mental sickness has brought out the best and the worst in religion. To eschatological faiths such as Judaism and Christianity, the triumph of good over evil is an ultimate ideal. Yet impatient for this ideal, faith may attempt to hasten its coming by witch-hunting, heretic-burning and religious persecution. Evil then is elevated to an obsessional object of negative worship, an antigod to be exorcised at any cost. To this mode of thinking nothing more clearly manifests the presence of the Evil One than mental sickness and emotional instability. Naturally inclined to regard mental illness as the work of strange spirits, humanity has been all too ready to interpret it as a symptom of satanic possession. The victims often willingly accepted the part forced upon them by their persecutors in order to secure for themselves a negative identity when society denied them a positive one.

The main quality inherent in such a treatment of the insane is fear -- the fear of a mysterious evil which is ultimately a fear for one’s identity. By exorcising or imprisoning those who are different, people try to protect themselves against the threat to their own identity. But all that is salvaged by such an attitude is the shell that closes off the possibility of authentic self hood.

Nor must we assume that our enlightened times have replaced the behavior of the "dark ages" with open-mindedness. In no previous epoch have the mentally sick been more carefully secluded from society. Whereas nonaggressive patients once remained in the family or moved into another normal community specially equipped for adopting a certain number of the mentally ill, today we lock them up except when practical obstacles (mainly financial) discourage us from doing so.

Healing the Human Condition

Despite outbursts of fanaticism, the Christian faith has throughout its history maintained a deep humanitarianism sorely missing in today’s secular views. I cannot but recall my early experiences as a high school student in the Flemish town of Geel. A pilgrimage place for those suffering nervous disorders and an open colony for the insane since the 13th century, the town has traditionally accommodated the patients in local families and allowed them to partake of almost all aspects of its daily life. Everywhere one encounters them working in the fields, doing errands, worshiping in church, attending children or, if they are children themselves, playing or going to school.

I cannot imagine a more humane attempt to alleviate an unparalleled variety of mental misery. The religious origins (preserved in the legend of a chaste Irish princess who was beheaded on this location by her demented father) are by no means fortuitous to the nature of the arrangement. For it is inspired by a vision of the sick as men and women who in their suffering were chosen to represent the universal alienation of all humankind. Mental degradation symbolizes the human condition in its fundamental need of redemption.

This attitude, I believe, has its roots in the profound religious concept of human alienation. The existence of religious health resorts since antiquity, the expulsion of "evil spirits" in the Gospel, sacramental prayer over the sick -- those practices signify far more than a surpassed stage in the history of medicine. They symbolize that salvation itself is healing and that all actual healing is part of the redemptive process.

This connection is particularly valid for mental sickness. When confronted with the mentally ill, Jesus does not elicit faith before curing them, as he is wont to do for other diseases, but first "liberates" them. For the mental patient is a captive, closed up in an unreal self and, like the deaf-mutes of the Gospel, unable to listen as well as to speak. He is in despair if despair means, as Kierkegaard thought, shut-upness. Unable to relate, vertically and horizontally, he cannot reach out and be in touch with his own transcendence. He needs to be cured before anything else, and the primary religious assistance consists in healing, whether the healer be a minister, a physician or a counseler.

The Solitude of Suffering

In regarding mental illness as symptomatic of the diseased general condition of humankind, faith relativizes the simplistically absolute distinction between the sick and the well, and calls attention to a deeper, less obvious level of selfhood. In the eyes of faith we are all diseased, and salvation must be to all what is so evidently needed by some -- a healing of the self. Yet religious healing never simply coincides with an ordinary cure, even for those whose physical or mental health is fully restored.

If the true disease lies beyond the obvious illness, the cure also must reach beyond the symptoms. Mental suffering itself then is assigned its unique place in the all-comprehensive plan of salvation. It creates the solitude in which alone persons may find genuine transcendence. Only in suffering -- and all suffering, whatever its origin, in the end is mental -- am I most totally alone. My suffering is exclusively mine: it bears my name as no other experience does, because it isolates me from others. Only this painful isolation gives access to the bottom depth of selfhood -- the locus of transcendence -- which remains unsuspectedly hidden to the untroubled mind.

This is the meaning of Kierkegaard’s brutal saying that most people do not feel sick enough to accept faith. His own mental disarray, culminating in the breach of his engagement, illustrates the existential significance of suffering as presupposed by the religious doctrine of redemption. In a short entry in his diary Kierkegaard describes how he experienced the complex dialectic of suffering in his relation to Regine Olsen:

My greatest pleasure would have been to marry the girl to whom I was engaged; God knows how much I wanted to: but here again is my wretchedness. And so I remained unmarried, and so I had the opportunity of thinking over what Christianity really meant by praising the unmarried state [Søren Kierkegaard’s Papirer, edited by P. A. Heiberg, V. Kuhr and E. Torsting (Copenhagen, 1909-48), X2, A 61. Alexander Dru, The Journals (New York, 1938), #970].

Only suffering can convey the feeling of insufficiency, without which the human person experiences no need of salvation. The condition of the mentally ill keeps alive the awareness both of the self’s hidden depth and of its insufficiency. It is an object lesson that aids us in finding meaning for our most private and most painful experiences. The present trend is to ignore that lesson and to remove its inconvenient reminders as far as possible from the daily traffic of life.

In the same spirit modern thinking tends to regard a person’s religion as the sum total of his or her mental inadequacies. This view may contain more truth than its contemptuous holders suspect. For without an awareness of basic inadequacy no genuine awareness of transcendence can exist. Yet it loses sight of the fact that religion is not a desperate attempt to cope with matters that cannot be handled otherwise, but a positive vision in which personal suffering is needed for spiritual growth.

Here it appears clearly how the alienation of which faith speaks so often (in its doctrines of sinfulness, of the fall, of redemption) is more than a mere interpretation of experiences that allow a different one. Alienation is first and foremost the awareness of a fact without which religion and its interpretations could not exist. Far from inventing or coloring a particular experience, religion gives expression to a primary experience. It still may well be the most comprehensive expression, because faith alone has sounded the full depth of the experience.

Though we are all acquainted with suffering and feelings of self-estrangement, our ordinary objective ways of articulating them make it easier to suppress than to express the experience. We talk in order to classify in a universal, objective scheme, as a controllable "problem," that which is neither universal nor controllable. Suffering remains strictly each person’s own experience which he or she is unable to share with others. Putting a common label on mental anguish is necessary for medical treatment, but is misleading if we expect the label to reveal the true "nature" of the experience. For suffering possesses none but a private nature, which means no "nature" at all. It speaks out of the most intimate privacy of my selfhood.

To be sure, religious terms are abused as commonly as medical ones. Such phrases as "sinfulness," "insufficiency" or "the need of salvation" may be just as impersonal as psychological ones, and a good deal vaguer. It cannot be our intention here to pit one against the other, but rather to draw attention to the fundamental experience expressed, however imperfectly, in religious doctrines. What those doctrines mean is more than a universal description of the actual state of the world: they describe the most private and most fundamental awareness of insufficiency. As such they reach to the heart of the self.

Seeking Christian Interiority: An Interview with Louis Dupré

You have said that it’s difficult to be Christian in our age. But hasn’t it always been difficult to be a Christian? Why specifically is being a Christian difficult in our time?

Culture as a whole has become secular in a way that it has never been before. One may plausibly argue that the 18th century was the first non-Christian century. Most leading thinkers and artists, even if they were not opposed to Christianity, ceased to take their inspiration from it; secularization became dominant. Still, even at that time Western culture was so penetrated by Christian values and ideas that one might mistake entire passages of Voltaire or Diderot as having been written by believing Christians. Eighteenth-century culture was still steeped in a tradition that had been Christian since its beginning, and it was extremely difficult for these thinkers to free themselves from a language saturated with religion. The 19th century was different. It was an epoch marked by a virulent antitheistic campaign to clean the cultural slate of all Christian traces. Yet these attacks were the work of an elite; culture at large retained distinct remnants of its Christian roots.

Even today ties still exist between Christianity and culture in Europe and more so in the U.S. But on a more fundamental level the West appears to have said its definitive farewell to a Christian culture. Little of the old hostility remains. Our secular colleagues are happy to recognize the debt our civilization owes to the Christian faith to the extent that the faith, having been absorbed by culture itself, has become simply another cultural artifact. Christianity has become an historical factor subservient to a secular culture rather than functioning as the creative power it once was. The new attitude of benign atheism was, I think, prepared in the late 19th and the early 20th centuries by the three most prominent secularizers of the time, Marx, Freud and Nietzsche.

Why single them out? How did they differ from the earlier atheists you mention?

For Marx, Freud and Nietzsche, the idea of forcibly eradicating religion had become unnecessary. Religion for them was a passing symptom that was rapidly vanishing by itself. Already Marx had moved beyond the idea of atheism as a mere assertion of the unreality of God. For Marx, concentrating on atheism distracts us from the positive task of liberating humanity from social oppression. Lenin’s active atheism, in which he used the state to try to destroy religion, is actually a fallback to earlier attitudes about religion. Freud admitted that no one can be forced not to believe. But as rational thought shows nothing in favor of religion and everything against it, to persist in a faith because no argument can decisively refute it is for Freud the sign of a lazy mind. Nietzsche preached a spiritual gospel, a new religion without God, beyond Christianity and atheism, that could still learn much from the old faiths.

Moving further in that direction, contemporary secular culture, especially in its communications media, shows a surprising openness toward religion. But little suggests that this interest surpasses the purely horizontal cultural level. Culture itself has become the real religion of our time, and it has absorbed all other religion as a subordinate part of itself. It even offers some of the emotional benefits of religion, without exacting the high price faith demands. We have all become atheists, not in the hostile, antireligious sense of an earlier age, but in the sense that God no longer matters absolutely in our closed world, if God matters at all.

Why should the secularism of our time pose a more serious challenge to Christianity than the determined antitheism of the past?

Because religion in the 20th century has ceased to integrate public life altogether. By its very nature faith must integrate all other elements of life if it is to survive. Faith cannot simply remain one discrete part of life. My own writing about religion grew out of the fundamental question raised by the new situation: Is religion something that may or may not be very important to humans, or must it in some way integrate all other aspects of existence? I came to the conclusion that if it isn’t somehow everything, it’s nothing.

All societies, even the religious ones of the high Middle Ages or of Calvin’s Geneva or of the Puritan pilgrims, distinguish between sacred and profane. But religion must in some way integrate the profane with the sacred. Obviously, Christianity no longer plays an integrating role in the life of modern societies. Certainly for most people in the West, especially in Western Europe, it has lost its creative, formative power. Christianity has become simply one element of civilization among many others, and by no means the most important. So here lies Christianity’s present predicament. In the past religious integration was handed down by a tradition. But that tradition itself has lost its authority in the eyes of our contemporaries, including most believers. What then ought the Christian to do to survive as a genuine religious believer? I see no alternative but that he or she must now personally integrate what tradition did in the past. Nothing in culture today compels our contemporaries to embrace a religious faith. If they do, they alone are responsible for allowing their faith to incorporate all aspects of their existence. Hence the vital importance of a spiritual life.

The word spiritual is used liberally these days. What do you mean by spiritual life?

A religious life built upon an attitude of personal response to the call of the divine. Such an attitude originates within the self; it is not derived from the force of inherited habits nor from people’s tendency to yield to social pressure. To attain the religious life the believer must be alert to the inner voice. How essential such an attitude has become is evident in light of the massive apostasies that occurred in the 1970s and ‘80s (and continue in Western Europe) when the social pressure in support of religion suddenly seemed to have lifted. Because it lacks roots either in society or the self, people have simply abandoned the faith.

Of course, the term "spiritual" is not exclusively religious. We speak of the spiritual in art and literature. In 1911 the painter Kandinsky wrote an influential work, Concerning the Spiritual in Art. We tend to think of El Greco and Rothko as "spiritual painters," without necessarily considering them religious. But in all these meanings the term refers to a more intense inner awareness of what surpasses ordinary life. The alertness is what counts.

In the frequently quoted conclusion of his book After Virtue, Alasdair Maclntyre claims we must choose between Nietzsche and St. Benedict—between the world of the autonomous individual and the world of the individual in voluntary submission to a community of religious practice. You seem to be describing a third alternative.

Only in this respect: before Benedict came Augustine, the master of interior life. Without one who taught how to reintegrate the fragments of a broken civilization from within, the retreat to the monastery might easily have turned into mere flight. But in part because of Augustine the Benedictine monastery became the heart of medieval culture, the spiritual force that motivated the building of a new civilization.

So in the first place—prior in some sense to submission to a community of religious practice—the new integration you look for requires an inner life?

Exactly. I am reminded of Karl Rahner’s remark that Christianity in the future will be mystical or it will not be at all. That expression may seem too strong—but only if we think the mystical is the exceptional rather than a spiritual experience that belongs to the essence of religion and remains accessible to all believers.

In your opinion, then, the religious problem of our time involves changing individual hearts so that individuals are able to hear and to see religiously.

Yes, and that is not easy, because it implies confronting each person with his or her unique responsibility to decide on a personal attitude toward existence instead of having it conveyed by society or inherited from ancestors. Each person must find his or her own way in the world. This becoming a Christian "from within" is a daunting task. But I can think of no other that would contribute more to the integration of our culture at the present period. We experience our culture as fragmented; we live on bits of meaning and lack the overall vision that holds them together in a whole.

Some postmodern writers pride themselves on the liberating absence of a defining unity. But most of us feel lost in a disconnected universe. We may feel attracted to noncommittal open-endedness. But to survive as human beings we need some coherent meaning in our lives. This may be one of the reasons why the integrated Christian culture of the past has suddenly become so attractive to many of our contemporaries. They feel that the fragments of meaning present to us must somehow be united in a manner that modern culture fails to accomplish. Hence they turn to models from the past. Some join ultraconservative religious or political movements, or they lose themselves in mystics of earlier times as if no cultural distance separated us from the past. Such complete reversals that attempt to abolish modern life are, I think, inauthentic ways for trying to achieve the integration our time needs.

Many Christians (I am one of them) may feel nostalgic for a culture that is more God-oriented than ours, but this religious nostalgia must not be allowed to fly us on a magic carpet to a mystical fata morgana. Nor must the need for integration seduce us to reinvent a Christian "tradition" (mostly intended for "the masses") for social or political purposes, as some social theorists do today in America. They consider religion essential to cultural integration, but their primary concern is not with the truth of faith but with the order of society.

To return to Augustine, what can we learn from him? In contrast with us Augustine lived in a world rich with religious meaning.

Augustine’s world was more religious than ours, certainly. But he shared our predicament of living in a world in which traditional values have collapsed. It’s hard for us to imagine what the end of the Roman Empire must have meant for its citizens. The central questions Augustine addressed in The City of God, questions of absolute moment to him and to his contemporaries, were: What or who is responsible for the end of civilization? What can we do about it? In response to these questions Augustine developed a remarkable form of what I would cautiously call a Christian humanism.

Over the course of his writing life, Augustine combined a number of elements from his fragmented culture—Neoplatonic philosophy, Roman civic morality, the heritage of the great Roman poets, Manichaeism—with his dominant but open-ended Christian faith, into a new synthesis. At first these various components merely formed his own interior life; later they became the seeds of a new culture. But his first concern was to respond to an extremely personal demand. The story of his life in The Confessions is not one of self-actualization—it’s not the record of what we now would call a "life project." It presents his extremely active life as a constant response to what he sensed to be a divine initiative. The Confessions record a dialogue with transcendence in which the first word comes always from the other side. There may be a lesson in this for our own reform plans which all too often—also in religion!—assume the form of a "project."

But Christianity was a new religious force in Augustine’s day. Today, as you say, its power to integrate culture has all but disappeared. Does Christianity still have the capacity to renew?

On a personal level, yes, and through a personal renewal it may spread to small communities which in turn could affect the entire culture. But the time of the res publica christiana—of what some would call Christendom—is past. Both the secularization of the West and the revolution in communications have converted our society into an intrinsically pluralist one. I expect it to remain so for any foreseeable time. But Christianity has always started with a personal conversion of the heart. The case of Augustine reminds us of this primary fact. At a time when culture has become shattered, we, like Augustine, are forced to rebuild it from within. In an odd sort of way, our culture may even be said to foster a move toward such personal renewal by its overwhelming sense of emptiness and its desperate search for a soul. This emptiness itself favors a new openness to transcendence. Our contemporaries experience an intense need for self-integration. (Hence the enormous success of psychoanalysis and methods of self-actualization for other than purely therapeutic purposes.) Whether this need will eventually result in a reintegration on a cultural level I do not know, and I am not overly optimistic about that possibility. But in the end, that is not the Christian’s primary concern.

I do not want this last point to be misunderstood. I am not advocating an interiority that isolates the individual or the Christian community from contemporary culture. I am defending an integral and all-integrating Christian humanism, but one that derives its inspiration from within. Christianity has no right to seclude itself from society. Even the contemplative is responsible for the civilization in which he or she lives. By its very nature spiritual life is transformative of all life. The spiritual Christian belongs to a community. That community is, quite naturally, in the first place one of like-minded and of potentially like-minded persons. It includes one’s church but also that hidden yet intimate communion of spiritual men and women of other faiths. Christians living in an inevitably pluralist society have an obligation to acquaint themselves seriously, without giving in to syncretist tendencies—another symptom of the fragmentation of our time—with the presence of the Spirit in other faiths. They must move to a union beyond tolerance and even, as one spiritual thinker has put it, "beyond dialogue."

In addition, Christians are also responsible for the culture in which they live, however unlike-minded it may be. A genuine Christian interiority must provide the inspiration for a humanism capable of living a vigorous, free and open life within one’s culture, whatever its condition may be. I see no conflict between an interior life and an integral humanism that embraces, from whatever source it may come, "all that is true, all that is noble, all that is just and pure, all that is lovable and gracious," as Philippians says. The spiritual Christian is not involved in constant polemics with the surrounding secular world. Since that person’s force and strength come from within, he or she can grant society and culture their full autonomy.

It may strike some as odd to hear you speak of the creative possibilities of emptiness and need. Those who admit to a fundamental emptiness in life are usually speaking out of despair, not hope. How is emptiness both absence and religious opening?

Indeed, many people never experience any emptiness at all: they keep themselves too busy to experience much absence of any kind. Yet occasionally some unexpected event may strike them with a numbing sense of the meaninglessness of existence. Suddenly they experience the anguish of sinking away into a life devoid of a solid bottom. The author of Psalm 18 expresses this experience when he writes, "The sorrows of death compassed me ... The sorrows of hell compassed me about: the snares of death prevented me." At such moments some may experience a mere absence, with no possible alternative. Others may perceive it as a repeal from the abyss, a call for meaning in the midst of despair. It is worth noting that some contemporary literature has made more of this human experience than theology has. Until recently theology often tended to be a self-supporting system that instead of responding to this existential anguish screened it off by its presumed theoretical sufficiency.

On the other side, it is precisely this sense of emptiness that accounts for the strange attraction mystical literature holds for our contemporaries. The mystics also convey a sense of emptiness not entirely unlike the one we experience about existence. For them the absence of God or what John of the Cross called "the night of the spirit" were experiences as acute and painful as what the a-theism of modern culture evokes in us.

But for John of the Cross the dark night of the spirit arrives at the end of a period of absolute self-abnegation. Writers such as John consider it an advanced status in one’s personal relationship with God. Are you saying that, culturally, we are offered a shortcut?

No, there is no more than a remote analogy between the modern experience of emptiness and the intense longing for the God whom the mystic knows to be present though totally absent from experience. The tendency to compare our poverty to mystical riches has been one of the illusions of a time that reduces all differences to our own measure.

I wonder whether the situation that you describe is unique to our time. Do we find parallels in the Bible, for example?

Not exact ones, of course. But biblical writers have been so outspoken in expressing their despair, their abandonment by God, their emptiness, that their words have lent a voice to distressed Jews and Christians of all times. They continue to do so in our own time, however unprecedented our condition may seem.

In fact, the Bible has assumed a unique significance for men and women living in an age of change and confusion. Ideas that previous generations held to be unshakable have turned out to be very shakable indeed, and those still committed to Christian faith are often at a loss as to what precisely it is they ought to believe. In their perplexity they still may turn to the Word. Even when we have no more religious words of our own, the ones on which our faith rests remain with us. No advanced biblical criticism is needed to let these words speak and to give voice to our own feelings of joy or sadness and even of despair. They translate for us what otherwise might remain unexpressed or constricted within the all too narrow limits of private needs and feelings.

The late Henri Nouwen always had the Psalms nearby to bring "before God" the passing moods and attitudes of the day. Another friend, when reading of the horrors occurring in Bosnia-Herzegovina, could only vent his emotion by turning to the Book of Lamentations. Who has not felt at some time the hopelessness voiced in Psalm 3: "How are they increased that trouble me. Many are they that rise up against me. Many there be which say of my soul, there is no help for him in God"? Or when coming out of despair who cannot identify with the words of Psalm 27: "The Lord is my light and my salvation, whom shall I fear?... Though a host should encamp against me, my heart shall not fear... "?

Spiritual life, as Bishop Joseph Butler knew, rests entirely on analogies. The Bible provides the analogies that enable the believer to convey meaning to private experience. But the Word will extend religious comfort only if we allow it to speak in its own name. The first lesson to learn in a time of need is that of listening. Only when we attentively heed the Word can it lift us beyond ourselves and convey divine meaning to private sorrows.

You speak about a kind of listening to the Word that does not require advanced biblical criticism. But many Christians experience serious difficulties with such an immediate encounter with stories and claims—the specific words—of a faith that originated in a remote past in a cultural context that appears so removed from today’s reality. In many ways it seems easier to be religious in a general sort of way rather than believing according to the specifics of a particular historical faith.

That has indeed become a major problem for our contemporaries. I would attribute it in large part to an exclusive and mistaken literalism in our encounter with the sources of revelation. One of the more ominous signs of the spiritual impoverishment of our time is that believers have lost much of the sensitivity needed to perceive the symbolic within the literal. They tend to oppose one to the other: events and words are either symbolic or they are literal. But such a disjunction is fatal. The purely literal reading deprives the paradigmatic events of our faith of their enduring redemptive significance today and reduces an historical religion, such as ours is, to a mere memory. A purely symbolic reading weakens historical events and words to the point where they become simply occasions for creating new symbols for our own age. Many contemporaries caught between the horns of this false dilemma flee their historical faith to take refuge in some kind of abstract deism. But the historical need not be exclusive of the symbolic, and precisely thereby it attains contemporaneity for all times.

An older conception of religious symbols understood them as both concrete, historical realities in their own right and signs referring to an invisible reality. That conception still survives in the Christian theology of the sacraments, according to which ordinary actions become extraordinary in signifying a new conveyance of divine grace. Unfortunately, in their encounter with the equally sacramental words of scripture Christians appear to have largely lost the symbolic meaning. Yet if the words and events of the Gospel narratives are to have more than a historical meaning, subject to the rules of historical criticism, they also must be read as sanctifying symbols that religiously address today’s believer.

The concept of a nonspecific general "religiosity" has proven to be untenable. Religion cannot survive on mere feelings or moral intentions. It needs symbols of transcendence, and symbols are by their very nature specific.

We’ve been speaking about the biblical resources of Christian interiority. But a moment ago you alluded to a likeness in the pursuit of the spiritual life between Christians and members of other faiths. Your view of the spiritual life seems to encourage interfaith encounter.

In our age we have come to understand our faith within the context of the aspirations, desires and needs expressed in so many forms since the beginning of the human race. We have learned to respect these many ways of humankind’s longing for God in the light of our own faith. Some Christians have been inspired to integrate pious attitudes and meditative practices derived from other faiths within their own, without betraying Christianity’s unique identity. In doing so they are following ancient examples. Christians have received so much from the Hebrew mother faith of which they are no longer aware. Also, from the fourth century on, Greek fathers generously borrowed from Neoplatonic speculation to an extent that, via Gregory of Nyssa, Dionysius and Maximus Confessor, late Greek piety has shaped the very nature of Christian mysticism. Why should we then not be allowed, as even the desert fathers were, to borrow meditative exercises that centuries of pre-Christian practice have left us?

In fact, here also the analogy of faith urges us to see the existence of other religions in the light of God’s providence. Buddhist silence may help the Christian in deepening insight into the mystery of the Trinity where the Father is the silent source of the eternal Word. And how could God’s omnipresence in Vedantic Hinduism not remind the Christian of the Spirit, qui replevit orbem terrarum—who fills the entire world? Such analogies cannot be fortuitous to the Christian mind, and we do well to heed them as signs of a divine Providence that, with loving care, rules not only Christians but all humans,

It would be wrong, however, to regard these analogies as justifying a syncretistic relativism that entitles each person to compose his or her own religious collage. This attitude, all too common today, shows a lack of respect not only for one’s own faith but also for those faiths one so casually dismantles for spare parts. It is yet another manifestation of that radical anthropocentrism, the main enemy of sincere religion, that tempts believers to bring the language of transcendence down to the level of purely human wants and choice. Without detracting from the providential nature of other faiths, Christians cannot ignore the fact that this same Providence has led them to a faith that is not a "choice" but, for those chosen to it, an absolute summons. To relativize faith is, I think, to subvert its fundamentally divine character.

You were among the earlier writers who regarded the mystical tradition as a source of religious renewal. Perhaps we should take the opportunity to clarify a basic point. What do you understand by the term mystical?

The word has radically changed its meaning, and the change is related to the modern turn to the self as source of meaning and value. This turn, in many respects, has restricted the scope of religious life. Thus, mystical now tends to refer to the intensely private, mostly exceptional, experience of God’s presence. The original meaning was less restrictive and far less subjective. For Christians of the first three or four centuries it referred to the spiritual meaning "hidden" under the letter of the Bible and, by extension, to the spiritual reality concealed in Christian sacramental symbols. The meaning remained hidden only until the person had been initiated into its deeper sense. Nor was that meaning in any way private. Gradually experience itself (still communal, and not exceptional) assumed greater importance than textual or sacramental meaning. When the individual self came to the fore in the modern age, religious life increasingly came to be interpreted as a private matter. Today we have to live with that somewhat individualist connotation, but there is no need at all for accepting the supposedly exceptional character of the mystical experience; that notion was an 18th-century addition.

I would prefer to consider as mystical all that refers to faith as it directly affects human experience. That includes the common Christian intimation of a divine presence in scripture, religious doctrine, liturgy and nature. To accept this broader meaning would also remove the obstacle that still gives some Protestants—anxious to explore the experience of God—a bad conscience in using a term so connected with a too exclusive spiritual theology. In its more general sense, then, mysticism refers to that disclosure and experience of a higher, purer reality which one gratefully accepts without exercising control over it. All great Christian writers assign considerable significance to such experience, Luther and Calvin as well as Teresa or John of the Cross, and for good reason, for without some experience, however humble, few people would be religious, particularly today now that the social pressure for actively belonging to a religious body has become so much weaker. Even Christians disaffected with church services, confused by doctrine, and unconvinced by some principles of moral guidance often appear somehow to remain acquainted with those delicate disclosures of an invisible, inaudible, mysterious presence.

Such a strong emphasis on personal experience alarms some religious thinkers, but for you such experience seems to mean more to the person of our time than it ever did before.

Yes. Precisely because believers are no longer supported in their faith by the community at large, they tend to fall back upon personal experience. Paradoxically, this new prominence of religious experience occurs at a time when that very experience has become both weak and ambiguous in our secular culture—indeed, so ambiguous that many no longer discern it as specifically religious. But religious experience has also become weak. The voice of God is being drowned out by the increasing noise of the ever humming, faxing, ringing world of ubiquitous communications that surrounds us. Even before this century John Henry Newman said in one of his sermons: "He is still here; He still whispers to us, He still makes signs to us. But His voice is too low, and the world’s din is so loud, and His signs are so covert, and the world is so restless, that it is difficult to determine when He addresses us, and what He says." Nonetheless the small, thin voice persists, gently urging many toward a specific faith that, they hope, will yield more abundant and more particular disclosures. The experience that leads to faith and that constantly feeds it is no mere intellectual insight, but neither is it a purely emotional occurrence. Taken in isolation, religious experience would not be faith at all, for faith, certainly the Christian one, implies so much more than experience. Faith can never be satisfied with profound insights and wholesome feelings. It requires an active response, a commitment of the whole person. Nevertheless, experience belongs to the essence of religion, even though it never coincides with it.

How should pastors speak about the Christian faith to their congregations? Where does the mystical path intersect with their responsibilities as ministers of the Word?

To speak of the religious experience may lead them into the most dangerous corner of all. The communication of personal feelings or the appeal to the feelings of the congregation subjects the divine message to the constantly changing tides of the human heart. The pastor’s task is an objective one: to preach the Word. But what the minister preaches is not an objective fact, a moral exhortation or an intellectual doctrine. It possesses an inner, radiating beauty that illuminates the objective message and warms the heart of the believer. To bring out this aesthetic quality of the Word requires more than eloquence or a solid acquaintance with theology. It summons the pastor to be a spiritual person, penetrated by the encompassing presence of the Word, by its mysterious force and by its sublime symbols.

The pastor ought to be a person acquainted with that inner silence in which alone the Word can resonate. Pastors should also be capable of detecting the symbolism both of words and of earthly events—the analogies of faith—through which the deeper meaning of the divine mystery discloses itself. This, I suspect, may be far more important than being timely or relevant. Those who attend services come to hear what is different from ordinary life: what is the same they may learn far better from journals at home.

Beyond Neutrality

Over the years, the Supreme Court has had trouble deciding between two competing theories on how the courts should interpret the religion clause of the First Amendment. One strand, which survives in popular discourse and in the courts but is in serious danger of vanishing from the academy, is the view that the state must be neutral, neither choosing among religions nor choosing between religion and nonreligion. Unsurprisingly, this is known in the literature as neutrality. The second strand, thriving in academic discourse but remarkably difficult to express in popular discourse, is that the state must take steps to accommodate religious believers whose practices are burdened by otherwise neutral state laws. This strand, predictably, is known as accommodationism.

I confess that I have somewhat oversimplified a vast and often intricate literature. There are any number of carefully nuanced middle positions. Yet, in the end, nearly all theorists lean toward one of these two positions, simply because legal theory (unlike, say, literary theory) must finally guide the decisions by actual judges of actual cases. Consequently, legal theorists quite understandably feel impelled to tell us in the end how the cases should come out. And, in the end, just about everybody who presents a theory -- no matter what the theory is called -- winds up favoring something that looks an awful lot like either neutrality or accommodationism.

The problem is this: Neutrality is a theory about freedom of religion in a world that does not and cannot actually exist, whereas accommodationism, although a theory about the real world, is not really a theory about freedom of religion. Accommodationism is certainly to be preferred, at least if one takes religion seriously. But one must also be careful. Although a theory of religious freedom is plainly necessary to the proper function of the state (because its courts must decide actual cases), a theory of religious freedom may be unnecessary to, and might even prove dangerous to, religion itself.

Why do I say that neutrality is a theory about a world that does not and cannot exist? Because neutrality supposes that it is possible for the state to act without taking any account of religion generally, or of the specific religious beliefs of constituent groups. No religion is to be favored over any other; nor is religion generally to be favored over nonreligion. The metaphorical wall of separation of church and state (which is only a metaphor, although we sometimes pretend is a part of our constitutional law) seeks to capture this idea: there is the sphere of religion and the sphere of the state, and a mighty wall protecting each from the other.

But none of this is possible.

In the first place, what we are bold to call neutrality means in practice that big religions win and small religions lose. When the Supreme Court decreed in 1988 in Lyng v. Northwest Indian Cemetery Protective Association that three Indian tribes in California could not prevent the Forest Service from allowing road building and logging on their sacred lands, the justices believed, I suppose, that they were acting neutrally: the tribes had no more right than anybody else to protect the lands. But from the point of view of the tribes, whose religious tradition, as the justices admitted, would be "devastated" by the government’s action, there was nothing neutral about the destruction of the forest. With the forest gone, their religion also would be gone: a neutral result without any remotely neutral consequences.

A more powerful religion would not suffer so from neutrality. To take what might seem a painfully obvious example, nobody proposes to build a road through the Cathedral of St. John the Divine in Manhattan. My own Episcopal Church, although our numbers are dwindling, remains sufficiently powerful that nobody would even dream of so absurd a notion. (And, lest you object that the federal government "owns" the national forests, and not the cathedral, let me add that nobody would dare try to take the cathedral through eminent domain either.) To be sure, neutrality governs this result: the cathedral is not safe because it is a religious building; it is safe because it is a building valued by a politically powerful constituent group. But that only illustrates the point. Neutrality is a blueprint for the accidental destruction of religions that lack power.

The reason neutrality fails is that it imagines an impossible world. No true wall of separation is possible. Religion and the state, the two great sources of control all through human history, will never be fully separate from each other. Each will always shade into the other’s sphere. Schoolchildren learn this truth in their science classes: All containers leak. The only interesting question is how fast. In the case of religion and state, the leakage is rapid, and constant. How could matters be otherwise? Religion, by focusing the attention of the believer on the idea of transcendent truth, necessarily changes the person the believer is; which in turn changes the way the believer interacts with the world; which in turn changes political outcomes. Although there have been some clever moves in political philosophy to explain why the religious voice should not be a part of our public debates, such theories wind up describing debates from which deeply religious people are simply absent.

Besides, in a nation in which the great majority of voters describe themselves as religious, religious belief will usually be the background -- even if frequently unstated -- of our policy debates. A widespread religious conviction that we must aid the poor will inevitably find its way into legislation, and so the nation will create welfare programs. A widespread religious conviction that long-term help is no substitute for hard work will inevitably find its way into legislation, and so welfare will evolve into workfare.

Once one recognizes that the wall of separation leaks, and leaks badly, the case for neutrality disappears. It seeks to impose an artificial order on a naturally chaotic relationship. By so doing, it cloaks hard truths about our society. Some of the truths are pleasant -- for example, the happy truth the religion matters deeply to most people, even in politics. And some of the truths are unpleasant -- for example, the tragic fact that other people’s religions often matter less to people than their own. (One recalls Ambrose Bierce’s definition of a Christian: "One who believes that the New Testament is a splendid guide for the life of his neighbor.") This truth, too, works its way into policy, which is why the "neutral" result in lying is able to stand. The state is permitted to devastate a religion, as long as it doesn’t do so on purpose.

The bureaucrats who decided to build the road and level the sacred forest were not wicked people but fallible humans, engaged in the fallible human task of governance. Yet none of them, I suspect, would have considered for a moment allowing the destruction of whatever physical symbol their own religions held to be sacred. Neutrality’s ideal world, in which all of us rigorously separate our religious and secular selves, simply is not the world in which most Americans live. Nor would many people want to. So although neutrality is certainly a theory about religion and law, it is not a theory about a world that exists.

The strongest argument in favor of the accommodation of religion is the preservation of genuine diversity -- not simply people who look different, but people who in deep ways are different. A religion that makes no difference in the lives of its adherents is not a religion at all. The accommodationist believes in religion as something that actually changes the way people are; nurturing religion, then, also nurtures a plurality of communities, communities that assign to existence meanings different from those of the dominant culture. The accommodationist, therefore, argues that the state should back off whenever possible -- formally, whenever a compelling state interest is not served by the effort to rein in religions that would otherwise press for meanings of reality that the state abhors.

These are already dangerous words, because they can invite oppression: the state, after all, might reasonably conclude that many different ends are compelling. Indeed, the immediate difficulty for the accommodationist is that it is the state that is doing the accommodating. The problem facing the religionist is deeper than the struggle between the ideal of neutrality and the reality of accommodation. For the accommodationist, at the very beginning of the analysis, is faced with two crucial problems that necessarily limit religious freedom. First, the accommodationist must define religion, which already narrows the universe of what counts. One saw evidence of this, for example, when the Massachusetts Supreme Judicial Court decreed that when religious counseling is informed even in part by secular psychology, it ceases to be religious and is entitled to no free-exercise protection. The problem is plain: If the religion seeking an accommodation must first prove to the state (through its courts) that it is a religion, it is already under pressure to meet a test that might have nothing to do with the religion’s teaching.

The courts might try to avoid this difficulty, by adopting, for example, Kent Greenawalt’s proposal that we define religion analogically. Greenawalt suggests that we begin with what we know to be religions, search out their common elements, and then compare other claimants by looking for similar (but perhaps not identical) elements. This approach (which is, I think, more impressionistic than analogical) would have the advantage of not forcing all religions into a single, narrow mold -- a point to which I shall return. But it would have the disadvantage of beginning with the religions with which our culture is most familiar, which at once makes potential outsiders of religions that may look very different. Besides, like all definitional approaches, this one still says to people of faith that they must ensure that they come within the approved definition if they want, in a legal sense, to be free.

Second, the accommodationist must indicate a limit beyond which no claim of religious freedom will be recognized -- to resolve, for example, the problem of religiously mandated murder. Most accommodationists place the limit at "compelling state interest"; but even setting compellingness as the standard, and handling it correctly, the courts in the end will be centering their concern on the needs of the state, not the needs of the religionist. This became painfully clear in 1996 when the Supreme Court refused to hear an appeal of the Alaska Supreme Court’s Swanner decision, which held that the state’s interest in preventing discrimination against unmarried heterosexual couples is sufficiently great that it trumps the objections of landlords who believe they are forbidden by God to permit "fornication" on their property. It is not easy to discern what made the interest compelling, except that the state had decreed it to be so.

The simple problem is that the accommodationist must ultimately draw some lines; and every line divides those religions that will be privileged from those that will be demoted. The mere existence of these dividing lines creates pressures on religions to twist themselves into shapes that the state (through its courts) will recognize. No religion picks a fight with the state if it can be avoided, and few religionists are able to resist the lure of state assistance. And so, precisely because the rules of constitutional law create some faint possibility of gaining special consideration, the question the religion must ask itself becomes (to take Needham as our example) not What form of counseling does God require? but What form of counseling will the state allow? Thus does constitutional protection of religion threaten to undo the specialness of religion itself.

Suppose the state says no? Is the follower of R then to decide that B is not required after all -- because the state will punish him if he does it? In principle, this outcome is no more appealing than the notion that the state should decide to allow the follower of R to do B because he is going to do it anyway, even if punishment follows. But no such symmetry exists. We seem to expect that the sensible religionist will take into account state response in making a decision whether to break the law or not; yet we evidently suppose it to be absurd that the sensible state should take into account religious response in deciding whether to enforce a law or not. The religionist who persists in doing B after it is deemed to carry no constitutional protection is a fanatic, but the state that puts him in jail is only doing its job. As a formal matter, this result is possible because the state possesses the legal authority to enforce its decrees and the religionist does not. As a practical matter, the result is possible because the state, not the religionist, has the guns.

In a nation that properly prides itself on religious pluralism, we certainly do not want to put the guns in the hands of the competing religions instead. Yet the same pride should help us to recognize that the asymmetry of religious freedom is mischievous. Although the compelling-ness test (which only accommodationists believe in anyway) gives us a standard for determining which state interests are so vital that a claim of religious requirement cannot overcome them, we have no test for determining which religious interests are so vital that no state interest -- not even a compelling one -- can overcome them. In other words, the balance is struck in a way that creates only two sets of cases: those that are hard and those in which the state automatically wins. There may be no set of cases in which the religion automatically wins.

This asymmetry is also unstable. The neat separation assumed by the model does not survive. The result is not chaos; the result, rather, is a steady leakage in our separation cylinder, but leakage in only one direction, so that the state wins more and more and the religionist wins less and less -- precisely the trend line of the cases over the past 20 years. This does not mean that there is a point at which accommodation and neutrality converge (although there may be); it means only that as time and cases go by, the state-centeredness of what we are bold to call religious freedom is more and more apparent, and the pressure on religionists to conform to the expectations of others grows and grows, with the inevitable complicity of the judges who supposedly protect religion’s "right" to be free.

But the phenomenon I have been describing, the reader might object, is simply the way the world works. For me to accuse the courts of being state-centered on religion issues, the reader might argue, smacks of sophistry. After all, the judges must decide the cases properly before them. To decide cases they must draw lines. That the lines judges draw are occasionally a little bit arbitrary is a very old critique of judging. Unless we are to adopt a standard under which the religious freedom claimant always wins, then we must pick a point, by some method, at which the claimant loses. Whatever that point turns out to be, it will invariably be the point at which the losing claimant will accuse the judges of statism.

And besides (so the counterargument might continue) -- besides, trade-offs are a necessary part of the life of a society. Of course religions will sometimes feel themselves pressured to change; everybody is pressured to conform; that is a fact of life, not of law. If a religious belief is sufficiently strong, it will survive the pressure. Recall that Bob Jones University refused to yield to state pressure to abandon its religious belief in racial separation after losing in the Supreme Court in 1983. Imagine, however, that the Internal Revenue Service had been after Bob Jones to abandon faith in Jesus Christ as Son of God and Savior, the bedrock on which Christianity rests. In that case, nobody would have been surprised by the school’s refusal to back down.

Neither one of these objections disturbs the theory. Yes, judges must decide cases and draw lines; but it is the fact that the judges must draw lines -- not the place where the lines happen to be drawn -- that creates the risk for religions that go to court to seek protection. And, yes, religionists should ideally be strong enough to resist state pressure; but the state itself has created layer on layer of purportedly neutral pressures that combine to weaken the ability of the religions (or, rather, the religious people) to resist. Among these layers are a variety of tax rules, which encourage religions to reshape themselves so as to be eligible for tax benefits, and the recent legislative efforts, certainly constitutional but perhaps of dubious value to religion, to allow religious groups to share in the rather substantial largesse of the programmatic side of the welfare state.

A principle of accommodation will not eliminate the tension that pressures the religious to change for the sake of society, but it will at least create a space in which the religions can organize their resistance. Moreover, if combined with a healthy respect for the role of the religious voice in politics, the accommodation principle might at least create some pressures for the society to change for the sake of religion. A democracy that believes in religious freedom should be willing to live with the tension; so should a religion that believes in democracy.

The Etiquette of Democracy

In the summer of 1966, my parents moved with their five children to a large house near the corner of 35th and Macomb Streets in Cleveland Park, a neighborhood in the middle of Northwest Washington, D.C., and, in those days, a lily-white enclave. My father, trained as a lawyer, was working for the federal government, and this was an area of the city where many lawyers and government officials lived. There were senators, there were lobbyists, there were undersecretaries of this and that. My first impression was of block upon block of grim, forbidding old homes, each of which seemed to feature a massive dog and spoiled children in the uniforms of various private schools. My two brothers and two sisters and I sat on the front steps, missing our playmates, as the movers carried in our furniture. Cars passed what was now our house, slowing for a look, as did people on foot. We waited for somebody to say hello, to welcome us. Nobody did.

We children had no previous experience of white neighborhoods. But we had heard unpleasant rumors. The gang of boys I used to hang out with in Southwest Washington traded tall tales of places where white people did evil things to us, mostly in the South, where none of us had ever lived, nor wanted to.

I watched the strange new people passing us and wordlessly watching back, and I knew we were not welcome here. I knew we would not be liked here. I knew we would have no friends here. I knew we should not have moved here. I knew...

And all at once, a white woman arriving home from work at the house across the street from ours turned and smiled with obvious delight and waved and called out, "Welcome!" in a booming, confident voice I would come to love. She bustled into her house, only to emerge, minutes later, with a huge tray of cream cheese and jelly sandwiches, which she carried to our porch and offered around with her ready smile, simultaneously feeding and greeting the children of a family she had never met—and a black family at that—with nothing to gain for herself except perhaps the knowledge that she had done the right thing. We were strangers, black strangers, and she went out of her way to make us feel welcome. This woman’s name was Sara Kestenbaum. Sara died much too soon, but she remains, in my experience, one of the great exemplars of all that is best about civility.

Sara Kestenbaum’s special contribution to civility back in 1966 was to create for us a sense of belonging where none had existed before. And she did so even though she had never seen any of us in her life. She managed, in the course of a single day, to turn us from strangers into friends, a remarkable gift that few share. (My wife is one of the few.) But we must never require friendship as the price of civility, and the great majority of us who lack that gift nevertheless hold the same obligation of civility.

This story illustrates what I mean when I say that civility is the set of sacrifices we make for the sake of our fellow passengers. Sara Kestenbaum was generous to us, giving of herself with no benefit to herself, and she demonstrated not merely a welcome that nobody else offered, but a faith in us, a trust that we were people to whom one could and should be generous. And so we have the beginning of a definition of sacrificial civility: Civility has two parts: generosity, even when it is costly, and trust, even when there is risk.

Saying hello to a stranger on the street or driving with a bit more care are acts of generosity. Conceding the basic goodwill of my fellow citizens, even when I disagree with them, is an act of trust. By greeting us as she did, in the midst of a white neighborhood and a racially charged era, Sara was generous when nobody forced her to be, and trusting when there was no reason to be. Of such risks is true civility constructed.

Historians and sociologists of civility might have predicted that our new neighbors would be friendly, even had the racial issue not been present. The city, in a peculiar way, holds within its history the collapse of one form of civility, based on norms learned from small, known communities, and the development of another, based on norms learned from larger, anonymous ones. Although this is not the place to do the history in great detail—others have done it, and excellently—it will be useful to hit the high points.

Most of us have heard the old bromide that people who live in cities are not as polite as people in the country. New Yorkers, we think, epitomize rudeness, whereas folks in the South, say, are just as friendly as they can be. The bromide, however, turns out not to be a bromide: more and more experimental evidence confirms it. Something seems to happen to the psyche, to the personality, maybe even to the soul, when people live together in vast numbers. We find ourselves avoiding each other if only to keep from tripping over each other. We demand what has come to be called our "space."

In his classic text The Individual and the Social World, psychologist Stanley Milgram warned against overstating the case for urban incivility: "In some instances it is not simply that, in the city, traditional courtesies are violated; rather, the cities develop new norms of noninvolvement." Thus, when visitors arrive from rural areas with very different rules of conduct and complain that they seem to have landed in a foreign country, they are, in a sense, absolutely right. The city, like any other community, creates its own standards of behavior, along with its own pressures to obey them. The only trouble is, the standards are often morally inferior to the ones they replace. Milgram continues:

These [new norms] are so well defined and so deeply a part of city life that they constitute the norms people are reluctant to violate. Men are actually embarrassed to give up a seat on the subway to an old woman; they mumble "I was getting off anyway," instead of making the gesture in a straightforward and gracious way.

Instinct says that Milgram is right. So do experiments—for example, observing whether passersby will help a disabled man who falls to the ground. For half a century, every scholar who has examined the question has found city dwellers far less likely to be helpful or polite than those who live in rural areas—and the bigger the city, the ruder its citizens. Probably they do not see themselves as rude: they are only conforming their conduct to the expectations of their urban world.

And why do city dwellers develop these norms—what might be called norms of rudeness rather than norms of civility? Because, says Milgram, "everyone realizes that, in situations of high population density, people cannot implicate themselves in each other’s affairs, for to do so would create conditions of continual distraction which would frustrate purposeful action." In other words, if people in cities are nice to each other, they will be too busy with each other’s problems to get anything done.

Yet this explanation is obviously incomplete. In the city of Milgram’s vision, strangers are cold to each other because it is too costly to behave any other way. Social historians, however, have linked the rise of the national concern over matters of etiquette in the 19th century to the rise of the great industrial city precisely because the bold new cities were places where, for the first time, it was possible to live and work entirely among strangers.

Prior to the development of the great cities, most people lived in the same community their entire lives, surrounded by an extended family (perhaps a large one) and a thick network of other relationships that helped to nurture and enforce norms of behavior. The new industrial cities created enormous freedom: people moved to the cities and lived there alone, or with their nuclear families, but without the significant community ties that simultaneously offered moral guidance and limited their choices. People were mobile: they went where the jobs were. But this new freedom, as the historian John Diggins has pointed out, created problems of its own: "The more free the individual felt himself to be, the more isolated and lonely he actually became until he craved to forsake his solitude in order to surrender his self to the new invisible authority of society itself." With the advance of capitalism, then, the nation moved from the idea of local norms of behavior, created and nurtured by the community (often the religious community), to the idea of citywide or even national norms. Now the norms were generated not by people one knew but by a larger, anonymous, untouchable entity. "Society" replaced "community."

One beneficiary of the change was the publishing industry. As the new citizens— now literally citizens, residents of cities—struggled to work out how to relate to the strangers by whom they were surrounded the demand for authoritative rules exploded. Suddenly, everybody wanted to know how to behave, how to get along with strangers. Responding to the manners craze, publishers coughed up a flood of books on etiquette, most of them quite thin, producing, as one observer has noted, "weak opinion strongly held."

Unlike the Europeans, who held that good manners were the property of the upper classes, Americans nourished the conceit that anybody could learn etiquette. Many of the books were inexpensive. One of the most popular was Irwin Beadle’s popular 1859 volume, Dime Book of Popular Etiquette, the price of which was stated in the title. There were books about how to act when traveling on the railroads, books about behavior on the street or in the theater, books aimed at women, books aimed at men. Special books were written to guide foreign immigrants to America—and rural immigrants to the cities. And people who never read books on anything else worked on their manners. John F. Kasson says in Rudeness and Civility: Manners in 19th Century Urban America that "most etiquette manuals were sold to people who would never have thought of entering a bookstore," mainly by direct mail. There were books for all ages, for all regions, from all angles, and they shared a common theme: If you want to be somebody in America, you must learn to mind your manners.

One result of this urban fad was that the fledgling public schools (most of them in those days still called common schools) began to include the study of manners in the curriculum. Part of the reason was nativism, a fear that the new immigrants did not know how Americans were expected to behave. But a larger part was probably status anxiety, a fear that if children did not learn proper etiquette, they would not amount to anything. In an eerie foreshadowing of today’s debates over character education, journalists warned that the family could not do it alone: the schools had to help. And so a new subject— "morals and deportment"—was born. Now that society rather than community set the standards, children needed a way to find out what the standards were.

Which leads us to our second story.

When I was a child, attending grade school in Washington, D.C., we took classroom time to study manners. Not only the magic words, "please" and "thank you," but more complicated etiquette questions, such as how to answer the telephone ("Carter residence, Stephen speaking") and how to set the table (we were quizzed on whether knife blades should point in or out). We were taught to address adults with a title ("Sir" or "Ma’am") or by surname ("Mrs. White"). And somehow nobody—no children, no parents—objected to what nowadays would surely be viewed as indoctrination.

Today instruction of this sort is so rare that when a school tries to teach manners to children, it makes the news. When U.S. News & World Report ran a story in 1996 about the decline of civility, it opened with what it must have considered the man-bites-dog vignette—an account of a classroom where young people were taught to be polite. Ironically, this newsworthy curriculum evidently teaches a good deal less about etiquette than we learned back at Margaret M. Amidon Elementary School in the ‘60s, but that is still a good deal more than children learn in most places. Deportment classes are long gone. Now and then the schools teach some norms of conduct, but almost always about sex, and never the most important ones: Do not engage in harassment and Always use a condom seems to be the outer limits of their moral capacity. The idea that sex, as a unique human activity, might require a unique morality, different from the general moral rules against physical harm to others and harm to the self, is not one that public schools are prepared to entertain.

Respect for rules of conduct has been lost in the deafening and essentially empty rights-talk of our age. Following a rule of good manners may mean doing something you do not want to do, and the weird rhetoric of our self-indulgent age resists the idea that we have such things as obligations to others. We suffer from what James Q. Wilson has described as the elevation of self-expression over self-control. So when a black student at a Connecticut high school was disciplined in 1996 for wearing pants that drooped (exposing his underwear), not only did he claim a right to wear what he liked, but some community leaders hinted at racism, on the theory that many young African-American males dress this way. (The fact that the style is copied from prison garb, which lacks a belt, evidently makes no impression on these particular defenders of the race.)

When I was a child, had my school sought to discipline me, my parents would have assumed the school had good reason. And they probably would have punished me further at home. Unlike many of today’s parents, they would not have begun by challenging the teacher or principal who thought I had done wrong. To the student of civility, the relevant difference between that era and the present is the collapse of trust, particularly trust in strangers and in institutions. My parents would have trusted the school’s judgment—and thus trusted the school to punish me appropriately—but trust of that kind has largely dissolved. Trust (along with generosity) is at the heart of civility. But cynicism has replaced the healthier emotion of trust. Cynicism is the enemy of civility: it suggests a deep distrust of the motives of our fellow passengers, a distrust that ruins any project that rests, as civility does, on trusting others even when there is risk. And so, because we no longer trust each other, we place our trust in the vague and conversation-stifling language of "rights" instead.

Consider again the boy with the droopy pants. To talk about wearing a particular set of clothes as a "right" is demeaning to the bloody struggles for such basic rights as the vote and an unsegregated education. But the illusion that all desires are rights continues its insidious spread. At about the same time, a fired waitress at a restaurant not far from Yale, where I teach, announced a "right" to pierce her face with as many studs and rings as she wishes. And, not long ago, a television program featured an interview with a woman who insisted on the "right" to be as fat as she likes. Rights that are purchased at relatively low cost stand a fair chance of being abused, simply because there is no history behind them, and thus little pressure to use them responsibly—in short, because nobody knows why the right exists. But even a right that possesses a grimly instructive history—a right like freedom of speech—may fall subject to abuse when we forget where it came from.

This proposition helps explain the facts, if not the outcome, of Cohen v. California, a 1971 decision in which the Supreme Court overturned the conviction of a young man who wore on his jacket the benign legend F___ THE DRAFT. The case arose as the public language grew vulgar. The 19th and early 20th centuries offered a tradition of public insults that were witty, pointed, occasionally cruel, but not obscene or particularly offensive. Politicians and other public figures competed to demonstrate their cleverness in repartee. (One of my favorites is Benjamin Disraeli’s explanation of the difference between a misfortune and a calamity: "If Gladstone fell into the Thames, that would be a misfortune. And if anyone pulled him out, that would be a calamity.") Nowadays the tradition of barbed wit has given way to a witless barbarism, our lazier conversational habit of reaching for the first bit of profanity that comes to mind. The restraint and forethought that are necessary to be clever, even in insult, are what a sacrificial civility demands. When we are lazy about our words, we are telling those at whom our vulgarity is directed that they are so far beneath us that they are not worth the effort of stopping to think how best to insult them; we prefer, animal-like, to make the first sound that comes to mind.

In Cohen v. California the justices were unfortunately correct that what the dissenters on the court called "Cohen’s absurd and immature antic" was protected by the freedom of speech. But it is important to add that when the framers of the Constitution envisioned the rough-and-tumble world of public argument, they almost certainly imagined heated disagreements against a background of broadly shared values; certainly that was the model offered by John Locke. It is unlikely that the framers imagined a world in which I might feel (morally) free to say the first thing that came into my head. I do think Cohen was rightly decided, but the danger deserves emphasis: When offensiveness becomes a constitutional right, it is a right without any tradition behind it, and consequently we have no norms to govern its use.

Consider once more the fired waitress. I do not deny that the piercing of one’s body conveys, in many cultures, information of great significance. But in America, we have no tradition to serve as guide. No elder stands behind our young to say, "Folks have fought and died for your right to pierce your face, so do it right"; no community exists that can model for a young person the responsible use of the "right"; for the right, even if called self-expression, comes from no source other than desire. If we fail to distinguish desire from right, we will not understand that rights are sensible and wise only within particular contexts that give them meaning.

The constitution protects a variety of rights, but our moral norms provide the discipline in their exercise. Sometimes what the moral norm of civility demands is that we restrain our self-expression for the sake of our community. That is why Isaac Peebles in the 19th century thought it was wrong for people to sing during a train ride; and why it is wrong to race our cars through the streets, stereos cranked high enough to be sure that everyone we pass has the opportunity to enjoy the music we happen to like; and why it was wrong for Cohn to wear his jacket; and why it is wrong for racists to burn crosses (another harmful act of self-expression that the courts have protected under the First Amendment). And it is why a waitress who encounters the dining public each day in her work must consider the interest of that public as she mulls the proper form of self-expression.

Consequently, our celebration of Howard Stern, Don Imus and other heroes of "shock radio" (as it is sometimes called) might be evidence of a certain loss of moral focus. The proposition that all speech must be protected should not be confused with the very different proposition that all speech must be celebrated. When radio station WABC in New York dismissed a popular talk show host, Bob Grant, who refused to stop making racist remarks on the air, some of his colleagues complained that he was being censored. (He was no more being censored than the producer of a television program that is canceled because of low ratings: in both cases, the content is the problem.) Lost in the brouhaha was the simple fact that Grant’s comments and conduct were reprehensible, and that his abuse of our precious freedoms was nothing to be celebrated.

The point is not that we should rule the offensive illegal, which is why the courts are correct to strike down efforts to regulate speech that some people do not like, and even most speech that hurts; the advantages of yielding to the government so much power over what we say have never been shown to outweigh the dangers. Yet we should recognize the terrible damage that free speech can do if people are unwilling to adhere to the basic precept of civility, that we must sometimes rein in our own impulses—including our impulses to speak hurtful words—for the sake of those who are making the democratic journey with us. The Book of Proverbs tells us, "Death and life are in the power of the tongue" (Prov. 18:21). The implication is that the choice of how to use the tongue, for good or for evil, is ours.

Words are magic. We conjure with them. We send messages, we paint images. With words we report the news, profess undying love, and preserve our religious traditions. Words at their best are the tools of morality, of progress, of hope. But words at their worst can wound. And wounds fester. Consequently, the way we use words matters. This explains why many traditional rules of etiquette, from Erasmus’s handbook in the 16th century to the explosion of guides to good manners during the Victorian era, were designed to govern how words, these marvelous, dangerous words, should be used. Even the controversial limits on sexual harassment and "hate speech" that have sprouted in our era, limits that often carry the force of law, are really just more rules of civility, more efforts, in a morally bereft age, to encourage us to discipline our desires.

My point is not to tell us how to speak. My point is to argue that how we speak is simply one point on a continuum of right and wrong ways to treat one another. And how we treat one another is what civility is about.

Revising the Concept of Vocation for the Industrial Age

George MacLeod, the founder of the Iona Community of Scotland, customarily took on the community’s least attractive job, that of cleaning the latrines. He did this, he said, so "I will not be tempted to preach irrelevant sermons on the dignity of all labor" (quoted in Robert McAfee Brown, The Spirit of Protestantism [Oxford University Press, 1961], p. 116). A similar discipline on the part of Protestant thinkers might rescue the concept of vocation from a morass of sentimentality. As it is usually formulated, the doctrine of vocation is unrealistic especially for people who work in the industrial plants of our nation.

Not that the dignity of work is threatened only in the industrial sector. To some degree, work is problematic everywhere, including the burgeoning service sectors of the economy, an area that needs special consideration. Attention is concentrated here on the industrial sector because it has been largely ignored by Protestant thought—an oversight that needs correcting—and because the problematic nature of work, particularly its repetitive and boring quality, can be most clearly seen in the modern assembly line. All work may have occasional stretches of boredom and dullness, but assembly-line work is pervaded by these qualities. If the meaning of vocation can be articulated in this context, surely it can be done in other contexts. And the assembly line may illustrate for us what work is going to be like for an increasing number of persons. If this is so, Protestant churches need to be aware of it; and theologians and social scientists need to give sustained attention to formulating a new and coherent doctrine of vocation.

This is not to suggest that a mere restatement of the concept of vocation will enable Protestantism to deal constructively with industrialism. But such a restatement would at least indicate that churches have some rudimentary understanding of the situation. Toward this end, I want to examine some classic and modern statements on vocation, compare them to the actual situation of workers on the assembly line, and then suggest how the idea of vocation might be reformulated so as to be more applicable to contemporary life.

The idea of vocation that developed in the Reformation was originally a powerful one, and it remains a meaningful guide to Christian existence for a sizable group in modern society. The Reformers invested secular labor with a new dignity, and thus opened up a whole new world of spiritual significance. Laity who had not thought that what they did in home, field or shop had any religious significance were told that their everyday work was as much a calling of God as was the praying of the monk in a cloister. "What you do in your house," said Luther. "is worth as much as if you did it up in heaven for our Lord God. For what we do in our calling here on earth in accordance with His word and command He counts as if it were done in heaven for Him" (Works [Erlanger edition], vol. 5, p.102). Luther’s language is as usual, exceedingly bold. It looks, he says, like a great thing when a monk renounces the world and pursues a life of asceticism, fasting, prayers and vigils in the cloister. "On the other hand," he goes on. "it looks like a small thing when a maid cooks and cleans and does other housework. But because God’s command is there, even such small work must be praised as a service of God far surpassing the holiness and asceticism of all monks and nuns" (p.100).

Calvin’s stress on station and calling gave a similar significance to work. "Every man’s mode of life, therefore, is a kind of station assigned him by the Lord, that he may not always be driven about at random. . . . Again, in all our cares, toils, annoyances, and other burdens, it will be no small alleviation to know that all these are under the superintendence of God. . . . This, too, will afford admirable consolation, that in following your proper calling, no work will be so mean and sordid as not to have a splendor and value in the eye of God" (Institutes, bk. 3, chap. 10, sec. 6).

A brief look at some modern statements on vocation renews our appreciation of this aspect of our Reformation heritage. But they also enable us to see what happens when the conventional Protestant emphasis on work is put in the context of the industrial world.

For example, Dorothy Sayers, the gifted mystery writer and theologian, wrote a spirited essay, "Why work?" which reflects her own work with words. For Sayers, "work is not, primarily, a thing one does to live, but the thing one lives to do. It is, or should be, the full expression of the worker’s faculties, the thing in which he finds spiritual, mental, and bodily satisfaction, and the medium in which he offers himself to God." In Sayer’s view, "we should no longer think of work as something that we hastened to get through in order to enjoy our leisure; we should look on our leisure as the period of changed rhythm that refreshed us for the delightful purpose of getting on with our work" (Creed or Chaos [Harcourt, Brace], 1949, pp. 54-55). If work is so "delightful," concerns about rates of pay or working hours are somewhat irrelevant. Thus, though these sentiments may reflect the admirable devotion to work of an artist like Sayers, I suspect that they would not be wildly popular among workers on modern assembly lines.

It should be stated with all possible emphasis that these and other affirmations of work are solid and helpful. They fill an important niche, and it would be churlish not to grant this fact. But between such assertions and the world of the assembly-line worker, there is, quite simply, an enormous gap.

Workers do not find in their work anything remotely resembling the affirmations that Sayers describes or the significance ascribed to it by Luther and Calvin. The detestation felt by workers for the conditions under which they labor is fierce and widespread; if their complaints are listened to with sympathy, it is possible to hear in them a cry for elemental rights.

One of the major privations of modern industrial work is the absence of any sense of craftmanship. The finished product is something that workers rarely see. Technological efficiency dictates that a worker perform a single operation and do it over and over again, knowing that tomorrow he or she will again perform the same simple operation countless times. It is an old story, by now, to contrast this situation with that of the craftsman of an earlier time. To a real degree, the craftsman was in control of production; production was not something imposed on him. Work provided the chance to learn, to change, to adapt, and so it did not become monotonous. As a result, work could be illuminated by the traditional concepts of vocation. But that possibility has disappeared under the conditions of modern industrial life.

What alarms many observers of workers’ dissatisfaction today is the depth of workers’ anger. In their anger, they lash out at times, sabotaging the system on which their livelihood depends. This anger can be abundantly documented in such books as Studs Terkel’s Working (Pantheon, 1972) and Robert Schrag’s Ten Thousand Working Days (MIT Press, 1978). While workers’ expressions vary, their message is the same: they detest the monotonous and boring quality of modern work. Robert Linhart, reflecting on his experience on an assembly line in a French auto factory, puts the matter vividly: . "Through the gaps in this gray, gliding line I can glimpse a war of attrition, death versus life and life versus death. Death: being caught up in the line, the imperturbable gliding of the cars, the repetition of identical gesture, the work that’s never finished. If one car’s done, the next one isn’t, and it’s already there, unsoldered at the precise spot that’s just been done, rough at the precise spot that’s just been polished" (Robert Linhart, The Assembly Line [University of Massachusetts Press, 1981], p.16).

It is in the light of such conditions, and such distress, that the concept of vocation must be reconceived. Before sketching what a revised concept would be like, however, two preliminary convictions need to be stated. One is that work, even if dull, boring and debasing, is nevertheless very important in giving persons a sense of meaningful existence. To have no work is to be in a situation where one’s worth, in one’s own eyes as well as the eyes of others, is diminished. Any observer of the unemployed can corroborate this observation. Though the significance of work may need to be rethought, work will always occupy an important role in human existence. A second conviction is that technological developments cannot be undone, and we cannot return to an age of handicraft. The assembly line, whatever its problems, is a fact of modern existence.

How can vocation be reinterpreted to address the condition of the modern industrial worker?

An essential first step is to recover the core meaning of the doctrine. Vocation, as Reformation thought conceived it, was the calling of a life—the whole being of the person offered in the service of God. Daily work, formerly ignored, was lifted up and emphasized; but it was regarded as only one arena of obedient service, not the only important one. In our time, especially in middle-class Protestantism, work has been exalted to such an extent that vocation is confined almost entirely to the sphere of work. This interpretation needs to be repudiated. The work ethic, if it is not balanced by other emphases, can become demonic.

A more balanced view of the place of daily work would be wholesome and liberating. For one thing, it would permit the recognition that work is far less important now than it used to be, simply in terms of the number of hours given to it. And all indications are that work is going to be less and less a major activity in the future. Herman Kahn and Anthony I. Weinter in The Year 2000 (Macmillan, 1967) project that by that time people will work 7.5 hours a day, four days a week, 39 weeks per year, with 13 weeks left for vacation. Even if we allow for some margin of error in this prediction, we should expect a radical change in the social significance of work. To stress the centrality of daily work in that future society is bound to be unrealistic.

Realizing the limited scope of daily work also makes it possible to accord a new legitimacy to leisure and play. I am not necessarily referring only to leisure devoted to the laudable enterprises of self-improvement or social betterment, but to play as a God-approved activity.

At the same time, a doctrine of vocation that claims all of existence will summon workers to attend to the concerns of the common life. This means, specifically, strengthening home and family life, taking a responsible role in politics and participating in the union movement as it strives to recover its role as an ethical force in modern society.

A second step in stating a contemporary doctrine of vocation is to shift our focus from the product of work to the quality of the workplace. The crying need of workers is to have some say in the conditions of their work. They are demanding autonomy on the job. Speaking at the 1985 meeting of the American Association for the Advancement of Science, sociologist Jeylan T. Mortimer said that the most important determinant of job satisfaction—more important than wages—is the degree to which workers feel they can make their own decisions and have an influence on what happens on the job.

An example of what is being sought by workers is the New Technology Bill of Rights issued by the International Association of Machinists, representing some 650,000 workers in the machine-tool, metal-working, aerospace and airline industries. The document calls for workers to participate in all decisions "that lead to the introduction of new technology or the changing of the workplace systems design, work processes, and procedures of doing work, including the shutdown or transfer of work, capital, and equipment." Technology should "improve the condition of work" and training should be provided for workers displaced by technological developments.

Such a Bill of Rights, even if adopted as part of labor law, would not solve all the problems created by the impact of technology, but it would give workers the sense that they have a role in determining the conditions under which they work and would moderate much of the tension now experienced by industrial workers. Greater worker autonomy would enhance the dignity of work, a crucial factor in any Christian statement on vocation.

Management and labor alike are beginning to recognize that innovations are needed in the workplace. A number of American firms-including AT&T, General Foods, Polaroid, Proctor & Gamble, Chrysler and General. Motors— are experimenting with giving workers a greater voice in determining working conditions, and they are rotating jobs to reduce monotony. Other U.S. companies are following Japan’s example in involving workers in management and production decisions. In the Volvo plant in Gutenberg, Sweden, a special noise-proof retreat is provided for persons who work where the noise level is exceedingly high. Several European firms have reduced work hours, shifted from piece work to hourly or monthly rates, and improved bonuses.

These developments simply indicate that the conditions of work are fluid, and creative possibilities are before us. They further underscore the need for Protestant thought to articulate a vision of vocation that is both realistic about the nature of the workplace and sensitive to the needs of workers. It was Albert Camus who reminded us that life goes rotten without work. He also said that life stifles and dies when work is soulless.

The Grandeur of Politics

American political liberalism is not in a robust state of health. Not only out of power, it is also out of a program, and sometimes seems without a purpose. Unless something is done to change its present condition, it is unlikely that liberalism will soon return to power.

Such judgments are hardly original; they reflect a view widely held by observers of American politics. It is not surprising that this awareness of something seriously wrong with liberalism has produced many analyses of the condition and numerous suggestions about what needs to be done. New programs, going beyond those of Franklin D. Roosevelt and liberalism’s golden age, need to be formulated; and new strategies, bold and realistic, must be developed. But there is something else required, something less exciting than the task of developing new programs and strategies, but essential to any renewal of liberalism’s promise. The trouble with liberalism is that too many liberals are disenchanted with politics. We have had a strange procession of candidates who dislike the actual tasks of politics.

It is an indication of liberalism’s plight that my text for this homily comes from a conservative. “The grandeur of politics” -- a phrase that probably draws either amazed incredulity or harsh guffaws -- is from George F. Will’s Statecraft as Soulcraft, first presented as his Godkin lectures at Harvard. “My thesis,” he writes, is that the most important task confronting Americans as a polity is, in part, a philosophers task. The task is to reclaim for politics a properly great and stately jurisdiction That task is both important and difficult. “To understand a reassertion of the grandeur of politics, you must risk a crick in your neck” (Simon & Schuster, 1983, pp. 21-26) Will’s “crick in the Neck” refers to his call to go all the way back to Plato, but for my purpose we need go back only to the 1960s to try to see what took place in that baffling decade to produce the current liberal difficulty.

The contemporary disenchantment with politics requires no extensive documentation. It is a distressing reality which touches people of all political persuasions. A single illustration of it is found in the declining percentages of participation in voting. In the past six elections, the trend has been downward. Of those eligible to vote, 64 per cent did so in 1960; 61.7 per cent in 1964: 60.6 per cent in 1968; 55.6 per cent in 1972; 54.4 per cent in 1976; 53.95 per cent in l980. A contrast is provided by Canada. where 75.5 per cent voted in 1980. These figures tell a graphic story and point to a serious problem.

Although this disenchantment with politics is, of course, unfortunate for all citizens, it is especially so for liberals, for liberalism’s cherished goals can he achieved only through the practice of politics.

Conventional wisdom finds the beginnings of the cynical dismissal of politics in Watergate; however, the disillusionment caused by the Watergate incident did not create this anti-political stance. Skepticism about politics, and particularly about politicians, rooted far back in American history, has never been totally absent. Watergate is better seen simply as a phenomenon that permitted citizens to express their already present cynicism and to feel morally justified in doing so. Watergate made being anti-political respectable. Ironically, the very event that drove Richard Nixon from office may yield him his final victory over the liberals he so disliked, if they use it as an excuse for abandoning politics.

Jimmy Carter’s experience illustrates what happens when the idea of the grandeur of politics is forsaken, even in the name of noble intentions. Carter’s basic convictions were certainly decent and liberal. But his successful campaign had in it a strange element: he seemed to scorn the very profession he sought to practice. Writing in Harper’s in December 1977. after Carter had been in office for almost a year. Henry Fairlie said:

For what Jimmy Carter seems to be offering the American people is yet another unpolitical President. A man who in many respects is superbly equipped as a politician seems to be earnestly striving to be something else. If the Presidency is in trouble as an institution, it is not so much because it has become imperial as that it has been made increasingly unpolitical

A president who is above politics will find himself ineffective in governing and soon out of office.

Our present situation is dangerous. Our liberal politicians are openly contemptuous of politics, and our population has gone on record in numerous polls as believing that little or nothing can be hoped for through politics. How did all of us, and especially liberals, get to such a point?



No answer to that question is possible without a careful look at what went on in the 1960s. By no means do I want to join the conventional chorus which sings the sad refrain that the ‘60s were a misguided aberration in American life and that their impact was wholly negative and unfortunate. What is needed is a balanced assessment of the period, acknowledging and rejoicing in its achievements, and admitting and learning from its follies.

Its follies go far toward explaining why the grandeur of politics has faded, to be replaced by a weary cynicism that often masquerades as sophistication. If a respect for politics is to be restored to liberalism, it will be necessary to recover a realism that was initially learned from theologian Reinhold Niebuhr, and was left behind during the idealism of the ‘60s. What replaced that realism needs to be examined with some care. Most of us are still puzzled by the ‘60s. When so much seemed possible, why was so little accomplished? What flaws were present to frustrate what in retrospect remain noble dreams?

To make sense of that decade, we need an interpretive framework, such as that provided by Samuel P. Huntington’s study American Politics: The Promise of Disharmony (Harvard University Press, 1981). Building upon Gunnar Myrdal’s statement of the American creed as embracing a commitment to democracy, liberty, individual rights and the limitation of powers, Huntington sees this creed as a basic aspect of American politics. To be sure, the creed does not always -- not even very often -- get implemented; but it remains our theoretical framework and is subscribed to by most Americans. As Huntington points out, it is precisely because of the creed that we have disharmony and disturbance, for while Americans believe deeply in it, they often find themselves unable to live up to it. “Americans cannot be themselves unless they believe in their Creed, and if they believe in their Creed they must be against themselves” (p. 63). As a result of this dilemma. American politics is characterized by an alternation between periods of creedal passion and periods of creedal passivity. Both are necessary and both bring particular problems.

Although the values of the American creed are always espoused in politics, they are usually not brought out and pursued with singleness of purpose. Periods of creedal passivity are far more numerous in our history than are those of creedal passion. In looking back over the course of American history, Huntington sees only four periods of creedal passion: the Revolutionary era, the Jacksonian years, the Progressive era, and the 1960s and ‘70s, with their protest and reform movements. Central to times of creedal passion is the perception of an ominous gap between ideals and institutions. This perception is then translated into determined action to lessen the gap and often into a growing confidence that the gap can be eliminated.

In times of creedal passion, the perception of the gap between ideal and institution produces an intense moral indignation. As the American creed comes alive for people, they become vividly conscious of the many ways in which its promise has been denied to certain groups. As a result, passions become enflamed and authorities are called into question; institutions are put under searching scrutiny, and any, however venerable, which does not demonstrate a willingness to change becomes a target of hot impatience. Gradualism becomes an epithet and revolution a watchword. No delay is to be tolerated, for too many have already waited too long for justice.

It is this fervor of creedal passion which produced the troubles of the 1960s. A significant feature of the period was the shelving of Niebuhrian insights into the social struggle. insights that seemed cautious to an era impatient for results. George Santayana’s comment that we Americans do not refute our predecessors, we pleasantly bid them goodbye, is perfectly illustrated in the changing fortunes of Niebuhr’s thought.

The development of creedal passion, making the achievement of the American creed’s values a matter of intense concern, and leading to the refusal to acknowledge any limits on such efforts, set the scene for the assertion of unrealizable political hopes. The failure of such hopes brought a harvest of political cynicism. It is a result which realism, if it had been taken seriously, could have prevented.

The moralism of the ‘60s political scene deprived politics of the flexibility that is essential to its effective functioning in a democracy. Like other periods of political ineffectiveness. the 60s saw an eruption of single-issue politics. Believing passionately in the importance of a particular issue -- there were a great many of them -- ardent partisans would brook no suggestion that they might have to give a bit to make common cause with people who were not quite “pure” on the issues vital to them. The kind of fastidious insistence on purity that characterized the politics of the period goes far toward explaining the ultimate meagerness of its achievements. The election of 1968 provides a perfect example. Hubert Humphrey, as part of the Johnson administration, was not “pure” on the Vietnam war issue. Hence a great many disenchanted liberals either sat the election out or did little on Humphrey's behalf. The popular vote was 31,710,470 for Richard Nixon to 30,898,055 for Hubert Humphrey. Liberals maintained their “purity" and the nation got Nixon for its president -- a high price indeed to pay.

Overlooked in this sort of strategy was a central fact of democratic politics. Periods of creedal passion, if they are to bring about significant changes and advances, must be kept within certain bounds. To be sure, those bounds are difficult to determine, and it is this difficulty which frustrates otherwise honorable ventures. Although there is no magical guarantee of strategy, an awareness of the dangers and a willingness to learn from past mistakes would help us to recapture a sense of the grandeur of politics.

No one needs to be told that it is extraordinarily difficult to maintain both passion and perspective at the same time, but to do so is essential if we are to have a democratic politics that can move liberals toward the goals they cherish. The great fault of moralistic politics is that it makes it so easy to assume the purity of one’s own group and the demonic nature of all who oppose one. Then civility is lost, and epithets become substitutes for reasoned discourse. The liberals of the 60s ended the decade with a heady sense of their own purity, but with precious little in the way of actual accomplishments. It is scarcely surprising if they began to blame “the system”  -- American political practices and institutions -- and excuse their own employment of inept strategies.

One of Reinhold Niebuhr’s lasting contributions to American political practice was his blending of Christian insights with pragmatic strategies. His sense of humanity’s sinfulness kept him from the dangerous assumption that any group -- whether of the oppressors or the oppressed -- had a monopoly on either virtue or wickedness. Never did his insight paralyze effort: it simply guarded against presumption and kept a measure of flexibility in policies. It is unlikely that Niebuhrian thought will ever be outmoded, but it is clear that it can frequently be ignored.

Political maturity recognizes that democratic politics requires clear goals joined to appropriate strategies. In an essay of great perceptiveness. Isaiah Berlin wrote of FDR:

He did not condone the abandonment of ultimate principles before the claims of expediency or of anything else; but political monasticism -- a search for same private cave of Adullam to avoid being disappointed or tarnished, the taking up of consciously Utopian or politcally impossible positions, in order to remain true to some inner voice, or some unbreakable principle too pure for the wicked public world -- that seemed to him a mixture of weakness aid self-conceit, foolish and despicable. He did not disguise his lack of respect for purists of this type. He did not always treat them fairly and his point of view is one which has, of course, been opposed, and indeed detested, by men of the greatest courage and integrity: but I should be less than candid if I did not confess that it is a point of view that seems to me superior to its opposite [Personal Impressions (Penguin. 1982. p. 31)]

That is the sort of responsible politics marked by grandeur.

If in the 1960s liberals practiced the politics of purity, they also erred in treating politics as redemptive. What is at best an art that makes modest temporal gains came to be regarded as an instrument of redemption. Everything had to be achieved -- peace and justice established, poverty eliminated, oppression overcome. And all this had to be done quickly -- indeed, at once. It was a frenetic politics of here or nowhere, and now or never. It was often a politics of fanatics (brilliantly defined by Santayana as those who redouble their effort after they have forgotten their aim).

It is now apparent that this effort was, at least in part, the attempt of a culture whose religious dimensions were fading to find some other area in which the religious passions could be exercised. Huntington’s use of the term “creedal” passion is appropriate, for a substitute religion was being sought. As in Luke’s parable (11:24 ff.), when the house was swept clean, demons rushed into it, and the last state of the man was worse than the first.

In his analysis of the ‘60s, significantly titled The Unraveling of America, Allen J. Matusow claims that a fanciful idea of what was possible lay at the base of liberalism’s debacle. When you have set your sights too high, reality may become hard to bear; the resulting disenchantment easily becomes cynicism.

Christian realism, had it been heeded in the euphoric and frenzied climate of the 60s, might well have reminded liberals that there are limits to what can be achieved in the political sphere. This realism would not have meant the abandonment of dreams, but only the relinquishment of fatuous hopes. Reinhold Niebuhr stressed the limits faced by those who sought to be active in social concerns. The dream of perpetual peace and brotherhood “is one which will never be fully realized. It is prompted by the conscience and insight of individual man, but incapable of fulfillment by collective man” (Moral Man and Immoral Society [Scribner’s, 1932], pp. 21-22).

Although liberal politics in America is at a low point, the sweep of our history is so filled with recoveries that no one has any right to despair. But if the liberal promise is to regain its faded luster, it once again will have to find politics, in the phrase of John Buchan, “the greatest and most honorable adventure.”

Seventy Years of the Century

It does not diminish a man or a magazine to believe that 1908 was, in the providence of God, precisely the right time for Charles Clayton Morrison to become an editor and for The Christian Century to become an "undenominational" magazine devoted to church and public affairs. Many Protestants were tiring of provincialism; their churches were breaking out of sectarian isolation. Some scholars were daring to speak the truth concerning the history and composition of the Bible, thus liberating some churches from literalism. Runaway industrialism had become so oppressive that the nation’s conscience was hurting. Exploited labor was stirring, hoping to find in organization a way to shorter hours, decent wages and improved conditions. Suffragists were marching to gain voting rights for women. Pioneer sociologists were uncovering the shame of city slums and the disgrace of child labor in field and factory. Proposals to prohibit the manufacture and sale of intoxicants were widely supported. The political arena was alive with humanitarian issues.

Seventy years ago not many people would have predicted that C. C. Morrison, a 34-year-old minister of the Christian Church (Disciples of Christ), was destined to exert a notable ecumenical and intellectual influence on American Christianity. He had served churches in Iowa during attendance at and after graduation from Drake University. Then he moved to Springfield, Illinois, where he served a church and found a wife. By 1908 he was a pastor in Chicago and a doctoral candidate at the University of Chicago. It was at this time that he began to write for The Christian Century.

I

This journal had been launched in Des Moines in 1884 as the Christian Oracle, a denominational weekly. By 1900 it had moved to Chicago. In that year its name was changed to The Christian Century. For the next eight years it limped along as a denominational publication under a succession of editors. Then, on October 10, 1908, "The New Christian Century" announced that Charles Clayton Morrison and William A. Kennedy had purchased the magazine from C. A. Osborn, a publications broker who had bought it "at auction last August on account of the foreclosure of a mortgage." New papers of incorporation were taken out. Morrison was to be editor. Others were associated with him, but he later claimed, with good reason, that he had "refounded" The Christian Century on that date.

Struggling against poverty, inexperience and the low estate of religious journalism generally, Morrison had within 15 years lifted an obscure publication to a position of influence in church and state. One of his earliest reports covered the establishment of the Federal Council of Churches, which took place a couple of months after the Century was refounded. Thus the life of the magazine henceforth paralleled the development of the conciliar movement in American Christianity. While there was never any official tie, the Century became a firm advocate of cooperation with the Federal Council and with its successor, the National Council of the Churches of Christ in the U.S.A.

The Century’s championship of the social gospel was related to its support of the Federal Council, whose famous "Social Creed of the Churches" was released at its first meeting as part of a report on the church and modern industry. Elements of the social creed became planks in the Century’s platform and, in time, in that of many churches. They included abolition of child labor, protection of workers against industrial hazards, shorter hours -- the 12-hour day was then common -- a living wage and arbitration of industrial disputes.

After Morrison became editor, his first campaign was a spirited defense of Herbert L. Willett of the University of Chicago, then a leading popularizer of the modern critical study of the Bible. Willett was under attack by some of his fellow Disciples as an "infidel." The Century defended him, published his articles and boldly exploited his identification with modern scholarship. For a time, Willett’s was the most prominent name associated with the Century. He helped establish the intellectual standards of the journal and contributed financial support. One of the founders of the Federal Council of Churches, Willett moved both the Century and the council in the direction they were to follow when he wrote: "I believe that the reunion of Christendom is the logical climax of all the reformations which have preceded it and the most pressing duty of the hour."

Soon after he became editor of the Century, Morrison joined the American delegation which attended the International Missionary Conference in Edinburgh. He was therefore present when the modern ecumenical movement was born. This development, which led eventually to the formation of the World Council of Churches, emerged as a result of missionary statesmen’s discovery that sectarian divisions among Western Christians constituted the most formidable obstacle to the advancement of the Christian gospel abroad. The point was underscored on the domestic front in 1920 when the Interchurch World Movement collapsed, mainly because its participating American churches financed their own programs of expansion and left their ecumenical commitment to the tender mercies of nonexistent "friends of the churches."

II

During the first 15 years of his editorship, Morrison steadily widened the Century’s constituency and deepened its intellectual substance. It was not long before he discovered that his readers among Disciples were outnumbered by subscribers from other denominations. With this encouragement, he set out to stabilize financially an enterprise whose shoestring had at times showed signs of fraying. He found three laymen, all Disciples, who collectively provided a sustaining fund which undergirded the Century for the remaining years of Morrison’s editorship. Thus fortified, he began the search for a full-time managing editor. One day in 1923 the man he sought walked through the front door of the Century s offices.

He was Paul Hutchinson, a young journalist who was already the author of three books. Hutchinson was then associated with the Epworth Herald, published in Evanston, a Chicago suburb. He was a graduate of Lafayette College (Phi Beta Kappa) and Garrett Biblical Institute (now Garrett-Evangelical Theological Seminary). He had served for five years in China, editing the China Christian Advocate, chairing the China Literature Council and acting as secretary of the China Centenary Movement of the Methodist Church. In 1916 he had been a leader in organizing young people to re-elect Woodrow Wilson to the presidency of the United States.

Morrison printed the manuscript which Hutchinson had come to propose for publication and offered him the position of managing editor. Hutchinson accepted. Then began a collaboration of a quarter of a century which was fateful for both men, for the Century and for the Christian cause. Hutchinson was managing editor from 1923 to 1947 and succeeded Morrison in the editorship. He retired at 65, at the end of 1955.

Within three weeks after hiring Hutchinson, Morrison left for two months in Europe, leaving the magazine in the hands of his new managing editor. His confidence was abundantly justified. Hutchinson brought a professional scope and quality to the Century. His broad knowledge of foreign affairs, and particularly of American relations with Asia, earned respect for the journal from the secular as well as the religious press. He developed a corps of correspondents who supplied exclusive news coverage from cities at home and abroad.

Meanwhile, Hutchinson continued to write books, turning out a dozen or more volumes on such themes as the history of Methodism, world revolution and religion, the ordeal of Western religion, the leaders who made the churches and the modernization of China. His final volume, The New Leviathan, was a searching analysis of the trend toward totalitarianism which he saw operating in the U.S. and other governments. Following Hutchinson’s death in 1956, Morrison wrote:

There was a mild but wholesome skepticism in his mentality which made him look below the surface for hidden motivations. His perceptive mind was quick to detect both unconscious and deliberate intentions behind unctuous phrases. He hated cant. His composition was evidence of a tidy and honest mind. His manuscript came from his typewriter so clean that he rarely needed to change a single word. It was an exact mirror of his mind.

III

A measure of the influence the Century had achieved by the middle of the 1920s was the success which attended its support for the outlawing of war. Senator William E. Borah had tried and failed to incorporate a provision outlawing war into agreements establishing the World Court. In 1926 Dr. Morrison joined with Salmon O. Levinson, a Chicago lawyer, in campaigning for the renunciation of war and the branding as a crime its use as an instrument of national policy. In 1927 Aristide Briand, foreign secretary of France, proposed that the United States and France agree to renounce war on those terms. U.S. Secretary of State Frank B. Kellogg suggested that such an agreement be broadened into a general antiwar pact. The idea found favor, and on August 27, 1928, high officials of 15 nations, meeting in Paris, condemned "recourse to war for the solution of international controversies" and agreed that solutions of all disputes "shall never be sought except by pacific means." Morrison was an honored guest at the signing of what came to be known as "The Pact of Paris" or the "Kellogg-Briand Treaty." There were many other nations that later added their signatures to those of the original 15.

Since the pact to outlaw war lacked measures of enforcement, attempts to invoke it when the Japanese invaded Manchuria came to nothing. It was cited by Justice Robert Jackson to justify the Nuremberg trials after World War II. But the agreement was vitiated when Kellogg said that the pact did not impair any nation’s right of self-defense. War departments promptly became departments of defense. Hindsight notes that the pact should have been linked with the minimal structure of world government which the League of Nations provided, and that it would have been more impressive had it been combined with measures for disarmament and with efforts to remedy the historic injustices which led to World War II. But Dr. Morrison always insisted that in principle the outlawing of war was right. Our threatened world could do worse than to renew that commitment made 50 years ago.

As the second major conflict approached, the Century held firmly to a policy of nonintervention until the attack on Pearl Harbor created a new situation. Morrison then bowed to what he called the "unnecessary necessity" of war. But the Century was not caught unprepared. Hutchinson had made the clergy wary of propaganda by reprinting chapters of the book Preachers Present Arms, by Ray Abrams, on the sorry record of the clergy in World War I. Morrison had attended the 1937 Oxford World Conference on Church and State; its theme, "Let the Church Be the Church," was the topic of many a wartime Century editorial. In 1942 American churches held a conference at Delaware, Ohio, on "a just and durable peace." The Century publicized its findings, and produced and distributed widely an 80-page handbook on what should be the shape of the peace at war’s end.

The Century was the first national publication to denounce the violation of the civil rights of Japanese-Americans by their arrest and relocation at the outset of World War II. It also deplored the dropping of the atomic bomb on Hiroshima and Nagasaki. As the struggle approached its conclusion, Hutchinson wrote the book From Victory to Peace. Morrison was present and approving in San Francisco when the United Nations was born. Following the war Hutchinson made a journey around the world, surveying for the Century and for Life magazine the wreckage and suffering of a world in ruins.

IV

Throughout his 39-year editorship, Morrison wrestled with theological and ecclesiastical questions. At the University of Chicago he had encountered the instrumentalist version of pragmatism, which had been taught there by John Dewey. But before many years he moved away from a position which he believed led to humanism, though he never ceased to apply the empirical test to any system of thought.

Morrison was always sensitive to the direction of the theological winds; he was aware of the attractions of Barthian and Niebuhrian views but never embraced them. Hutchinson was more responsive to the positions of Reinhold Niebuhr, partly because of Niebuhr’s involvement in social and political affairs and partly because of his deep sense of the tragic role of sin and corruption in history. The Century published a great many articles by Reinhold Niebuhr over a period of some 15 years, until Christianity and Crisis was launched by Niebuhr and associates at about the time America became involved in World War II. The break, which was never complete, was prompted by disagreement over several issues, of which American involvement in the struggle was one.

Toward the end of 1938, Morrison responded editorially to attacks being made by neo-orthodox theologians on liberalism. He noted that liberalism had entered the religious scene after it was established in science, history and politics. "It began its operations," he wrote, "by questioning the truth of certain conceptions held by Christian people -- particularly the literal inerrancy of the Bible, the obscurantist dogmas concerning the origin of the Christian revelation and a cosmological view of the origins of the universe and of man.’ He observed that the critics of liberalism did not attack it in the name of the literal Bible, or the cosmology of Genesis, or an obscurantist dogma of Christian revelation. Rather, in the name of realism, they held that "human thought is dynamic, not static; that it is a movement, not a position; that Christian theology grows with the growth of life and changes when life presents it with new and unanticipated positions." What was that but liberalism, which had said it first?

The spirit of liberal thought is one of the most precious gifts to modern man which have come out of the intellectual struggle of past centuries. To hold it in disdain, to set it over against realism, to stigmatize it as incompatible with true Christianity, to proclaim its bankruptcy, is hardly less than a wanton act.

Morrison left no doubt that he must still be considered a liberal evangelical Christian.

V

The Century under Morrison had as its deepest and most lasting concern the health and unity of the church. His two most important hooks -- What Is Christianity?, a revision of his Beecher lectures given at Yale, published in 1940, and The Unfinished Reformation, the Hoover lectures at the University of Chicago, published in 1953 -- dealt mainly with the church. While the earlier volume was cast in a theological context, its main purpose is indicated in the following passage:

The fellowship of the body of Christ is incomparably the most precious thing in Christianity, as it is also its absolute and substantive reality. It is Christianity. To divide the body of Christ is sin, because it divides Him and makes his headship of the body a scandal in the eyes of the world [p. 275].

The unfinished Reformation, as Morrison saw it, was the ecumenical movement’s undertaking to finish the Reformation’s task of uniting the church. He detailed many signs of encouragement that this reformation was in process, among them the historical understanding of the Bible and the growing ecumenical character of Christian thought.

Morrison had no patience with a rootless ecumenism. He lived and died a minister of the Christian Church, as his father had before him. But his loyalty was discriminating. He moved away from the congregationalism of his disciples heritage toward a more integrated form of church structure which he sometimes referred to as "connectionalism" In the course of time he became ready to accept the historic episcopate. He was always eager to welcome and examine plans of union. The foundation of the United Church of Canada, bringing together Methodists, Congregationalists and Presbyterians, delighted him because it crossed confessional lines. F. Stanley Jones’s proposed "federal union" of churches got a cooler reception. Under that scheme, each denomination was to keep its name and structure, but an entity was to be set over all denominations to which they would adhere. This superstructure was to be called the "United Church" The idea of leaving denominations intact did not appeal. Morrison was a draftsman of the later Greenwich Plan and was disappointed at its failure. He welcomed the Consultation on Church Union and was impressed with its deliberate and thorough theological development.

VI

Morrison’s commitment to the social gospel was personal as well as institutional. During World War I he visited the federal prison at Leavenworth, Kansas, three times in the interest of conscientious objectors. When Harold Gray, the conscientious-objector son of Philip Gray, an attorney for the Ford Motor Company, was transferred from Leavenworth to Alcatraz in San Francisco Bay, Morrison got Judge Henry of Cleveland to intercede with Newton Baker, formerly of Cleveland but then secretary of war, in the young man’s behalf. Baker immediately ordered Gray’s release. In gratitude to Judge Henry, a trustee of Hiram College near Cleveland, the elder Gray provided funds to build Gray Hall on the college campus. The son later wrote Character:Bad, a book on his experiences. Its title was derived from the inscription on his discharge paper. Although Morrison was not a pacifist, he supported the rights of conscientious objectors.

Dr. Morrison was always wary of mysticism, but with Dr. Willett he produced The Daily Altar, a widely used devotional book, and the useful hymnal Hymns of the United Church. In a 1933 book. The Social Gospel and the Christian Cultus, he sought to link social and cultural concepts in worship. He wrote:

We must construct new models, new pageantry, new hymns, new forms of prayer, new anthems of praise, new dramatizations, in which, for example, the labor movement may be caught up in the embrace of religion, the peace movement, the civic conscience, the community spirit, the family and every great aspiration of our time.

That winged vision, rising on the updraft of an earlier time, has valiantly beat its course through fair and stormy skies until now, 70 years after The Christian Century was refounded, it still summons us to move forward and finish the Reformation.