Counting Diamonds (Mark 9:30-37)

Occasionally in the news one hears about an infant that has been abandoned by its parents -- left at a church door, perhaps, or found on a side street somewhere, or even in a garbage can. If the parent or parents are found (usually in our North American context it is the mother), she or they are prosecuted. And we shudder and think, "What sort of heartless person could do a thing like that?"

In the ancient world, however, abandonment of infants was a normal practice, a postnatal method of birth control, and no particular stigma was attached to it. Oedipus is perhaps the most famous example -- the heir to the throne of Thebes was exposed to the elements as a newborn because of the terrible prophecy that he would kill his father and marry his mother. Infants might be abandoned for a number of reasons, including illegitimacy, but usually they were simply the offspring of parents who lacked the resources to feed them.

As John Boswell has shown in a magisterial study, these parents were not monsters; they knew what Tennessee Williams’s character Blanche DuBois knows -- that one can usually rely on "the kindness of strangers" (the title of Boswell’s book). Most abandoned children probably survived abandonment, because they were picked up and incorporated into someone’s household. Sometimes the rescuers were infertile couples who had always desired children; sometimes they were people who simply needed an extra hand to help out with the work.

There does not seem to have been any formal ceremony by which abandoned infants became a part of the family. Probably the presumption was that, merely by picking such a child up and taking it home, a person assumed the role of its legal guardian. This informal arrangement mirrored the ritual in which a father would pick up his own child immediately after it was born, thereby acknowledging it as his own and pledging to raise it. If, however, because of some scruple, he refused to lift the child, it would be abandoned.

This Roman custom of raising the newborn infant probably underlies Jesus’ symbolic action in our Gospel text for today. In response to his disciples’ secret argument about which of them is the greatest, Jesus says that the one who wishes to become first must become last of all and the servant of all. As an illustration of the sort of service he is talking about, he places a child in the midst of the circle of disciples -- thus indicating that children, including abandoned children, are to be brought into the Christian community. He then embraces the child, which probably involves picking it up. This would remind most observers of the raising ritual just discussed. Jesus’ actions, then, are symbolic of adoption; abandoned children are to be brought into the church and raised by Christian parents, not in order to exploit their labor potential but because the biblical God is one with a special concern for the poor, the homeless, the weak and the abandoned. Indeed, the early Christians were known throughout the ancient world for their charity, including their treatment of destitute children.

In our passage, Jesus goes beyond simply providing a model of such charity and links acceptance of such abandoned children with acceptance of himself: "Whoever welcomes one such child in my name welcomes me, and whoever welcomes me welcomes not me but the one who sent me." The wailing child in the garbage can is an image of Christ, and the way in which one responds to such a helpless creature is a gauge of one’s response to Jesus. Christ is mysteriously found in the abandoned baby or in the vagabond knocking at the door, as in the folk song, "Tramp on the Street."

Ancient literature, like modern fairy tales, is full of narratives in which gods and other supernatural beings disguise themselves as human beings, sometimes as the lowest of the low, and roam throughout the world to see how people will treat them. As the Epistle to the Hebrews says, "Be not forgetful to entertain strangers: for thereby some have entertained angels unawares" (Heb. 13:2, KJV). In our passage Jesus uses this common folk tale motif to drive home the same point that is expressed in his most powerful parable: "Inasmuch as you have done it to one of the least of these, you have done it to me" (Matt. 25:40).

A student came into my office at a time when I was busy writing. I reluctantly agreed to talk to him, trying not to let my impatience show. My fidgetiness increased when I noticed how long it was taking him to get to the point. Suddenly, however, something about the student got through to me. I realized that he bore an uncanny resemblance in appearance, manner and voice to one of the great leaders of our age. And it came to me in a flash -- this guy could turn out to be the next______! And here’s the next ______sitting in my office, and I can’t even concentrate on what he is saying!

Well, I don’t know if that student will really turn out to be an incarnation of this person -- but does it matter? "Inasmuch as you have done it to one of the least of these, you have done it to me. Menachem Schneerson, the famous Lubavitcher rabbi from Brooklyn, used to stand every week for hours as thousands of people filed by to receive his blessing or his advice about matters great and small. Once someone asked him how he, who was in his 80s, could stand for so long without seeming to get tired. The rabbi replied, "When you’re counting diamonds you don’t get tired."

The abandoned baby on the street, the stranger at the door, even our own husband or wife or child, is a diamond, and in receiving and treasuring these diamonds we are receiving the "pearl of great price" that was once hidden on earth as a destitute child of uncertain parentage.

The Millstone (Mark 9:38-50)

Eternal punishment. Like it or not, it is a biblical concept, albeit a late-blooming one. In the Old Testament, the afterlife is rarely spoken of, and when it is, it is usually pictured as a shadowy, wraithlike existence.

For the dominant line of thought in the Old Testament, "Sheol [the realm of the dead] cannot thank thee, death cannot praise thee; those who go down to the pit cannot hope for thy faithfulness" (Isa. 38:18) The dead are miserable, insubstantial shades, and it is better to be a living dog than a dead lion (Eccles. 9:4).

Only in the later books of the Old Testament, such as Daniel, do we encounter the idea of a resurrection of the dead at the end of time, "some to everlasting life, and some to shame and everlasting contempt."

This changed conception probably reflects terrible historical experiences, especially the persecutions and murders of Jews under the Syrian king Antiochus Epiphanes near the beginning of the second century B.C. If pious Jewish men, women and children were being brutally tortured and martyred for their faith, if the wicked were seeming to triumph in this world, where was the balance and justice in the universe? Somehow God had to make everything even, to reward the righteous and punish the wicked. And if he didn’t do so in this life, he would have to do so in the next.

The persecutions of Antiochus thus raised the same question for pious Jews of the second century B.C. that the Holocaust and similar horrors raise for Jews and Christians today. One answer to those questions, and a profound one, is expressed in a prayer offered by Auschwitz survivor Elie Wiesel at the 50th anniversary of the camp’s liberation:

Those who are here remember the nightly marches [into the gas chambers] of children, and more children, and more children. Frightened, quiet. So quiet and so beautiful. If we could see just one of them our heart would break. But did it break the hearts of the murderers? O God, O merciful God, do not have pity on those who did not have mercy on Jewish children.

This powerful prayer against forgiveness is related to the passage from Mark, not only by the general theme of divine judgment, but also by the specific subject of abuse and murder of children: "And whoever offends against one of these little ones … it would be better for him to have a millstone hung around his neck and for him to be cast into the sea.

Are such sentiments substandard, unchristian? It would be difficult to say so when it is Jesus himself who gives voice to them. And shouldn’t we let those who have suffered such terrible abuse be the ones to judge whether or not Wiesel’s words, or Jesus’, are wrongheaded? Who would want to deprive the children of Auschwitz, their relatives, or other survivors of human cruelty, of the powerful, cleansing emotion that goes with the conviction that the destroyers of the innocent will one day be punished?

And yet, although dreams of retribution are sweet and even at times empowering, are they the end of wisdom? Can we be certain that we are not among those who destroy the earth and our fellow human beings, and who will be judged for it? Once, on a bus tour of Egypt, we were led into a "school." It turned out to be a carpet factory where children sat hour after hour before huge looms, weaving lovely rugs to grace the living rooms of Western tourists like ourselves. They were beautiful children who flashed us shy smiles, and their hands flew so rapidly over the looms that we could scarcely see them.

I remember a young woman from the tour, a college student, hugging one of the little girls and weeping -- weeping that this child should have to forfeit her childhood, and her hope for the education that might lift her out of poverty, for the sake of the few dollars she was earning for her family by making rugs for tourists. Somehow, just by visiting, we all felt complicit in the exploitation and destruction of spirit that was going on in that so-called school. And even on the individual level, is it not the sad truth that most child abusers were abused as children themselves? In the justice of God, then, how will such people be judged -- with the punishment befitting abusers or with the compassion befitting the abused?

The calculus of revenge seems too complicated! There must be some other equation, or no hope will remain for any of us. And, indeed, the New Testament seems to hint at another equation when Paul says that God has imprisoned all human beings in disobedience in order that he might have mercy upon all (Rom. 11:32) Does this level the playing field totally, so that all humans become equally guilty of sin, thereby washing away the force of Jesus’ threats against those who abuse and murder children? No. Those threats must still represent a truth, a "word before the last word," as Bonhoeffer puts it -- a word that is terrible for the abusers and terrible for us to the extent that we participate in the "little murders" that punctuate daily life. But they cannot represent the final word, because the same Jesus who in Mark 9 says that it would be better if child abusers had never been born, in Mark 10 points to his own abused body as a sign of hope for all.

The Skewing of America: Disparities in Wealth and Income

The Federal Reserve Board reported some staggering news last summer: one-half of 1 per cent of American families (just 419,590 out of a total of about 87 million households) possess 35 per cent of this country’s privately held wealth. As it turns out, the board had to rescind its study because a major error had been made. The new results are not yet available; they will not be as drastic, though they will show that the gap between the rich and the rest of us -- stable until recently -- has widened. Private wealth is the value of an individual’s or household’s possessions. It includes real estate, homes, cars, stocks, trusts, savings and retirement accounts. Wealth is not income. Wealth is capital, stored-up economic power; it tends to be stable, and to grow. It is the cushion against the shocks of a mercurial economy. Income, on the other hand, is constituted primarily by wages and salaries (it also includes interest and dividends) , and offers only as much economic stability as the job market does -- which is very little.

The wealth concentrated in the hands of the very rich includes assets that confer economic power. In 1972, the top 5 per cent of the population held 66.7 per cent of the corporate stock and 93.6 per cent of state and local bonds. This disparity, which has since widened, suggests that the very rich can control the decisions related to corporate and municipal assets. The net worth of most Americans does not amount to much. Most families’ forms of wealth are limited to homes, automobiles, furnishings, appliances and checking and savings accounts. For most, the family residence is the principal form of wealth. Except for money that may be raised by second mortgages, residences are not an available asset for most families. Furthermore, the value of a residence is subject to the capriciousness of the housing market. The fragile economic situation of most Americans has been starkly evident in communities where plants have closed or moved away. Many homeowners have been left job-less, with houses that are unsalable. As for savings, the average American has at most $5,000 -- hardly enough to support a family after unemployment benefits run out.

Within the past decade, the median family income has leveled off and begun to decline. In 1979, 50 per cent of two-parent families had an income of $26,299. Today 50 per cent earn $24,556 or less. (There is a similar decrease for single-parent families.) The distribution of income among families has also become more unequal in recent years. The share of total income going to the bottom 60 per cent of American families has declined, and the share going to the next 20 per cent has increased negligibly, while the share going to the top 20 per cent has increased by 2.5 per cent.

The major reason for the decline in family income is the decline in real wages: most incomes are from wages. and average earnings have declined since peaking in 1973. Weekly wages have declined by 14.5 per cent, hourly wages by 10.1 per cent. Behind this decline is the loss of middle-income jobs as a result of the rise of the service economy and the failure of the minimum wage to keep pace with inflation.

According to statistics supplied by the Economic Policy Institute, a 25-year-old male worker in 1953 or 1963 could expect to more than double his income over a period of ten years. But in 1973, a 25-year-old male could expect a mere 16 per cent increase in income by 1983. The average male who was 40 in 1973 actually saw a 14 per cent decline in his income over ten years.

Practically all the households with annual incomes of more than $25,000 in 1983 were those of working couples. The traditional middle-class family with one wage-earner is no longer as economically feasible as it once was. Even in two-earner families incomes can stagnate, for women’s wages still remain low relative to men’s (so a woman’s salary does not necessarily double the family income)

All of these figures indicate why there has been a decline in the proportional size of the middle class. Between 1973 and 1985, the proportion of American families with incomes between $20,000 and $50,000 dropped 5 percent. The sector of families earning less than $20,000 grew 3.4 per cent. The number with incomes greater than $50,000 grew, but by only 1.8 per cent. Meanwhile, poverty has been increasing among low-income persons. In 1978 about 24.5 million Americans lived below the poverty line, but by 1984 the number had climbed to 33.7 million. Moreover, the number of persons who work but are still unable to escape poverty has grown dramatically in recent years. The number of those aged 22 to 64 who work but are still poor has increased by more than 60 per cent since 1978, according to the Center on Budget and Policy Priorities.

Are there any specific forces responsible for this situation? For instance, what is the role of the current administration; is it to blame? The Wall Street Journal put it this way: "The Reagan administration’s policies haven’t narrowed the gap between rich and poor. The tax cuts in the first five years of Mr. Reagan’s presidency favored the wealthy: reducing capital gains taxes benefited those with capital. Government programs aimed exclusively at the poor were trimmed, but others that help a broader group -- Social Security and veterans benefits, for instance -- weren’t" (September 22, 1986).

Though the skewing of America began before President Reagan took office, his policies have markedly accelerated the process. The Congressional Budget Office has indicated that budget and tax changes enacted from 1981 to 1983 have taken $20 billion from those with incomes below $20,000 a year and brought about a $35 billion increase for households with incomes of $80,000 or more.

Why should these facts concern Christians? First of all, because concentrations of wealth have important economic consequences. As was pointed out above, wealth translates into economic power and control through corporate stock and nonincorporated businesses, real estate and state and local bonds. Those who control these assets control how they are used and developed. The investment decisions of private companies, for example -- decisions that affect how and where people work and what and how much they produce -- are made in offices far removed from those workers or the general public. Those with wealth, who are closely tied to the banking community, govern capital accumulation and, with it, decisions about employment technology, income distribution, work organization, consumption patterns, and even our relations with other nations.

Consider the development of Harbor Place, a renovated waterfront area in Washington, D.C. All the decisions about how it should be built, what kind of establishment it should be, and to whom the site should cater were made almost entirely by the developers. Local residents who are worried about the effect on parking that the new 1,000-seat restaurant will cause were not consulted. In too many cases, decisions about which plants should be developed, which should be allowed to deteriorate, and which should be moved in pursuit of lower wages are made by corporate stockholders and their managers. Whole cities have been held hostage to large businesses, which threaten to withdraw if their demands are not met.

The concentration of wealth puts control of the not-really-free market in the hands of a relatively few families. Decisions about the quality of the food we eat, the goods we buy, the air we breathe, the prices we pay, the way work is divided and jobs defined, the kinds of transportation, recreation and entertainment available, the opinions and values that are supposed to be important, the kind of treatment we get in schools, clinics and hospitals -- these are made by the people who control the economic resources.

The second reason for Christians to be concerned is that economic control also means political control. Thomas Jefferson feared that "an aristocracy of wealth [is] of more harm and danger than benefit to society." Those who control economic resources have the potential to control legislative decisions. Government is beholden to those who control the purse strings, and not only because of the dollars needed to run political campaigns. One of the functions of government is to ensure the stable operation of the economy, and to do so it must often defer to the wishes of those who control the means of production. A government that limited or disrupted economic growth, risking thereby the displeasure of the economic overlords, would also be putting the coherence of its polity at risk.

Finally, the concentration of wealth should be a concern because it is a cause of poverty. There is a tendency for Americans who think about the problem at all to consider poverty a problem of individuals. Individuals are poor because they lack certain social skills, or they lack intelligence or virtue, or they are inadequately educated, or have incongruous cultural values or have too many children. From this perspective, the solution to poverty is simply to let poor people feel the spur of their poverty, which should motivate them to alter their behavior. Usually those who urge the motivational value of privation are not themselves poor. When they get hungry, they go to the nearest restaurant. Nothing is more ludicrous than such criticism of the ill fed, ill housed and ill clothed by the well fed, well housed and well clothed.

Poverty ought to be viewed instead as an aspect of inequality, a facet of the maldistribution of wealth and income. It is the inevitable social byproduct of the concentration of wealth in the hands of a few. From this perspective, one does not pay primary attention to individuals, or propose programs aimed at correcting the deficiencies of individuals. For if inequality is the cause of poverty, the proper question is not "What is it about individuals that makes them poor?" but "What is it about the nature of the economy that creates poverty?"

The case can be made that the current economic situation derives from our particular brand of American individualism. Individualism is healthy if it means the freedom to choose moral, political and cultural alternatives within a community of mutual concern. However, individualism has come to mean a solipsistic, acquisitive consumerism. Americans try to get what they can for themselves, by themselves, without being too troubled by the needs and problems of others. One works for oneself and leaves the rest of society to the workings of the "invisible hand." Indeed, according to this ideology it is by working for one’s own advancement, without regard for other individuals or for the polity, that one does the greatest good for society.

What is lacking here is a true sense of the common good. Were programs and policies shaped by the obligation to contribute actively to the common good, the situation would be altered. Social responsibility, in this context, means being aware of and concerned about the impact of one’s actions on others. If people saw themselves obligated to ensure that everyone has the essentials of life’s opportunities, then the economic and political institutions of society would be structured differently.

Gross inequality is a direct contradiction of the will of God, the creator and sovereign over all nations and peoples. Creation is God’s gift to all. We are collaborators in creation, our task being to shape the social and economic order according to God’s intentions. To allow the appropriation of the world’s resources by a small minority of persons betrays the gift of creation and the giver of the gift.

The prophets railed against the accumulation of land, and Israel’s law provided for the Year of Jubilee, when land was returned to its original owners. The covenant established a community of mutual responsibility and care. Jesus, in keeping with this tradition, had a special concern for the poor, and opposed the accumulation of possessions. He disdained the type of economic inequality which we are now witnessing. "A man’s life does not consist in the abundance of his possessions," he declared (Luke 12:15) , and he called the rich man who builds larger and larger barns in which to store his grain and other goods a fool, cautioning his listeners that "he who lays up treasure for himself . . . is not rich toward God" (12:16-21) As biblical scholar Richard J. Cassidy points out in Jesus, Politics and Society (Orbis, 1978) , that parable focuses on those who already have enough for their needs. The landowner does not seem to have acquired his wealth unfairly or dishonestly; nevertheless, because he held on to more possessions than he needed, because he possessed wealth without regard to the covenant community, he was declared a fool in the sight of God.

In the Bible, Walter Brueggemann says, justice means "to sort out what belongs to whom, and return it to them" ("Voices of the Night -- Against Justice," To Act Justly, Love Tenderly, Walk Humbly [Paulist, 1986], p. 5). There is a right distribution of goods and of access to the source of life. People have entitlements which must be respected. All this is implied in the great biblical vision of covenant and shalom, in which every family, clan, tribe and person has a place in which their God-given dignity is respected. This vision is not a romantic ideal; it calls for the transformation of unjust social systems. It is the obligation of churches to bring this biblical vision to bear on our complex economic and social world. They attempt this at some risk, of course, for it is not an easy message, and too many of the churches and those in the churches are dependent on the holders of wealth for their own economic wellbeing. But if the problem is not addressed by the ambassadors of the gospel, who will do so?

Going Home to Israel

This essay marks for me the end of Galut, of exile, I and the realization of my half-century dream of returning to the land of my ancestors. It is an essay, too, about new beginnings. For although I have visited Israel almost every year for the past 20 and have very little to learn about it, knowing Israeli life in one’s head is not the same as internalizing it on a daily basis. That kind of knowing comes only when one has moved in permanently, unpacked, and has no return ticket.

It’s not for nothing that one of the first phrases they teach in Hebrew language class is kol chatchalah kasha -- all beginnings are difficult. New beginnings, no matter how much desired, dreamed of, worked toward, are difficult. I’ve found, for example, that despite the lip-service paid to Israel’s need for Western aliyah -- the immigration of Jews from the West -- many native-born Israelis resent these newcomers almost as much as they do Palestinians. In the U.S. I always thought of myself as a refugee; now I’m an "American" or, ludicrous as it may seem, an "Anglo-Saxon."

More serious, and more painful, is my discovery that the values of progressive Judaism and Zionism to which I am committed have been deeply eroded in the past decade by forces over which we seem to have little control.

The influx of cheap Arab labor from the occupied territories has undermined one of the most basic concepts of mainstream Zionism, avodah ivrit -- the sacredness of Jewish labor, the idea that Jews must stop being middlemen, as they were so often forced to be in the Diaspora, and do their own dirty work. While it is undoubtedly true that the wages of Palestinians. from the territories who work in construction and low-level hotel jobs raise the standard of living in their communities -- a fact frequently touted by those Israelis who support the status quo -- nevertheless Israel was a healthier place, a more Zionist and progressive nation, when we drained the swamps, built the roads and the houses and waited on our own tables.

Israel’s ever-increasing dependency on the U.S. has also produced unpleasant, even dangerous, side effects. The impact of American consumer culture on the developing world may be nowhere more evident than here. Israelis who don’t necessarily want to live in the land where everyone (it is assumed) has a beautiful home with the latest electronic gadgets, a swimming pool and fast car do their best to clone that way of life. The young would like at least to visit the U.S., not because they are impressed by its democracy and First Amendment, but because they are tantalized by designer jeans and discos.

This situation leads to a third major Israeli problem: the growth of fundamentalism. Like the spread of American culture, this phenomenon is scarcely unique to Israel. Fundamentalism is what many people turn to when they are frightened of challenges to existing values and cherished beliefs, and when alternative leadership falters. In Israel there is a growing drug and alcohol problem, and many young people have embraced a hedonistic lifestyle. A nation in which the youth are regarded not only as the future but as one symbol for a million Jewish children lost in the Holocaust, a nation traditionally proud of the idealism, scholarship and selflessness of the young, fears that those youth are being seduced by alien values. The teen-age suicide rate is up, stabbings occur in discos, and it is no longer safe for young women to hitchhike -- quite apart from the threat of terrorism. It is not surprising, then, that the power of the Orthodox and ultra-Orthodox religious minority is growing.

Israel’s convoluted electoral system has always given the small religious parties great leverage. And the secular Jews who founded modern Israel granted the Orthodox the right to control matters of personal life (marriage and divorce, for example) not only for pragmatic reasons (they wanted the political support of this group) but also because they realized that many, if not most, victims of the Holocaust came from its ranks. Nor did even the most militant secularists fail to recognize that without the religious fervor that looked forward to celebrating Passover "next year in Jerusalem," modern political Zionism might never have been born. On this question, what Shlomo Avinery, the Hebrew University scholar and former director general of Israel’s foreign ministry, said of himself is unfortunately true for many Israelis: "The synagogue I don’t attend is Orthodox." Progressive Judaism -- as represented in the U.S. by Reform, Conservative or Reconstructionist streams -- has never really established itself here, although Reform and Conservative groups are now making a greater effort to do so.

The indifference of those who don’t care much for religion and the passion of those who care desperately, fanatically, have together created a climate that would make our national mothers and fathers -- the Ben Gurions no less than the first Ray Kook -- weep. This summer, after dozens of bus shelters were burned by ultra-Orthodox Yeshiva students to protest advertisements featuring women in suggestive dress and posture, a synagogue near Tel Aviv was vandalized in what appeared to be an act of retaliation by ultra-secular Jewish youth. Whole neighborhoods are torn apart over the issue of blockbusting. Orthodox families, who have or will have many more children than secular families, buy up all available apartments and then pressure their less Orthodox neighbors into conformity or flight. This past Yom Kippur, to which I had looked forward for 17 years -- so that I could worship with my daughter, who has been living here -- turned out to be a bitter day. It presented me with a miserable choice. In the neighborhood I now share with my daughter and her family, the only worship service available was Orthodox -- though I tried to find enough people who wished to pray together in an egalitarian, progressive service. So my daughter, unwilling to sit in a separate section for women, refused to go. In principle, I agreed with her decision. But how could I be in Jerusalem and not say Kaddish, the prayer for the departed, for my father and my Nazi-decimated family? After struggling between my desire to coax my daughter to accompany me and my shame at even considering asking her to go against all that I have taught her about the right of women to be equal in every area of life, I went by myself. If I have ever romanticized Orthodoxy -- and I have -- all romantic notions died forever as I sat not behind the traditional mehitza (dividing curtain) or balcony of my childhood, but in a separate room, forbidden to touch the Torah, excluded from anything but being a distant spectator.

On the other hand, I could have found myself in the Jerusalem Reform congregation where a local Orthodox rabbi and some of his followers broke into the service, tried to grab the Torah scrolls being carried by men and women, and suggested that the women were "undressed whores" (some wore sleeveless dresses) and the synagogue was "a house of prostitution."

For me it boils down to wanting, at the simplest gut level, to spend the years left to me in a land where I’m not in the minority, a land where I don’t have to listen to Christmas carols from Thanksgiving to Christmas.

One person who showed deep insight into my move, herself the American-born daughter of Jews who fled Germany, is a staunch believer in American capitalism who does not share my socialist vision or my belief in the coming of the Messiah and in the possibility that the arrival of this person (or the messianic time) depends on Israel or its message to the world. Nevertheless, she recently demonstrated her understanding when she wrote to me, "Happy birthday -- your first in your third country (but first homeland)" She knows that I love what America stands for, but that for all the years I lived in it I never felt of it. She knows that among the first words I heard in English were, "You dirty Jew, why don’t you go back where you came from," and that throughout my life in the U.S. I kept running into one version or another of that sentiment.

I will continue to work for peace and justice here, as I did in the U.S., but it will be work among Jews and Arabs, and among Jews who are divided by theological or ethnic origins, or by attitudes toward women. And unlike the various movements in which I worked over the years in America, my sisters and brothers, when they have enough of me or forget themselves, at least won’t turn on me by calling me "dirty Jew." Here, yesterday’s comrade can’t tell me to go back where I came from -- to Waldheim’s Vienna, soaked in Jewish blood -- because I am back, back to the beginnings of my people, my land. It is a land that I, along with members of Israel’s large peace movement, believe must be shared. But within its borders -- which many of us are willing to see shrink for the sake of peace -- it is ours in perpetuity.

The headlines here are full of unpleasant news. Is it true that Jacques Chirac and Helmut Kohl both believe that the plot to blow up an El Al airliner (on which the now-convicted Arab named Hindawi sent a small, deadly bomb along with a woman carrying his unborn child) was really masterminded by Israel? Can it be that the Israelis arrested in the U.S. for selling arms to Iran did so at the behest of the U.S. government? Has it turned out that a young man termed unstable by some Western newsmen who spoke with him in Australia, before he sold the London Times the pictures and story of Israeli nuclear capability -- and who is now on trial -- did what he did because of his radical politics, his conversion to Christianity, or his desire for money? How would the U.S. treat one of its citizens who did something similar?

Nations and revolutionary movements that start out with the highest moral purposes, and are guided by the aspiration to build a society based on justice and equality, have only a short time before their age of innocence ends. However magnificent its dream, a nation must be judged, like an individual, on the means it uses, not just the ends it desires. Israel’s age of innocence is over. There are those who do not hesitate to exploit our deepest fears and our deepest loyalties. Some Israelis wonder if post-1967 Israel has truly followed the only path to survival. History books are filled with examples of nations and revolutionary movements which fell from grace in the struggle to survive. This then is our dilemma: Can we achieve both physical and ethical survival? Can we, in rejecting 2,000 years of powerlessness, remain (or become) that prophesied nation of priests, that light unto the other nations, a Jewish nation in the best and not the narrowest, tribalistic sense? Whatever the outcome, I am now part of the search for the answers to those questions.

Hospice: Caring at Life’s Edge

When I was just beginning to do volunteer work at Cabrini Hospice in Manhattan, I visited an emaciated, dispirited man who was dying of cancer of the larynx. It was all but impossible for him to speak. The few words he did manage emerged in a whisper that seemed to come from the elaborately bandaged crater by his throat. A nurse stopped by and noticed his sluggishness. "Do you want us to give you less methadone, Mr. DeVoe? Or should we continue with the dosage that we have been giving you?" Turning away from her, Mr. DeVoe remained silent for awhile. But the nurse hung on to the question until she had his answer.

The primary aim of hospice is to help patients die with dignity. And one of the ways that is achieved is by allowing patients, whenever possible, to make choices about their treatment. More methadone or less methadone? Radiation treatment or no radiation treatment? The patients at Cabrini are not led to their deaths blindfolded.

Like most of America’s approximately 1,000 hospice programs, Cabrini’s is based on home care, with a 15-bed in-patient unit to care for crisis cases. Cabrini is one of New York’s largest hospices, capable of handling up to 65 patients at a time. It is run by laypeople under the auspices of the Missionary Sisters of the Sacred Heart. To be admitted to the program one must be diagnosed as having six months or less to live. Medicare covers most patients, private insurance the rest. At Cabrini, life-prolonging devices such as respirators and naso-gastric tubes are dispensed with. Cancer patients predominate, along with a growing number of AIDS victims.

Hospice has succeeded in expanding and energizing the often narrow, airless world of the crisis-stricken family. Cabrini provides a social worker, nurses, a doctor and volunteers. ("Hospice may be one of the last places left where a doctor still makes house calls," maintains Barbara Rice, Cabrini’s director of volunteers.) There are also weekly support meetings for all primary caregivers.

Sister Loretta Palamara, Cabrini’s director of pastoral care, who has been with the hospice since its founding in 1980. tells endless stories of her relationships with patients, their families and their friends. Once, at three in the morning, a dying man summoned her to his bedside and told her to have his two sisters and his daughter come to him. The women came, stood around, said nothing, then withdrew to the lounge. "You see that," the patient remarked to Sister Loretta, "no one spoke." "But Jack," she said, "it’s four in the morning. What do you expect?" Jack would not be placated. "You don’t understand. They never speak. You must go and tell them they have to forgive each other for what happened 17 years ago." The sister did as she was told. The women broke into sobs and embraced one another. Jack died the next day. Sister Loretta never learned the nature of the wound she had helped to heal.

Most volunteers’ involvements with families are not that dramatic. Volunteers bring supplies and listen to patients’ complaints, fears and family legends. One volunteer simply played Scrabble with his patient week after week.

The families that 4~hoose hospice are unusual in a culture that banishes the dying from the world of familiar faces, furniture and kitchen smells, and entrusts them instead to hospitals and nursing homes, to the wilderness of pills and medical gadgets. The families that decide to care for their terminally ill loved ones at home are motivated by an uneasy mixture of love and guilt. Courageously, they learn to change diapers, give injections and endure the chastening smells and sights of the dying. They have resurrected in this country the mislaid ritual of communion at the rim of extinction.

But hospice is not only for patients who have family and friends to care for them. It also helps patients to die alone in their own homes. All that is needed is a volunteer willing to provide care.

I have cared for two patients who died alone. One was an old woman named Ruth Levy. To get to Ruth, one had to pass through two doors -- the door of her home and the door of her solitude. Of the two, the latter was often the harder to penetrate. There were days when she could barely bring herself to acknowledge my arrival. Death, which looked out at me from the crags and caverns of her face, went forever unmentioned by her, as if it could be made to take offense and go away.

In contrast, Fred Smith was able to face death squarely. He once said, "Every night I pray, ‘God, I thank you for this day, but please see to it that I don’t have to live through another one like it."’ Unlike Ruth Levy, whose lung cancer caused her no physical pain, Mr. Smith’s tumor pressed against his spine and kept him in agony much of the time. While Ruth could still court denial, Fred had seen all his illusions burned away by pain.

Failure is woven into every success at hospice -- that is the terrible paradox of it. The friendships that for a time bridge the void are consumed by the void. Hospice teaches detachment.

In We Die Before We Live, his book about St. Rose’s Home for terminally ill cancer patients, Daniel Berrigan writes: "I am beginning to sense it; you have to be in good form spiritually to work here," At Cabrini, as at St. Rose’s, not everyone is in good spiritual form, but it helps if you are. All around you is the question of suffering, the questions of life and death and God and afterlife. And nowhere are there any answers.

"Volunteers," says Barbara Rice, "find it a good atmosphere in which to practice self-reflection."

Choosing the Impossible: Seminary Students Speak Out

The seminaries that train future clergy, and the churches that employ them, dominate most discussions about theological education. Too often missing from the analysis of postgraduate preparation for ministry are the students themselves. What do they think about their experiences and their future?

To find some answers, The Christian Century invited students from ten Chicago-area seminaries to our offices for an intensive three-hour roundtable discussion. We asked them to engage in informal conversation, prodded by a few questions, to provide our readers with insight on how today’s students view theological education. The students, nine men and eight women suggested by the deans of their respective schools, represented not only denominational diversity but a variety of backgrounds and work experiences -- reflecting the general increase in the number of older and second-career students currently attending seminary. Most of these students had returned to school after some significant work experience -- in such diverse fields as teaching, politics, law, securities, real estate and graphic design.

Several students confided that money was the main question for them in deciding whether or not they could indeed attend seminary. In some cases, the student’s home church has helped foot the tuition bill; in others, the seminaries have offered financial aid. But almost all the students agreed that they still had serious difficulties making ends meet, and that they were amassing sizable debts. One student pointed out that a minister’s debts become, in effect, a burden on churches, which must try to pay pastors a salary that will allow them eventually to retire those debts. Another woman remarked that as a female she could expect no financial support from her district, whose hierarchy does not support female ordination (though this certainly would not be the case in all of her denomination’s districts, nor was it true for all females present)

The Catholic representatives noted that their church has its own version of this problem. For members of religious orders, seminary training is entirely paid for by the order. The church does not, however, provide similar support for laypeople, especially women. This "two-track" system discourages the participation of many talented individuals. Not only must laypeople finance their own education, but their service to the church is often questioned and challenged, despite the dramatic decline in numbers of men entering the Catholic priesthood.

Other students placed the issue of expenses in a larger framework. According to one, the cost of seminary generally discourages the emergence of religious leaders from the lower economic levels of society, thus enforcing a kind of caste system in church leadership. "I don’t find people emerging from relatively impoverished backgrounds to pursue seminary training," he noted.

One student who reported that on graduating he will be $10,000 in debt suggested that the responsibility for educating pastors should rest with the churches, not with individual students. He regarded his own training as the property of the church, he said, and therefore thought it appropriate that the church finance that education.

Some denominations are attempting to find ways to convince local churches to support theological education. However, several students pointed out that the prospect of churches playing a greater role in finances is problematic: it would mean that churches would also be able to define more specifically the nature of seminary education. The freedom for intellectual and spiritual exploration -- and the makeup of seminary communities -- might well be limited if the seminaries were more closely controlled by the churches. One student remarked that if it had been up to her local congregation to choose and support seminary candidates, they probably "wouldn’t have sent me."

A student from one of the more conservative evangelical schools objected to distinguishing so firmly between the person and the church, saying that one attends seminary in order to serve the church. A mainline student added that he thought it entirely appropriate that he be held responsible to the church for his education, since in a way his career would be a gift to the church.

These points brought the conversation around to that perennial concern of theological education mentioned earlier: the gap between church and seminary and how to bridge it. "One of my biggest fears," admitted one seminarian, is that of "getting lost" in the parish; i.e., not knowing how to apply the theoretical knowledge he has accumulated in school. Another observed that amid all the requirements of the curriculum, there is only a minimal opportunity to take "practical" courses. "Our education is so segmented," commented a third. "Where is the integration" necessary to a career in the parish?

Though most of the students agreed that integration is a problem, one advanced student mentioned that she was discovering it through field education work; "I just wish I could have more of it," she lamented. Reflecting on his experience of attending seminary after first gaining considerable experience in the parish, one older participant wondered if maybe we’re doing it backwards"; in other words, perhaps schools ought somehow to require practical experience before -- or at the beginning of -- formal education (such an arrangement would, of course, run counter to essentially all currently respected educational theories) For himself, he said, the practical application of what was being taught in seminary was plain in light of his experience of parish ministry.

However, another participant pointed out, there isn’t time "in any program" to learn all the things that need to be learned; many skills can -- indeed, must -- be learned later. Others agreed that the goals of a seminary education were "impossible." In the face of this challenge, remarked a member of the group, it is actually "consoling to realize that we can’t know everything." One student called attention to the fact that in the Soviet Union seminary students attend school for eight years; but even then it is doubtful whether there is enough time truly to "complete" a course of study. Most educational programs are organized on the principle that schooling is an introduction to a topic; graduate education is an opportunity to delve more deeply into a specialty, but one is expected to continue learning after that as well. (Commentators such as theologian and culture critic Joseph A. Sittler have complained that pastors often cease their education with the completion of seminary.)

Seminarians -- and seminary graduates -- must "be aware of how much we don’t know," emphasized one discussant. For example, he said, it is absurd for a young celibate Catholic priest to conduct marriage counseling for a middle-aged couple with children. The priest at least needs to know when he has to turn to other professionals for help.

The demand for seminaries to integrate theological education with practical skills for parish ministry is to some extent misconceived, demurred one speaker. To begin with, he said, seminaries generally prepare people for a broad range of ministries in the public and private sectors, not just for parish ministry. Furthermore, he argued, the point of theological education is not necessarily to acquire practical skills. He noted that his own training in law school had not been designed to teach him how to file a brief or deal with a client, but rather to teach him to think like a lawyer. Similarly, the point of theological education should be to learn how to think theologically. Having learned that, one is prepared for a variety of ministries.

In agreement, another participant suggested that "we put too high an expectation on the seminary if we expect it to provide the integration" between theological study and parish ministry. That integration comes from one’s own effort as the end result of years of study and practice. However, this view did not negate the importance of outside support and guidance for seminarians attempting to find the synthesis. For example, one woman reported that she had been aided in integrating academic study with ministry through the support of a sponsor in her church who regularly monitors her progress. Similarly, another of those present related that he has been pressed -- to his benefit -- by his seminary community and the parish he works in to account for the relations among the various activities in which he is engaged.

Responding further to the theme of integration, a participant acknowledged that he was surprised by the "lack of passion for the gospel in seminary." If people have that passion, he suggested, then they will be stimulated to achieve the integration of knowledge and action that seems so elusive. While virtually all present agreed with this general point, they applied it to a variety of specific issues. For example, complained one mainline seminarian, his school does not require its graduates to learn Greek and Hebrew. "That shows a lack of passion for the gospel from an academic standpoint."

This claim generated a lively discussion, with a number of disagreements being expressed. A representative of a particularly socially active denomination postulated that Spanish is probably a more important language to know today than Greek; how can one propose to do ministry in the U.S. currently without a knowledge of Spanish? The practical demands of ministry require such changes in the required curriculum, he maintained. In opposition, speaking as a student of Greek and Hebrew, a participant insisted that these languages are a foundational study for all theological education, and seminary is, for her, an opportunity to gain a solid grounding in God’s Word.

Several students, though acknowledging that there should be a core seminary curriculum, argued that the content of that core should change as society changes. For example, feminist theology’s an important part of theological education now, though it wasn’t 15 years ago and perhaps will not need to be 15 years from now.

In general, discussants put all of the above issues into the context of one overall question: What is the purpose of a seminary education? Some students emphasized again that they are attending seminary not primarily to gain skills for parish ministry, but for personal enrichment (and some do not necessarily plan careers in religious ministry)

There seems today to be an increasing diversity of motives bringing people to seminary, which is one reason it is difficult -- perhaps more so than ever -- to define the goal of a seminary degree. The diversity of individual pursuits also calls into question the issue that opened the conversation: What is the churches’ financial responsibility to provide seminary training for its leaders? Churches could hardly be expected, one participant maintained, to fund those students who are in seminary more as part of a personal search than in training to become its future leaders.

The multipurpose nature of theological education, and the pluralism of the seminary community itself, clearly make the task of such training a difficult one, indeed an "impossible" one, as suggested earlier. But the liveliness of the seminary students’ discussion, and the urgency which characterized their grappling with the dilemmas of diversity -- theological, practical and personal -- indicate that seminaries are nevertheless training people who are, despite all obstacles, eager to achieve the impossible.

Seizing the Moment for Teaching Pastoral Care

Pastoral theology (like all seminary courses) cannot be taught as a purely academic discipline. Being more a skill than a system of facts and ideas, pastoral care cannot be mastered merely through reading books and hearing lectures. To learn how to minister effectively, prospective pastors must first experience pastoral care -- and it can be experienced in the seminary.

Seminary professors can teach pastoral care not only in the pastoral theology classroom, but in other seminary situations as well. The opportunities arise spontaneously, often at surprising times. If the teacher takes advantage of these occasions when and as they arise, students can learn by experience how to help people cope with life’s inevitable traumas.

During my years as a seminary instructor I have often observed that the theological classroom is bedeviled with rivalries, put-downs, hatreds, suspicions, anxieties and identity crises, not to mention hard-line, inflexible prejudices. Teachers shouldn’t allow students to conceal that reality behind a façade of highly graded essays, dissertations and examinations. Even fieldwork can seem to be competently performed and yet disguise many negative attitudes or habits. In assignments, students can write acceptable arguments that belie their basic behavior and attitudes. I soon realized that I needed to discover why a student was angry or depressed, why she was fearful and aggressive or why he went into a tailspin when crossed or interrupted in a discussion.

As a visiting professor at another seminary. I once taught a pastoral-care course using contemporary novels as the texts. Through novels and plays, human situations can be imported into the classroom by means of the narrative and the character presentation. One student chose to present to the class Peter De Vries’s novel The Blood of the Lamb. As he described the sickness and death of one of the characters, a 12-year-old child, and her father’s profound grief, the student broke down. We did not attempt to pass over this. Other students disclosed the grief they had felt at the deaths of family members or friends or the breakup of partnerships. Some were describing feelings they had never before confessed. The students were giving and receiving grief ministry.

I let the conversation flow naturally. Some of it was addressed to me and some was expressed in small groups. After a time, I moved around among the class and talked with individuals or with twos or threes. This development could not have been structured or preplanned; I had to let it roll on until it completed itself. Later the class quickly agreed to keep the conversations confidential. That evening I called at two or three apartments where I realized that further discussion was needed. These were actually pastoral calls. Looking back I realize that that class of about 20 students became the most intentionally caring class I have ever led.

Another day while working through Angus McKenzie’s Late Call, an excellently constructed novel about three generations living under one roof, the presenting student vented the anger she felt toward her own father. Again, this triggered a chain reaction from half a dozen other students. One exclaimed, "I never thought I would be allowed to talk about a thing like that!" I suspect that one of the students was acknowledging her victimization in a long-suppressed case of incest. She was obviously dealing with something she had not been able to handle before. She later sought counseling from an appropriate source.

Not all the class sessions reached that dramatic level. But I was learning in a new way how near to the surface lie guilt and anxiety. I believe our discussions taught the students how simple it is to provoke these reactions -- in preaching, counseling or other situations.

To be effective pastors, seminarians also need to learn that they need not be invulnerable pillars of theological certainty. Some students, influenced perhaps by their home churches’ expectations, perceive seminary as a kind of finishing school. The students are supposed to have a definite commitment, a firm faith and an identifiable personal development. To this they simply need to add biblical and theological knowledge along with professional and pastoral skills.

Seminaries should be prepared to help students grapple with searching, doubting and moral dilemmas. The new ideas they encounter at seminary may have brought their earlier faith understanding to an impasse.

In another course, where we were discussing ministry to the dying and the bereaved, I had asked each student to prepare a brief position paper on what theology she considered appropriate to such ministry. One student presented a blank sheet.

"I have nothing to say," he said. "At this point I am not sure that I believe in God. I could only succeed in ‘being present,’ as you would put it, and help the person to die peacefully or to face grief calmly -- or whatever."

I replied, "You would do that better than anyone I know! If you did survive in the ministry -- which in your present state of belief is unlikely -- I would welcome you as my minister at the time of my own death."

He looked very surprised at my response. I told him he appeared to be the kind of person who knew how to be present.

He stopped by my office a couple of days later and said that my remark in class and my affirmation of him had made him want to grow theologically. "It is like you opened a door and invited me to walk through it."

I have helped develop pastoral groups in seminaries. These are known by many names, but I am not impressed by most of them. No matter how sophisticated their curriculum is, there are special moments in the pastoral learning process for which teachers cannot plan. These moments are unpredictable. The whole theological education process must attempt to catch those moments as they come. Seminarians still need an academic environment of lectures, tutorials and seminars in all subjects, including pastoral care. But in no classroom should any teacher be exempt from the responsibility of pastoral formation.

Many schools require their students to do fieldwork in a church. This provides valuable practical training. But the classroom need not be impractical. The seminary is also a field which develops and hones the student’s skills of sharing, caring, spiritual nurture and personal evaluation. This kind of development cannot be scheduled directly into a curriculum. Instead, all seminary professors, of any subject, should sharpen their knowledge of pastoral education in order to be prepared for opportunities as they arise. Some of the seminary’s most valuable lessons are "caught" rather than "taught."

Rethinking Hunger in America: Adapting the Sullivan Principles

A few years ago I visited the pastor of an Episcopal church in an economically mixed neighborhood. To reach his office, I had to pass "emergency shelter" signs, piles of donated clothes, a soup kitchen and a few small mountains of donated food. Once inside his office, I congratulated him for all the activity, and suggested he rename his church the "Emergency Episcopal Church."

"Don’t you be fooled," came his response. "If I wanted to be sure that all the good people did nothing about social justice, I’d just put a thousand hungry people on their doorstep. Those good churchpeople would immediately start to organize people to donate the food, to collect and stack the food, to transport the food, and, later, to cook and serve the food. They’d even organize the people to clean up afterwards.

"They’d be so busy moving all that food from one end of town to the other, they wouldn’t have any time or energy left to do anything about why all those people are hungry in the first place."

He was right. So much energy is consumed in moving the cans, and so little energy in doing anything about why so many people are hungry and poor, that hunger has become a fixture in one of the wealthiest countries on earth. And that is not acceptable. It’s time to stop asking for more soup kitchens and pantries, and to call ourselves and our neighbors to account for actions that cause -- and tolerate -- so much avoidable hunger.

Such accountability will require three things: a better understanding of the nature of poverty in our society; a way to address the anomaly of poverty among working people and their families; and a reminder from recent history concerning how to do something quickly about hunger in America.

In the late 1980s, poverty is taking a new form: it is younger, poorer and more likely to be employed.

Younger. Poverty among the elderly has been cut by two-thirds in the past quarter-century. Today children account for 4.0 per cent of America’s poor. We are the only industrialized nation for which children are the largest group in poverty.

Poorer. A family of three falls below the official poverty line if its income is less than $8,600 per year. Roughly four in ten of America’s poor have incomes below half the official poverty line. And the poverty "gap" -- the amount needed to reach the poverty line -- is growing.

More likely to be employed. Full-time, full-year work at the minimum wage yields an annual income of just $6,968 per year -- which is below the poverty line for any but a single-person household. Two million full-time workers -- many of them with families -- can be found in poverty; 60 per cent of poor families include workers. One of the growing poverty groups is the two-parent family with young children, in which one or both of the parents works full time.

It is little wonder, therefore, that shelters for the homeless are serving growing numbers of working people, including families with young children; that soup kitchens and food pantries fill up at the end of every month with working people whose wages just don’t last all month; and that three-fourths of the 35 to 40 million Americans without health coverage can be found in the household of a worker.

For more than a decade, these "Sullivan Principles" offered a litmus test of basic decency and responsible employer behavior. Anyone could check on whether or not a company was adhering to the Sullivan Principles. Many groups conditioned their decisions about divestiture and other forms of economic pressure on whether businesses had adopted these principles.

The time has come to develop a variation of the Sullivan Principles that would hold American businesses accountable for their treatment of American workers.

Though conditions in this country are not the same as in South Africa, poverty and hunger do violence to the human spirit, and that violence -- particularly when it is a consequence of public policies and publicly acknowledged actions -- should be opposed with the same ferocious indignation that prompted the shantytowns and divestiture battles directed toward South Africa and the American businesses located there. Just as we hold people thousands of miles away accountable for their actions, now it is time to hold ourselves accountable for actions that cause so much of the poverty and hunger in our own communities. Something is fundamentally wrong when so many working people in a country as rich as ours can’t provide food for their children, health care when they’re sick, or a roof over their heads.

Yet efforts to raise basic wages meet stiff opposition from the business community. Equally strong resistance greets alternative structures that would make life on low wages more reasonable: subsidized childcare and housing, expanded and improved food stamp programs or tax-supported health insurance for low-earning working people.

An American version of the Sullivan Principles would address this resistance by outlining four or five basic tenets that congregations and individuals could apply to local businesses to determine whether they protect their workers against hunger, homelessness and poor health, or contribute to those problems. We need to hold ourselves accountable, and, if necessary, press for public and private policies to bring an end to poverty and hunger.

Just as the Sullivan Principles were not limited to what is required by local law, but rather to what would be needed to meet the standards of common decency and responsible community behavior, we might ask whether employers

  • pay wages at least equal to the poverty line;

  • provide childcare subsidies for working parents;

  • provide health coverage for all employees (those working less than full-time could get pro-rated benefits, 20 or 30 hours’ worth of benefits for 20 or 30 hours of work)

Since 1977 the Sullivan Principles have been "amplified" to ask whether a business used its influence for good in the community by, for example, working for "provisions for adequate housing." Businesses in our communities might be asked the same question. At a minimum, paying less than poverty-level wages and denying basic health and pension coverage should be grounds for community opprobrium.

Better wages or the supports to make low wages livable will not make hunger disappear, any more than the Sullivan Principles could make apartheid disappear. And when Sullivan announced recently that he would abandon his code, it was because he felt that sterner measures are now needed in South Africa. But the principles played an important role, and they are widely acknowledged as an effective device for educating and arousing a complacent national conscience. A version of the Sullivan Principles for this country could do the same with regard to hunger. In the process, the lives of working people would be changed significantly. And emergency food aid would no longer be expected -- as is true now -- to compensate for inadequate wages.

Ending hunger now means changing the way we think about our responsibilities -- not just as church members, employers and individuals, but as citizens in a democracy where every voice counts, and where doing nothing is a political act. Church social action committees must go beyond sharing information and offering direct service to take on responsibility for advocacy -- political and personal -- for poor Americans. Hunger could be reduced significantly and quickly by heeding the lessons of recent history, and restoring the role of government food assistance programs.

In 1967, when a team of pediatricians reported to Congress that they had seen children starving in the rural South, their testimony set off shock waves. At the time, little was available to help hungry Americans. The Food Stamp Program had just been revived as a "pilot demonstration project’ in 1961, but it had only spread to a few hundred counties. The system for distributing surplus food commodities (cheese, dry milk, bulgar wheat, peanut butter) was proving to be cumbersome, unreliable and likely to leave its recipients hungry and malnourished.

The situation was not even good for especially vulnerable groups: children, pregnant women, the elderly. Though established in 1946, the school lunch program was chiefly available to schools that could afford it, leaving many of the schools with the largest concentrations of poor children out of the program entirely. School breakfast programs had been authorized in 1965, but no money had ever been appropriated. There was no special supplemental food program for women, infants and children (the program now popularly known as WIC) , no congregate or home-delivered meals for the elderly, and no subsidized meals for children in daycare, residential institutions or summer camps.

Nor was much private charity available. When Americans thought about hunger, they thought about starving children with bloated bellies in Africa or Asia -- not the sickly children next door. There were a few exceptions (skid row soup kitchens in major cities operated by the Salvation Army or the Catholic Worker, chiefly for elderly, alcoholic men) , but, for the most part, soup kitchens and food pantries were only memories from the past.

Not much was actually known about hunger among America’s poor in the 1960s. There were few government or academic studies, and little in the way of local surveys or reports. Food scientists and medical professionals knew something about the damaging effects of serious long-term hunger, particularly for pregnant women and young infants, but even that evidence was mainly from overseas. Before those pediatricians reported what they had found among America’s rural poor in 1967, hunger was not an American issue.

"We don’t want to quibble over words," the doctors told Congress, "but malnutrition is not quite what we found. The boys and girls we saw were sick, in pain, weak. They are suffering from hunger and disease and directly or indirectly they are dying from them, which is exactly what starvation means" (U.S. Senate, July 1967).

At the time, both Congress and the president were preoccupied by the war in Vietnam. They had no interest in costly new social issues that might require public funds and attention. Moreover, they were facing elections in 1968, and neither Lyndon Johnson nor the Congress wanted to go into that election having raised taxes -- particularly not to pay for social programs for poor people.

Far from moving quickly to respond to the new evidence of hunger in America, Congress moved slowly and reluctantly, when at all. Members had to be pushed and prodded by a public that found the news of hungry children unacceptable. But, bit by bit, as the evidence accumulated that the problem existed in areas other than just the rural South, that more than children were involved, that existing measures were not adequate to the task, that unmet hunger carried serious consequences for those affected -- the members of Congress did respond.

They ordered the U.S. Public Health Service to conduct a national nutrition survey. They held field hearings across the country, listening to testimony from state and local health officials, poor people and their advocates. And they invited testimony from schoolteachers and pastors, doctors and nurses, parents and volunteers.

As academic and clinically based studies were carried out, as researchers turned their spotlights onto America’s malnourished poor, and as the public continued to press for action, Congress took action. It created WIC for pregnant and nursing mothers and young children; it replaced surplus foods with food stamps; it expanded school meals; it established programs for senior citizens; and it made food aid generally more available to hungry Americans.

Ten years later, a larger team of health professionals reviewed the hunger problem and reported that "our first and overwhelming impression is that there are far fewer grossly malnourished people in this country than there were just ten years ago" (U.S. Senate, 1977). They ascribed the change to food stamps, the nutrition component of Head Start, school meals and WIC, and said that the food stamp dollar was the best health dollar spent by the federal government.

By then, other studies were available to corroborate the doctors’ observations, along with evaluations of the programs that the government had put in place. A small but entirely positive revolution had taken place: in just one decade the nation had gone from discovering a major social problem -- hunger -- to documenting the need, to putting a response in place, to finding evidence that that response had worked. Hunger was one problem we proved we could solve.

Moreover, the same public that had earlier demanded that its government become involved now moved quickly to fill the most obvious remaining gap: there was still very little private charitable food aid available to meet the kinds of emergencies beyond the reach of government aid. Filling that gap became the response of caring individuals to a renewed awareness of hunger in the early 1980s.

For the first time since the Depression, towns and cities of every size once again had soup kitchens, food pantries, food banks and food lines. Reporters covering the phenomenon could scarcely contain their amazement as they wrote of the people standing in long lines just to get some cheese or a few days’ supply of food. Unions organized food drives for their unemployed members; churches collected canned goods for their hungry parishioners; children solicited pledges as they danced or walked or ran for the hungry.

And, very quickly, new evidence of hunger was collected by pediatricians, obstetricians and other health professionals; by mayors’ and governors’ task forces; by citizen groups and church committees. When Hunger in the Eighties: A Primer was compiled in 1983, it could include only one- or two-paragraph summaries of the various reports in what quickly became an 87-page chapter. Taken alone, any one of the studies or reports might not have been persuasive. But taken together, the evidence left no doubt that hunger was, once again, a national problem.

But while poverty was high and rising, the very programs that had been so effective in reducing hunger and malnutrition earlier were cut by an estimated $9 billion -- a sum that no church or private charities, canned-food drives or Christmas baskets could replace. Even those who receive government help now receive benefits that leave them hungry. (Many of the elderly and disabled on food stamps, for example, receive just $10 per month -- roughly 10 cents a meal.) The combination of a steep recession and deep budget cuts in government food assistance has resulted in long waiting lines for help, and in hunger.

Now, month by month, there are new reports, studies, congressional hearings and hunger evidence of every kind. All point to continuing evidence of need; much of it documents dramatic increases just since 1983. None of it is a closely guarded secret.

The recession isn’t over for poor people. While some in the economy are better off, economists who follow national poverty trends estimate that, even with sustained economic growth of 3 per cent or more, it will take at least a decade to get national poverty rates as low as they were in 1979. The rate of unemployment and underemployment (working part-time when full-time work is needed; working, but earning too little) remains unacceptably high. The need for food assistance remains great.

But far too little is in place to help poor people feed their families. Unlike the 1960s when news of hunger caused citizens to insist that their government get involved, in the 1980s when poverty and hunger rose, Americans -- acting through their elected officials -- cut back food aid for hungry people. Under the circumstances, no one should be surprised that hunger is a problem once again.

Fortunately, in some ways the present holds more promise for ending hunger. Today, more is known about the problem and the toll that it takes. Government programs capable of meeting long-term food needs have been tested overtime, and private programs capable of meeting true emergencies are in place. There are more antihunger organizations, more churches and congregations involved to some degree and -- thanks to USA for AFRICA and Hands Across America -- most Americans know there is a problem. That’s all to the good.

What is missing in the 1980s is the sense of shock and outrage, the anger and commitment necessary to make us do whatever it takes to get the job done. Bringing our understanding up to date with the facts, developing "Sullivan Principles" for America’s working poor, and pressing for the maximum use of government programs capable of ending hunger quickly are three logical ways to get us back on track.

Ministering to the Collective Soul amid the Arms Race

It is easy to think of nuclear weapons as a new phenomenon in history, a ‘basic change in the circumstances of life," in the words of Jonathan Schell. But recent analyses of the nuclear threat have also examined how it arises from longstanding patterns of human behavior. Increasingly, writing on disarmament modifies familiar terms to describe the peril, speaking of the "nuclear madness" or "fixation" of "deliriums" and "death wishes"; or, in another vein, of ‘idolatry." "revelation" and "warnings" of "the wrath of God."

As these terms suggest, these kinds of analyses come from psychologists and theologians, or from others writing in the idioms of those disciplines. To say that nuclear weapons have their roots in the soul itself is to invoke theology and depth psychology, our diagnostic sciences of the soul. Jim Garrison, in The Darkness of God: Theology After Hiroshima (SCM, 1982) , has even suggested combining these two methods of soul-searching into a new discipline of "psycho-theology" designed especially for the nuclear problem.

Disarmament literature also uses such vocabulary, suggesting either "repentance" or "therapy" as possible answers. But although professionals diagnose the problem in this way. they don’t follow through with appropriate solutions. What would we think of a psychologist who offered a patient the following response:

PATIENT: Doctor. I’m convinced my neighbors are conspiring to kill me. This makes me extremely anxious. Moreover, I have wired my house with explosives and will blow it up if they try to get near me.

DOCTOR: You are suffering from a syndrome called "paranoia." You exaggerate the intentions of others and will only hurt them, and yourself, if you continue on this course. That will be $50, and do have a nice day.

Suppose the same person then sought advice from the parish pastor, and the conversation went like this:

PARISHIONER: Help me, pastor. I live a life of strife and discord with those around me. The only way any of us has yet found comfort is by threatening others’ lives.

PASTOR: What you describe is terrible in God’s eyes. Scripture clearly teaches that we are to love our neighbors as ourselves. Those who live by the sword shall die by the sword. Now. please go and sin no more.

We would not think much of the psychologist or minister who would respond in these ways. Obviously, solving people’s problems requires more than refutation or warning. People need to be shown how to live a new kind of life. Telling someone to change her ways or else can actually make it harder for her to overcome a problem which, quite possibly, she is already as painfully aware of as is anybody.

Yet when faced with a collective problem like nuclear weapons, people seem to throw this obvious wisdom out the window. The dialogues related above are only somewhat caricatured examples of the "advice" psychological and theological writings offer on the issue. The following paragraph, from an address on "The Social-Psychological Dimension of the Arms Race" (reprinted in Search for Sanity [South End Press. 1984]. p. 272) by Morris Schwartz. typifies this common approach:

I believe chat the persons generating these [official] delusions and deceptions acutely feel the loss of security and stability [brought about by nuclear weapons], and that they feel afraid and powerless. They try to overcome and defend themselves against this feeling of powerlessness through a massive increase in nuclear arms, and they try to conquer their fear and reassure themselves by asserting that we can prevail if we do indeed multiply our nuclear arsenal.

Schwartz hereby states the problem and begins to analyze it psychologically. But look at his conclusion:

Thus as we escalate the arms race we increase the very devices that are responsible in the first place for our loss of security, and are left with a greater sense of vulnerability and insecurity.

Instead of following the analysis through, he merely re-emphasizes the problem. In similar fashion, many other such treatises begin by discussing the weapons, analyze the problem partway, and conclude, "And so we must get rid of the weapons." But they do not describe how to accomplish that goal.

There are three main reasons why psychological analyses veer off at the crucial moment. First, an analyst might have no name for the soul’s sickness and thus no "handle" on it. Second, although the analyst might have a name for it, he or she might be unwilling to ascribe it to people in general. And third, the analyst might admit that the sickness afflicts all people, but still regard it as an unchangeable "fact" of "human nature."

The first problem creeps into Edward Thompson’s analysis in Exterminism and Cold War (Verso. 1982. p. 330) :

There remains something, in the inertial thrust and reciprocal logic of the opposed weapons systems -- and the configuration of material, political, ideological and security interests attendant upon them -- which cannot be explained within [our usual political] categories.

Thompson tries calling this something "exterminism." which he defines as a certain inertial thrust within the deep structure of the cold war. But in coining a new name he is really denying that older terms make sense, or that the nuclear peril is part of some pre-existing pattern after all. It is to look upon the bomb as a basic change, a cause rather than an effect.

Discussions of the "madness" or "paranoia" of the arms race often exhibit the second problem. Writers broadly ascribe terms like these as frequently as they make excuses for ordinary people. Notice how George Kennan’s psychological language confuses his point: "Can we not at long last cast off our preoccupation with sheer destruction?" he asks in the New York Review of Books ("On Nuclear War," January 21, 1982) "For this entire preoccupation with nuclear war is a form of illness. It is morbid in the extreme." After a bit more such analysis, Kennan concludes, "I decline to believe that this is the condition of the majority of our people."

That’s a relief -- but then, who was the "we" he was speaking to one paragraph earlier? Kennan, like many other writers, has not settled this point for himself. He seems to implicate all people, yet he finds it unthinkable that the majority could be responsible for nuclear war.

Another way that writers excuse the people is to admit their guilt, but to consider it ignorance deliberately cultivated by elites, particularly through media manipulation. Or more subtly, they speak of the mass pathology as something passive, portraying ordinary people as (in Jim Garrison’s words) "victims of a compelling nightmare, hypnotized and magnetized" in a dreamlike state like that of children following the Pied Piper (Darkness of God, p. 3) Interestingly, this view reverses Caldicott’s formulation, in which the people were seen as adults and the leaders were the children. But both writers separate the two groups.

The third problem emerges in some writers’ attempts to close that gap. If human nature gives rise to the problem, then ordinary people are at least as culpable as politicians. Psychologists David P. Barash and Judith Eve Lipton, in The Caveman and the Bomb: Human Nature, Evolution, and Nuclear War (McGraw-Hill, 1985), express this view with their useful observation that "the nuclear arms race goes on because people allow it" (p. 22, their emphasis). Such a view at least makes it less mysterious that the arms race has been led from the start by a great democratic society. It also offers hope of a solution to the problem -- for in this formulation, the weapons are not causes of the arms race but instead are effects, which means that we can hope to act upon their causes.

This brings us back to our starting point: the need to fit the arms race into older, pre-existing behavior patterns. Those patterns -- or, one might say, the "prior" phenomena -- are human attitudes. If we can understand and change the attitudes, we can eliminate nuclear weapons.

Yet this approach seems further from a solution than the others. How can we change human nature? Here the whole psychological approach seems to break down, for it is based on an analogy between the psychology of individuals who can be treated in clinics, and the collective psychology behind the arms race. And collective citizenry cannot be brought into a clinic.

But Raskin is assuming that human nature is fixed and changeless. He is like Barash and Lipton, Who equate human nature with a lingering "Neanderthal mentality" that is out of place in today’s advanced world, In this scheme the prehistorical mentality is also ahistorical -- and thus a permanent condition. The authors argue for using our superior rational faculties to "say No" to our primitive impulses. It is urgent that we do so, they say, so that humanity might reach "the point at which, while unable to be saints but refusing to bow down to universal murder, we resolve to overcome the Neanderthal mentality and thereby transcend, if not overcome, our biology itself" (p. 267) But again, this is preaching to the converted. The reader has no doubt already ‘said No" for herself, and. might readily agree that "we" should resolve to do these things -- but she is then left to wonder how to convince those who aren’t reading the book and aren’t aware of the problem to join in such resolutions. The book offers no answer; it ends with the words just quoted.

We might hope that theologians would have a better grasp of human nature than do psychologists. That humanity is in a bad way is no news to theologians, who know full well that sin appeared on earth well before 1945. Surely they know better than to agree with Schell’s description of the nuclear age as "the second fall of man." The first fall, they would say, was total, and explains all subsequent problems.

Or would they? Dale Aukerman, in Darkening Valley (Winston-Seabury, 1981) , sees the splitting of the atom as "a postponed swallowing of the tough core of that original fruit" (p. 161) This all but denies that the first fall really occurred: Adam had the fruit in his mouth but didn’t actually swallow it. Human rebellion from God wasn’t complete until we started messing with uranium,

This is a theological way of saying that the bomb is a cause and not an effect. It denies the free human decision to create the bomb. I imagine that it is driven by the same desperate desire to believe that a secular solution can be found before it is too late.

Similarly, Aukerman’s claim that "really it is only for the devil, for those aligned with him, and for those who might yet turn from him that time is running out" (pp. 168-69) separates leaders from ordinary people. And his warnings about God’s wrath do not offer ministry, only judgment. He dispenses more law than gospel.

In fact, the nuclear peril has thrown the notion of gospel into crisis. It has given humanity the power to short-circuit the Last Judgment, thus calling into question God’s omnipotence. There are several possible responses to this. One is to trust that God still holds the last trumpet and will not allow humanity to end the world on its own. Politically aware theologians (and many ordinary believers) find themselves caught between trusting in this assurance -- and so taking no political action -- or taking action based on what seems to be a loss of faith in God. If their theology sounds peculiar, it reflects the cruel force of their dilemma.

But there is another possible response that points toward real answers. Also inspired by theology, it takes sin and the fall seriously. In this view, humanity has the power to destroy itself -- but it has been doing so little by little throughout history. Part of our sinful nature has always been to seek more efficient methods, and in the bomb we have found one. Let us not make Adam’s mistake and imagine that this particular fruit of the tree of knowledge really does give us godlike power. Nuclear weapons are just one more confused, frail, human invention -- maybe, like so many sins, even a stupidly well-intentioned one.

This view is available even to non-believers since it makes use of Christian beliefs not about God but about humanity and culture. It requires believing only, as C. H. Sisson has put it, that Christianity "has, even for those who are what is called skeptical, undeniable strengths in its far-reaching correspondence with the deeper reaches of human nature," and that our culture inherited its world view from many of our ancestors ("Putting Faith in Writing," Times Literary Supplement, December 21, 1984, p. 1468). Theology can provide us with a powerful framework for self-analysis. We can think of original sin as an archetype for our secular and psychological ideas about evil.

Evil was a point of contention between Pelagius and St. Augustine some I ,600 years ago. Crudely put, the Pelagian view denied original sin; evil was more a case of unrealized human potential than some kind of congenital condition. In disarmament literature this Pelagian view predominates and the Augustinian view that grace initiates salvation is finally dismissed or evaded. This may seem strange, since the church decided for Augustine’s view. But in our culture the debate goes on. Luther and Erasmus, Hobbes and Rousseau, Burke and Paine -- each dispute has resonated with the same issues. American culture, a society of both Puritans and Jeffersonians, of transcendentalists and business tycoons, resonates with them to perhaps an unusual degree.

So it is easy to hypothesize that the nuclear weapons debate does too. The bomb simultaneously reveals both the power and the weakness of human works -- which supports both the Pelagian and the Augustinian views. Armaments are just machines. yet at the same time they are flaming swords held over our heads by angels. The first atomic bomb was spoken of in reverential, religious terms, a simultaneous affirmation of both Augustinian humility before God’s power, and extraordinary Pelagian pride in humankind’s. Standing at the apex of two central traditions of our culture, this attitude explains the "tremendous internal momentum" which Schell has noted in the arms race, and which drives so many writers on the subject to despair.

How can we find answers within this conceptual framework? First, we must abandon religion’s prophetic condemnations and look instead at what it teaches about cultural images and symbols. There are cultural "repertoires of values," says theologian Bernice Martin, which express themselves through a "hidden vocabulary" of symbols that saturate popular culture and structure its assumptions; We need, therefore, to look at

the inchoate constellations of imagery, sentiment and identification which form archeological strata in our culture, sedimented deposits of unconsidered, implicit meanings lying beneath the surface of reasoned debate. . . . They stand unanalysed behind what counts as a good story, a proper response in a crisis, and so on [Martin, in the anthology Unholy Warfare: The Church and the Bomb (Basil Blackwell. 1983) p. 1101.

Martin’s approach brings psychology back into the picture. Psychology is the basis for any study of cultural imagery, Sentiment and identification are psychological phenomena, and "archaeology" a familiar metaphor for the probing of a troubled mind.

Chernus reminds us that Christian religious culture has long hoped for rather than feared the prospect of the world’s end. Historian Perry Miller, in Errand into the Wilderness (Harvard University Press, 1956) , argues that the belief in impending world destruction has been paramount in the Christian West, and that Newtonian physics provoked a serious crisis by challenging that belief, Newton himself researched the Book of Revelation in hopes of restoring that eschatology. It follows that we, as Newton’s heirs in this century, and the builders of a mechanical doomsday system. have received society’s full authority to accomplish just that task.

Chernus also offers a solution, but sadly, it is unworkable if not dangerous. Disarmament activists, he suggests. should attempt to create in the popular mind an equation between nuclear war and hell, and disarmament and heaven. Through such a change in the cultural imagery, we could "choose our own salvation as a political as well as spiritual act" (p. 910)

This raises new problems. What if a guilty, sinful people likes the idea of hell? It distorts Christian theology to claim that we can choose salvation by disarming; even Pelagius didn’t go that far. Besides, who would believe it? In 1944, one year before the A-bomb, war and genocide alone killed hundreds of thousands. Was that heaven? Would the trumpet have sounded if the recent Iceland summit had ended differently? If people carry the ancient debate in their bones, they’re not likely to mistake a Pelagian dream -- disarmament -- for an Augustinian one -- divine redemption.

Chernus’s basic argument that we need to reorient cultural imagery is an excellent insight. It is only necessary that this be undertaken with full understanding of that imagery in all its historical complexity. Chernus is correct to suggest that churches (and, I think, psychologists and anthropologists) ought to be the ones performing this task.

In this way, we can begin to shape the final answer. It will be a type of collective therapy which, like individual therapy, relies on "reframing" the issue. In the clinic, reframing is done through words exchanged by patient and therapist. But the same term has been used in critiques of public discourse, and specifically of mass communication. Media portrayals of issues can bring the cultural subconscious to light, just as an image in a patient’s dream can bring the patient’s subconscious to light. Art can make popular culture more self-aware, as can political activity, to the extent that it helps reframe an issue. The "bounds of possible thought" about nuclear weapons may be set by the prevailing "nukespeak," as Paul Chilton has pointed out (in Nukespeak [Comedia Publishing, 1982]) But through persistent "peacespeak" they can also be expanded, much as the bounds of possible thought about race were expanded by the civil rights movement -- under the image-sensitive leadership of Christian ministers.

Political activity remains the answer, but it should aim to recast media depictions of nuclear war and peace. That may seem an old idea, but nuclear weapons are an old problem. The new element is the undespairing recognition that it is possible to minister to the soul of the people at large, beyond just helplessly reminding each other that "we" need to mend our ways.

Joseph Sittler and the Theater of Human Existence

Lutheran scholar Joseph Sittler, now nearly 80, is widely considered to be one of the era’s most distinguished -- and best-loved -- theological educators. Those who have been his students, themselves now noted teachers, preachers and scholars, pay homage to his singular ability to relate the academy to the parish -- and to the wider culture. There are few figures of his generation who have so purposefully yet so gracefully passed among those three spheres of endeavor. Nor has this activity ceased during the years of his “retirement.” Since he became emeritus at the University of Chicago Divinity School in 1973, after teaching there for 17 years, Sittler has been scholar in residence at the Lutheran School of Theology at Chicago. Not that he is always “resident” -- for, despite his near-total blindness, he continues his travels for speaking engagements. He keeps well informed through tapes and the services of a reader.

This is the Century’s centennial year, and Sittler has been an important figure in the ecumenical tradition and the cultural orientation which have always characterized this journal’s identity. Thus we decided to seek out this Lutheran sage’s views on the current state of theological education.

 



Joseph Sittler did not have the type of formal graduate education that he has become an expert at providing for others. When he was a college student at Wittenberg University in Springfield, Ohio, he had three main interests: literature, history and biology. The latter was, in fact, his major -- and he intended to become a physician like his older brother. He frequently attended that brother when he delivered babies in the poor sections of nearby Columbus: “We would always take a copy of the New York Times with us because it had more pages in it than any other paper, and because newsprint, due to the heat used in processing it, is fairly sterile. I would simply lay the Times pages all over these hovels . . .” and deliver the babies.

It was through this experience that Sittler began to have doubts about his chosen career -- not because of the depressing poverty he witnessed, but because at that time “if you were going to be a doctor, you couldn’t possibly keep up your interest in anything else: the demands of reading and work as a physician meant that you couldn’t maintain a deep interest in literature and history.”

Having reached this conclusion by the time of his graduation -- but not having defined an alternative career -- Sittler decided to follow in his father’s footsteps and enter seminary. This he did at Hamma Divinity School in Springfield. According to Sittler, there was one remarkable teacher on the faculty -- Leipzig-trained John Evjen (later fired for his liberal views) -- who was so inspiring that he awakened in his young student a whole new set of interests. “When I got into the seminary, I found out that the studies in biblical language, church history, Christian theology and liturgy were fascinating.”

Sittler became a leader of a group of fairly radical young Lutherans -- mostly Evjen’s students -- in Ohio, and he did further theological study at Oberlin under Walter Marshall Horton, and at Case Western Reserve with Max Fisch. From that time on, Sittler educated himself through extensive reading and through dialogue with experts on various topics. “I was ordained in 1930. It was utterly impossible during that decade to afford graduate school. The Depression was absolute, you know.”

His own lack of advanced formal training is one factor in Sittler’s continued great interest in the field -- and he has many ideas concerning the successes and failures of current theological education.

“Let’s start with what’s right,” he begins: “Most institutions devoted to theological education today are introducing their students, either through carefully structured curricula or in field work, to the increasingly urban settings of the actual parishes. There is nothing new about this, but I think the care with which that intention is built into the planning of faculties and curricula is each year getting more expert, so that we don’t just sort of splash the students through the city but get them to know more about specific aspects of contemporary urban culture, at the hands of and through the mouths of people who are the children of that culture. That, I think, is right because it simply makes practical sense. The people to whom the church addresses its invitation are now primarily urban people.

“That approach, however, is but an aspect of the second and more profoundly right thing: what Johannes Metz has called the theology of the subject; that is, the enhancing of the dialectical relationship between the God who is the object of theological reflection and the persons who are its subjects. The joining of these is realty the fulfillment of Schleiermacher’ s program. It is the deepest internal theological program -- and while it always teeters on the edge of making theology a primarily subjective experience, we have to run the risk of a careless subjectivity in order to keep that dialogue balanced. This line of thought is not new; it is in its second century now, but such a book as Sallie McFague’s Metaphorical Theology illustrates how rich -- when combined with objective, disinterested. sound biblical scholarship -- this approach may be.”

Such a focus will lead eventually, thinks Sittler, “to both an honoring of and a renovation of the whole theological vocabulary.” However, if the renovation takes place without the honoring, it will become trivial. . . . It’s a hundred years now since the search for the historical Jesus has indeed proven, as Norman Perrin said, to be a futile search in purely historical results, or hopes. It has nevertheless disclosed the strength, the uniqueness, the incomparable challenge of the reports about Jesus, so that the figure of the New Testament Jesus emerges mysterious but powerful for the contemporary mind.

“Now, that means that the way this mystery and this power were related backward to the whole of history fashioned a kind of theological vocabulary of messianic kingship, sonship and kingdomship. We must now reexamine these concepts in order to reinvest them with substance which more clearly intersects the nature of evil and the loss of meaning in contemporary life. If you sweep away everything with which our fathers invested their search, you will trivialize. If you don’t sweep it away, you will make the language less and less alluring.”

In terms of the actual classroom situation, Sittler thinks that both teachers and students are now more open to new ways of thinking than they have been in recent years. “Both students and teachers have had their fingers burned by the banality of new theologies seeking to operate without a historical foundation.

“I think in many cases the movement to evoke and unwrap all theological truths from the quest for personal spirituality is narrow -- first in that what is meant by spirituality bears the marks of the anxieties of recent years. Further, it ignores the vast spiritual literature of our history. I think the movement is blowing up because it’s not going anywhere, because spirituality is rifling around in pure inwardness, which has not been enriched by the inherited classical works of the centuries.”

When asked what is making students today more interested in those classics, Sittler replies characteristically. “I’m going to go ‘way out on a limb, and if I’m wrong, I’m blazingly wrong; but one must take chances. I think the political and social disenchantments of the ‘70s have chastened the minds of this generation, so that they know they must not be fixated by the moment. They’re not going to hitch their expectations to the febrility of revolutionary changes which students hoped for 15 years ago -- changes that haven’t come along.”

In this context, the surrounding secular culture has played an important role. “I think social and political sophistication, the modifying of our expectations according to experience, is also shaping minds that are willing to take a new look at the old dogmas, doctrines and liturgies which were given a vehement heave-ho in the excitement of the ‘60s.”



These same years have brought to the fore much of what Sittler thinks is wrong with theological education today. He locates one basic problem not with the schools but with the churches that send their ministry candidates to those schools. “There are certain expectations of contemporary theological education which the schools have got to fight against. One of these, generated by the church, is that the pastor is to be a combination of master of ceremonies and soothing friend. The schools, despite the integrity of their own intentions, the appropriate preparation that they demand of their faculties, and the desire to do the hard and critical work, are expected to turn out students who ‘meet the approval of the church accrediting committee.’”

Sittler does not place much faith in the churches to alter their expectations independently; rather, he thinks, the schools must resist these trends -- and might, in fact, attempt to engage in some lay education to bring these expectations under criticism.

In another problematic area of theological education, however, Sittler places blame directly on the schools: the proliferation of academically dubious professional degrees. For example, he labels many current D. Min. programs “banal at best, absurd and fraudulent at worst.” He continues, “It’s a program for which you get an academic degree for the reported exercise of common horse sense. ‘The history of St. Paul’s Lutheran Church in the creation of a new educational system . . . or new educational building.’ They give you the dimensions of all the rooms, who the architect was, and so on. . . .

“I think we ought to have a good MA. program for people who want to do serious advanced work in theological studies. They could take one day a week Out of their parish duties and come and study: preaching, church history, no matter what. There’s a place for continuing education, but it should be academically demanding. . . The master’s degree is really a degree for the continuing renovation of competence -- though it’s not a research degree.”

And the D. Min. degree, he thinks, exists simply so that the recipients can redo their church letterheads. He thoroughly disapproves of such overcredentialization and castigates schools that go along with this trend -- some of them solely to bring in more tuition dollars, or to create new cadres of loyal degree-bearers who might later aid the institution financially or in some other way.

Sittler thinks, further, that schools should determine exactly what their specific purposes and strengths are -- and not stray from them. For example, he argues that the schools with primarily a scholarship/research orientation (he cites his own former employer, the University of Chicago Divinity School) “ought not to touch the parish. They should have nothing to do with the training of pastors and preachers. That’s not their best contribution to knowledge, nor, in the long run, to the church.”

On the other hand, he continues, “That does not mean that a school which is interested in the preparation of persons seeking ordination is to invite or practice bad scholarship -- by no means. There is no reason why the preparation for parish ministry should not be taught by people who have been trained in and who exhibit the soundest scholarship.”

The most important point, says Sittler, is to keep the two goals -- scholarship and preparation for the ministry --  distinct, so that each school can accomplish its defined task as consistently as possible.

In addition to “overcredentialization,” “fradulent degrees” and misapplied emphasis, another fault that Sittler finds in current educational practice concerns the overspecialization of students, although he grants the often grim necessity of the latter in seeking employment.

Of the Ph.D. degree he notes that students, while pursuing a doctorate, are often compelled to leave by the wayside the broader implications of their studies. Later they may say, “‘I’ve got the damn diploma, now let’s get educated.’ That’s an awful thing to say about the nature of graduate study, but I’m afraid it’s true. You set aside the humane end of education in order to acquire enough skill to get a job. Then, having got the job (if you can) you say, ‘Now let me learn something about what’s humanly important. ’” In many cases, however, that transition never takes place, says Sittler. “Many specialists remain only that.”

This situation can be remedied in part by introducing courses from other disciplines into the religion curriculum. Sittler has long been an advocate of interdisciplinary endeavors and has an abiding interest, especially, in the interrelation of religion and the arts. His particular regard for poetry and architecture consistently finds its way into his speeches and sermons. Currently he is re-examining the Greek tragedies.

Sittler’s other preferences in methods of theological education reflect this broad-based cultural orientation. For example, in discussing how to teach preaching (if that is indeed possible), he states, “If you’re going to teach people to preach in a way that serves the truth and the faith of the church, several thinks are crucial. Students must [do] basic work in language and in the critical-historical examination of the Holy Scriptures and of church doctrine. In addition, one must maintain wide exposure to one’s surrounding culture. These requirements are absolute.”

This theme of “getting back to basics” is one of Sittler’s favorites. In fact, when asked what one word of advice he would give to current theological education students, he replies, “Master the classical texts. They will generate energy and demands for relationship,” thereby leading to further development of one’s knowledge and skills. And he doesn’t mean just the classical theological texts, but those in other fields including literature, history and the natural sciences.

Currently he is especially concerned with the latter. “It may well be that the most illuminating focal point for the coming generation of theological students will be some precise knowledge of the methods and projections with which the natural sciences operate. If you want to talk about God and the world, you had better have some clear understanding of what kind of cosmos constitutes the theater of human existence. All my theological notions are shaken by the work of the cosmologists. Our solar system is only a speck in the cosmos, and, as the scientists tell us, will most certainly not last forever. What about when our earth is gone? We define God absolutely in accord with our anthropological view, but what will concepts of sin and redemption mean without our own presence?

“To do one’s daily work with care, and at the some time to be aware of the haunting truth of the inevitable annihilation of our earth, is an operation both disturbing and deepening. To keep up on the new work in the Gospel of Mark and take seriously scientist Robert Jastrow’s Until the Sun Dies ,for example, requires a balancing adroitness which is both demanding and expanding.”

Those who know Sittler point to him as one of the best exemplars of his own advice. They know that in addition to the fields just mentioned, he is fascinated by music, by machinery and the manual arts, by the epical nature of baseball, by ships and the ocean. (He has expressed the wish to be, above anything else, “the captain of a fine sailing vessel.’’) All of these areas he sees as interrelated and as aspects of an organic whole. There is no compartmentalization in his thought. “You know,” he says, “I simply cannot draw a line around the subject matter of theology. The subject matter is discourse about God: theos logos. What isn’t? That’s what Euripides was writing about: whatever is, permanent, fundamental, at the center, absolute. All of it is discourse about God. I can’t think of anything that is not. There’s no time when I say, ‘Now I’m going to think theologically.’ I don’t think any other way.”