Chapter 9: Humanity at War with Itself

Among the biblical myths of human origin, two are particularly relevant to the problems raised by globalization. One tells the story of Cain, the first child of Adam (humankind), who killed his brother Abel; when asked to explain what had happened to Abel, Cain made the well-known reply, ‘I do not know. Am I my brother’s keeper?’1 This story symbolically describes the extremes to which the anti-social tendencies in the human condition can go, and indicates that humanity has been at war with itself from the very beginning. The other myth describes a time when all human beings lived in one harmonious society; ‘the whole earth had one language and the same words,’ says the story of the Tower of Babel.2 The myth then explains why, in historical time, there have always been many languages, many cultures and many societies. They are the result of divine punishment for the blatant hubris of humankind.

Whether there ever was a time when humans were so few in number that they constituted only one society and spoke only one language we cannot say. It remains possible that all belonging to the species of Homo sapiens are descended from the same two parents, such as the Adam and Eve of the biblical myth of Eden. In that case, their descendants would have formed the first extended human family or tribe, united genetically and linguistically. All that we can say with confidence, however, is that our earliest knowledge of humankind takes us back only to the point where humans were already scattered into groups, living a tribal existence, each with its own language and culture.

The oldest and most basic form of the human social group is the family -- not the nuclear family (which is largely a modern phenomenon) but the extended family. The family is bound genetically by blood ties and by a closely knit culture arising out of its common life. The family unit evolved in the far distant past into the dominant type of human society, the tribe. For a long time, most likely, the extended family and the tribe were indistinguishable, though gradually (it may be conjectured) the tribe grew to become a society of families.

The tribe is held together largely by a commonality of blood and culture. The bonds of the tribe started with blood ties and are instinctive in origin (as they are with all non-human gregarious mammals). This genetic base of tribalism remains strong even today, for ‘blood is thicker than water’, as we say. In the case of humans, however, the genetic ties have been supplemented by the bonding power of an ever-evolving culture. Each culture (or civilization) contains a common language, a shared view of reality, a shared set of values and goals, and common patterns of behavior, both moral and ritualistic.

The strength of tribalism lies in the personal bonds of mutual loyalty which hold the tribe together, give it an identity, endow it with strength and courage to overcome threats, and enable it to survive from generation to generation. So strong are the bonds of the tribe that its members vigorously defend its vitality and may even be prepared to die to ensure its preservation. There is much that is commendable in tribalism and it has been essential to human survival, at least until the present. The negative side of tribalism is that, manifesting the mark of Cain, it fosters distrust and antagonism to those outside the tribe, who are seen as a threat.

To this day the tribe remains the social base of the later forms of society which began to evolve less than 10,000 years ago, when the agricultural ‘revolution’ brought a more settled existence. This cultural change led to the establishment of walled cities, the cultivation of the various arts which city life makes possible, and hence what was called civilization. From that time on, the city, the nation and even a whole civilization have manifested themselves as particular forms of tribalism, and each has to some degree retained the loyalties and sense of social identity that constitute tribalism.

From the far distant past right up until the twentieth century, humankind showed a tendency to divide and diversify into ever more ethnic groups, all of them retaining the tribal type of social life. As we saw in the last chapter, it was this tendency to divide that Teilhard de Chardin labeled divergence. However, after the rise of the first civilizations some 5,000 years ago, powerful conquerors have forced a number of different social groups to enter into a compulsory form of unity. Because these empires were dependent upon the use of force, they had no great permanence and, in time, the original tribal or ethnic entities usually regained their independence. That process is still occurring; the twentieth century saw the British and other European empires gradually disbanding, and it has ended with the break-up of communist Yugoslavia after the death of Tito.

From the Axial Period onwards, a different and more permanent form of inter-ethnic social cohesion evolved. The strong commitment to ethnic or blood-related social groups came to be superseded by or subordinated to the formation of a multi-ethnic society of a religious kind. As noted in the previous chapter, Christianity and Islam each hoped to eliminate the negative aspects of ethnic tribalism by incorporating all of humankind into a new trans-ethnic community.

Out of the original Christian hope for the imminent coming of the Kingdom of God evolved the institution of the church (in which there was to be ‘neither Jew nor Greek, neither slave nor free, neither male nor female’);3 this eventually developed into Christendom. The Muslims called it the brotherhood or Umma Muslima (Muslim people), and Islam dates its calendar from the establishment of the first Islamic state or society -- that at Medina. At their height, Christendom and the Islamic world were both impressive in overcoming tribal and ethnic animosity and establishing a form of international society. However, even these religiously based super-societies have been subject to divergence, tending either to fragment (as in denominational divisions) or to give way to the resurgence of ethnic tribalism.

Ethnic divergence, or the dispersion of humanity into ever more tribes, cultures, nations and languages, was possible because there was, until recently, always more land to which they could go. The Pilgrim Fathers set sail for what they thought was the relatively empty ‘new world’. The Quakers found freedom in Pennsylvania. The nations of Europe exported to their new colonies the surplus population that followed the industrial revolution. But this avenue for dealing with increased population no longer exists.

Because humans have not only spread to the limits of the earth but have also started to multiply at an alarming rate, the mythical story of the Tower of Babel is now being reversed. The whole earth has been reduced to the vicinity of the Tower; there is no more land to which we may scatter. Further divergence into more tribes and races is no longer possible, and is being replaced by what Teilhard de Chardin called convergence. Human societies of all kinds now find themselves being pushed more closely together. Minor languages are dying fast and major languages are in competition to become the one global language.

We are beginning to feel the birth pangs of what could be a new form of human society -- global society. But will it emerge successfully? Will we humans now form ourselves into one global society, or will we simply slide into social chaos and end up destroying one another? The truth in the myth of Cain and Abel still remains: the mark of Cain is upon us. Will our anti-social tendencies prevent us from becoming one global society? That is the question globalization is forcing us now to answer.

We humans are becoming a danger to ourselves, simply because of our natural capacity to multiply indefinitely on a finite planet. Globalization has been partly promoted by the population explosion. The basic instinct to procreate has always been essential for the survival of humankind, as it is for any species. That is why, in the biblical tradition, humankind was commanded to ‘be fruitful, multiply and fill the earth’. We are now filling the earth abundantly, even to overflowing, but we find it difficult to stop multiplying. The procreative instinct has become not only an asset for humankind but also a liability. It is tragically ironic that the very capacity required for survival is now that which threatens the well-being (even the future) of the species.

At the beginning of the Christian era (it has been estimated) the human population of the earth was approximately 300 million. Population growth remained relatively slow, so that by 1750 It had reached only about 800 million. Disease, epidemics, famine and high mortality among children took their toll. (Only about half of newborn babies survived to the age of five years.) Disease and early mortality, which were understandably judged to be evil, nonetheless kept in check the natural increase in human population. All that has been drastically changed by such otherwise beneficial developments as medical science, education in personal hygiene, better sanitation, and the improved economic conditions brought about by the industrial revolution. These factors reduced infant mortality, cured diseases, and prevented some plagues; the average length of life was gradually extended, and the population began to increase.

Population growth began to accelerate from 1750 onwards. As early as 1790 a Venetian monk, Gianmaria Ortis, declared that the human population could not continue to grow indefinitely. In 1798 Thomas Malthus (1766-1834), an Anglican clergyman and economist, published his seminal Essay on the Principle of Population, in which he argued that the human population has a natural tendency to multiply faster than the increase in food supply. Natural causes had long kept the population growth within sustainable limits; if these were removed, special efforts would be required to reduce the birthrate, either by self-restraint or by compulsory birth control.

By 1800 the world population had reached one billion, and it had taken some two million years to do so. But by only 1930 it had doubled to two billion. Then, partly as a result of the comparative peace and prosperity following World War II, a third billion was added by 1960, a fourth by 1974, the fifth before 1990 and the sixth by 1998. It has been projected that, on present rates of fertility, mortality and migration, the global population will have reached eight billion people by 2025. Since the size of the planet remains the same, the increase in human population will force people to live more closely together, as well as be more dependent on one another.

Until the 1950s the debate about human numbers remained largely academic. When artificial forms of contraception were coming into common use in the first half of the twentieth century, they were vigorously opposed by some on religious grounds. The debate was pursued purely on the basis of personal morality. Now that global population is reaching the limits of sustainability on the earth, contraception has become a social concern as well as a personal one. The traditional morality surrounding procreation and sexual relationships in the so-called free world is sadly out of touch with present reality: witness the Roman Catholic rejection of all artificial forms of contraception and the still widespread moral condemnation of clinical abortion.

However, drastic attempts to curb the population explosion, such as very strict forms of birth control and clinical abortion, do not offer any simple solution, for they have alarming social consequences. Any sudden decrease in the birth rate brings serious dislocation to the age composition of human society, so that (for example) too few working people have to provide for the material needs of too many old people. Programs of enforced birth control also come into conflict with long-standing cultural customs and religious convictions, cause considerable anguish and are often strenuously opposed. Harsh and rigid methods of birth control, such as those instituted in China, can seriously upset the gender balance.

The population explosion has also changed patterns of land occupation and, in latter years, brought a dramatic increase in urban density. It was not until about 10,000 years ago that humans began to live in permanent settlements. Even up to 5,000 years ago such settlements were small, consisting of semi-permanent villages of peasant farmers. Only during the great empires of Mesopotamia, Egypt, Greece and Rome did cities have more than 100,000 inhabitants. Even then, and for a long time thereafter, human existence was still predominantly rural and village-like. In 1800, less than 3 per cent of the world’s population were living in cities of 20,000 or more. This percentage has rapidly increased, in line with the population explosion; it reached 25 per cent by the mid-1960s and 40 percent by 1980. In the year 2000 about half the world’s population will be living an urban existence. What has been called the ‘global village’ is turning out to be one global city.

How will life in the global city be different from earlier human experience? First of all, as Harvey Cox pointed out in The Secular City, in the modern city human existence is becoming more rootless and mobile on the one hand and more anonymous on the other. Our neighbors may remain strangers and our closest friends may be found in various scattered net-works or sub-societies that we have chosen. Cities spring up like mushrooms, flourish, then show signs of decay. A flourishing suburb may, a generation later, turn into a slum.

Secondly, human existence will no longer be lived exclusively within one culture with its own identity and language. The embryonic global culture is already relativizing both the trans-ethnic cultures (such as Christendom and the Islamic world) and the many ethnic cultures. The global mix will not necessarily destroy the values and goals found in these cultures, but it deprives them of their absolute status, and makes them closer to personal options.

The global city could provide exciting new possibilities for humankind but it also presents challenges far exceeding any that the human species has yet faced. Not only is there the pressure of the population explosion and its increased needs, but there is also the fact that the various nations (with their burgeoning populations and their diverse cultures) are being pushed together rather like the continental plates on the earth’s surface. Just as the clash of these geological plates causes earthquakes, so we can expect massive cultural earthquakes as the great cultures are forced into closer contact. The recent troubles in the Balkans, the Holy land and East Timor, to name but a few, may be small compared with what is yet to come.

In 1996 Samuel Huntington, director of the Harvard Institute of Strategic Studies, published The Clash of Civilizations and the Remaking of World Order. He argued that the phenomenon of globalization is bringing new pressures of such magnitude that they could easily result in disastrous human conflicts, and that a global war of civilizations can be avoided only if world leaders accept the multi-civilizational character of global politics, and learn to co-operate. Among the factors contributing to global instability he cited the following:

• For the first time in history, politics have become global, being both multi-polar and multi-civilizational.

• The influence of the west is declining.

• The patronizing superiority of the west is bringing it into conflict with other civilizations, particularly Islam and China.

• Asian civilizations are expanding, economically and politically. Islam is exploding demographically and destabilizing Muslim countries and their neighbors.

• Modernization is not currently producing a universal civilization.

Huntington sees major fault lines appearing between what we may call the earth’s ‘civilization plates’ -- between the Muslim and Asian societies on the one hand and the West on the other. These fault lines are being exacerbated by ‘Western arrogance, Islamic intolerance and Chinese assertiveness’.4 As Huntington writes:

The West . . . believe[s] that the non-Western peoples should commit themselves to the Western values of democracy, free markets, limited government, human rights, individualism, the rule of law and embody these values in their institutions . . . What is universalism to the West is imperialism to the rest . . . The West is, for instance, attempting to integrate the economies of the non-Western societies into a global economic system which it dominates.5

This impending civilizational conflict indicates the fact that tribalism is so deeply entrenched in human behavioral patterns that it not only refuses to wither away but, in times of mounting tension in the face of threatening world crises, it is likely to intensify.

Tribalism was an asset for human societies before globalization, but its persistence now constitutes a threat to the future race, if it prevents the evolution of one global society. We have reached that point in the evolution of the human species where traditional tribalism must be superseded by the acceptance of the essential unity of all human society. The distinctive strengths of tribalism need to be transferred to the whole human race; tribalism needs to be transformed into globalism.

Yet, in spite of encouraging signs in international and global activity (as described in Chapter 8), we are also witnessing an alarming resurgence of tribalism. In Europe during the last four centuries the disintegration of Christendom has led to the rise of nationalism, a modern form of tribalism. On the global scale, however, tribalism can operate in whole civilizations. As Huntington has stated: ‘Civilizations are the ultimate human tribes and the clash of civilizations is tribal conflict on a global scale.’6 And he predicts: ‘Cold peace, cold war, trade war, quasi war, uneasy peace, troubled relations, intense rivalry, competitive co-existence, arms races: these phrases are the most probable descriptions of relations between entities from different civilizations. Trust and friendship will be rare.’7

It is much the same, if not worse, when we turn to the conservative defenders of the various religious traditions. An original goal of those traditions was, among other things, to overcome tribalism. But the old tribalism dies hard. Not only did ethnic tribalism spring to life again with the fragmentation of Christendom, but Christianity and Islam have each tended to develop further forms of tribalism -- religious tribalism. Christian tribalism took the form of denominationalism. During the twentieth century, as denominationalism has lost its vitality, religious tribalism has taken the form of animosity and unco-operativeness among the conservative, liberal and radical wings of the ecclesiastical organizations, with the conservative wing being particularly militant.

In 1993 there appeared two books with a similar theme, Out of Control: Global Turmoil on the Eve of the Twenty-first Century by Zbigniew Brzezinski and Pandaemonium: Ethnicity in International Politics by Daniel Moynihan, who had had a distinguished career as a Harvard professor, US ambassador to the UN, president of the Security Council and then senior US senator. They both argue that the world is heading for chaos as a result of the breakup of nation states; the intensification of tribal, ethnic and religious loyalties; the spread of international terrorism; and the proliferation of weapons of mass destruction. Senator Moynihan rightly concedes that one’s ethnicity or nationality can be a legitimate source of pride, but warns, using the words of the Christian ethicist Reinhold Niebuhr, that it can also be a form of collective egotism, potentially very destructive.8

Brzezinski, who was formerly national security adviser to US President Jimmy Carter, published a book in 1989 called The Grand Failure, in which he prophesied the collapse of communism. In Out of Control he suggests that the world today is ‘like a plane on automatic pilot, with its speed continuously accelerating but with no defined destination’.9 The idea that humankind is in control of the various forces promoting change is an illusion:

Man does not control or even determine the basic directions of his ever-expanding physical powers. The plunge into space, the acquisition of new weapons, the breakthroughs in medical and other sciences are shaped largely by their own internal dynamics . . . The human being, while being the inventor, is simultaneously the prisoner of the process of invention.10

In the same year, 1993, Alvin and Heidi Toffler published War and Antiwar: Survival at the Dawn of the 21st Century, in which they said:

For the past three centuries the basic unit of the world system has been the nation-state. But this building block of the global system is itself changing . . . Many of today’s states are going to splinter or transform, and the resultant units may not be integrated nations at all, in the modern sense, but a variety of other entities from tribal federations to Third Wave city-states.11

Many have noted that while globalization has been drawing people together into worldwide unions and federations, opposite forces have been focusing attention on individualism and on small, tightly knit groups and movements. There is therefore a strange ambivalence present in globalization. On the one hand, it is drawing all nations and cultures into one global conglomeration which has the capacity to formulate some common moral standards that might enable us to eliminate the wars of the past and establish a stable global society. On the other hand, it has stimulated a resurgence of both ethnic and religious tribalism, which is causing new pressures, and their accompanying tensions and hostilities, to emerge.

Religious tribalism is exemplified in the twentieth century by the rise of fundamentalism. Although the term fundamentalist originated in Christians circles (see Chapter 4), it is now used to refer to any person who tries to hold at bay the impact of cultural change on their traditional beliefs and practices. So today there are Muslim fundamentalists, Jewish fundamentalists, Hindu fundamentalists and ethnic fundamentalists, each trying to preserve and revive their traditional rites, practices and beliefs. All of these reject some or all of the various values which we have earlier sketched as modernity. Modernity is for them a blind road; and while they are not necessarily wholly opposed to entering a global society, they are adamant that they alone hold the key that will enable that society to eventuate.

Gilles Kepel, after studying the rise of fundamentalism in Islam, Christianity and Judaism, particularly from the 1970s onward, warned of its dangers in his book The Revenge of God. All fundamentalists have much in common, according to Kepel. They all reject secularism, which they attribute to the influence of the western Enlightenment. This secular rejection of the traditional form of religious faith they see as the reason for the rise of both Nazism and communism. They reject what they see as the immoral mode of life in the modern secular city. But, by the same token, they also diverge sharply from one another. Each looks for the reorganization of society in accordance with its own specific set of holy scriptures. That is, they aim to re-Islamize, of re-Judaize, or re-Christianize society, and so their ideals clash. Thus fundamentalists are not only at odds with the more liberal sections of their own religious or ethnic community but they disagree also with fundamentalists of other persuasions, although they share a similar mind set.

Fundamentalism, in its various forms, is a force that can no longer be ignored, for it constitutes one of the most serious obstacles to the evolution of a global society. Global society calls for flexibility of thought and practice, for empathy with those who differ, for compromise in a spirit of goodwill; it requires mutual co-operation for the common good. Fundamentalism, by contrast, is socially divisive, calling for absolute (and even) blind loyalty to a holy book or a set of fixed principles. Fundamentalism leads readily to fanaticism, for fundamentalists are so sure of the truth that they are not open to dialogue or other human reasoning. Fundamentalists insist on remaining loyal to the fundamentals, even if this leads to their own death or the death of others. Indeed, Muslim fundamentalists sometimes see martyrdom as the fast road to eternal bliss. Such fanaticism soon leads to terrorism and suicide bombings, as in the Muslim Hizbollah and the Jewish Gush Emunim. Fundamentalism is an intense form of religious tribalism which can lead to social chaos in today’s world.

Alternative to the road to social chaos is the evolution of a global society. Globalization offers the opportunity for the rise of a new kind of human society, an open society on a global scale. The global economy, which is already emerging, will form a natural part of this. The dominant part played by the global economy in globalization is, however, no guarantee of a secure future. By virtue of its very size and complexity, the global economy is also unstable and vulnerable. Most of us know from personal experience that the more complex the technological gadget, the greater the consequences when it breaks down. So it is with the global economy. A sudden change or disaster in one place sends shockwaves around the world. A rapid economic success story in one country may put thousands of people out of work on the other side of the globe. The stock markets around the world respond within moments of hearing a particular item of news.

Another reason for regarding the global economy as unstable is the expanding gulf between rich and poor nations. Immediately after World War II, the acknowledged disparity in the wealth of nations led to the establishment of the World Bank and the International Monetary Fund; it was then widely assumed that the so-called developing countries could be brought up to some sort of parity with the developed countries by lending money and promoting economic growth. Indeed, economic growth has become the chief goal of most national economies; governments use it as a barometer for their political policies. Yet even Adam Smith, whose book The Wealth of Nations (1776) has become the bible of today’s protagonists of free trade, conceded that his theory of economic growth broke down at the point where human expansion reached the limits of the Earth’s resources. Writing more than 200 years ago, he believed this point to be only hypothetical; we know that we have now reached it.

In 1972 the Club of Rome, an international assembly of business leaders, published The Limits to Growth. Just as Thomas Malthus had shown how population had the capacity to increase faster than the food supply, so this computer-based report concluded that world order would collapse if population growth, industrial expansion, increased pollution and the depletion of natural resources were to continue at current rates. The Club of Rome called for ‘a Copernican revolution of the mind’, which abandoned the commitment to endless economic growth and set instead as its goals zero population growth, a leveling-off of industrial production, increased pollution control, and a shift from consumerism to a more service-based economy. The recommendations of the Club of Rome were heavily criticized by business interests who had most to lose, but their claims served only to illustrate how much political and economic ideology is driven by self-interest.

The promotion of economic growth has not reduced the economic gap between the developing and the developed countries, as the World Bank and the International Monetary Fund assumed it would. In 1960 the wealthiest 20 per cent of the world’s population had a per capita income 30 tunes that of the poorest 20 per cent. By 1989 the disparity had doubled to 60 times more. Third World countries now find themselves so heavily in debt that they have become economically enslaved to the developed nations. The old colonialism has gone, only to be replaced by a new kind: economic colonialism.

What we should be aiming for today is not economic growth but greater social cohesion. Between about 1870 and 1970 many countries of the western world did focus their attention on economic policies that promoted social cohesion at the national level. They did this by means of social welfare schemes, and by making education and health resources freely available. These welfare state economies are now mostly in disarray or are under great strain, particularly since the collapse of communism. The move to a global economy is partly responsible; globalization has intensified the interdependence of economics, health, education, culture and religion. Increased pressure in one area soon causes tension in the others, both nationally and globally. Any attempt to plan the future of one area (such as health or education) in isolation from the others, soon encounters insuperable difficulties.

What we need -- and what we lack -- is the vision of an ultimate goal, such as the promotion of social cohesion on a global scale and the realization of a global society. Globalization is challenging us to exercise a greatly increased sense of responsibility to our fellow humans. Today’s answer to the rhetorical question ‘Am I my brother’s keeper?’ has to be a resounding ‘Yes’. We humans must learn how to live together in harmony, goodwill and mutual responsibility.

The vitality of human society has always rested on bonds of loyalty that were both genetic and cultural; now both need to widen their framework. The genetic bonds of globalism are to be found in our common humanity, irrespective of tribe, nationality, culture and religion. For we humans are all genetically linked; we share nearly all of the DNA formula peculiar to our species, and we are already bonded as blood brothers and sisters. Our cultural bonds must now become global rather than regional. Our common humanity has been acknowledged, for example, in a declaration of human rights and a growing recognition of racism as an immoral attitude. What separates us and makes us all different is our cultural conditioning, which still strongly reflects the diversities and tribalism of the past. We have yet to evolve a common human culture, and this will be essential if there is to be a global society.

Yet some faint outlines of a global human culture are already appearing. The recognition of our common humanity is leading us, if haltingly, to a set of common, humanly based values, which arise out of the human condition we all share. We increasingly feel a moral obligation to treat all fellow humans as equally as possible.

One of these common values is personal freedom. For the first time in human history, people are being encouraged to think for themselves, and to challenge any beliefs and way of life imposed upon them. This freedom to think for oneself, to make personal judgements and decisions, good though it may be at the local and individual level, can unfortunately prove a liability at the global level. Individual autonomy places on human shoulders much greater responsibilities than the majority of people have had to bear before. It may well be asked whether, as a species, we are ready to shoulder that responsibility.

Thus what has emerged as a great human gain at the individual level may well prove to be a further threat at the collective level. It is entirely possible that, at a time when very important decisions have to be made and acted upon for the good of humankind and the planet as a whole, far too many people will focus their attention on their own immediate vicinity and insist on claiming their individual right to act within it as they wish. They may end up going in different and conflicting directions, thus producing anarchy and social chaos. Huntington, for example, contends that ‘far more significant than the global issues of economics and demography are problems of moral decline’, an ‘increase in antisocial behavior’, decay of family structures, weakening of the ‘work ethic’, and decreasing commitment to intellectual activity.12 Similarly Brzezinski refers to a current global crisis of spirit which has to be overcome if the human race is to regain some control over its destiny.

This dilemma is far more important as we enter the twenty-first century than economic theories of how to achieve economic growth. Either we learn how to live in harmony with one another and in harmony with the earth or else the human species goes the way of the dinosaurs. The twenty-first century will be a severe test of the human species. Instead of finding our enemies in other ethnic groups or in spiritual principalities and powers such as the Devil, we humans are just beginning to realize that we are becoming our own worst enemies. We are at war with ourselves.

After a long period of dispersion over the whole earth the human race is now being pushed together, whether we are ready for it or not. We find that, with millennia of civilization behind us, there is much primitive tribalism still beneath that veneer. We live in a global world and we have a common destiny on this planet, but our decisions are hampered by narrow-mindedness and short-sightedness. The majority of us are so immersed in our personal affairs in our own small part of the world that we are unaware of the sword of Damocles now suspended over our heads -- a sword that hovers because of the continuing tribalism which keeps us in a state of war with other humans, and because the very earth on which we live is now issuing its own a set of warnings about the limitations on a human future.

 

Notes:

1. Genesis 4:1-16.

2. Genesis 2:1-9.

3. Galatians 3:28.

4. Samuel P. Huntington, The Clash of Civilizations and the Remaking of World Order, p. 183.

5. Ibid., p. 184.

6. Ibid., p. 207.

7. Ibid.

8. Daniel p. Moynihan, Pandaemonium; Ethnicity in International Politics. pp. 46, ‘73

9. Zbigniew Brzezinski, Out of Control, p. xiv.

10. Ibid., pp. 204-5.

11. Alvin and Heidi Toffler, War and Anti-war, p. 242.

12. Huntington, op. cit., p. 304.

Chapter 8: Globalization

Globalization is a word that has only recently entered our dictionaries. First used to refer to the new global economy, it now encompasses the great new phenomenon of our time -- the process by which all scientific, cultural, religious and economic human activity is being integrated into one worldwide network. Humans of all ethnic groups, all nations, all cultures and all religious traditions are being drawn together into one global community. Without exercising much individual choice, we are becoming part of a global interchange of news, knowledge and ideas; we are increasingly dependent on one global economy, and influenced by a developing global culture. This is in spite of our diversity, our frequent mutual animosity and our all too common fear and distrust of all things foreign.

But if the term globalization is new, the phenomenon which it names has been around for quite some time. With hindsight we can readily discern several causes, all present in the emergence of western modernity during the last 500 years. Familiar as these trends and developments have been, we have not seen where they were leading us.

Technology is perhaps the most obvious cause of globalization, particularly the technology that so rapidly advanced travel and communication across geographical and ethnic barriers. For many thousands of years, humans lived in relative isolation from one another on the different parts of the earth’s surface, areas to which they had slowly dispersed during long period of human biological and cultural evolution. Then, in the fifteenth century, Europeans invented the ocean-going vessels which gave them access to what they named the ‘new world’. (Such terms as ‘voyages of discovery’, ‘the new world’ and ‘the far east’ express a specifically western perspective.) Only after these voyages were humans able to draw a reasonably reliable map of the planet’s surface, showing how the various continents and islands each form part of one global world. Gradually it became clear (though this was all too slowly appreciated) that the surface of the earth is finite; eventually we reached the end of new ‘worlds’ to explore, at least on this planet.

From the sixteenth century onwards, and particularly in the nineteenth century, ocean travel led to the European colonization of the Americas, Africa and Oceania. This enabled the European nations to export their surplus population and thus begin the global intermingling of races that has continued ever since. At first, globalization seemed perilously close to being Europeanization, but in the latter half of the twentieth century European supremacy has been modified, partly by the spread of Indian, Chinese and Japanese people, and partly because some of the colonized peoples took the opportunity of their imperial citizenship to settle in Europe.

In the nineteenth century land travel was speeded up by the invention of steam locomotion and the development of railways. Coupled with the industrialization of production in factories, this brought a radical shift in people’s lives, from a predominantly rural existence to a predominantly urban one, first in Europe and later elsewhere. Whereas families had often stayed in their own village for centuries, people now began to move about in search of work or advancement. With the twentieth-century invention of the motor car, then the aeroplane, families and individuals were even less likely to be anchored to one place for life. When Europeans first began to migrate to Australia and New Zealand in the early 1800s, for example, they faced a three-month sea journey, and many never returned to the land of their birth. Now the journey takes 24 hours. Long-distance travel has become an everyday affair, and is for many young people simply a part of their general education. Even those who cannot afford travel are familiar with the experiences it offers, and often benefit indirectly. Long-distance travel has, most of all, hastened the meeting of races and cultures, and nurtured the incipient global culture.

New technology made another huge contribution to globalization by intensifying the communication of news, the spread of ideas and the transfer of information. It started with the invention of the printing press by Gutenberg in 1437. The Protestant Reformation would not have spread so successfully had it not been for the ability to print propaganda pamphlets and multiply copies of the Bible in the vernacular. Previously the Bible had to be painstakingly copied by hand and, being in Latin (the Vulgate), was accessible only to scholars. Making the Bible available in all Protestant parish churches in the language of the people increased the desire of ordinary people to become literate so that they could read it for themselves. The idea of universal education took root at that time.

In the seventeenth century the first regular newspapers were published, thus spreading further afield the news and current opinions which were otherwise only locally known. The collection of world news and its dissemination through newsprint flourishes to this day in spite of new technological rivals. The distribution of information was greatly assisted by the invention of the telegraph by Samuel Morse in 1837. Then came the invention of the telephone in 1876: now the human voice could be transmitted across ever longer distances, and communication itself became much more personal.

The discovery of radio in the late nineteenth century led to the establishment of radio broadcasting in the I9zos. Owning a radio started as a privilege but was soon seen as a necessity, and is now almost universal, even in Third World countries. Broadcasting was extended to television by the middle of the century, and has become an enormous industry. Television exerts a powerful cultural influence, not only by sending almost instant visual news all over the globe but also by fostering cultural change. Currently it is one of the chief instruments by which globalization is being advanced.

One of the twentieth century inventions with an impact reaching far into the future is the electronic computer, which has the capacity to speed up many forms of written communication. As we are still at the beginning of the computer revolution, its consequences are hard to foretell, even within the next decade. We do know, however, that the speed of change in communications technology is such that each innovation is rendered obsolete within a few years.

The last two decades have witnessed the introduction of the internet, offering a new way of sending information almost instantaneously around the world. Electronic mail via the internet provides fast and cheap personal intercommunication on a global scale. It is bringing together different networks of people from all around the world, many of whom may never have any other contact. The computer has also made a huge difference to the collection and storage of information. All the great libraries of the world are now linked together; immense databanks for all sorts of subjects are accumulating. Computer systems and the internet now provide access to an unbelievable amount of information, both good and bad. The industrial age, only 300 years old, has now been superseded by the information age.

This has sometimes been called a knowledge explosion. However, we need to distinguish between information and knowledge. Having access to information is not the same as being knowledgeable, just as the possession of knowledge does not necessarily produce wisdom. To be knowledgeable we need to absorb and master the information. But the time has long passed since any one person could absorb more than the tiniest fragment of the total body of available, reliable information. One can now be a specialist only in a very confined area.

All this has meant that geographical distance is no longer a barrier separating individuals or groups. We may now have more regular and personal contact with someone on the other side of the world than we do with the person living next door. On the one hand, the boundary lines which separate ethnic and national groups (which were mostly geographical in origin) are now becoming blurred, indistinct and sometimes irrelevant. On the other hand, global pressures often serve to intensify ethnic identity and become the cause of conflict. Both types of response illustrate how the world is fast becoming one global city.

Speed of travel, the intensification of communication and the rise in the average level of education have also meant that the various aspects of western modernity have spread quickly around the globe. We are beginning to be aware that, no matter where we were born and whatever our culture, we share a common story -- the story of human origins within the more complex story of the evolution of life on the planet. As the once separate cultures meet and cross-fertilize one another, humankind is beginning to share more and more values -- such as the concern for human rights and personal freedom.

Today many cultures acknowledge (at least superficially) the supreme value of personal freedom and of human rights, but this dates only from the Declaration of Human Rights during the French Revolution of the late eighteenth century. ‘Liberty, Equality, Fraternity’ are commonplace ideas today. Yet the same words struck fear into most thinking Europeans of the day, including religious leaders; they appeared then to threaten the very fabric of society.

In the following 200 years, the revolutionary ideas of freedom, equality, brotherhood and human rights for all have spread further and further, inspiring a series of emancipations. First came the emancipation from absolute monarchy and its replacement by democracy. Then came the emancipation from slavery; humans had the right to personal freedom. Then came the emancipation of women from male domination; women initially claimed the right to hold property and to vote, and more recently their right to social equality and career opportunity. During the twentieth century indigenous peoples have sought emancipation from foreign imperialism, and colored races emancipation from white domination. The emancipation of homosexuals from heterosexual domination is being vigorously debated today. In all cases, the struggle for emancipation has met fierce resistance, which still continues in many quarters.

Globalization has also meant that we are moving from what may be called the closed society -- that is, one surrounded by a clear boundary -- to the open society. In a closed society, one’s freedom is significantly restricted, for one is deeply involved in the duties (or morality) of that society. Anyone outside the closed society is an outcast, an excommunicate or a foreigner: that which is outside is unknown and potentially dangerous. The closed society has a strong sense of its own identity, from which members draw their sense of security; it is held together as a social organism largely by authority, exercised from above but also supplemented by peer pressure. And it depends on such qualities in its members as loyalty, trust and an absolute respect for authority. Until the rise of the modern world, all societies, whether tribal, ethnic, national or religious, were closed societies, to a greater or lesser degree.

The open society has been emerging in the modern world along with the globalization. It is a society in which boundary lines are less distinct, so that people can leave or join with relative ease. The open society permits and fosters the growth of individualism; people enjoy greater personal freedom and, by the same token, more responsibility. Whereas membership of a closed society helped to provide personal identity (‘I’m a Scotsman’, for example, or ‘I’m a Presbyterian’), individuals in the open society are both freer and have more responsibility to establish their own identity. Closed societies, by the authority they exert, effectively cushion their members from the exercise of responsibility; decisions are largely made for people, not by them. In the open society, where external authority is greatly reduced, much more responsibility rests with each individual to promote the welfare of the society. ‘Where that responsibility is lacking, the society either disintegrates or is forced to return to the rigid authoritarianism of the closed society.

Many of our current social problems arise from the fact that, as our culture shifts from the closed to the open society, people often struggle with their new social responsibilities. And this is occurring on a grand scale, as a large number of former closed societies find themselves part of one vast, complex and (as yet) embryonic open society. The open society also has some inherent difficulties. Any vigorous human society draws its identity from a shared tradition of common beliefs, values and practices. The growth of individualism and personal freedom can damage such traditions, and this in turn endangers social cohesion. Thus globalization and the advent of the open society, while bringing great benefit to the individual, can have serious consequences for human society. These consequences are the more serious if we remember that our very humanity, as individuals, relies upon human society and what we receive from it.

Language, so essential to our culture and humanity, remains the basis of a human society. Without it there can be no social cohesion. Nonetheless, language has been (and still is) one of the main causes of humankind’s division into separate closed societies. Globalization is now throwing the spotlight on this phenomenon. If there is to be one global community, clearly it would be easier for all humankind to have one language (though that would be no guarantee of peace and unity). There are still over 6,000 living languages, but many have disappeared already, and it has been estimated that another 80 percent will die out during the twenty-first century.

But language is not solely a form of communication; it also provides an essential means of identifying our particular kind of humanity. Diversity of language among humans has provided a richness in human culture not to be undervalued. If all languages but one were to die, the rich cultural heritage of the past would be lost to all but a small scholarly elite. Already the prospect of losing of their mother tongue represents a profound loss of identity to many cultures and many peoples. The practical need for a common language in a global society has already assisted the spread of the most widely-used languages, such as English and Spanish. Yet David Crystal, an acknowledged authority in linguistics, has warned that the survival of English as the only language left in 500 years’ time would be a great intellectual disaster.

In various places and at different times in the past, a particular language has emerged as a lingua franca alongside a local language. The ideal future would be one in which everybody could converse in at least two languages -- a mother tongue sustaining their own cultural heritage, and a lingua franca providing access to other people and cultures around the globe. The creation of Esperanto was an artificial attempt to achieve this but it is unlikely to succeed. Yet something like a universal language is being provided by empirical science (for example, in mathematics), this being one of the important areas of commonality in the new global culture.

Globalization is further exemplified by the great increase in international commerce and trade. Less and less do countries live solely off the products they themselves produce; more and more, they are engaged together in a complex global economy. Production of goods, marketing, financial backing and promotion are all increasingly planned at a transnational level, with the result that national governments have less economic power (especially smaller states, or those in the so-called developing world). Globalization strategies by the major transnational organizations now dominate the corporate scene, leading to a narrow specialization in key sectors. It is widely claimed that the globalization of production helps to cut costs, and that (as long as gains are not outweighed by transport costs) everybody benefits; the truth of such claims is also strenuously challenged, and there is strong evidence that the real beneficiaries are powerful, wealthy, western countries, and the transnational companies they support. Organizations such as the International Monetary Fund and the World Bank have been formed by the powerful nation states to foster globalization. Initially started by the United States and Britain during World War II to assist postwar international financial and economic co-operation, these have been joined by others such as the World Trade Organization, and act as brokers for the free market. We are moving by stages to a single global economy, operating on free market principles. (The negative and possibly destructive consequences of a free market global economy will be discussed in the next chapter.)

In drawing the nations of the world into one arena, globalization can also lead to serious conflict, as the twentieth century has already demonstrated. Paradoxically, though they threatened to tear humanity apart, the two world wars led to the foundation of international organizations that may be seen as the first steps towards some global form of world government. The League of Nations was established by the victorious Allied powers at the end of World War I for the purpose of preventing another destructive global conflict. It promoted the principle of collective security, arbitration of international disputes, and reduction of armaments; it also set up a Permanent Court of International Justice. But while the League of Nations did assist with minor international disputes, it had little real power for dealing with issues such as the Japanese invasion of Asia, Italian expansion in Africa and German aggression in Europe. In 1946 it was replaced by the United Nations, which inherited many of the League’s goals. According to its charter, the United Nations aims to save succeeding generations from the scourge of war, to reaffirm faith in fundamental human rights, to promote worldwide co-operation in the solving of international economic, social, cultural and humanitarian problems, and to maintain international peace and security. Beginning with 51 nations, membership had grown to 80 by 1956 and to more than 180 by 1996.

Parallel to these international Organizations in government and commerce are many international societies of religious, cultural, humanitarian and sporting interests. No sooner does some new enterprise begin than, within a short time, it spreads around the world and takes root in other areas. All major sports are now linked on a global level, and world cup tournaments are the order of the day. Most notable are the Olympic Games, rejuvenated in 1896. Originally a product of Greek culture, the Olympics ran for over 1,000 years as a series of contests including music and poetry along with sport, before being abolished in 393 CE. Today their success is emulated in localized events such as the Commonwealth Games.

The phenomenon of globalization has already begun to generate what may be called global consciousness. This new and developing form of human consciousness is still far from universal, but the world that we each create in our heads1 (our mental picture of reality) is now being constructed rather differently, by absorbing innumerable bits of knowledge and information from all over the world, not just from our own small locality. In particular, as Thomas Berry has observed, the discovery of the evolutionary process is making a profound impact on human consciousness.2

In global consciousness we know that, if we go far enough back in time, we share a common origin not only with people from very different cultural and religious backgrounds, but also with all forms of life on the planet. Moreover, we now face a common planetary future with all other humans and all other creatures. Our forebears, whose lives were contained by cultural and geographical horizons, had only the haziest knowledge of people who lived in other countries, and could afford to ignore them completely -- as we can no longer do. An emerging global awareness has (as we saw in Chapter 6) forced us to see all cultures, including our own, in relative terms.

We understand too that our earthly home is a tiny planet spinning in space, and that it is strictly finite and limited. We know that if we go far enough around the world in one direction we come back to where we started. For the first time in human history we have, in the twentieth century, been able to construct a fairly clear picture of the whole globe in our mind. Space exploration and satellite photography allows us to see even what the earth looks like from outer space. Into our mental picture we slot the various bits of news we see each night on television.

Global consciousness is causing us to discover and acknowledge both cultural diversity and cultural relativity. It is making us more aware of our own cultural identity (or lack of it). Only a few years ago people never used the word ‘culture’ in the way we do today, and never thought of themselves as belonging to a particular culture. Today we continually hear discussions about cultural identity, cultural autonomy and cultural diversity. People of different cultures are meeting personally, reading about each other’s lives, or seeing one another on television. Few of us live any longer within the boundaries of only one culture, and new terms, such as bi-cultural and multi-cultural, are emerging.

All this is evidence of a massive and far-reaching change reshaping the sort of creatures we humans are. We are now conscious of the ways in which our social, cultural, economic and even our mental life is being interwoven with that of others. We are becoming more interdependent, and are having to learn how to become one global society whether we wish to or not.

Although globalization is a modern phenomenon, the idea of a global consciousness and society goes back a very long way, and originated as a religious vision. The prophets and psalmists of ancient Israel envisaged a time when the people of all languages and nations would come from the four corners of the earth and be gathered together into a re-united family. This vision was intensified in the Christian and Islamic traditions to which Judaism unwittingly gave birth. Each of these monotheistic religious traditions had, at its best, the capacity to unite humanity. Each set out with a vision of binding humankind into one community of nations. Christianity looked forward to the time when ‘all nations shall come and worship’ the one deity; Christians spoke of the Kingdom of God, which would supersede the kingdoms of this world. Islam affirmed that people of all nations are brothers one to the other. The Muslims saw it as the culmination of human brotherhood, a concept symbolized magnificently in the ceremony of the annual Hajj to Mecca. Although these visions led to two great multiethnic civilizations -- Christendom and the Islamic world -- neither of these has succeeded in becoming the new global society, partly because each encountered within itself a continual resurgence of pre-Axial ethnic tribalism, and partly because each developed an intolerant exclusiveness which turned it into another closed society, this time of a religious kind. It has been difficult to resolve the uneasy tension between loyalty to the local community, whether ethnic or religious, and that to the larger human community, which globalization requires.

Perhaps the first to grasp the full significance of globalization, and to experience global consciousness intensively, was the Jesuit priest-scientist Pierre Teilhard de Chardin (1881-1955), whose seminal book The Phenomenon of Man was written before 1940 but not published until after his death. Writing at a time when the signs of globalization were not nearly as obvious as they are today, he foresaw a process he called ‘planetization’, by which ‘peoples and civilizations reach such a degree either of frontier con-tact or economic interdependence or psychic communion that they can no longer develop save by the interpenetration of one another’.3 Teilhard de Chardin wholly identified with the traditions of the Christian west, yet his visionary mind was able to lift the Christian themes and symbols out of their traditional usage and re-interpret them. In his breathtaking conceptualization of planetary life, he saw the emergence of the human species and the subsequent globalization of the planet not in terms of the conquest of the globe by Christianity but as the logical consequence of an ongoing evolutionary process.

He identified one aspect of evolution as divergence. Divergence can be seen not only in the multiplication of species over a long period of time, but also in the multiplication and diversification of languages, cultures and nations within the human species. This phenomenon of divergence functions rather like the lines of longitude on the surface of the planet, growing further apart as they move away from their point of origin at the earth’s pole. At the equator, however, the lines of longitude start to come closer together. This other aspect of evolution Teilhard de Chardin called convergence; he saw it, like divergence, as an integral part of the evolving process as it moved from the point of origin he called Alpha to the end point he called Omega. From the time the species Homo sapiens consolidated itself to become the only survivor of the various hominoid species that had evolved as a result of divergence, humankind slowly spread over the earth, moving into previously uninhabited areas.

But what was to happen when there were no more places into which to spread on this finite planet? That very point, said Teilhard de Chardin, would mark the transition in humankind from divergence to convergence. The human species would fold back upon itself, merging all ethnic groups and cultures into one unified species, one global culture. This would then produce a new, higher form of human consciousness. He foresaw the rise of global consciousness, and dared to call it super-consciousness. Teilhard de Chardin believed that the twentieth century marked the coming of that crucial time in the planet’s history.

‘What we have described as globalization is remarkably close to Teilhard de Chardin’s planetization, in which ‘[mankind, born on this planet and spread over its entire surface, come[s] gradually to form round its earthly matrix, a single, major, organic unity, enclosed upon itself.4 Thus the globalization of humankind could lead to the formation of a new kind of living entity -- a social organism -- on the same cosmic principle as that by which atoms join to form molecules, molecules join to form mega-molecules, mega-molecules unite to form living cells, and innumerable cells constitute an organism. Within this social organism there would arise ‘a spiritual center, a supreme pole of consciousness, upon which all the separate consciousnesses of the world may converge and within which they may love one another’.5 This super-consciousness would evolve in the same way that personal consciousness does within the complex physiology of the human organism.

Of course Teilhard de Chardin’s vision of the evolutionary process need not be taken too literally. It was severely criticized on publication -- by many scientists for not being truly scientific and by many theologians for not being truly Christian. Some of its details are now dated. Yet, as a total vision of the story of the planet and of the future of humanity, it is still inspiring, and shows remarkable insight.

But is such a scenario even possible, let alone probable? A tenuous global culture is now emerging, though it is still embryonic. It is relativizing all the earlier cultures, just as the trans-ethnic cultures created by Christianity and Islam each relativized the ethnic cultures of the countries into which they spread. Motivating this global culture is the growing global consciousness which we have just described. Is this anything like Teilhard de Chardin’s super-consciousness? It is far too early to say. Possibly the human species, by successfully responding to current challenges, could become so united in love and goodwill that there would be some kind of spiritual center linking all humans together in a commonality of consciousness. Martin Buber in his spiritual classic I and Thou maintained that where people are drawn together by deep personal relationships to form a true community, there emerges in their midst a spiritual center, which both reflects and continues to foster the unique quality of their relationships. Such a center could constitute a super-consciousness.

It must be conceded, however, that Teilhard de Chardin was working within an exclusively Christian framework. He was thinking very much in terms of the church as the supra-national society with Christ as the spiritual center. While globalization has been largely made possible by the Christian west, with its rapidly expanding technology and imperialistic ambitions, this does not make globalization necessarily welcome in non-Christian cultures. They welcome the new technology but often regard the associated cultural challenges not as aspects of globalization but as westernization -- and this they wish to reject. The resulting conflicts and difficulties now surfacing in the international arena were not part of Teilhard de Chardin’s planetization.

In addition, it needs to be remembered that the globalizing process, advanced by western science and technology, also gave rise to today’s secular culture. This is a product of the west, yet it was not planned by those proclaiming and defending Christianity; indeed, traditional Christian preachers often treat secularism as one of their chief enemies. Yet secularity became so much a part of western culture that the mission of bringing Christianity to the rest of the world has spread secularity as well. Today’s embryonic global culture is secular in nature, however much it may draw its values from the Christian and other cultures which have preceded it.

Globalization means that all kinds of allegiances -- personal, family, religious and national -- are increasingly subject to global concerns. We can still value, and feel some loyalty to, our own personal circle and cultural background, but these loyalties are now becoming subject to the imperatives laid upon us by globalization. The citizenship of each particular nationality must take its place alongside global citizenship, our current cultures and our religious allegiance alongside an emerging global culture. But what will be the character of this culture and will it come at all? I would venture that any coming global culture will need to be humanistic (rather than traditionally religious), naturalistic (rather than super-naturalistic) and ecological (designed to promote the health of all planetary life). But as we shall see in the next two chapters, there are strong forces -- economic, political and religious -- which will seek to prevent the coming of such a culture. Have we the wisdom, courage and motivation to become global citizens and to welcome a global culture?

An important aspect of globalization is that no human problem of any size exists in isolation. It therefore does not lend itself to any simple solution. What happens in one geographical area and/or in one aspect of life is quickly reflected elsewhere. Human life on this planet has the capacity to become a complex social organism with the kind of super-consciousness foreseen by Teilhard de Chardin. But there are also divisive and destructive forces present in the humanization of the globe.

Globalization is currently rolling along without any one person, organization or nation controlling it. Like the march of time, it is going on its way relentlessly. There is very little possibility of holding it back or of directing it. As we arrive at the year 2000 CE, the process of globalization has reached a turning point. It could lead to a form of human existence more wonderful and exciting than we can possibly imagine -- a veritable heaven on earth. Or some of the trends that have been encouraging globalization may have disastrous consequences far beyond human control.

Whichever way the future unfolds -- and this we shall tentatively explore -- the year 2000 CE in the Christian calendar may well prove a milestone in human history after all. Calendars are built around such milestones, for, as we have seen, Dionysius Exiguus changed the year 753 of the ancient Roman calendar into the year 1 CE. The year 2000 CE marks not merely the end of the second Christian millennium; it may well mark the end of Christianity as a clearly definable tradition. Globalization is bringing us into such a profoundly different era that some future generation may well be moved to discard the Christian calendar entirely, and rename the year 2000 AD as 1 GE, the first year of the Global Era.

 

Notes:

1. See the author’s Tomorrow’s God, Part I.

2. Thomas Berry, The Dream of the Earth, p.117.

3. Pierre Teilhard de Chardin, The Phenomenon of Man, p. 277.

4. Pierre Teilhard de Chardin. The Future of Man, p. 120.

5. Ibid., p. 124.

Chapter 7: A Post Christian Future

Whatever our future in the western world, it has already been partly shaped by the Christian tradition. Indeed, the post-Christian age we are now entering owes its very existence to the Christian civilization of the last two thousand years. The structures of the Christian church may have little or no part to play in the years ahead, but the world we live in will remain deeply influenced by Christianity, its beliefs, customs, and culture. This means, first, that the post-Christian age is to be clearly distinguished from the pre-Christian age (which Christians often referred to as pagan). Secondly, it means that the post-Christian era is not necessarily anti-Christian (as many Christians are inclined to judge it), and we can legitimately speak of a Christian ‘stream of influence’ continuing within it. 1 But how does this influence manifest itself? First we shall discuss the future of conventional Christianity in the post-Christian world.

Organized Christianity in the form of an ecclesiastical institution has already been greatly fragmented. Church structures will continue to multiply in number and to become smaller. There is no longer any place for a national church or an international ecclesiastical organization which is monolithic and authoritarian. The coming decades may well see the sudden disintegration of the Roman Catholic Church which, because of the central power wielded by the Vatican, has been described as the last great absolutist empire. The pronouncements of the Pope no longer receive from all Catholics the unquestioning and obedient response they traditionally did.

Yet the mainline churches, including Roman Catholicism, will be active for quite some time, carried along by the momentum of past centuries. These churches will increasingly depend on their inherited capital and real estate, until these resources are exhausted. More seminaries will close their doors, and it will be hard to meet any residual demand for a properly trained, professional clergy.

Ultimately there will be no need for a priesthood or ordained ministry. In the post-Christian era divine revelation is no longer seen as a source of knowledge, and the traditional organs of religious authority have become obsolete. The Word of God in the Bible, the voice of the Pope or the decisions of ecclesiastical assemblies -- all will fall more and more on deaf ears. The authority of religious leaders, like that of civil and political leaders, will depend on the emotional or intellectual appeal of what they say and not on any special gifts supposedly conferred on them by ordination. The once clear line between priesthood and laity is already blurred and will soon count for little. People no longer seek professional spiritual advice from a supposed authoritative source as they once did.

This all comes from the growth of human autonomy -- the freedom of people to think for themselves and to make their own decisions. It is not only the traditional religious institutions that are affected; there has been a rapid multiplication of social groups one may choose to join. Anecdotal evidence suggests, further, that people today are more reluctant to commit themselves permanently to any form of association -- be it a club, society, political party, church or marriage partner. Taking life-long vows was once regarded as highly virtuous. Now it may be seen as precarious and even unethical: the person one is at the present moment may not have the moral right to bind the person one has yet to become. In this age of rapid change, and with our modern understanding of the human condition, we can see how much alters in a person’s lifetime; we must remain open to what may come, and free to respond to new circumstances.

Yet, because humans are social creatures, we shall continue to value opportunities for fellowship and interpersonal activities, whether in sport, culture or spirituality. The institutions best suited for spiritual needs are those that are fluid, informal, inclusive and open to change. They must provide the fullest opportunities for people to be themselves, to participate actively and to share in decision-making. As sociologist Robert Bellah said: ‘Each individual must work out their own ultimate solutions and the most the church can do is to provide a favorable environment for doing so, without imposing on him a prefabricated set of answers.’2 But because such groups have no firm structure, they will always be more vulnerable to passing moods and fashions and will have an uncertain duration.

So what place does conventional Christianity have, in the post-Christian era we are now entering? It is no longer a community-held faith which shapes and motivates society. Instead, in its multiple forms, it is becoming one set of personal options among numerous others, including New Age religions and secular ideologies. Together they form a vast religious supermarket to which people may go when they are looking for a philosophy or way of life, and in which they are free to choose one tailored to their needs.

The more traditional practicing Christians will form part of the fundamentalist reactionary movement to be discussed further in Chapter 9. They will even grow in numbers, for their strong convictions are infectious and appear to offer some security in an otherwise frightening world. But, like the remnants of the great churches, they too will become marginalized from society and its chief decision-makers.

For most of the post-Christian world, the Bible will no longer be regarded as the Word of God, but it will continue to be of value as an historical testimony to Judeo-Christian origins and as an essential resource for the understanding of past western culture. It will take its place alongside other great religious classics from the various cultures of the past. Jesus will no longer be hailed as the savior of the world, or as a divine figure. He will stand among the great pioneering figures of the past, and his sayings and parables will continue to inspire those who take the trouble to search them out.

God will no longer be conceived widely as an objective spiritual being -- one who personally hears and answers prayers, and who guides human history from behind the scenes. God language, if used at all, will be treated as symbolic. Spiritual practices may take the form of meditation but will not be understood as conversation with an external personal being. Life in this world will be acknowledged as the only form of human existence. The expectation of conscious personal existence beyond death will gradually be abandoned.

If there is ultimately to be no authoritative ecclesiastical institution, no definitive set of doctrines and no clearly definable personal figure to hold Christianity together and promote it, it may at first appear that Christianity will simply disappear. This might well alarm even nominal Christians -- that large group who regard themselves as Christian though no longer active in the church. And this concern would be justified, for when a religious tradition ebbs away from the culture it has inspired, a spiritual vacuum is likely to emerge, leaving the society vulnerable to forces that threaten its survival.

But Christianity will not disappear without a trace. When it is understood as a ‘stream of cultural influence’, Christianity can be seen as something that already stretches far beyond its ecclesiastical institutions, and is likely to last longer than any of them. The Christian stream of influence may not be clearly identifiable as Christianity and certainly not as conventional Christianity. The Christ figure may continue as a symbol embodying various important values, such as compassion, love, and caring for one’s neighbor. The symbols, concepts, images, stories and myths of Christian origin, which remain deeply embedded in the fabric of western culture, will continue to offer the raw material from which people form their understanding of life, develop their capacity for spirituality and experience satisfaction at the deepest levels. From time to time, individuals and groups will receive fresh inspiration, and experience great delight, as they rediscover in the Bible and elsewhere the cultural treasures of the Christian past.

The decline of the old religious institutions, however, will not just open the door to a life of joyful freedom in some secular Paradise, as some are inclined to think. The great traditions of Buddhism, Christianity and Islam were, at their best, long-term civilizing forces. They were able to curb personal violence and anti-social behavior by providing value systems and goals which were accepted widely enough to bring stability and cohesion to the societies they permeated. When these traditions recede -- as they are almost everywhere in the face of globalization -- we shall see the re-emergence of the more brutal capacities of the human condition, which have long survived beneath the veneer of civilization.

Thus social unrest and anti-social behavior will increase in the coming decades. There will be heightened calls for a return to the religious or ethnic cultures of the past and for the re-introduction of stricter controls backed up by force from a higher authority. Considerable criticism will be directed towards all forms of liberalism. The Enlightenment, which opened the way to secularization, will be blamed for our current predicament. There is some truth in pointing to the Enlightenment as the door to the modern world and its freedoms, but, as with the opening of Pandora’s box, there can be no return to the pre-Enlightenment conditions, except by harsh and repressive measures.

As conventional Christianity is ending, I have chosen to look first at its place in our post-Christian future and to differentiate it from the continuing stream of Christian influence. But what happens to Christianity is no longer the primary question. Far more crucial is the future of the world itself (Perhaps the church should never have become concerned with its own future or that of Christianity, for it was the imminent future of the world that concerned the first Christians.) No nation, religion or culture can contemplate its future in isolation: as the west leaves Christendom behind, the whole world is entering a new age. All of humankind is being forced to think and act globally. The most important issues before us now (or one could refer to them as the chief religious questions) are these: what is to be the future of humankind in the post-Christian world? What is to be the future of planet earth?

The shape of the coming post-Christian world is therefore important not only to Christians. It is too often assumed (even by many who no longer have any explicit allegiance to the Christian tradition) that the values and social institutions of our cultural past will continue into the future in much the same form. This is very unlikely. As the west emerges out of Christendom, countries across the world are moving into a more unstable and unpredictable situation. The new cultural forces which have emanated from the west and which are causing the decay of traditional Christianity also threaten the future of the other great post-Axial traditions.

We are now aware, as never before, that human history has no predetermined plan. The beliefs that the earth was created especially for humankind, and that human history is providentially controlled by a divine planner, are obsolete (even though they are stoutly defended in some quarters). For all practical purposes, we humans are alone in the universe, and face an unknown and uncertain future. Further, while each of us has some small degree of choice within the tiny micro-world of our personal life, the changes going on in the human mega-world around us are not being planned or controlled even by humans. The ongoing process of planetary life and of cultural change is rushing on like a driverless juggernaut.

All this is clearly reflected in the contemporary school of thought known as post-modernism, which stands at the opposite end of the spectrum from the fundamentalists and anti-modernists. While the term ‘post-modernity’ is sometimes used as a non-judgmental description of today’s intellectual climate, post-modernism refers broadly to the ideological rejection of modernism as the way forward. Some forms of post-modernism are destructive of all unified world views; they deconstruct, or eliminate altogether, such basic terms as God, self, purpose, meaning, reality. Other kinds of post-modernism attempt to be more constructive. But all varieties of postmodernism share a critical distrust of modernity, and a conviction that the deficiencies in modernity cannot be rectified by reviving the pre-modern age. They see modernism as the legacy of the Enlightenment, with its absolute faith in human reason and its supreme confidence that human endeavor can steadily make progress towards an ultimate goal which promises final knowledge and complete human fulfillment. Post-modernists have become disillusioned with modernism and believe it now to be necessary to go beyond the individualism, scientism, mechanization, consumerism and militarism which have been the fruits of modernity.

Post-modernism rejects the modernist goal of building a new world on the basis of science, reason and human endeavor. In this, it is the anti-ideology of our time, announcing the end of ideology and even the end of history. There is no one true and absolute human history; there is no one definitive Universe Story which can unite all humankind into one global society. Literary critic Terry Eagleton wrote in 1987: ‘Post-modernism signals the death of "meta-narratives". Science and philosophy must jettison their grandiose metaphysical claims and view themselves more modestly as just another set of narratives.’3

Post-modernism signals the triumph of the subjective over the objective. Modernism, having celebrated the end of the eternal cosmic order, now finds itself unstable and impermanent as a working philosophy. Some see the emergence of post-modernism as a shift in human consciousness just as radical as the shift from mediaevalism to modernism. It is reflected in the way we use language and the new use we give to older words -- talking. for example, about spirituality instead of religion. Instead of being self-confident explorers of a mysterious external world, we first set out to find ourselves: that is, to find out who we are. Post-modernism is indifferent to consistency and continuity. It questions whether we can have strong beliefs in anything at all, for nothing lasts for ever. All our social structures, like so many of our artifacts, are here today and gone tomorrow.

Nowhere is the transition from modernism to post-modernism more visible than in the evolution of physics during the twentieth century. Modernism developed on the basis of the Newtonian universe, conceived as a complex inanimate machine, operating in absolute space and absolute time according to its own internal laws, which were also believed to be eternal and absolute.4 Understanding this ‘natural world’ was the key to everything; physicists set about uncovering the laws by which the physical world operates; Adam Smith looked for the natural laws by which the economy operates; Darwin thought he had discovered, in the law of natural selection, the origin of species. Later scientists such as Stephen Hawking are still hoping to arrive at what they call a unified ‘Theory of Everything’.

The end of the Newtonian view of reality may be said to date from 1900, when Max Planck laid the foundations of quantum physics, the concepts that introduced us to indeterminism, uncertainty and human subjectivity. No one had any idea at the time of the far-reaching significance of this, and its role in the end of modernity. The discovery of quantum physics led in turn to Einstein’s theories, sub-atomic physics and a new way of understanding reality. For example, quantum physicist David Bohm speculated in his 1954 textbook that there may be some relationship between quantum processes and thought processes.5 He later became convinced of what he called the ‘unbroken wholeness’ of reality, asserting: ‘The primary emphasis is now on undivided wholeness, in which the observing instrument is not separable from what is observed.’6 He believed that our way of seeing reality as fragmented bits with their own independent existence is an illusion, and he coined the term ‘implicate order’ to refer to this undivided wholeness. Claiming that we falsify reality if we divide it into mind and matter, into living and non-living, he said: ‘consciousness (which we take to include thought, feeling, desire, will etc.) is to be comprehended in terms of the implicate order, along with reality as a whole’.7 Thus the randomness of sub-atomic elements may be linked with the creative freedom exercised by human consciousness.

Today we are being forced to rethink the nature and origin of the universe, the nature of the human condition and the nature of scientific enterprise. Only by excluding ourselves and our consciousness from the universe can we think of it as a lifeless thing. Once we acknowledge that we are not just in the universe but a part of the universe, then the universe itself must be conceived of as alive -- as we are. And because we are part of the universe and think, the universe has the capacity for thought.

One hundred years ago there was such confidence in science that it seemed set to uncover the last secrets of the universe. Today the mood of the scientific community is much more modest. Indeed, each addition to reliable knowledge has tended to uncover more mystery and complexity. The influential philosopher of science, Karl Popper (1902-1994), has convincingly argued that science does not prove to us what is true: its real strength is to show us what is false. Even Stephen Hawking has conceded that if we did work out a Theory of Everything there would be no way of proving that it were true.

Thus science is no longer the objective activity we took it to be, uncovering the hidden eternal truth of the universe (an idea which goes back to Plato). Science is a human enterprise, which is never free from human limitation. Physics is increasingly dependent on such skills as mathematics, itself a human creation. What we assumed to be the laws of nature, which we humans cleverly discovered, turn out to be human judgements based on observation, experiment and measurement. They are continually open to revision and must be regarded as probabilities rather than as certainties. Thanks to Max Planck, Albert Einstein, Ernest Rutherford and many others, we now find that physical matter is not the static, inanimate and stable stuff we had assumed it to be. The stuff of which the real world is composed is dynamic in the extreme. Every atom is a fuzzy little cloud of incredible energy and movement. And just as matter is not solid and indestructible, so our knowledge of physical reality is not certain and infallible. Even about quantum physics itself there is no finality. Already there is some theoretical difficulty in reconciling quantum physics with chaos theory, which began to develop rapidly in the 1980s.

Of one thing we can be sure: the absolute laws of nature on which modernism was based are no more. We continue to observe the regularities of cause and effect, which the humanly constructed laws of nature were intended to encapsulate; but we are also much more aware of how many natural events seem to occur through sheer chance. The universe in which we find ourselves appears to be a mystifying mixture of both chance and necessity, to use the words of Jacques Monod.8

The way we now apprehend the world (and understand our uncertainty in it). though based on western thought, has an impact far beyond the geographic boundaries of western civilization. This is why the post-Christian and post-modern future is relevant to the world at large and not just to the west. Over the last 400 years the western world has influenced the rest of humankind (either to its advantage or disadvantage) more than any other civilization. As Samuel Huntington has said: ‘The West inaugurated the processes of modernization and industrialization that have become worldwide, and as a result societies in all other civilizations have been attempting to catch up with the West in wealth and modernity.’9 Today the old western assumption that Christendom would eventually envelop the world is no more likely than the idea of global Islam or Buddhism. However, in its spread around the world, the west did much to bring the modern global world into being. Thus the demise of Christendom, followed by the dissipation of the Christian tradition, will have a direct effect on the whole globe.

Trying to see what the future holds has become such a feature of twentieth century thought that it has earned a title of its own. The earliest form of futurology was science fiction, which usually anticipated the future with pleasurable amazement. This literature has been more than just a new form of entertainment. Like the apocalyptic writings of New Testament times, it has stretched the imagination and inspired great confidence in the future. But, whereas in ancient times the apocalyptic writers expected God to usher in the new world, the first science fiction authors described a future created by human invention. Readers were encouraged to believe in the power of human ingenuity, a power that repeatedly overcame all future problems by scientific discovery and technological expertise. Some of the best examples, such as H.G. Well’s Time Machine (1895) and War of the Worlds (1898), became classics. The genre has continued in cinema and video with such popular epics as Star Wars. Not all science fiction, however, is optimistic. Some writers, such as Aldous Huxley in Brave New World (1932) and George Orwell in Nineteen Eighty-four (1949), struck an early cautionary note.

Futurology has now grown into a widespread industry for more practical purposes: looking into the future has become an essential part of social and economic planning. Some notable books have been written in this field: The Coming of Post-Industrial Society (1973) by Daniel Bell, The Third Wave (1980) by Alvin Toffler, The Fate of the Earth (1982) by Jonathan Schell, and The Green Machines (I98~) by Nigel Calder. But the attempt to forecast the shape of the future and the post-Christian world is fraught with difficulty. Chaos theory now helps us to understand how small and unknown factors can be instrumental in causing significant change in natural phenomena. Weather forecasting can be seriously astray even 24 hours ahead. Economic forecasting is even more problematic, for it is dependent on the uncertain factors of human choice and idiosyncrasy. How much harder it is, therefore, to look into the general human future when events and trends reflect countless billions of personal choices.

Let no one think that the coming century will present a utopia unfolding smoothly before us. Just as the great religious traditions emerged from their ethnic origins only through confusion, conflict, controversy and even violence, so it will be in the century to come. This has the potential to be the most creative and glorious century yet in human culture. But it also has the potential to be the most violent and destructive period in the long history of humankind.

The human species currently consists of some six billion individuals spread around the globe in millions of groups large and small, all focused on their own affairs without much concept of any ultimate goal, or the value of their contribution. No wonder human expectations about the shape of the future have been considerably shaken during the twentieth century.

The best we can do, in attempting to imagine the twenty-first century, is to assess the current trends in the fast-changing human cultures. As John Naisbitt has observed in Megatrends (1982), most trends are found to be interconnected and part of a worldwide process; most also occur over a period and without fanfare. If they do produce an obvious crisis (for example, the French Revolution), we usually find that the forces at work have been present for some time. Cultural change normally creeps up on us relatively unnoticed. Only when we look back can we identify patterns and turning points. That is the case with the end-times discussed in the early part of this book, and it is why so many are still unaware of what has been happening. So what matters in the coming century is not what occurs (or fails to occur) on 1 January 2000 but how the years unfold thereafter.

The most dominant trend today, and the one we must look at first, is globalization. This will not be experienced everywhere as an unqualified blessing, and it may well lead to a testing period for the human species.

 

Notes:

1. See Chapter 5.

2. Robert Bellah, Beyond Belief pp. 43-44.

3. Quoted in David Harvey, The Condition of Post-modernity p. 9.

4. See Chapter 6.

5. David Bohm, Quantum Theory.

6. David Bohm, Wholeness and the Implicate Order, p. 134.

7. Ibid., p. 196.

8. See Jacques Monod, Chance and Necessity.

9. Samuel P. Huntington, The Clash of Civilizations and the Remaking of World Order, p. 302.

Chapter 6: The Discovery of Relativity

There is a book in the Bible not much loved by Christian preachers, even though it is called ‘The Preacher’, or by its Greek title, ‘Ecclesiastes’. It starts off: "‘Vanity of vanities," says the preacher, "Everything is vanity."’ The word translated as vanity literally means ‘thin air’, and it was used to describe whatever is ‘vapor-like’, ‘insubstantial’. ‘having no solidity or permanence’. The unknown author of this book was writing at time when his Jewish heritage was encountering challenges from Greek critical thought, and he expressed here his sense of uncertainty. In today’s fast-changing world, many might share his sense of ‘vanity’, of impermanence, of the ground shifting beneath their feet. But our equivalent word might be ‘relativity’ -- and how often do we hear the phrase ‘everything is relative’?

The phenomenon of relativity has been one of the epoch-making discoveries of this last century of the second millennium. We mostly associate the term with Albert Einstein (1879-1955) and his two famous cosmological theories of relativity. But these simply brought to a surprising climax a thread of thought which began much earlier, and which applies to much more than our understanding of the physical universe. Indeed, the word ‘relativity’ had already been used in 1890 to refer to the reciprocal interdependence of the individual and society.

Briefly, the concept of relativity means that everything exists in relation to something else and is in some way dependent on something else for its being. No thing in the universe can be fully understood in isolation; everything we previously took to be absolute and final is now relativized. To understand this, we will start with Einstein’s cosmic relativity.

Perhaps the first glimpse of cosmic relativity came when Copernicus (1473-1543) proposed that the sun and not the earth is the center of the universe. This was disturbing at the time because the suggestion that the earth was moving around in space threatened the dependable certainty of the ground beneath our feet (then called terra firma). The mental picture of the earth revolving around the sun suggested that, at any time, we might drop into free fall in outer space.

Isaac Newton (1642-1727) was able to bring some reassurance: his theory of gravity meant that we were firmly attached to the earth. But Newton retained the idea of absolutes. The sun, rather than the earth, became the center of the universe, and the solar system operated within the two basic absolutes of space and time. Indeed, Newton invented these terms, saying, ‘Absolute space, in its own nature, without regard to anything external, remains always similar and immovable.’ This seemed as self-evident to him as it still does to us; three-dimensional space appears to be just there; objects like planets and falling apples move within space, but space itself does not move.

Similarly, Newton believed that objects can move in space because of the existence of another absolute -- time. So Newton said, ‘Absolute, true and mathematical time, of itself and by its own nature, flows uniformly, without regard to anything external.’ Again, not only does this appear to be common sense, but our clocks also appear to prove it. It was to be another 300 years before these two absolutes were questioned by Albert Einstein.

In 1905 Einstein published his Special Theory of Relativity, in which he questioned the very notion of absolute space, showing that nothing is ever absolutely at rest or absolutely in motion. The idea of rest and motion are valid concepts only when used in relation to something else. In the second century Ptolemy had accurately described the paths of the planets relative to the earth, and very curious paths they were. But when Johann Kepler and Newton measured the movements of the planets relative to the sun, their orbits were seen to be ellipses. These were beautifully simple orbits compared with Ptolemy’s complex ones. In Einstein’s view, however, Ptolemy was not wholly wrong, and neither were Galileo, Kepler and Newton wholly right: there is no fixed or central point in space to which everything else must be related. That is the first important consequence of Einstein’s theory of relativity. All motion that we observe is relative to us. All rest, too, is relative. Of course, for practical purposes we regard the position from which we make an observation as a fixed point, but this is an arbitrary choice on our part.

Now let us turn to time. To measure the orbit of a planet we must record not only the position of the planet relative to us but also the exact time we observe the planet. In other words we are recording a series of events. We can accurately measure cosmic events -- say, the ‘distance’ or ‘interval’ between any two sightings of a planet -- only by means of a combination of space and time, and not by either of these separately. This led Einstein to speak of the universe as a space-time continuum. The word ‘continuum’ means a continuous thing, all of whose elements flow into one another. The universe is a continuum, said Einstein, in which the three dimensions of space and the one dimension of time flow into one another to form an indivisible whole.

Our traditional units of time illustrate how all measurement of time is relative to something else. Our hours and days derive from the time it takes the earth to revolve on its axis; our months from the time it takes the moon to revolve around the earth; and the year from the time of the revolution of the earth around the sun. Thus, just as it is necessary to surrender the notion of absolute space because there is no fixed point in space, so we must now surrender the notion of absolute time.

Every measurement of time is also relative to the place where the observation is made. We used to assume that, when we were observing a planet or star, the time on the star was the same as ours. That is not so. There is no absolute present moment which can be experienced or observed simultaneously throughout the universe. There is no absolute point in time, any more than there is an absolute point in space. Just as the observed position of a heavenly body is relative to us, so the time when we make the observation is also relative to us. It is our time. This is not at first easy to grasp, and it comes as a certain shock when we do so. Time and space have been around us for so long that we have taken them for granted. Even around the surface of the planet we can still take them for granted -- there is only a momentary delay when we telephone someone on the other side of the earth. But when we move out into celestial space the problem is magnified and the time we measure is clearly our time, not universal time or absolute time.

Or to put it another way, when we look up at the starry sky we are not only looking out into space, we are also looking back into time. The heavenly bodies we see are not there at our present moment but were out there at some time in the past, depending on the time it has taken the light to travel from them to us. This varies tremendously all around the sky. One star we observe may be taking us back a thousand years. But in the same area of sky we may be looking at a star that is taking us back a million years. For the distant nebulae, as seen through the telescope, we are looking back through hundreds of millions of years.

In 1915 Einstein developed the General Theory of Relativity, to explain apparent conflicts between his Special Theory of Relativity and Newton’s law of gravity. Why do objects fall to the ground? Newton explained this in terms of the force he called gravity, which causes any two objects to be attracted to each other. The immense mass of the earth attracts the relatively minute mass of our bodies, and so we stay on the surface of the planet even though it is whirling us around its center at about 1,600 kilo-meters per hour. Objects feel heavy in our hands because we have to counter the force of gravity pulling them to the earth. But why, when we are going up in a fast elevator, does the parcel we are carrying suddenly feel heavier as we set off? We say that this is because of the acceleration or change of speed of the elevator. Why does acceleration give us the same feeling as gravity gives us? To Einstein, acceleration and gravity are essentially the same force. In the four-dimensional space-time continuum (of Einstein’s universe), every object is subject to acceleration along what he called its world line; this curves when it is in the vicinity of any other object with mass, such as the earth or the sun. Thus the path followed by light coming from a distant star curves in the vicinity of the sun. This led Einstein to speak of the curvature of space.

Einstein’s theories of relativity opened the way for a new and quite different understanding of the universe. It is, in the first place, billions of times bigger and more complex than people had previously thought. Our sun is one medium-sized star among the billion or more which make up our galaxy and our galaxy in turn is one of more than a billion such galaxies. The planet earth is but the tiniest speck within a vast cosmic sea of nebulae. This planet is of supreme importance to us, but to the rest of the universe it is largely irrelevant.

As a result of Einstein’s theories, the static model of the universe was abandoned, and replaced with the dynamic model of an expanding universe. Even Einstein found this difficult to accept and, to avoid this surprising conclusion, he proposed a cosmological constant. But when the astronomer Hubble produced strong evidence that the universe is indeed expanding and that the further away a nebula is, the faster it is receding from us in space, Einstein confessed that his cosmological constant was the biggest blunder of his life.

Einstein’s theories of relativity may be wonderful and puzzling, but what is their relevance in this book? J.B.S. Haldane pointed out in The Philosophy of Humanism (1922) that Einstein’s theory of relativity is a scientific and exact illustration of a much wider principle: all our knowledge is relative to the human mind that produced it. Or to put it in another way, we humans have evolved in a symbiotic relationship with the culture created by the countless generations before us; we are dependent on the culture into which we have been born, not only for what we think and believe we know, but also for our very humanity. In other words, we humans are subject to cultural relativity. The phenomenon of relativity not only denies the absoluteness of time and space; it also undermines the certainty of our knowledge and the absoluteness of the values and purposes by which we live.

Consistent with the new model of an expanding universe in constant flux is that of an ever-changing planet. In its earliest geological history the earth’s surface was bubbling with activity -- with exploding volcanoes, boiling lakes, massive earthquakes, great gulfs opening and closing, whole continents appearing and disappearing. Even now the continental plates are always moving, albeit slowly, and mountain ranges are rising and wearing away. Nothing is permanent, not even the mountains, which the ancient psalmists used to regard as symbols of enduring stability. On planet earth nothing stays the same for ever.

On the surface of the earth, on the boundary where the atmosphere meets the hydrosphere, evolved the thin film of life we call the biosphere, in which change has been particularly fast and dramatic. Life of some kind on this planet stretches back through more than three billion years. It has been manifesting itself in a variety of species, bewilderingly rich and numerous. Planetary life started with the simplest living cells and amoebae-like creatures, yet out of them, through increasing complexity, our own species eventually evolved. Humans used to think until only last century that all species including our own had been here from the beginning. Now we know that, on the time scale of the earth, we emerged on this planet very late indeed.

Just as scientists from Galileo to Einstein ‘relativized’ the planet earth from its once central position to a tiny and impermanent fragment of a much vaster space-time continuum, so Darwin and his successors have relativized the centrality of the human species to simply being one species in a continuum of planetary life in which all species past and present are genetically related. The new story of all life on this planet has undermined the permanence of any species, including humans. This is the first consequence of relativity in relation to the planet, to which we have to become adjusted.

It is not surprising that Darwin caused a stir commensurate with that raised by Galileo. People had previously held what is called an anthropocentric view of the universe. In most respects we still do. Our forebears not only saw themselves as a race quite apart from all other animal species, but thought also that the universe was especially made for their benefit. Many people, on first encountering Darwin’s theory of biological evolution, feel deeply affronted and refuse to accept it. That is understandable. Our dignity is hurt when we find ourselves described as animals. We have come to think of ourselves as rational creatures, not only intellectual but also spiritual in character, and made in the image of God.

We belong to the total stream of life on this planet, and all other creatures are our genetic relatives. Physiologically humans differ only in degree but not in kind from other earthly creatures. Our human DNA is said to be 98 percent the same as that of the gorilla. We cannot escape our animal form -- to which everything we do and think remains connected. We humans have appeared right at the tail end of earth’s history -- relative, of course, to the present moment. Many other types of creatures have been here before us, including the dinosaurs who roamed the planet for nearly 200 million years. And just as there is no immortality for any member of a species, so there is no guarantee of permanence for any species itself, even though it may last through countless generations. The time will probably come when humans are extinct on this changing planet, like so many species before them.

There is, however, a great gulf separating humans from all other living species. It is not a physiological but a cultural one. We share with all the other higher animals the same vital functions of breathing, eating and reproducing. But that is only half the truth. We are sociocultural animals. What is most distinctive about us as an animal species is that all of our vital functions have been qualified and transformed by patterns of behavior we have learned from the culture into which we were born.

Within the continuum of planetary life there have evolved many interconnected systems, each of them in symbiotic relationship with its environment. In the case of the human species, we have evolved not only in a symbiotic relationship with the physical environment of the earth but with another kind of environment, known as human culture. By this is meant everything we humans have constructed with our hands, performed by our actions and thought with our minds. The basis of all human culture is language. As Don Cupitt has said: ‘Language is the medium in which we live and move and have our being. In it we act, we structure the world and order every aspect of our social life. Only Language stands between us and the Void. It shapes everything.’1

Language enabled our human forebears to reach a heightened form of consciousness; they came to depend less and less on biological drives and animal instincts, and organized their lives with an increasing awareness of their emotional, intellectual and spiritual needs. Consciousness began to evolve into the critical self-consciousness we are capable of today. So as our species gradually became human, we ceased to live an exclusively animal existence and developed, in addition, a cultural existence. It is by means of language that we have developed human culture and it is by being immersed in culture that each new generation becomes human.

Whether there was ever a time in the past with only one human culture, however primitive, we do not know. But we do know that over time a bewildering plurality of human cultures has evolved, as the Tower of Babel myth symbolically describes. And so we can ask: if it is by being nurtured by a culture that we become human, does this mean that there are many different ways of being human? Yes, it does! There has been a Maori way of being human, a European way of being human, a Chinese way of being human. Cultural differences do turn us into different types of human beings.

Until recently each culture assumed itself to be greatly superior to others, and to constitute the norm or truest type of humanity. It was common in the ancient world, for example, to distinguish between the barbarians and those who were civilized. Even as recently as last century, Europeans tended to regard tribal peoples as savages. For Christians, being a Christian was the ideal way of being a human; for Muslims, being a Muslim was. Today such judgements are seen as cultural chauvinism: we are becoming aware of cultural relativity. Just as there is no center to the universe and no earthly species that is biologically superior to all others, so no human culture provides the norm to which all cultures should conform. All human cultures are relative to time, place and experience.

The evolution of human culture has taken place in a much shorter time than biological evolution. Human culture also changes much faster. No culture stands completely still, even though some change more slowly than others. Each culture is a living, changing phenomenon, and it changes as a result of human thought and decision-making. Each new generation inherits the cultural deposit of the past and adds something of its own. Today human culture is changing much faster than at any previous time. This is all the more reason for us to understand the shifting and relative character of every culture.

What we learn from cultural relativity is firstly this: our cultural convictions and practices are always relative, relative to the time in which we live, the position we choose to take, and the cultural inheritance which has shaped us. Nothing about them is absolute or unchangeable. This book, indeed, is simply one person’s thoughts, reflecting a standpoint in western culture and trying to take into account what appear to be dominant global trends. Just as we humans are earth-bound and time-bound, we are also culture-bound. We can no more escape from cultural relativity than we can defy gravity.

Secondly, no culture stays the same. Every attempt to preserve a culture by human effort is doomed. The very fact that people set out to try to preserve it is a sure sign that a culture is already changing fast and perhaps dying. That is true of great religious cultures like Christendom and Islam, even though their respective fundamentalists think otherwise. It is also true of indigenous ethnic cultures, in spite of the best efforts to preserve them. No culture can be made absolute or permanent.

The chief substance or identity of any culture is to be found in its morality and its religion. By morality (literally, the customs or mores) is meant the patterns of behavior which are deemed by a society to be ideal or at the least permissible. The definition of religion is much more difficult and hence debatable. ‘To be religious,’ said theologian Paul Tillich, ‘is to be grasped by an ultimate concern, a concern which qualifies all other concerns as preliminary and which itself contains the answer to the question of the meaning of life.’2 Carlo Della Casa said, ‘Religion is a total mode of the interpreting and living of life.’3 These descriptions allow for the many and diverse forms of religion.4

Indeed, it is the relativity of religion that makes its definition difficult. Just as there is no one absolute culture and no one absolute morality, so there is no one absolute religion, despite the fact that some religions, particularly the monotheistic ones such as Christianity and Islam, have claimed that they alone possess the absolute truth and that all other religions are false or inferior. There are many forms of devotion, many modes of interpreting life and living; each of them is central for the people who practice it, but that is a relative and subjective judgement.

Every human culture has evolved on the basis of a particular way of interpreting human existence. Tillich suggested that religion is that which gives culture its depth and its strength. Without the religious dimension a culture has no staying power and no clear identity. Whatever provides a culture with its goals, values, motivation and creative energy is its religion (irrespective of whether that term or its equivalent is used).

Religion and culture are so closely interwoven that, though they are not one and the same, neither can exist without the other. In a healthy homogeneous society, culture and religion are deeply blended. That is why the pre-Axial cultures were not even aware they had a religion and hence had no name for it. We still refer to their religions by the ethnic group to which they belong. We speak, for example, of the religion of the Babylonians, of the Greeks, or of pre-European Maori.

Because religion arises out of the quest for meaning, and is a mode of interpreting life, it is dependent upon language. In fact most great religions have never become completely divorced from the particular language in which they came into being. Judaism is closely tied to the Hebrew language. Hinduism goes back to Sanskrit for the study of its founding scriptures. Buddhism, though a little more universal, still sets great store on the Pali texts. Until the last two centuries Christianity was largely tied to Greek and Latin, the languages within which it first evolved. Islam so honors Arabic that it does not officially permit the Qur’an to be translated into any other language. Each of the great religions has also developed its own symbolic language as its interpretation of human existence within the world. Religions have even been defined as symbol-systems. Each symbolic language is a kind of super-language which has to be learned and understood by those who embrace that religion as their way of life. These symbolic terms all cohere and relate together, depending upon one another for their full meaning and often being defined in terms of one another. They formed what the philosopher Wittgenstein called a ‘language-game’. Just as a game consists of a set of rules that we have to learn and honor if we are to play it, so a religion consists of a system of symbols that we have to understand as a whole. Each symbol-system forms a religious language, a language of meaning for interpreting the meaning of life (or, more correctly, creating a meaning for life).

Modern scholarship has revealed not only how much our capacity to be human depends on language and culture but also the extent to which all language (and particularly religious language) is symbolic. To many it has been an unwelcome shock to hear basic religious terms such as God, Christ and resurrection referred to as symbols when these terms have previously been used as literal descriptions of unseen reality. Even worse is the idea that these symbols, along with the whole cultural tradition which uses them, have been humanly created.

Yet that is exactly what has been slowly coming to light as the advent of modernity has unfolded to us a new story of human origins. We have seen that all human cultures are human creations, each of them being the collective creation of an ongoing ethnic group. Each of these cultures has produced a human-created morality and each has created its own set of symbols for the interpreting and living of life. The word ‘God’ is a symbolic term which is no less a human creation than the class of beings called ‘gods’, which ‘God’ came to replace at the Axial Period.

Until the modern period humans remained unaware of just how creative they really were. It seemed natural to assume that every new thought or from outside. We still reflect this tendency in such simple expressions as ‘I was struck by a brilliant idea’, as if it came from somewhere else. Psychology has helped us during the twentieth century to understand the creativity of the human psyche so that we have now a quite natural explanation for, say, the voices heard by Joan of Arc and the vision of John in the Book of Revelation.

Even more important, we now have an explanation for the types of religious experience that led people to attribute their thoughts to divine revelation. What has been claimed as revelation from a divine source of knowledge is in fact the product of human creativity, stretching back over a very long time and involving countless people. For example, the Islamic world accepted from the outset that the words of the Qur’an, expressed in beautiful Arabic poetry, could not have been composed by Muhammad, but must have originated with Allah and been transmitted to Muhammad by way of divine revelation. Muhammad’s own account of this was of course perfectly sincere, for he would have been unaware that his own creative psyche was the source of that remarkable outpouring that became the Qur’an.

No traditional Muslim would accept this natural explanation, any more than an orthodox Christian would accept that the long-held revelations of the Christian tradition are the product of human psychic creativity. Each religious tradition has exempted itself from natural explanations, while applying them to all the other traditions. In today’s global world, this will no longer do. We land ourselves in this inconsistency by not acknowledging relativity. If Christians use logical or natural explanations to explain the rise of other traditions, such as the foundation of Mormonism on the visions of Joseph Smith, these explanations must be applied to the Judeo-Christian tradition as well.

The new understanding of how the human mind works in creating human culture has shown more clearly the relative nature of all religious traditions. Those who have set much store by the belief in divine revelation feel, at first, a great sense of loss, comparable to that felt when the earth ceased to be the center of the universe. The loss of divine revelation has left each religious tradition bereft of its supposedly firm foundation.

Yet it has not been all loss. Belief in divine revelation has had its negative side. People have been inspired by revelation, and had great confidence in their beliefs, but they have also been motivated to impose these on others, for the latter’s own ‘good’. When we attribute to a divine source what are really our own thoughts, visions and aspirations, the consequences are serious. People who adamantly declare they know what is the will of God for society at large, are unconsciously projecting onto an objective deity their own ideals and aspirations, including their prejudices. In this way we become enslaved to our own thinking. It is even more damaging when we treat the words of holy scripture as divine revelation, for then we are enslaved to the thinking of ancient humans, whose ideas may be long outmoded. Any religious tradition claiming to be the absolute truth in a universe so marked by relativity leads not to the salvation of humankind but to its enslavement.

In the last 200 years we have become increasingly aware of the relativity of culture, morality and now of religion. It means that all religious traditions are of human origin -- none is exempt.

It means also that all religious ideas, concepts, symbols and traditions are human in origin, however valuable they may remain. Just as there is no one culture which is the norm for all other cultures, no one morality which is the norm for all other moralities, so there is no one religion which is the norm for all others. None of them is absolute and final, and those which claim to be must surrender those claims if they are to continue to be a viable means of the interpreting and living of life.

As theologian Tom Driver put it: ‘Christianity has been compelled to see itself as a religion relative to other religions and relative to the history of the world. Christianity does this reluctantly . . . The gap between Christianity and modern theories of relativity is widening so much that the church’s teaching about Christ is in danger of losing both its intellectual and its moral credibility.’5 Driver then showed that our understanding of relativity has made necessary a radical redrafting of the whole of Christian thought:

To think of Christ as the center, model and norm of humanity made a certain sense in the Ptolemaic universe, which had the Earth as its center. It continued to make some sense, however strained, in the Copernican universe, which had the sun as its center. Today, christocentrism cannot make sense in the Einsteinian universe, which has no center and in which every structure is a dynamic relationality of moving parts. The ethical theological task of the churches today is to find a Christology which can be liberating in a world of relativity.6

Let us return to the words of Ecclesiastes, for this ancient preacher caught something of the spirit of relativity -- a word that probably translates the original Hebrew word better than ‘vanity’. Vanity implies futility but relativity does not. And Ecclesiastes remained positive in the face of impermanence. He was able to say, ‘Go ahead and enjoy life with your partner. Eat your meals and drink your wine with a merry heart. And whatever your hands find to do for your daily toil, do it with zeal’.7

Relativity tells us that there is a mysterious elasticity about time and space, that all physical reality is in a state of flux, and that the cosmos was not made for any obvious purpose. But it is just because nothing lasts for ever and there is continual change that life has been able to evolve and that humanity has developed as it has. There was nothing necessary in this. Each of us exists as the result of an almost infinite number of accidents or chance events. We find ourselves living in an otherwise meaningless universe where there are no absolutes and nothing is certain. Within the changing conditions and evolving life on this planet, and out of the various developing cultures that have shaped us, we humans can and do create meaning for ourselves. Even though our efforts remain subject to relativity, they need not be futile and vain. Since all sense of purpose and human fulfillment resulted from human creation in the past, we can continue to create a purpose for living in the future. It is with that kind of faith and hope that we can enter the new millennium as we come to the end of the Christian era.

 

Notes:

1. Don Cupitt, Creation out of Nothing, pp. 4-5.

2. Paul Tillich, Christianity and the Encounter of World Religions, p. 3.

3. C. Jouko Bleeker and Geo. Widengreen (eds.), Historia Religionum. Vol. 2, p. 355.

4. There is a fuller discussion in the author’s Tomorrow’s God, Chapter 7.

5. Tom Driver, Christ in a Changing World pp. 69. 66.

6. Ibid., p. 56.

7. Ecclesiastes 9:7, 10 (author’s translation).

Chapter 5: The Christian Stream of Influence

If Christian civilization is no more, if Christian orthodoxy is disintegrating and if Christian Modernism has failed to rescue it, where does this leave Christianity? Is it also facing its demise? This depends on what we understand by Christianity -- a question which, rather problematically, cannot be answered in the same way for all who call themselves Christian.

Many assume Christianity to be identified with what became the classical Christian doctrines (orthodoxy), yet it was several centuries before these were explicitly enunciated in the creeds by the ecumenical councils. Christianity is older than the orthodoxy it later produced. (In any case, since the Reformation, there have emerged several ‘orthodoxies’, each claiming to be the true one). Something which might be called Christianity clearly existed from the time the first followers of Jesus proclaimed him to be the Christ and found themselves referred to as Christians (Acts 11:26). Yet the study of Christian history shows there has never been a time when all Christians have agreed on what it is to be a Christian. The first sharp difference of opinion is documented in the New Testament -- it was the difference between the original (or Jewish) form of Christian allegiance to Jesus Christ and the Pauline (or Gentile) form. That rift was never healed. The ecumenical councils later achieved the only true form of Christian teaching only by declaring to be heretical all who failed to accept their definitions. But those ‘heretical’ movements also claimed to be Christian. Thus, over two millennia, there have been innumerable different ways of understanding what it means to be a Christian and during the last 500 years they have been multiplying.

Today, some think of Christianity as a matter of holding certain beliefs, while others think of it as a particular lifestyle. Some regard Christianity as a set of values to be honored as a guide to living; others experience it as a conversion in which one accepts Jesus Christ as one’s personal Lord and Savior. Some see themselves as incorporated into the church as the body of Christ; others believe everything in the Bible and call themselves bible-believing Christians. The Christian path of faith has been walked in many different ways by innumerable people through the centuries. What links them together is their common respect for the Bible (though they interpret it in different ways) and their desire to give their allegiance to Jesus as the Christ (albeit in many different forms). To avoid adopting a sectarian viewpoint, it is necessary to include in the broad stream of Christianity not only what has been at the center but also what has been on the margins -- and that includes what some have judged to be heresies. We should remember also that what has been heretical to one age has sometimes been approved by another, and vice-versa.

A precise answer to the question ‘what is Christianity?’ thus remains elusive. However, W. Cantwell Smith in his book The Meaning and End of Religion (1964) opened up other ways of apprehending the Christian experience. Objective names such as Christianity, Hinduism, and Buddhism have only come into use in recent centuries; this phenomenon, implying that religions are ‘things’, is described as ‘reification’. Instead of using the term Christianity, Cantwell Smith suggested that we would do better to focus on two quite different components present throughout Christian history -- the first he calls the Christian cumulative tradition and the second is the personal faith and commitment of people who think of themselves as Christians.

By the Christian cumulative tradition is meant the sum of all the objective data that has marked the complex path of Christian faith through the centuries. They are, for example, the Bible, creeds, confessions, theological systems, deviant heresies, moral codes, myths, buildings, social institutions -- everything that has been left as an extant deposit within the developing Christian culture, and which can be studied by the historian. It is not the historian’s place to prefer one set of Christian data to another, or to side with the orthodox over the heretics but only to decide whether the datum is definitely linked with the cumulative Christian tradition as a whole.

Faith is something quite different. It is the attitude of trust and hope with which humans can face the future and all the challenges life brings. Faith is an attitudinal response of the whole person, involving the emotions and the will as well as the mind. It is not therefore to be identified just with beliefs, for these are solely cognitive. Being personal and subjective, faith is not open to historical and objective study as the cumulative tradition is, yet without such faith there would have been no such tradition.

Faith is not the sole prerogative of any one cultural tradition, though Christians have often shown a tendency to think faith was exclusively a Christian phenomenon. Faith is a potential universal to the human species and is to be found in people of every cultural tradition. That is why it has become common to speak of the various traditions as ‘paths of faith’. Each particular culture fosters and shapes the faith of those within it by the way it provides a world-view and helps them to understand life. The religious dimension of a culture promotes particular qualities and aspirations which give that culture its identity and even a name. There is no one path of faith which is ideal or exclusively true. Moreover, in the life of an individual or a community the experience of faith may be found to ebb and flow according to changing circumstances.

Long before the modern term ‘Christianity’ came into use, people used to speak rather of ‘the Christian faith’. This term acknowledges (at least tacitly) that there are many different ways of experiencing faith (or trust). The qualifying epithet ‘Christian’ was used to denote the particular qualities this path of faith was believed to possess, namely, that it drew inspiration and strength from the one known as Jesus Christ.

While faith is so personal and subjective that it is not open to objective study, the religious observance which stems from faith is observable. Instead of asking about the fate of Christianity in the modern world, we can more usefully ask what is happening to Christian observance.

Widespread Christian observance within Christendom not only survived the fragmentation of the church at the Reformation but even seemed to show a new burst of vitality. This was because the focus shifted from participation in Christendom (by virtue of birth) to personal experience and belief (by active choice). Instead of being baptized into ‘the one and holy catholic church’ as a matter of universal practice, people were being challenged to make a personal choice between the Catholic and the various other Protestant forms of Christian allegiance. This had the effect of intensifying devotion and commitment. Unfortunately it often led to bitter animosity between Protestant and Catholic and even between various forms of Protestantism. What, in theory, should have been allegiance to a common Lord Jesus Christ, often turned out in practice to be sectarian allegiance to a particular confession or denomination. After the rather grudging truce between Catholic and Protestant was entered into at the 1648 Peace of Westphalia, the resurgence of active ‘Christian’ commitment initiated by the Reformation began to ebb.

Yet there was still no question of abandoning Christian affiliation. This even survived the corrosive effect of the Enlightenment. In doing so, however, Christian allegiance became even more personal, inward and subjective. Protestantism, in particular, survived the rationalism of the Enlightenment through a shift of emphasis from doctrines (products of the mind) to inner experience (feelings of the heart). The Pietistic movements, initiated by such people as Philipp Spener and Count von Zinzendorf, and spread by the Moravian Brethren, did much to revitalize Protestant church life. These movements, followed by Methodism and the evangelical revival, focused on inner subjective experience, just as the charismatic movements have been doing in the late twentieth century.

Personal religious experience and inner feeling, therefore, began to take precedence over religious thought and dogma at the very time when traditional Christian doctrines were becoming increasingly out of kilter with the new ideas and advancing human knowledge of the last two centuries. Even so, the number of people with serious doubts about the basic Christian concepts and doctrines was still extremely small in the eighteenth century. This did not increase until the nineteenth century, by which time the leading edge of western thought was moving beyond the limits of doctrinal orthodoxy. A great gulf began to open up between what intelligent people were thinking and saying on the one hand, and what the church continued to teach on the other.

At first all these changes were quite gradual. Even the decline of Christendom was hardly noticed until after the end of the nineteenth century. In 1900 it would have been absurd to suggest that Christian allegiance was in any decline, for the opposite appeared to be true. It has been only in the second half of the twentieth century that people of the Christian west have abandoned affiliation to Christianity in some numbers, openly confessing they are no longer Christian. Even when church attendance was becoming more irregular, between 1850-1950. people did not think of themselves as abandoning Christianity but only (what they called) ‘churchianity’. Since the end of World War II, however, there have been alarming signs that it is not just Christendom that is vanishing and not just Christian orthodoxy which is disintegrating. Christian allegiance is itself suffering from a deep malaise. The proclamation of the age-old Christian message is no longer bringing forth a firm response of Christian commitment. Since the beginning, the Christian message has been boldly presented as the Gospel -- ‘good news’. Today it is no longer widely heard as any sort of news at all, good or bad.

This first became noticeable in Protestant areas, but predominantly Catholic countries now appear to be affected too. In 1982 the World Christian Encyclopaedia noted the number of white westerners practicing Christianity was dropping at a rate of 7,600 per day; in 1986 the Roman Catholic theologian Hans Küng observed that, of the some five billion inhabitants of the earth, only 950 million were nominally Christian and only a fraction of those took any active part in the church. Although this decline in Christian allegiance, occurring mainly in one century, is quite sudden relative to the length of the Christian era, it has been sufficiently slow and unspectacular relative to a person’s lifetime that most churches have, until recently, been hardly aware of it. Many church leaders have flatly refused to acknowledge any decline at all in Christian allegiance.

People born in recent decades have no first-hand experience of what active Christian allegiance was like at the beginning of the twentieth century, when practically everybody in the western world other than Jews claimed to be Christian. Churches were full; Christian festivals dominated the calendar; there was strict Sabbath observance; and the various patterns of Christian morality were enforced by peer pressure, even more than they were by law or from the pulpit. Christendom may not have existed at the beginning of this century, but the Christian practice was still very much alive.

At the end of this century things are very different. In Europe, and in countries to which European culture has been transplanted, there is evidence everywhere of decline in outward Christian observance. Many churches now have very small congregations; some churches have closed altogether. Congregations are commonly made up of people aged 50 and over; young people rarely participate. Seminaries for the training of clergy and priests have been closing down. Roman Catholic monastic orders have very few novices and often consist of a few elderly nuns or monks. The great cathedrals have become historic monuments to a past age, chiefly of interest to tourists.

Church-going remains more common in the United States and in some of the African countries; the charismatic and fundamentalist groups are the most active of all the churches. Fundamentalist Christians regard themselves as the last bastions of orthodoxy because of their commitment to the literal text of the Bible, and this meets the needs of people looking for certainty in a time of rapid change. The attraction of the charismatic churches is their emphasis on inner feeling and their ability to foster a sense of emotional fulfillment; there is little critical examination of what the Christian doctrines really mean in a world very different from that in which they were first formulated.

To many, of course, this evidence of decline in Christian allegiance is only too obvious. But there has been a strange reluctance within the churches to acknowledge it. Some insist on interpreting the twentieth century as a period of unfortunate but temporary setback in Christian advance, comparable to those which occurred prior to the Reformation and to the Evangelical Revival. They confidently predict that this decline too will be followed by a renewal. Some claim that this is already happening in the rapid spread of the charismatic movement, while others, like Keith Ward in The Turn of the Tide, express optimistic hopes for the future of Christianity. Yet others are sure that radical measures could be taken to reverse the current decline, if only the church were of a mind to adopt them.1

The belief that the classical form of Christianity will come through every crisis in the long run is, of course, an essential component of the Christian faith. Over the centuries Christians have said of their church founded by God, ‘not even the gates of Hell shall prevail against it’, so to contemplate the possible demise of Christianity we have to suspend Christian faith and step outside it, at least temporarily. When we do this the traditional expectations of Christianity’s future look very much like wishful thinking.

Already by the end of the nineteenth century, theology was losing credibility as an academic discipline, often finding no place in the new secular universities in the twentieth century. It has sometimes been replaced now by the historical study of all religion as a human phenomenon. The Christian churches have been reluctant to follow the lead of even their own liberal scholars. John Cobb has gone so far as to say: ‘The church has lost the ability to think. Unless it recognizes that its healthy survival depends on the recovery and exercise of that ability and acts on that recognition, talk of renewal or transformation is idle.’2

Modern historical, philosophical and scientific thought has come into conflict at so many points with traditional Christian teaching that the latter has been losing its power to convince ordinary people (to say nothing of the intelligentsia). While most people still affirm what they call ‘Christian values’, an increasing number at all educational levels find themselves quite unable to embrace traditional Christian beliefs. The Christian views of history, of the nature of the universe and of the human condition are no longer consistent with the understanding that most people have through experience and general education. Many who have tried to remain faithful to the church feel guilty that they are unable to reconcile their personal views or convictions with Christian teaching; they live a kind of schizophrenic religious existence. Others have resolved the tension by distancing themselves and openly saying they are not Christian. During the twentieth century the mainline churches have become the oldline churches and now find themselves to be the sidelined churches (to use John Cobb’s words).

We have seen that Christian faith can be described only in very general terms. There is no such thing as the Christian faith, but there have been countless people through the ages who have found that their capacity for faith has been nourished and strengthened by drawing on various elements of the now extensive Christian cumulative tradition. In the course of 2,000 years this has not only spread around the world geographically but, like a river fanning out into a delta with streams and tributaries, it has diversified its forms and expression. Its organizational manifestation is to be found in a great variety of churches, denominations, sects, associations, movements and house groups. It has gradually penetrated into different cultures, so shaping and coloring them, that even when the ecclesiastical organizations begin to decay, its influence leaves behind a more permanent deposit. This may not be recognizable as any form of conventional Christianity; yet it is there because of the influence of the Christian cumulative tradition and remains part of that tradition.

To illustrate this, let us look at another religious tradition, that of Zoroastrianism. Zoroaster’s teaching also developed into a civilization. It had two main periods of flowering, one about 540-330 BCE in the time of the Achaemenian rulers, and the second about 225-650 CE in the time of the Sassanian rulers. Zoroastrian civilization has now long since disappeared. yet Zoroastrianism still lives, in two quite different ways. It is preserved and practiced in one form by the descendants of the earlier Zoroastrians, the Parsis, who now number only about 100,000. More remarkably, however, Zoroastrianism continues today in the ideas, values and mythical themes transmitted to the Jews and through them, to Christianity and Islam. Some of the Zoroastrian influences were described briefly in Chapter I; they include, as noted, our current concern with ‘the millennium’.

The components of Zoroastrianism which survive in the three monotheistic traditions of the Middle East are, of course, no longer known as Zoroastrian, nor are they usually acknowledged to have a pre-Jewish source. But today we are much more aware that no religious tradition evolves in complete isolation. Most, on examination, reveal more influence from other traditions than they are usually ready to acknowledge. Some gems of wisdom travel from one culture to another, yet each regards them as its own. This interplay between cultures and between religious traditions means that few, if any, of the great cultures ever wholly disappear; they leave deposits of their most compelling ideas and themes. In our clocks and watches we still observe the long-term influence of the culture of ancient Babylonia, for it was the ancient Babylonians who began to use the number base of 6o for counting time and for measuring angles.

The modern secular world cannot be properly understood without acknowledging all it owes to the many human cultures which have preceded it -- in particular, the culture of western Christendom. Indeed western Christianity, however unintentionally, was chiefly instrumental in bringing the modern world into existence. Thus, just as parts of Zoroastrianism may be said to have survived in Christianity, so much of the Christian cumulative tradition lives on in the secularized modern world, and will continue to do so. It would be almost impossible to stamp out that influence.

What survives of the classical form of Christianity appears thin when compared with the substantial body of teaching in its heyday. Yet it has gained enormously in breadth. The cumulative Christian tradition is now spreading out so widely, both geographically and in shape, that it is coming to include a variety of forms which are inconsistent with others. One cannot today define ‘a Christian’ without cutting out people who, quite legitimately, wish to count themselves as Christians, or including some who wish to deny any allegiance to the Christian tradition.

The question of whether we are facing the demise of Christianity does not, therefore, admit of any straightforward answer. We are certainly coming to the end of orthodox or conventional Christianity -- that is, the Christianity which is Bible-based, and which affirms God as a divine personal being and Jesus Christ as the only Savior of the world. But the cumulative tradition still goes on. Just as the ancients used the terms ‘wind’ and ‘breath’ metaphorically to refer to the invisible ‘spiritual’ forces that operate in human societies and motivate their cultures, so we may need to draw upon such vague and indefinite terms in order to understand what is happening in this tradition. Viktor von Strauss, the first to notice the ancient cultural change that was later named the Axial Period, described what he observed as ‘a strange movement of the spirit [which] passed through all civilized peoples’.3 Such ‘movements of the spirit’ may be the key to our understanding of the next phase.

Instead of thinking of Christianity as something which has an unchangeable essence we should view it as a continuing, yet changing, stream of cultural influence. The history and culture of ancient Israel was the chief source from which this stream issued but there were many other tributaries, such as Persian Zoroastrianism and Hellenistic philosophy. Through the centuries were added the thoughts, feelings and personal experiences of countless generations of people who were both shaped by the stream and contributed to it. The development of the mediaeval church, the Renaissance, the Reformation, the Enlightenment, the Evangelical Revival and the advent of modernity have all been significant features of the stream itself, sometimes strengthening it, sometimes modifying it, and always changing it.

As change has taken place, some have accepted it readily, while others have resisted change as inconsistent with some immutable essence. Today such people commonly speak of the danger of ‘throwing out the baby with the bath water’. The metaphor is misleading. There is no ‘baby’, no eternal essence of Christianity. Christianity is the stream itself. The stream is continuous in its flow but ever changing, with new elements entering and others falling out of sight. As Heraclitus noted, one cannot step twice into the same stream.

The stance on human rights is an excellent example of the way radical change can take place in this cultural stream. Many in Christian circles now see it as their duty to give strong support to human rights, yet for nearly 2,000 years the concept of human rights was never acknowledged as a Christian value. There is no explicit mention of such rights in the Bible, nor do they figure in traditional theology and Christian ethics. Even in Emil Brunner’s weighty volume on Christian ethics, The Divine Imperative (1937), there is no discussion of human rights as such. In pre-modern times the emphasis was always on the duties and responsibilities that lie with us humans -- duties to God, duties to the monarch, duties to our fellows. Conventional Christianity asserted that, as sinful creatures in a fallen world ruled by an Almighty God, humans had no rights at all but were at the mercy of a gracious God.

And so the papal encyclicals did not speak about human rights until 1963. In 1864 Pope Pius IX declared that it was insane to teach that citizens had rights to all kinds of liberty,4 but in 1963 Pope John XXIII, in his encyclical Pacem in Terris, said:

Every man has the right to life, to bodily integrity and to the means which are necessary and suitable for the proper development of life. These are primarily food, clothing, shelter, rest, medical care and, finally, the necessary social services. Therefore a human being also has the right to security in cases of sickness, inability to work, widowhood, old age, unemployment, or in any case in which he is deprived of the means of subsistence through no fault of his own . . . right to respect, right to freedom in searching for truth, right to share in the benefits of culture . . . the right to choose the state of life which they prefer’ 5

Within 100 years the Roman Catholic Church had completely reversed its position. If such a radical shift can occur in the most conservative bastion of Christian orthodoxy, how much more change is likely in the more indefinable stream of influence, referred to in the past as Christianity? The Catholic Modernist Loisy, argued against Harnack, as we have seen, that Christianity has no permanent and absolute essence. It is free to evolve where the spirit leads it. Thus, if it is true, as has been claimed, that the idea of Christendom and the doctrines of Christian orthodoxy, were not at all what the historical Jesus had in mind when he spoke of the Kingdom of God, we should not be surprised if the continuing stream of cultural influence which he was so instrumental in re-directing should in the future manifest itself in ways very different from the conventional Christianity it later became for a period.

The modern world is definitely not Christian in any traditional sense, but neither is it anti-Christian, as many traditional Christians assert. What was once the ‘Christian west’ may be legitimately described as post-Christian, a term which acknowledges its continuity with its Christian past. This ongoing ‘Christian’ stream of cultural influence 6 is once again in a fluid state, has widened considerably, and is changing quite radically. It is now becoming part of a larger stream, as all the cultural streams of the past begin to mingle in a global sea.

The Christian presence in the emerging global culture may not always be readily identifiable, but the new global sea of faith cannot help but be continuous with the Christian past. Just how the Christian stream is to relate to the other streams flowing into the global sea may become clearer when, in the next chapter, we acknowledge the phenomenon of relativity.

 

Notes:

1. See John Shelby Spong in such books as Rescuing the Bible from Fundamentalism and Why Christianity Must Change or Die; John Cobb, Reclaiming the Church.

2. Cobb, op. cit., p. 56.

3. Karl Jaspers, The Origin and Goal of History. p. 8.

4. See Chapter 4 for quote from Pius IX.

5. Anne Freemantle (ed.), The Papal Encyclicals, pp. 393 ff. (italics added).

6. This loose but useful term will be taken up again in the last chapter.

Chapter 4: The Failure of Christian Modernism

Why is Christian orthodoxy disintegrating? Why does it no longer have the power to bring conviction and win allegiance in the way it used to? We have already seen that orthodoxy first faltered in its encounters with the inquiring spirit of the Enlightenment. Today, those who valiantly try to defend Christian orthodoxy often blame modernism for its failure. But what is this ‘modernism’?

The term modern is often used simply as a synonym for ‘contemporary’, but there is more to it than that. Coming into English usage about 1500 from the late Latin word modernus, the word ‘modern’ was used to describe obviously new things (as Shakespeare did frequently). Later, the period from 1500 CE onwards became known as the Modern World, following the Mediaeval World or Middle Ages (500 -- 1500 CE), and the Ancient World (500 BCE to 500 CE). This division of history into three ages can be properly applied, of course, only to the Christian west, which also, for better or worse, produced modernity.

The dividing lines between the ancient, mediaeval and modem ages cannot be located with any precision because cultural history is always evolving and does not fall neatly into periods. So the mediaeval age grew out of the ancient world by a complex succession of steps or minor events, just as the modem age, in turn, emerged from the mediaeval age. It is somewhat easier to pinpoint the central or high point of each of these ages. The Graeco-Roman culture of ancient times, for example, had already reached its highest point by the beginning of the Christian era. Similarly, the thirteenth century is somewhere near the apex of the mediaeval age.

What makes the modern age significantly different? Some trace its beginning to the influence of William of Ockham (1285-1349), whose nominalist teaching at Oxford was called the via moderna to contrast it with the traditional teaching, the via anti qua. Nominalism drew inspiration from the rediscovered teaching of Aristotle that all reality consisted of individual things; it opposed the mediaeval scholastic philosophy which followed Plato’s view that reality consisted ultimately of universal archetypal ideas.

These universals, said Ockham, were only names (nomina) which humans have created. This simple but radical insight was to have far-reaching philosophical, cultural and scientific consequences. From this seed-thought grew the modern recognition that language, culture, religion and even such basic terms as ‘God’ originated in the creative human imagination. Along with the via moderna came the devoijo moderna, a form of spirituality promoted by the Brethren of the Common Life, described in Chapter 2. Its most well-known text was The Imitation of Christ of Thomas A. Kempis (c.1380 -- 471).

Modernity became a little more evident, however, in the Renaissance (whose leading thinkers were even then called humanists); this in turn gave rise to the Protestant Reformation, led by Martin Luther, a nominalist (1483-1546). But since the Renaissance humanists and the Protestant Reformers were each still trying to revive the past, many see the real beginnings of modernity with people like Francis Bacon (1561-1626). By separating the study of nature from theology and by laying the foundations of empirical science as he did in The Advancement of Learning (1605), Bacon encouraged his fellow humans to increase their knowledge of the natural world in order to gain mastery over it. It was this that led to the modem idea of human progress, and so later to industrialization and the use of technology, both drawing heavily on empirical science. This early modem age, however, retained much of the supernatural superstructure of the mediaeval age, whereas the later modem age has become increasingly secular (or this-worldly) and non-theistic by comparison. In pre-modern times people saw themselves as living in a fixed and eternal cosmic order, which the structures of society were expected to reflect (for example, ‘Thy will be done on earth, as it is in heaven’). Truth consisted of eternal and absolute verities waiting to be revealed or discovered. The modem age, by comparison, slowly began to question the permanence of the cosmic order. ‘Whereas all cultural change was once contemplated with trepidation, as a further removal from the golden age in the past, people from the Renaissance onwards began to view cultural change positively, seeing it as the harbinger of welcome improvement in both social well-being and, later, standards of living.

Out of this reversal of mood came the belief in progress that has been such a hallmark of the later modem age. R.G. Collingwood pointed out in 1946 that by the late nineteenth century the idea of progress was becoming an article of faith. He quoted the words of historian Robert Mackenzie, writing in 1880: ‘Human history is a record of progress -- a record of accumulating knowledge and increasing wisdom, of continual advancement from a lower to a higher platform of intelligence and well-being . . . The nineteenth-century has witnessed progress rapid beyond all precedent, for it has witnessed the overthrow of the barriers which prevented progress.’l

Progress was possible and seemed to be assured because modernity took a much more positive view of the human condition. In the pre-modern ages human consciousness was dominated by a feeling of helplessness in the face of all natural and supernatural forces, causing people to acknowledge their absolute dependence on divine help, whereas the modem age has been marked by a high degree of human self-confidence and the belief that humans can at last master the forces of nature, justifying an optimistic hope for the human earthly future.

Belief in human progress was continually generated by the success of the emerging sciences, along with the new technology which scientific discoveries made possible. As the twentieth century progressed, modernity almost came to be identified with science itself. Science was commonly thought to hold the key to the human future, so that there was no problem or obstacle which it could not eventually overcome. Also associated with modernity, and perhaps even essential to it, has been the rise of democracy as the fairest, though not necessarily the most efficient, form of government and social order. Allied to democracy has been a new awareness of the value of personal freedom, individual human rights, and gender and sexual equality.

In 1900, therefore, the beginning of the new century was being welcomed with enthusiasm and expectation. The majority of people, at least in the western world, rejoiced in modernity and were reasonably happy with where the world was heading. Everything new and modem was praised and assumed to be superior to the old. Most were firmly confident that conditions could only get better. Just as implicit faith in science has been called scientism, so this trust and confidence which people put in modernity may be called modernism, a term found as early as the eighteenth century.

Modernity came into being in the west and is a product of Christian culture, however much conservative Christians today want to disown it. As the Christian west saw modernity at first, it seemed that a new and better social order was emerging, thus enabling the Kingdom of God at last to be built on earth. The early pioneers of modernity, such as William of Ockham, John Wycliffe, Erasmus, Martin Luther, Francis Bacon, Galileo and John Locke, were all Christian by conviction. All through the nineteenth-century leading Christian thinkers, while not condoning everything new, enthusiastically welcomed and embraced modernity. Among them was Friedrich Schleiermacher (1768 -- 1834). often referred to as the first modern theologian. He was the father of what became known as Protestant Liberalism, which can be seen as the expression of Christian thought m a form more appropriate to the modern world. Thus, as the coming of modernity gathered speed, there were Christian thinkers and biblical scholars who were not only keeping pace with it but, in some areas, promoting it.

It is often forgotten today that, at the time of the furore over Darwin’s epoch-making book The Origin of Species in 1859, there were theologians who quickly accepted his theory of biological evolution. In 1860 the famous Cambridge New Testament scholar, F.J.A. Hort (1828-1892) wrote to a friend: ‘Have you read Darwin . . . In spite of difficulties, I am inclined to think it unanswerable.’ The more liberal Christian thinkers were still confident they would be able to reconcile the Word of God in the book of nature with the Word of God in the Bible (as some of them put it). They believed that, even when the Bible was studied like any other book, it would still be found that there was no other book like it.

A group of Anglican scholars from Oxford gave their support to modernity in their Essays and Reviews in 1861, a book that caused an even greater storm than Darwin’s. Professor Baden-Powell (father of the founder of the Boy Scout movement) wrote: ‘Mr. Darwin’s masterly volume . . . must soon bring about an entire revolution of opinion in favor of the self-evolving powers of nature.’2 By 1890 J. R. Illingworth, an influential Anglican theologian, was able to write: ‘The last few years have witnessed the gradual acceptance by Christian thinkers of the great scientific generalization of our time, the Theory of Evolution.’3 Many books were written by theologians on the problem of how to reconcile Christian thought with evolution. The notion of evolution was itself applied to the origin of culture and of religion, as by the Scottish theologian Edward Caird in The Evolution of Religion (1890).

The rise of Protestant Liberalism may be said to have reached its climax in the thought of Adolf Harnack (1851-1930). A leading historian of the Christian church in the late nineteenth and early twentieth centuries, he was also deeply involved in the advancement of science, as a member of the Academy of Sciences in Berlin and president of what later came to be known as the Max Planck Society for the Advancement of Science. Harnack set out to show from his penetrating studies of early Christianity that the relevance of Christianity to the modern world lay not in theological dogmatism but in the understanding of Christianity as an historical, changing, evolving process. He argued that, within this process, there existed an unchangeable essence of Christianity which, in the course of history, had gone through one metamorphosis after another. He sought to separate this essence from the subsequent accretions of dogma.

The original Gospel of Jesus, in Hamack’s view, had little in common with the ecclesiastical statutes and doctrines of orthodoxy. He was convinced that if the Gospel were to retain power in the modem world, it must be freed from its connection with the dogmas of God and Christ with which it had been clothed in order to survive in the ancient Hellenistic world. In 1900 he delivered his findings in a series of public lectures, later published as What is Christianity? There he reduced the essence of Christianity to the Fatherhood of God, the Brotherhood of Man, the infinite value of the human soul and the coming of the Kingdom of God -- themes that were already becoming dominant in late nineteenth-century hymns.

The development of Protestant Liberalism contrasted strongly, at first, with the response of Catholicism to modernity, partly because the Roman Catholic Church has been a much more authoritarian structure, and partly because the Roman Catholic Church remained more firmly committed to the mediaeval age after the Protestant Reformation. It saw no reason to depart from the teaching of the great mediaeval theologian Thomas Aquinas.

In the nineteenth century, therefore, the Roman Catholic Church was firmly resisting the influence of modernism on religious thought while Protestantism was adjusting to it. Pope Pius VI had strongly condemned the manifesto of the French Revolution, which was one of the more violent signs of the coming of modern age. The church continued to resist all social and cultural change throughout the nineteenth and early twentieth centuries. In 1832 Pope Gregory XVI (followed by Pope Pius IX) declared that it was insane to teach that ‘the liberty of conscience and of worship is the peculiar right of every man . . . and that citizens have the right to all kinds of liberty . . . by which they may be enabled to manifest openly and publicly their ideas, by word of mouth, through the press or by any other means’.4 In 1864 Pope Pius IX proceeded to draw up a list of the principal errors of the age which were to be condemned. There were 80 of these, of which the last read: ‘It is an error to claim that the Roman Pontiff can, and ought to, reconcile himself and come to terms with progress, liberalism and modern civilization’5 Thus modernism was not to be permitted to penetrate Catholic theological doctrine, and the idea of evolution was strongly condemned. The First Vatican Council (1869-1870) was held in part to strengthen the church against the onslaught of modem thinking, and did so by promulgating the Dogma of Papal Infallibility.

Nonetheless, modem thought took root in Catholicism. When Leo XIII came to the papal chair, he announced his intention of reconciling the church with modem civilization. His most famous encyclical, Rerum Novaruni (1891), was directed to ‘The Condition of the Working Classes’, and it has been hailed as one of the most important modern pronouncements on social justice. This gave encouragement to a group of Catholics who soon became known as the Modernists and who reached the height of their influence in the opening years of the twentieth century. The Catholic Modernists believed that Catholic teaching should be brought into harmony with the modern outlook in philosophy, history and science. They contended that the biblical writers were conditioned by the times in which they lived, and that biblical religion, like all religion, was subject to historical development.

Alfred Loisy (1857-1940) was a French priest and a very able biblical scholar, who published The Gospel and the Church (1902) for the express purpose of defending Catholicism against the influence of Protestant Liberalism, particularly as expounded in Harnack’s What is Christianity? (‘the essence of Christianity’, in German). Loisy denied that Christianity possessed any permanent and absolute essence; rather he saw it as a living and ever-changing process. He contended that it was quite legitimate for Christianity to evolve, as it had done, into the fully fledged form of Catholicism, and believed Harnack to be mistaken in thinking that, by stripping away what had developed over many centuries, he would find a solid and primitive kernel of essential Christianity. As Loisy saw it, the Gospel was not a message set in unchangeable words which were equally applicable to people of all centuries. Christianity, he claimed, was a living faith which, though always linked to the historical circumstances of its birth, had to be perpetually reshaped and given fresh verbal expression in order to remain a genuine path of faith in later ages.

Loisy was paving the way for an essential reform -- in the interpretation of the Bible, in the whole of theology and even in Catholicism itself. His book was welcomed by other liberal-minded Catholics, but the author soon found himself facing the full wrath of the Catholic hierarchy. He was charged with denying the inspiration of Scripture, denying that Jesus was the revealer of infallible truths, denying the bodily resurrection of Jesus by regarding it as myth, and undermining the authority of the papacy. Loisy and other liberal Catholic thinkers had been tolerated and even encouraged during the reign of Pope Leo XIII (1878 -- 1903), who had real respect for academic scholarship. But his successor, Pius X (19o3 -- 19I4), distrusted this liberal movement from the beginning.

In 1907 Modernism as led by Loisy was condemned by Pope Pius X as ‘the synthesis of all heresies’. In an encyclical (Lamentabilt) and a decree (Pascendi) he set out the 65 errors of Modernism, one of which was that ‘Scientific progress demands that the concepts of Christian doctrine concerning God, creation, revelation, the Person of the Incarnate Word and Redemption be readjusted’.6 Loisy was excommunicated in 1908, but in 1909 was appointed to the chair of the History of Religions at the Collage de France, from which position he continued to write about Christian origins for the next 20 years.

The leading Catholic Modernist in England was George Tyrrell (1861-1909). Reared as an evangelical Protestant in Dublin, Tyrrell was attracted to High Church Anglicanism. By 1879 he had become a Roman Catholic and in 1880 he entered the Jesuit novitiate. Remaining strongly attracted to the devotional aspects of Catholicism, he became increasingly hostile to the orthodox scholasticism, and began to publish his views with some vigor, contrasting living faith with dead theology. He was dismissed from the Jesuit order in 1907 for refusing to repudiate his more provocative statements. When the Pope issued his encyclical condemning Modernism, Tyrrell wrote letters to the London Times accusing the Pope of heresy. He was immediately excommunicated. He died in 1909 and was refused Catholic burial.

Tyrrell’s views were set forth in Christianity at the Cross-roads, published posthumously in 1910. There he defined a Modernist as ‘a churchman who believes in the possibility of a synthesis between the essential truth of his religion and the essential truth of modernity’.7 Like Loisy, he was critical of the Protestant Liberals, making the much-quoted remark that the Christ that Harnack saw, looking back through nineteen centuries of Catholic darkness, was only the reflection of a Liberal Protestant face seen at the bottom of a deep well. He believed that, whereas Protestant Liberals were putting the emphasis on historical records and on the moral teaching of Jesus, Catholic Modernism was calling for changes of such a radical nature that it might be necessary for Catholicism to die, in order that it might rise again in a grander form, more appropriate to the age.

Pope Pius X was determined to root out all elements of Modernism from Catholicism. In 1910 he required all priests to swear an anti-modernist oath in which they were to offer complete submission to his earlier condemnations of Modernism. Only 40 priests refused. All ordinands were thereafter required to make a vow renouncing all Modernist tendencies. At that point the Modernist movement was almost completely crushed by papal authority.

In 1898. just as Catholic Modernism was raising its profile in both England and France, an Anglican Society was founded, entitled the Churchmen’s Union. later to be called the Modern Churchmen’s Union. Its aim was to reformulate Christian thought in ways that would make it more consistent with the modem age. Anglican Modernism had much sympathy with both Protestant Liberalism and Catholic Modernism but, at the same time, remained critical of them both.

The leader and chief organizer of Anglican Modernism was Henry D.A. Major, who was reared, educated and ordained in New Zealand before he returned to his native Britain. In 1911 he founded a monthly journal. The Modern Churchman, which he edited until 1956. In 1919 Major was appointed principal of Ripon Hall after it was transferred to Oxford, where it became the center for Anglican Modernism.

Major defined Modernism as the claim of the modern mind to determine what is true in the light of its own experience, even though its conclusions might contradict those of tradition. He believed this to be a mode of human consciousness that would dominate in the future. The dogmas of the past were to be valued and studied historically, but were not to be taken as infallible and binding. All this he set forth in English Modernism (1927), first delivered as lectures in Harvard in 1925-1926.

Major denied that religion was dying. He claimed, rather, that it was being rationalized, moralized and spiritualized. He was convinced that, unless modernized, the church would be a declining influence in shaping the world of the future. He believed that Anglican Modernism would not suffer the same fate as Roman Catholic Modernism since Anglicanism, unlike Catholicism, was a tradition comprehensive enough to have contained many differing schools of thought. However, by the early 196os, when Major died, Anglican Modernism was already losing ground. By this time Protestant Liberalism was also less vigorous and was being successfully countered by a strong reactionary movement which came to be known as fundamentalism.

In 1909, rust one year after the Pope had crushed the rise of Modernism in the Catholic Church, a series of 12 booklets entitled The Fundamentals began to appear.8 Between 1909 and 1915 they were distributed free of charge to every Protestant minister in the English-speaking world. Their intention was to counter the spread of liberal religious thought commonly known as Christian Modernism. They identified it with secular humanism, and condemned it as the cause of all current cultural ills, including the decline in Christian allegiance. They believed the only solution was to return to the fundamental certainties and the supernaturalist thought forms of premodern times, The booklets reaffirmed belief in a personal God, the infallibility of the Bible, the deity of Christ, the Virgin Birth, miracles, the bodily resurrection of Jesus, and the substitutionary view of the Atonement. They attacked not only the new biblical criticism and Darwinism, but also Roman Catholicism and the new sects of Mormonism, Jehovah’s Witnesses and Christian Science.

Although the publication of the series failed initially to check the spread of Modernism, it led to fierce theological battles between fundamentalists and liberals in seminaries and churches. The theological battle received great publicity during the famous Scopes Trial of 1925, when school teacher John Scopes was tried and convicted for teaching biological evolution in a Tennessee school. Fundamentalists still found themselves in a minority; for example, Presbyterian fundamentalists chose to withdraw from Princeton Theological Seminary and form their own (conservative) Westminster Seminary.

In 1925 Kirsopp Lake, a New Testament scholar of international repute and an Anglican Modernist, wrote a book called The Religion of Yesterday and Tomorrow in which he asserted that the denominational divisions of the church had already become obsolete. The real divisions, which cut right across the denominations, divided church people into what he called the Fundamentalists (conservatives), the Institutionalists (liberal traditionalists) and the Experimentalists (radicals). He said it had become ‘necessary to distinguish the future of the churches from the future of religion’.9 The future of Christianity he believed to be with the experimentalists but, with regard to the churches, he made this striking prophecy: ‘The Fundamentalists will eventually triumph. They will drive the Experimentalists out of the churches and then reabsorb the Institutionalists who, under pressure, will become more orthodox. The Church will shrink from left to right.’10 This prophecy has largely been fulfilled in the Protestant churches. Fundamentalist or traditionalist Christians tend to dominate the ecclesiastical institutions throughout the world today. The liberals, particularly after the failure of Modernism, have largely ceased to be active in the mainline churches, leaving these to become increasingly conservative.

In Roman Catholicism, liberalism began to resurface for a time from the 1940s onwards, particularly in Catholic biblical scholarship. It came to a head when the Vatican II Council (1962-1965), with its theme of aggiornamento, was called by Pope John XXIII as a means of bringing the Roman Catholic Church into the modern world. Although the term Modernist was strictly avoided, the Vatican II Council did initiate a number of moves to which the earlier Modernists would have given hearty approval. For a decade or two the face of Catholicism began to change much more rapidly than that of Protestantism. Then the impetus faltered in the final years of Pope Paul VI, and traditional conservatism returned under Pope John Paul II.

Protestant Liberalism, Anglican Modernism and Roman Catholic Modernism all responded to the advent of the modern age in a positive and constructive way. They set out to show that Christian faith and practice had nothing to fear from modernity. They firmly believed that, though some changes in the expression of Christian doctrine were needed, the essential truth of Christianity would stand firm and would be expressed again in new and more appropriate forms. But they have not succeeded in taking the main body of the churches with them. A few instances of even more radical thought have surfaced within the mainline churches (such as John Robinson’s Honest to God in 1963, Don Cupitt’s Taking Leave of God and later books), but these have largely been rejected by Christian officialdom, and the churches have become more attached than ever to one or other of the orthodox forms of the past. The gulf between the church and the world outside it grows ever wider.

Is this because Christianity is unable to be modernized, or does it point to some basic flaws in modernity itself? There is some truth in each of these views. As we look back, we cannot fail to compare the widespread optimism with which the western world was greeting modernism a century ago with today’s more ambivalent experience. At the beginning of the century science was being hailed as the new and infallible source of truth. ‘Science teaches that . . .’ was rapidly replacing ‘The church teaches that . . .’ science and religion came to be popularly viewed as polar opposites and mutual enemies. If the body of divinely revealed knowledge contained in the Bible and guarded by the church was the basis of the Christian era, so the body of knowledge being accumulated by science was seen to be the foundation of modernity, fueling human confidence in an ever better future. This is no longer so.

Many of the events of the twentieth century have eroded the human self-confidence and belief in progress that fuelled modernity. And modernity itself is now held responsible by some for the current ills in society, and for the uncertain and fragile future which we now face. H. Richard Niebuhr wrote just before his death in 1962:

We see the possibility that human history will come to its end neither in a brotherhood of man nor in universal death under the blows of natural or man-made catastrophe, but in the gangrenous corruption of a social life in which every promise, contract, treaty and ‘word of honor’ is given and accepted in deception and distrust. If men no longer have faith in each other, can they exist as men?11

At the end of the twentieth century science and technology still enjoy approval and inspire confidence in human endeavor, but they no longer go unquestioned. Modernism, as a name for putting one’s faith in all things modern, is no longer universally espoused. Indeed, the scientific enterprise is itself entering a more fluid state. The world we find ourselves living in seems not wholly to be determined by the laws of nature which modernism set out to uncover. Rather, the universe appears to be a mystifying mixture of both necessity and chance. Perhaps we have come to the end of modernism, whether Christian or secular.

This ambivalence towards modernity is reflected in two extreme attitudes. Some, such as the fundamentalists, see modernity as the cause of all our ills; they wish wholeheartedly to reject much of it and to return to the supposed security of pre-modern times. At the other extreme there are those who call themselves post-modernists (to be discussed in Chapter 7); they also are strongly critical of much that has characterized modernity but, knowing there can be no turning back, they advocate various ways of moving into a less structured future. Most people could perhaps still be described as lukewarm modernists, in the sense that we are grateful for the comforts and pleasures modernity offers and are prepared to accept as inevitable the problems and disadvantages that come in its train.

And where does this leave Christianity? Is it to be even further marginalized? Is it to become a museum piece? Or is there something about Christianity which we have not yet fully understood?

 

Notes:

1. R.G. Collingwood, The Idea of History, pp. 44-46.

2. Essays and Reviews, sixth edition, p. 139.

3. Charles Gore (ed.), Lux Mundi, p.132.

4. Anne Freemantle (ed.), The Papal Encyclicals, p. 137.

5. Ibid., pp. 43-52.

6. Ibid., p. 207.

7. G. Tyrrell, Christianity at the Cross-roads. p. 26.

8. Issued by the Testimony Publishing Company, Chicago, and distributed with the ‘compliments of two Christian laymen’.

9. Kirsopp Lake, The Religion of Yesterday and To-morrow p. 159.

10. Ibid., p. 163.

11. H. Richard Niebuhr, Faith on Earth, p. 1.

Chapter 3: The Disintegration of Orthodoxy

Since Christianity existed for at least three centuries before the formation of Christendom, there is no reason why it should end at the same time. Indeed, during the gradual demise of Christendom, Christianity has increased in vitality and spread its influence much further afield -- rather like a living entity released from the protective shell which it had produced for itself. Thus, as the framework of Christendom began to crumble under its own weight from the sixteenth century onwards, Christian belief and allegiance experienced a surge of new life, first in Protestantism and then in Catholicism. As recently as the middle of the twentieth century, church historian K.S. Latourette described the period 1815-1914 as ‘the greatest century which Christianity has thus far known’ in its 2,000-year history.1 Christendom might be dying but Christianity was very much alive. Some Christian leaders have even rejoiced in the dissolution of a Christendom that allowed, or perhaps encouraged, an excessive degree of nominal Christian allegiance: the impact of modem secular society has challenged people to make a conscious choice about whether they are either for or against Christianity. Such leaders prefer to speak of the present as the post-Constantinian age rather than a post-Christian one, and some claim that Christianity is stronger than ever today.

But what sort of Christianity are we talking about? The term Christianity is, as we have seen, relatively modern in its current usage. And it has come to mean different things to different people. From the Protestant Reformation onwards, Roman Catholics saw themselves as the guardians of the only genuine form of Christianity, and they judged Protestants to be heretics and apostates. The Protestants were equally adamant that they alone were faithful to the original and only true form of Christianity, and they condemned Catholics as idolaters. So were there now two (and perhaps even more) forms of Christianity?

For many centuries before the Reformation, there was substantially only one form of Christian teaching and it was clearly set out in a set of doctrines now often referred to as Christian orthodoxy (literally meaning ‘right belief’). What became Christian orthodoxy was largely hammered out by debate in the ecumenical councils of the first few centuries, sometimes using concepts of Greek thought used by non-Christian philosophers. In the early centuries, Christian faith was sufficiently flexible to incorporate valid criticisms of its various verbal expressions. Only later did it assume the steadfast rigidity that it then displayed until modern times.

To deviate from established orthodoxy in one’s beliefs was to be guilty of the heinous sin of heresy. The Inquisition was set up in the thirteenth century to search out, condemn and put to death all heretics. This assumed its most violent form in the Spanish Inquisition which lasted from 1479 to 1820. Defending orthodoxy by the severe punishment of deviants was not only a Catholic practice. The Protestant Reformer John Calvin (1509-1564) was instrumental in having Michael Servetus burnt at the stake in 1553 for denying the doctrine of the Holy Trinity and the divinity of Christ. At first, the critics of orthodoxy were often lone voices that could be quickly silenced. But from the time of the Protestant Reformation, and more particularly over the last 200 years, criticism of Christian orthodoxy has grown.

More latterly a succession of voices from within the Christian tradition itself has warned that Christian orthodoxy is coming to an end. In 1963 Bishop John Robinson’s Honest to God, advertised with the slogan ‘Our Image of God Must Go’, became a runaway best-seller. Some months before, in 1962, a course of lectures began in the Catholic University of Nijmegen, Holland, entitled ‘The End of Conventional Christianity’. The lecturer, W.H. van de Pol, later published a book with the same title in which he sought ‘an answer to the question of why it is that conventional Christianity has become so undermined that we are experiencing its collapse’.2

By the term ‘conventional Christianity’ van de Pol did not mean the Christianity of the first three or four centuries, but rather Christianity as it was believed and has been practiced since the Christianization of Europe; that is, since the formation of Christendom. This Christianity is expressed in the creeds, confessions, hymns and liturgies, and is substantially what may be called Christian orthodoxy.

To understand the demise of Christian orthodoxy we must turn to four particular areas that are vulnerable to what have been called the corrosive acids of modernity -- that is, the church, the Bible, the person of Jesus Christ and the reality of God.

The Church

The church had long seen itself as a divine institution, different from all natural institutions such as the family and all humanly created institutions such as the monarchy. The church was believed to have been founded by Jesus Christ, who remained its king and head, exercising his rule through his vicar the Pope. Thus the church mediated a unique and divine authority, wielding a power that could not be matched by kings and princes. The church claimed to be able to speak with finality on all matters of essential truth. The remnant of this is still to be found in the Roman Catholic Dogma of Papal Infallibility. Belief in the divine institution of the church became an article of faith, as in the words of the Creed, ‘I believe in one, holy, catholic and apostolic church’.

Various aspects of this doctrine of the church were challenged at the Protestant Reformation. Yet the Reformers, critical though they were of the Pope and of the mediaeval church, were anxious not to throw the baby out with the bathwater. First they identified the church with the Christian people, rather than with the ecclesiastical institution which now ordered their lives. Secondly, they conceded that church councils were not infallible but were prone to error. Thus, although the church consisted of people who professed the Christian faith, its institutional form was human and fallible. Yet for a long time after the Reformation much of the traditional holiness was believed still to adhere to the church and its officers, the holy ministry.

The fact that the Reformers rejected the Pope entirely (even referring to him as the anti-Christ) meant that the Protestant churches lacked an authoritative personal voice. They were thrown back more and more on the words of the Bible, the very instrument they had used to bring criticism to bear upon the church. This proved something of a two-edged sword, leading to two extremes. Either the Bible was absolutely infallible, as it is for the fundamentalists; or it was subject to the same type of rational criticism that the Protestants had already brought to bear against the papacy. Thus Christian orthodoxy lost first a divine and infallible church and later, a divine and infallible Bible.

The Bible

Prior to the Enlightenment, Christian thought and practice appeared to be built on the firmest of foundations -- the Bible. Both Catholic and Protestant accepted the Bible as the divine revelation of infallible truth. As the Westminster Divines of 1643 declared: ‘The authority of the Holy Scripture depends not on the testimony of any man or church, but wholly on God, the author thereof; and it is to be received because it is the word of God.’ The Bible was therefore believed to reveal without error the origin of the world, the meaning of history, the moral laws by which all should live, and the only path to salvation. People, whether educated or not, generally accepted at face value everything written in the Bible. Until the beginning of the nineteenth century there seemed little reason to doubt its stories of creation, the Great Flood, its history of humankind, and the story of Israel culminating in the birth, death and resurrection of Jesus Christ.

But in the nineteenth century this widespread confidence in the Bible was badly shaken, as biblical scholars began to study it with the modern tools of literary and historical criticism. These pioneers often found themselves rejected by their churches, and even dismissed from their university posts as a result of their publications. Only slowly did their work come to be known by the general public. The process by which people lost their faith in the Bible as an infallible source of knowledge is thus a complex one, stretching over some two centuries. From time to time fierce theological debates took place, such as that which followed the publication of Charles Darwin’s (1809 -82) theory of biological evolution in 1859. If Darwin was right, then the opening chapters of the Bible were false and misleading. If the Bible was found wanting in its account of Creation, how could one be sure of it anywhere? Although liberal Christians quickly found ways of accommodating the idea of evolution, more conservative Christians reject Darwinism to this day.

Fuel was added to the fire by the work of the seminal biblical scholar Julius Wellhausen (1844-1918) about 1880. This led scholars to reject the tradition which regarded Moses as the author of the first five books of the Bible. Behind these controversies lay a growing awareness of the human origin of the Bible. For scholars were discovering that the Bible -- far from being the ‘Word of God’, dictated by God -- was written by humans. Its various books reflected many aspects of the cultural environment in which they were composed, including even the prejudices and limited knowledge of their authors.

This new understanding of the Bible has by no means dampened the interest, indeed the enthusiasm, of scholars and the Bible has been more studied in the last 150 years than in the previous two millennia. This has enabled us to gain a more reliable picture of the ancient world reflected in the Bible. The Bible’s value remains high, but it is value of a quite different order. The Bible remains our chief collection of extant records describing the origin and early development of the Judeo-Christian path of faith, but it no longer prescribes, as it was once thought to do, what devout people of all later ages should believe and do. The churches have found it difficult to come to terms with this fact, and often refuse to acknowledge that they have lost for ever what they took to be an authoritative source of religious truth. This revolution in our understanding of the Bible has had serious consequences for two other key concepts in Christianity: the person of Jesus Christ, and the reality of God.

Jesus Christ

What Christian orthodoxy meant by the term Jesus Christ is best understood by quoting from the Nicene Creed:

(I believe) in one Lord Jesus Christ, the only begotten Son of God, Begotten of His father before all worlds, God of God, Light of Light, Very God of Very God, Begotten, not made, Being of one substance with the Father, By whom all things were made: Who for us men and for our salvation, came down from heaven, And was incarnate by the Holy Ghost of the Virgin Mary, And was made man, and was crucified for us under Pontius Pilate. He suffered and was buried, and the third day rose again according to the scriptures, and ascended into heaven, and sitteth on the right hand of the Father. And He shall come again with glory to judge both the quick and the dead: Whose kingdom shall have no end.

This Jesus Christ stands at the center of Christian tradition and is the foundation of Christian orthodoxy. But what sort of person or being is the Creed referring to as the Lord Jesus Christ? Up until 200 years ago the term Jesus Christ implied all of the following things at one and the same time, for they were implicit, if not explicit, in the language common to all Christians in their devotions and their theology:

1. the Divine Son of God, who existed from the beginning of time, having been begotten before the Creation of the world, and who became the maker of all things;

2. the second ‘person’ of the Holy Trinity, who became incarnate in the man Jesus to become the Christ and Savior of the world;

3. the historical figure of Jesus, who lived in Palestine some 2,000 years ago, who became a travelling teacher and healer before being crucified and who did and said all the things the four Gospels ascribed to him;

4. the Christ who rose from the dead, ascended into heaven and, while sitting at the right hand of God, is also now eternally present everywhere, sharing the timelessness of God;

5. the church, since it is called the ‘body of Christ’ and since Christ resides spiritually in all Christians and all Christians are said to be ‘in Christ’;

6. the Eternal Judge, who now hears the prayers of his followers and who will come again in judgement at the end of the world.

These ways of thinking of Jesus Christ were all accepted as simply different facets of the one spiritual reality. As a result of the new understanding of the Bible, from the study of the last 200 years, the once seamless robe into which all these strands of thought were woven has been torn apart, just as surely as the curtain of the Jewish temple was said to have been rent in two on the first Good Friday after the death of Jesus. The Jesus Christ who is the foundation of Christian orthodoxy has disintegrated into a collage of history, myth and devout imagination. Based initially on personal memories of the historical figure of Jesus, the Jesus Christ worshipped in the Christian tradition has been shaped by the collective imagination and devotion of the Christian community.

The traditional mental picture of Jesus Christ was not dismantled intentionally by scholars hostile to Christianity. On the contrary, the long and complex process which has forced us to distinguish between the historical figure of Jesus (who is open to historical research) and the religious figure of the Christ (who can be affirmed only by Christians and who is subjectively ‘known’ in Christian devotion) has been undertaken by Christian scholars bringing the best of contemporary analysis to their study of the Bible.

First came the pioneering work of the Enlightenment scholar Hermann Reimarus (1694-1768), who made a critical study of the New Testament running to 4,000 pages of manuscript, entitled The Defense of a Rational Worshipper of God. He so surprised himself by his conclusions that he dared not publish this work during his lifetime but entrusted it to his friend G.E. Lessing, a dramatist and philosopher. Reimarus showed that it was impossible to reconcile the stories of Jesus as told by the four different Gospels (and particularly their accounts of Jesus’ Resurrection), so the Gospels could not be accepted at face value as genuine records. He concluded that the disciples, distraught by the unexpected end to the ministry of Jesus, stole his crucified body, concocted the story of his Resurrection and turned the message of Jesus into the message about Jesus. After the death of Reimarus, Lessing published seven excerpts from the manuscript under the title The Intention of Jesus and his Disciples. It caused such an outcry that the King of Prussia forbade any more to be published.

The second step occurred when David Strauss (1808-74) published a two-volume work, The Life of Jesus Critically Examined, in 1835.3 This is a most remarkable book to have been written by a young scholar of only 28 years. Strauss was the first to introduce into the study of the Gospels the categories of history, legend and myth. He defined legend as a story which has expanded and embellished the memory of an original historical event. A myth he defined as a story wholly created by devout imagination on the basis of an original idea. Strauss showed that the portraits of Jesus in the Gospels were already a mixture of history, legend and myth. In creating their stories, Strauss wrote, the early Christians drew largely upon Old Testament motifs and themes; they used them as models with which to describe how they saw the role of Jesus. In this way Strauss was able to see where Reimarus had gone wrong and why his hoax theory was false. The Gospel writers were not presenting eye-witness accounts, but simply collecting the stories already circulating about Jesus in the expanding oral tradition.

Strauss’s book, translated into English by novelist George Eliot, was widely read. It aroused such opposition that various attempts were made to have the book suppressed and Strauss lost all chance of a career in either the church or university. Many recognized that if his interpretation of the Gospels were true, then Christian orthodoxy had no future. Strauss overstated his thesis, but he opened up such a problem for Christianity thereafter that Bishop Stephen Neill, a moderate scholar, wrote in 1964 that ‘this book marked, as few others have done, a turning point in the history of the Christian faith’.4

Ever since Strauss’s first book, and his later book The Christ of Faith and Jesus of History (1865), it has been necessary to distinguish between the historical figure, now known as Jesus of Nazareth, and the symbolic object of Christian worship, called Christ. The original Jesus became Christ by being clothed and partially hidden in the stories of early Christian devotion. The Gospels can no longer be read as an accurate account of the historical Jesus of Galilee. Jesus became Christ, not in human and cosmic history, but in the experience and thinking of the first generations of Christians.5 This distinction between the historical Jesus and the Christ of faith meant that the traditional picture of Jesus Christ, as portrayed in the Creeds, was being torn apart. The only words in the Creed (as quoted earlier) that are historical are ‘was crucified … under Pontius Pilate. He suffered and was buried’; the rest is the language of myth.

Attention was then fastened on the historical Jesus as the founder of Christianity, and this led, through the rest of the nineteenth century. to an intense historical search for the genuine and original Jesus. The search was brought to a climax with the third milestone, the publication by Albert Schweitzer (1875-1965) of his book The Quest of the Historical Jesus (1906). In this book Schweitzer surveyed the whole of the critical research into the life of Jesus that had been undertaken in Germany in the previous century. He showed that the attempt to penetrate the Gospel portraits and recover the original historical Jesus of Nazareth failed, because each of the written histories of Jesus unconsciously reflected the subjective hopes and ideals of the author. He put it this way:

The Jesus of Nazareth who came forward publicly as the Messiah, who preached the ethic of the Kingdom of God, who founded the Kingdom of God upon earth, and died to give his work its final consecration, never had any existence. He is a figure designed by rationalism, endowed with life by liberalism, and clothed by modern theology in an historical garb.6

The failure to recover the historical Jesus did not unduly worry Schweitzer. He said: ‘The truth is, it is not Jesus as historically known, but Jesus as spiritually arisen within men, who is significant for our time . . . Not the historical Jesus, but the spirit which goes forth from him . . . is that which overcomes the world.’7 This was also the conclusion of a slightly earlier book which never received the publicity of Schweitzer’s work. In The So-called Historical Jesus and the Historic Biblical Christ (1896), Martin Kähler described the search for the historical Jesus as a ‘blind alley’. Not only, he said, does it fail to recover the historical Jesus but it actually ‘conceals the living Christ’. Thus we do not have the necessary sources for writing ‘a life of Jesus’, for the Gospels are proclamations (or extended sermons), reflecting the testimonies of the first believers in Christ. As Kähler said, ‘The risen Lord is not the historical Jesus behind the Gospels but the Christ of the apostolic preaching, of the whole New Testament.’8 The real Christ, therefore, is the one who was proclaimed by the Apostles and who continues to be preached on the basis of the biblical proclamations.

For both Kähler and Schweitzer, Christ is the name of the spiritual influence which has flowed from the original Jesus, now lost in the mists of history. This is why Kähler referred to the biblical Christ as ‘historic’, in contrast to the ‘historical’ Jesus. ‘The truly historic element in any great figure,’ he said, ‘is the discernible personal influence which he exercises upon later generations.’9 This means of course that the living Christ is not open to historical enquiry and ‘Christian language about Christ must always take the form of a confession’.10 The only real Christ is the Christ who is preached, and the Christ who is preached is precisely the Christ of faith.

From Strauss onwards, when the distinction was first being made between Jesus and Christ, attention switched from the Christ of dogma to the Jesus of history. At the beginning of the twentieth century, from Kähler and Schweitzer onwards, attention switched from the Jesus of history to the Christ being preached. It was on this activity of preaching that both Karl Barth (1886-1968) and Rudolf Bultmann (1884-1976), fastened -- not unlike St Paul, who not only showed little interest in the historical Jesus but also spoke in mystical terms of the in-dwelling Christ and of Christian believers being ‘in Christ’. So long as the Christ who was preached continued to influence people spiritually, as was the case for some decades into the twentieth century, Christianity remained very much alive, even if some aspects of Christian orthodoxy were being quietly ignored. But by the middle of the century it was becoming apparent that the Gospel was falling on deaf ears. Rudolf Bultmann, arguably the greatest New Testament scholar of the twentieth century, and in many ways the logical successor of Kähler, blamed this failure on the outmoded mythological language of the New Testament. In a celebrated essay published during World War II, he acknowledged that the classical form of Christian proclamation (kerygma) in which the living Christ was communicated was couched in terminology drawn from the now obsolete cosmology of the ancient world." Since this had become quite unbelievable to modern humankind, he called for a radical program of re-interpretation which he called ‘demythologizing the Gospel message’.

After World War II Bultmann’s plea for demythologizing the Gospel led to widespread theological debate. Conservatives rejected his approach entirely. Many others agreed that he made a valid point but they could not accept his existentialist re-interpretation. Some of Bultmann’s own pupils began what has been called ‘A New Quest for the Historical Jesus’.12 These scholars, recognizing the pitfalls of the first quest, were more modest in their aims. They were primarily concerned to investigate the overlap between the genuine memories of Jesus embedded in tradition and the church’s proclamation of him as Christ. More recently a group of New Testament scholars from USA and Germany, calling themselves the Jesus Seminar,13 have initiated a third quest for the historical Jesus. The results of their work are to be found especially in Robert Funk’s The Five Gospels, What Did Jesus Really Say? and The Acts of Jesus. What Did Jesus Really Do?’4

We now know that the most we can really say about Jesus of Nazareth as an historical figure is that he was a first-century Jew who developed a reputation as a teacher and healer. He antagonized the authorities of his day, both Jewish and Roman, and he was executed by the Romans. What he actually taught has become so integrated with what his followers taught about him that it is difficult to recover his own words. The Jesus Seminar has concluded that only about 20 per cent of the words attributed to Jesus originated with him. He almost certainly reflected the beliefs of his day. He may have been a teacher of wisdom rather than a prophet. He did not claim to be the expected Messiah, the Savior of the world or the divine Son of God. The stories of his birth, transfiguration, resurrection and ascension are not historical but belong to the categories of myth or legend.

Whether these radical findings of New Testament scholars mean the end of Christianity we have yet to discuss. They certainly entail the demise of Christian orthodoxy, which is wedded to the divinity of Jesus Christ. And this brings us to the fourth and ultimate foundation of Christian orthodoxy -- the being of God.

God

The understanding of God as the supreme personal being has been basic to Christian orthodoxy from the beginning. Christians inherited this from the Jewish religion, out of which they emerged and of which they were originally a sect. The concept of one supreme being was readily adopted, as it seemed greatly superior to the plethora of gods that were worshipped in the ancient world. Christians drew upon this understanding of God in order to interpret the significance and role of Jesus, whom they recognized as the one anointed by God to be Messiah.

The first rift in Christianity occurred when the first Christians, being Jewish, continued to affirm the full humanity of Jesus, while the Gentile Christians led by Paul increasingly affirmed the divinity of Jesus. As the Christians moved away from Judaism into Hellenistic culture, their understanding of God and of Jesus Christ was influenced by the Greek concept of God (theos), particularly as it was defined by both Plato and the Stoics. This influence can be documented clearly during the first five centuries when the orthodox Christian doctrine of God was debated and expressed in the creeds of the ecumenical councils.

The reality of God as the spiritual Creator of the physical universe seemed to be self-evident. In the ancient world, it was not atheism against which Christianity had to defend itself but polytheism, the belief in too many gods. Christians even found themselves being called atheists because they dismissed the gods that people had traditionally worshipped. The various attempts of Christian philosophers through the centuries to prove the existence of God were never much more than academic exercises, for the reality of a heavenly designer and sustainer fitted the pre-modern view of the universe so convincingly. John Calvin was able to declare in 1555 without fear of contradiction: ‘There is no nation so barbarous, no race so brutish as not to be imbued with the conviction that there is a God.’15

That was soon to change, but those who dared to question openly the reality of God faced the punishment of death. Giordano Bruno (1548-1600), an admirer of Copernicus, was burnt at the stake for contending, among other things, that God was not to be understood as a personal being distinct from the world but was to be encountered as immanent in nature. The divine life, he said, permeates everything including ourselves.

This kind of pantheism, shared by the Jewish philosopher Baruch Spinoza (1632-77), and to some extent by the earlier mystics, was the first alternative to traditional theism to be expressed. It was not until the eighteenth century that real doubt began to be raised about whether the concept of God referred to any kind of objective reality. The universal acceptance of the God reality was then beginning to weaken. Friedrich Nietzsche (1844-1900) gave dramatic expression to this in his parable of the madman who declared that ‘God is dead’. He was describing the fact that the traditional understanding of God (theism) was becoming dead for the modern human mind, because the modern view of the universe was vastly changed from that in which monotheism had arisen. Nietzsche’s announcement surfaced more widely in the 1960s, when even Christian theologians began to accept the significance of what he had said.

At the beginning of the twentieth century there were still only a few who dared to call themselves atheists. In the western world they remain a minority, but they are still growing in number. Traditional theism is declining even more rapidly, and is being replaced by agnosticism or by a use of the word ‘God’ that is both vague and variable from person to person. The God concept no longer has any agreed or universal meaning. There have, however, been some valiant attempts to defend the continuing use of the term God. Paul Tillich has spoken of God as ‘being itself or as the symbol which points to whatever is of ultimate concern for us.16 He spoke of the ‘God above God’17 Don Cupitt has expounded what he chooses to call a non-realist view of God, saying, ‘God is the mythical embodiment of all that one is concerned with in the spiritual life.’18 In this non-objective view, God is a symbolic term referring to our highest values and aspirations. Similarly, Gordon Kaufman has written: ‘The symbol "God" presents a focus for orientation which claims to bring true fulfillment and meaning to human life. It sums up, unifies, and represents in a personification what are taken to be the highest and most indispensable human ideals and values.’19

No matter how the concept of God is to be understood, the fact remains that this central religious symbol on which Christian orthodoxy has always depended is today severely eroded. As Catholic theologian Johann-Baptist Metz and Lutheran theologian Jürgen Moltmann have said in their book Faith and the Future, there is ‘a permanent constitutional crisis for theology’ because of ‘a withering of the imagination and a radical renunciation of symbolism and mythology’.20

When we turn to the concept of relativity in Chapter 6 we shall find further reasons why such concepts as Jesus Christ and God have, during the twentieth century, lost their significance as absolutes. The Bible, the church, Jesus Christ and God have all lost their absoluteness in modern times, and the attempt of the guardians of Christian orthodoxy to restore any of them to the pillars from which they have fallen becomes only a new form of idolatry.

Of course there is much more to Christian orthodoxy than these four pillars, but they do support a system of thought which, within the cultural context of its time, was both impressive and convincing. Today, these pillars no longer offer a firm and absolute foundation, and, as a consequence, the system of thought built upon them comes tumbling down like a house of cards. Traditional Christians refuse to accept that orthodoxy is in any kind of crisis. In vindication they point to the large numbers of professing Christians who remain. Is this because Christianity is broader and more flexible than orthodoxy?

Notes:

1. K.S. Latourette, A History of Christianity, p. 1063.

2. W.H. van de Pol, The End of Conventional Christianity, p. 12.

3. See the author’s Faith’s New Age, Chapter 6.

4. Stephen Neill, The Interpretation of the New Testament 1861-1961, p. 12.

5. Peter de Rosa, Jesus who Became the Christ.

6. Albert Schweitzer, The Quest of the Historical Jesus. p. 396.

7. Ibid., p.399.

8. Martin Kähler, The So-called Historical Jesus and the Historic Biblical Christ, p. 65. For a fuller account, see pp. 42-71.

9. Ibid., p. 63 (italics added).

10. Ibid., p. 68.

11. Hans Werner Bartsch (ed.), trans. Reginald Fuller, Kerygma and Myth.

12. James M. Robinson, A New Quest of the Historical Jesus.

13. Marcus J. Borg (ed.),Jesus at 2000.

14. Robert Funk, Honest to Jesus, The Five Gospels: The Search for the Authentic Words of Jesus, and The Acts of Jesus: The Search for the Authentic Deeds of Jesus.

15. John Calvin, Institutions of the Christian Religion, Book I, Chapter 3, para. 1.

16. Paul Tillich, Systematic Theology, Vol. 1, pp. 17, 181, and other writings.

17. Paul Tillich, The Courage to Be, p. 176.

18. Don Cupitt, Taking Leave of God, p. 166.

19. Gordon Kaufman, In Face of Mystery, p. 311.

20. Johann-Baptist Metz and Jürgen Moltmann, Faith and the Future, p. 31.

Chapter 2: The Decline of Christian Civilization

In 1940 Winston Churchill declared: ‘The Battle of Britain is about to begin. Upon this battle depends the survival of Christian civilization.’ Scottish theologian John Baillie responded in the question posed by his Riddell Lectures of 1945: What is Christian Civilization? He claimed that the essential element of Christian civilization is that the population as a whole believes what the church teaches.1

But how long is it since that could be said of the so-called Christian countries? In 1946 Christopher Dawson, a notable Roman Catholic historian of western culture, wrote:

Today Christendom no longer exists and we are moving towards a world in which the Christian peoples or the peoples that have formerly been Christian will be a minority. . . We no longer have any solid grounds for believing that the post-Christian era is likely to realize any of the humanitarian utopias in which the idealists of the nineteenth century put their faith.2

A similar judgement made in 1922 by German philosopher and historian Oswald Spengler had shocked readers. In his book The Decline of the West, Spengler argued that all cultures pass through a life cycle, and that western civilization was already in unavoidable decline. The time had passed when spiritual forces and values were determining the character of the western world; a new era had begun in which the scholar, the artist, the seer and the saint were being replaced by the soldier, the engineer and the politician, resulting in a technical civilization which was no longer Christendom.

Churchill’s solemn warning at a critical point in European history only drew widespread attention to what was already happening. The survival of Christian civilization did not depend on who won the Battle of Britain; it had already ceased to exist. World War II was itself a sign that Christian civilization was in an advanced form of disintegration. Dissolution has not come about as a direct result of enemy military action, Nazi or otherwise, but as a result of forces of quite a different order. And to borrow a phrase from the nursery rhyme, neither all the king’s horses nor all the king’s men can possibly put Christian civilization back together again.

We need to examine what we mean by Christian civilization, or ‘Christendom’, a word often used as a synonym. The term Christendom was coined to name the domain or realm where Christ was believed to rule. It may be defined as that society, with its own geographical area, which was subject to the rule of Christ, and whose culture and way of life had become so permeated and shaped by Christian beliefs and values as to form a cohesive whole. Christopher Dawson, in his book The Formation of Christendom, offers this description:

A culture and its language together form an autonomous world of meaning and existence which is indeed the only world of which the individual is conscious. It is man-made in the sense that it is the product of man’s creativity and his power of symbolic communication. But the individual is not aware of this, since both culture and language are unconscious processes in which men are immersed from their earliest infancy and on which this earliest social and intellectual activity is based.3

In the case of Christendom, the ‘autonomous world of meaning and existence’ was supplied by the complex of myths, goals and values of the religious tradition we now call Christianity. Both terms are relatively modern, first used (synonymously) in the seventeenth century. The reason why both these terms came into use comparatively recently is that when one lives within a culture like Christendom, there is no call to give it a name; as Dawson observed, it is ‘the only world of which the individual is conscious’. The practice of giving names to religious traditions is likewise a modem phenomenon, as Wilfred Cantwell Smith has pointed out,4 and it derives from the growing awareness of other cultures and civilizations. As soon as one feels the need to name one’s own culture or religion, one is no longer living wholly within its horizon; rather, one has taken, at least in imagination, the first tentative step outside it. This process of looking at one’s world and culture more objectively and analytically has made it necessary for us, first to create such terms as Christianity and Christendom, and, more recently, to distinguish between the two. Douglas Hall makes this distinction clearly in the title of his book, The End of Christendom and the Future of Christianity. It may be true that Christendom no longer exists but that Christianity, in a variety of forms, is still very much alive. In this chapter we shall examine the decline and fall of Christendom, leaving to later chapters the definition and destiny of Christianity.

The two terms Christianity and Christendom have been linked for so long that it was commonly assumed that the two must stand or fall together. Yet Christendom is at least some three to five centuries younger than Christianity and in some ways is a cultural product of Christianity. Christianity is a cultural tradition of religious belief and practice which, by its own reckoning, is 2,000 years old in the year 2000. Beginning as a Jewish sect, formed by the followers of Jesus of Nazareth after his death, the embryonic Christian church eventually broke away from its Jewish beginnings and became a new religion for Gentiles. It has long been independent from, and even antagonistic to, the Judaism which gave it birth and with which it still has so much in common. Within three to four centuries Christianity had outstripped all of its rivals, such as Mithraism and Manichaeism. But, as the modern study of Christian origins has made clear, the idea of establishing a Christian civilization was entirely foreign both to Jesus of Nazareth and to the early Christian movement. It was not until the Roman emperor Constantine decided to adopt Christianity as the new state religion of the empire, shortly after the beginning of the fourth Christian century, that the vital step was taken towards the formation of Christendom.

Even the fall of Rome, although disastrous at first, nurtured the growth of western Christendom eventually. The barbarian invaders were in time converted to Christianity; the bishop of Rome adopted the Roman emperor’s title of Pontifex Maximus; and the church, by stages, inherited the mantle of power which had previously been the Roman emperors in the west. The eastern church, by contrast, always remained more subservient to the emperor at Constantinople and later to the czar in Russia. Only in the west did the church largely fill the power vacuum left by the fall of Rome.

But the formation of Christendom depended on more than imperial authority. In the first few centuries of the Christian era, a synthesis of thought took place between the declining Graeco-Roman culture and the still-evolving system of Christian thought which had burst out of Judaism. This amalgam of Israelite prophetic zeal and the more abstract concepts of Greek philosophy constituted the belief system that provided Christendom with its ‘autonomous world of meaning and existence’. By the end of the first millennium, the newly emerged Christendom had extended its power and cultural influence even further than the boundaries of the former Roman Empire.

In this process Christendom had to defend itself from both external and internal threats. After vanquishing all its earlier rivals, it had to withstand both the intellectual and the military impact of Islam. Islam expanded much faster than Christianity and had established an impressive civilization in only half the time it took Christianity to reach the High Middle Ages. Eastern Christendom bore the brunt of the military and cultural advance of Islam and suffered quite heavy losses. Western Christendom, under Charlemagne’s leadership, just managed to stem the spread of Islam, and later counter-attacked in the Crusades.

Threats just as serious to the vitality of Christendom came from within, from a succession of schismatic and heretical movements. Its unity and catholicity were sorely tested, but Christendom was able to contain these threats, absorb new ideas and knowledge (such as Aristotelianism in the twelfth century) and cater for the whole range of human emotions and intellectual levels. By the High Middle Ages (the twelfth and thirteenth centuries) western Christendom manifested an impressive depth of intellectual culture, and an internal unity. Its confidence is reflected in the building of the great European cathedrals. It is easy to look back nostalgically and ascribe to the Middle Ages a perfection they did not possess. To be hypercritical is just as tempting, for by present moral standards life in the Middle Ages left a great deal to be desired. Nonetheless, it can be claimed that the Christian world of the High Middle Ages attained such a homogeneity of culture, one so permeated by Christian values and beliefs (as then understood), that it can be quite properly referred to as Christendom: that is, a domain or realm where Christ was believed to rule.

This Christendom was such a living, complex unity that it could be likened to an organism, in the way any healthy homogeneous society can be called a social organism. But just as all living organisms have a beginning and an end, going through a life cycle between conception and death, so it is with social organisms. This is why civilizations come and go, as Arnold Toynbee demonstrated in his study of world history. So it has been with Christendom.

Christendom rose Out of the death of the Graeco-Roman civilization and advanced to maturity during the Middle Ages. But that phase of maturity is now long past and Christendom (or Christian civilization) is now facing its demise. And as the dying process of an organism can stretch over some time, so the demise of a social organism which has enjoyed a life span of some 1,500 years may be a lengthy process. At what point are we in that process? Is it premature or a gross exaggeration to assume the demise of Christendom?

In the wake of the colonizing expansion of the European nations, Christianity traveled faster and wider during the nineteenth century than in any previous period. As it spread around the globe, particularly in the American and African continents, Christianity was intent on incorporating the newly colonized areas into Christendom. There still existed in the Christian west a mood of triumphant conquest. By the beginning of the twentieth century all the churches of Europe and North America were heavily involved in what they called Foreign Missions, by which they meant their God-given task of Christianizing the rest of the world. An American Methodist called John R. Mott published a book in 1900 entitled The Evangelization of the World in this Generation, and this became a widely used slogan. It was fully expected that during the course of the twentieth century the whole race of humankind would be incorporated into Christendom. This was to be The Christian Century, as indicated by the journal which took that name.

Yet the twentieth century has witnessed severe and quite unexpected setbacks to the viability of Christendom itself. Firstly, the two greatest wars ever waged by humankind were initiated within Christendom and were largely fought by the so-called Christian nations. Globally, Christianity’s claims to be a harbinger of peace are far from justified. Secondly, the most horrifying act of mass murder or genocide ever perpetrated by humans occurred within a leading Christian nation; and this grew out of an anti-Semitism long present within Christendom. Thirdly, the Christian nations, made economically strong by both their political imperialism and their advanced state of technology, have not only constructed the weapons for nuclear war but also been most to blame for the selfish exploitation of the non-renewable resources of the earth, for the accumulating mass pollution, for the gross interference with the delicate ecology of the planet. All of these together endanger the future not only of humankind but of all earthly life.

The mood in the Christian west at the end of this century is entirely different from the confidence expressed at its beginning. It is hard to believe that, after two millennia, such drastic changes could have occurred in only one century. In relative terms, the collapse of Christendom is happening as fast as the collapse of the communist world following the fall of the Berlin Wall. Christian expansion is still occurring in such places as Africa and South America. But no longer is Christianity triumphantly conquering the world, as it still appeared to be doing at the beginning of this century. The so-called Old World, which set out to incorporate the New World into Christendom, has itself entered what is now called a post-Christian era.

Scottish theologian Ronald Gregor Smith said in 1966: ‘The tide of secularism has swept over the whole of the western world, the world that was once called Christendom, and beyond that it has reached into every land. . . It has flooded over every island and the remotest parts of the world’.5 No longer can it be said that Christian beliefs, values and aspirations are shaping our public life. The so-called Christian countries of the West have become increasingly secularized and no longer see themselves as subject to the rule of Christ. During the course of this century the observance of Sunday and the annual festivals of Whitsunday and Easter have almost disappeared as public Christian festivals. These once holy days have become merely holidays. They emphasize the fact that Christendom within the countries of the west is a mere shadow of its former self. Christendom has virtually ceased to exist.

How has this come about and why? The decline of Christendom in fact began just after the period of its greatest flowering. There is general agreement today that the Renaissance marks the starting point of modern times. (As we shall see in Chapter 4, this needs to be qualified by going back to the influence of the Franciscan philosopher William of Ockham, c.1285-1347). The Renaissance was much more than the rediscovery of the classical texts of ancient Greece and Rome. The study of these texts, written as they were by pagan, pre-Christian authors, led to such a new appreciation of the creative capabilities of humankind in its unredeemed state that it has been called a revolution of consciousness. The leading thinkers of the Renaissance began to look with new eyes on the human condition. Pico della Mirandola (1463-1494) wrote (when only 24) Oration on the Dignity of Man, in which he imagined the Creator saying to Adam something like this: ‘I created you to be neither heavenly nor earthly that you might be free to shape and master yourself. You may descend to be lower than the beasts or you may rise to be as gods. Your growth and development depend on your own free will. You have in you the germs of a universal life.’6

Attention was no longer so focused on other-worldliness; as a result, the natural and material world came to be revalued. Humans came to be seen less as fallen creatures living in a fallen world and more as autonomous, rational beings, capable of choice, of doing good of their own free will, and of creativity. The men of the Renaissance began to turn away from the eternal and the absolute (as commonly conceived) and to concern themselves with the world of nature and of internal human experience. The Greek and Latin classics they studied became known as the humanities, for in Cicero’s day humanitas meant the education of humankind. So the leading figures of the Renaissance commended the study of the classics as the means of nurturing our human potential. Because they placed such emphasis on the value and dignity of the human condition, their philosophy came to be known as humanism. The humanists eagerly sought the rebirth of the free and creative human spirit which they believed to have flowered in the ancient world and to have been lost in the Middle Ages. Thus the Renaissance brought to birth our modern awareness of historical change, our passion for freedom, our respect for human reason, and our eagerness to investigate the natural world and to extend our knowledge.

The Renaissance humanists were not atheistic or anti-Christian, as some modern humanists are. They were aware of the need for spiritual renewal within the church, but they sought to promote reform from within. They were critical of the papacy, which had been passing through its most dismal period, but they saw no reason to doubt Christian institutions as such. In 1381 an association was founded, open to both clerics and laymen, called the Brethren of the Common Life.7 It set out to counter decadence in the church, to promote spirituality and to provide general education. One of their students was Nicholas of Cusa (c.1400-1464) who has often been regarded as the model of a ‘Renaissance man’; he became a cardinal, yet he was also a mathematician, a diagnostic physician, and an experimental scientist. Erasmus (c.1466 –1536) spent the first 30 years of his life in the schools of the Brethren of the Common Life before becoming the greatest of the humanist scholars. He wrote merciless satires against the church, and for this he lived a somewhat uneasy existence. Yet when the Protestant Reformation took place, he did not join it, partly because he was a man of moderate and tolerant temperament who abhorred violence, and partly because he was repelled by the anti-humanist element in so much of the Protestantism of his time.

Nevertheless, the Protestant Reformation was the logical outcome of the Renaissance, both because of the new sense of human freedom it created and because the resurgent interest in the study of biblical texts gave the Reformers the authority and courage to challenge the church hierarchy. It was said that Erasmus laid the egg which Martin Luther hatched. In seeing the Protestant Reformation as an assault on Christendom from within, the Catholics were partly right: the Protestant Reformation was destined to sound the death-knell to Christendom.

Protestants, of course, have long seen the Protestant Reformation as the resurrection of Christianity to new life, after its burial beneath the mass of superstition accumulated during the late Middle Ages. They too were partly right; even Catholic scholars now agree that the mediaeval church was in dire need of reformation. However, the Protestant Reformation was only partially successful, and there were heavy costs -- the fragmentation of Christendom and, ultimately, its dissolution. From this time onwards, Christendom increasingly lost both its unity and its all-embracing catholicity.

The splintering of the western church into denominations was much more serious than the schism between the eastern and western church that had taken place in 1054. The east-west schism left Christendom intact but in two sections, geographically separate from each other. The Protestant-Catholic division, however, split communities into warring Christian factions, creating conflict akin to that which smolders to this day in Northern Ireland. Christians came to spend much of their activity in countering one another.

The Protestant movement never achieved any organizational unity. The attempt to unite the Lutheran and Calvinist traditions at the Colloquy of Marburg in 1529 foundered on the doctrine of the Eucharist. Failure also met the attempt to unite the churches of the British Isles at the Westminster Assembly of 1643. Subsequently the Protestant denominations splintered even further, so that catholicity came to be replaced by sectarianism. The fragmentation of the church meant that western Christendom no longer had a unified organizational structure. What survived of the former Christendom was the fact that all (except the Jews) stoutly professed their commitment to the Christian tradition and their faith in Jesus Christ as Lord and Savior, but they differed strongly about what this entailed. There was no longer consensus on the doctrines of the church, the priesthood, the sacraments and the mode of Christian salvation. And because they no longer had a common personal head, Christians came to depend more and more upon the civil authority of the emerging nation states. Although these states saw themselves quite clearly as Christian, this process of state formation was the first step towards the emergence of the modern secular (or religiously neutral) state.

The fragmentation of Christendom which took place from the Reformation onwards did not lead directly to any questioning of the basic Christian beliefs; on the contrary, sectarian strife fostered so much bitterness and even fanaticism that the humanistic philosophy of the Renaissance was temporarily submerged. Not until the Enlightenment of the eighteenth century did it surface again.

It was partly because of sectarian strife that the humanist concern with the importance of human reason reappeared. Catholics were content to submit themselves to the authority of the Pope. Protestants turned to the Bible as to a ‘paper Pope’ but could not always agree on what it required them to do and believe. Christendom now lacked one central authority to which appeal could be made. For an honest and inquiring person, this posed an enigma. Since Catholic and Protestant could not both be right, by what criterion was one to make a judgement about their competing claims? And further, since one of them was clearly wrong, how could one be sure they were not both wrong? The first to give expression to this enigma was the French humanist Montaigne (1533-92), though he himself chose to remain a practicing Catholic.

Thus, on the margins between the surviving fragments of Christendom, more searching questions were being asked. As the Reformers had earlier appealed to the Bible in order to challenge the authority of the Pope, so human reason emerged as the only criterion by which one could challenge the different seats of authority acknowledged by Catholic and Protestant. The Age of Reason began to supersede the Ages of Faith. The leading thinkers of the Enlightenment (as the age came to be called) began to submit the basic tenets of Christianity to rational examination and produced what was called ‘natural religion’. They did not see themselves as atheistic or anti-Christian, but were intent on taking the Protestant Reformation to what they believed was its logical conclusion by removing the supernatural additions. The titles of their books indicate their intent: Reasonableness of Christianity as Delivered in the Scriptures (John Locke, 1695), Christianity Not Mysterious (John Toland, 1696) and Christianity as Old as the Creation (Matthew Tindal, 1730).

In the course of this examination of Christianity, there emerged the first signs of modern unbelief, as exemplified in philosopher David Hume (1711-1776), the father of modern skepticism. As Roman Catholic scholar Johann-Baptist Metz observed in 1995, the processes of the eighteenth-century Enlightenment resulted in the secularization of religion, the growing freedom of people to think and speak for themselves, the demythologizing of the powerful religious images and the end of theology’s cognitive innocence. This led to a profound crisis for Christian thought and teachings. Christendom had been a cohesive whole in which Christian beliefs and values shaped both public and private life. In such a world, it did not matter much whether people talked about Christendom, Christian civilization or simply Christianity. But when the cohesive whole began to disintegrate, it became necessary to distinguish Christendom from Christianity. From the Enlightenment onwards, public life became increasingly emancipated from the fairly rigid social structures that Christianity had shaped over the centuries. Christianity continued to shape people’s private lives, however, and would do so for at least another 200 years. The decline of Christendom meant that public life was becoming increasingly secularized and Christianity was being privatized.

This has not been all bad. In so far as people had private lives within the former Christendom, they were largely contained by uniform and prescribed patterns; the opportunities to develop one’s unique individuality were strictly limited. In Christendom there were standard ways not only of being a Christian but of being truly human. These differed somewhat for those in a monastic order, for the clergy, and for the lay people; Christian duties also differed for the aristocrat and the peasant. But the patterns were firmly set out, and peer pressure, as much as the authority of the church, ensured that this remained so. By contrast, post-Christian secular society provides much more freedom for individuals to be themselves. We now regard this as a value that we would be reluctant to surrender. In post-Christian society there are not only many ways of being Christian but also many more ways of being human. And the tolerance we have inherited from the Enlightenment enables us to accept this diversity and even to celebrate it.

To appreciate how the Enlightenment marked the end for Christendom, we must look at the new kind of culture that then began to emerge. In the progression from the mediaeval world to the modern world, there have been three main steps -- the Renaissance, the Protestant Reformation and the Enlightenment. If the modern world was conceived in the Renaissance, it came to birth at the Reformation and entered adolescence at the Enlightenment. As Dietrich Bonhoeffer said, we are now living through humankind’s ‘coming of age’.9

Many of the values and interests which we take for granted in the modern world have been widespread only since the Enlightenment. Take, for example, our passion for freedom of speech and expression. In pre-Enlightenment Christendom, one was expected to think in ways that were consistent with what the church taught; it was heresy to think or express thoughts at variance with orthodoxy or ‘right opinion’. This is why the word ‘freethinker’ (coined during the Enlightenment) gathered the sinister overtones which it has to this day, even though it meant at first what it literally said. But with freedom to think came freedom to explore new ideas and new knowledge. Already at the Renaissance, scholars had begun to pore over the ancient writings which included the original Greek and Hebrew texts of the Bible. From the Enlightenment scholars began to ask questions about the history of those texts and to develop a greater historical awareness. A static view of reality began to give way to the acknowledgement of change, and then to an evolving view of reality. The whole mental picture of the world we live in began to alter.

Thus the ‘Christian west’ is today very different from the ‘autonomous world of meaning and existence’, which it was when known as Christendom. Christendom is no more, and the so-called ‘Christian west’ is only the shell of its former self. The shell remains clearly visible in many structures, both physical and social, but they are no longer parts of a living whole. With the demise of Christendom or Christian civilization in the western world, what kind of civilization is left? Is it only the ghostly remnant of Christian civilization? Does it have any substance of its own which will enable it to survive, or is it living on past capital? Alvin Toffler, author of Future Shock, recently said: We are witnessing the sudden eruption of a new civilization on the planet’.10 This is certainly not the Christian civilization which Christians expected and hoped for at the beginning of the twentieth century. Yet it manifests many of the values and ideals of western Christendom (partly because this new civilization was fostered by the spread of European culture). Is Christianity thus continuing, rather like a leaven in the new global civilization, as many Christians would like to believe? Or is Christianity destined for the same fate as Christendom? To this question we now turn.

Notes:

1. John Baillie, What is Christian Civilization?, p.44.

2. Christopher Dawson, The Making of Europe, pp. vii-viii.

3. Christopher Dawson, The Formation of Christendom, p. 35.

4. Wilfred Cantwell Smith, The Meaning and End of Religion.

5. Ronald Gregor Smith, Secular Christianity, p. 138

6. Jacob Burckhardt, The Civilization of the Renaissance in Italy p. 185

7. See also Chapter 4.

8. Johann-Baptist Metz and Jürgen Moltmann, Faith and the Future, p. 31.

9. Dietrich Bonhoeffer, Letters & Papers from Prison, pp. 325-27.

10. Alvin and Heidi Toffler, War and Anti-war: Survival at the Dawn of the 21st Century, p. 242.

Chapter 1: The End of the Millennium

The year we call 2000 is a human convention created by western culture, projected upon the planet as a convenient way of measuring historical time. In the natural world of celestial bodies, the year 2000 has no actual existence, let alone significance. Not one of the astronomical cycles within our solar system (giving us days, months and years) is a simple multiple of the others, nor are any of them naturally divisible by thousands.

The factors leading to the convention now called ‘the millennium’ are various. In the first place, our ancient ancestors created a 10-digit numbering system only because we happen to have 10 fingers. There are plenty of other possible systems, some of them now considered better than our decimal system. Even to speak of a millennium, or 1,000 years, is to impose one particular numbering system on the measurement of time -- and a humanly invented one at that.

Secondly, the calendar which numbers the years from the supposed birth year of Jesus of Nazareth was established by Christians, and is peculiar to the Christian tradition. It has never been shared by other cultures. The Jews, for example, number their years from the first day of the creation of the world, as calculated from the Books of Moses. This they determined to be 3,760 years before the beginning of the Christian era. The year 2000 AD. for Christians will be the year 5760 - 5761 for Jews. The Jewish year does not begin on 1 January but on Rosh Hashanah, which varies slightly from year to year, according to the cycles of the moon.)

The Muslim calendar differs further from the Christian calendar. The Islamic year consists of 12 exact lunar months (or cycles of the moon around the earth) and hence is shorter, by about 11 days, than our solar year (the time of a cycle of the earth around the sun). The Islamic calendar starts neither from the birth of Jesus nor from the birth of Muhammad but from the Hijrah (‘flight’) of Muhammad from Mecca to Medina, which resulted in the establishment of the first Islamic state.

Thus the Christian calendar is far from being observed worldwide. Only about one fifth of humankind is even nominally Christian. It is chauvinistic of Christians to assume that the other four-fifths of the human race should have any special interest in the year 2000 and the transition it marks from the second to the third millennium.

The year 2000 has, moreover, become suspect even for Christians. Although the calendar starts from the supposed birth of Jesus Christ, no one really knows what year Jesus was born. The year 4 BC is now seen as more likely. (That is why a group of leading New Testament scholars, known as the Jesus Seminar, named their 1996 conference Jesus at 2000.) 1

So when we refer to the end of the second millennium and the beginning of a third millennium we are talking only about a human construct. And if we pay too much attention to the year 2000 we are in danger of being deceived by our own cultural creation, like a spider entangled in its own web. People, who make plans to travel the world in order to view the dawning of the new millennium, are caught in a similar sort of web. For they are observing yet another convention -- the now universal practice of imposing on the globe a meridian passing through Greenwich, a line arbitrarily chosen by the sea-faring people of Britain. Sunrise in the Chatham Islands off New Zealand’s east coast on January 1, 2000, has no meaning in the natural world. By endowing such ‘millennial’ events with some kind of absolute significance, we humans merely dupe ourselves with our own creations.

Any meaning associated with the year 2000 rests solely on certain religious beliefs held exclusively by Christians. For Christians the year of the birth of Jesus Christ was a year of cosmic significance: one by which, as they believed, the divine Creator of the universe and Lord of history had inaugurated an entirely new era for the world. It marked the moment when God had chosen to cut human history in two. Everything which came before that event was to be measured backwards, occurring, as they later said, so many years Before Christ (or BC). Everything which came afterwards was in a particular year of the Lord, Anno Domini (or AD).

Even in this more secular age, people in the western world tacitly, even if unintentionally, acknowledge Christian faith when they refer to a particular year by number, since the words Anno Domini, strictly speaking, are always implied. Some secular protest in the west against the continuing use of the traditional calendar might perhaps have been expected. Indeed, when Auguste Comte (1798-1857) promoted his new Religion of Humanity in the mid nineteenth century, he introduced a calendar of 13 months, named after such people as Moses, Aristotle, Caesar, Shakespeare and Descartes. The year of the French Revolution was Year One of his Positivist Calendar.

The year we call 2000 CE is not only religiously based, and hence culturally relative, but it is also based on doubtful calculations. It was a Scythian Christian monk called Dionysius Exiguus (Dennis the Short), living in Rome in the sixth century, who first suggested that our years should be dated from the birth of Jesus of Nazareth. In 525 AD, at the request of Pope St John I, Dionysius prepared a new schedule for the calculation of Easter; almost as an addendum, he suggested a reform which was little noticed at the time. He discarded the current Alexandrian practice of dating the years from the beginning of the rule of Diocletian (284 AD) on the grounds that it perpetuated the name of the Great Persecutor, and proposed that the years be numbered from the ‘Incarnation of the Lord Jesus Christ’. (Coptic Christians retain the Alexandrian calendar to this day on the grounds that it honors the martyrs put to death by Diocletian.)

Dionysius argued that, just as the Romans had come to regard the foundation of Rome as the beginning of the civilization of ancient Rome, so the coming of Jesus Christ into the world marked the beginning of a new era in the history of the world -- the Christian era. He placed that event in the year 753 of the Roman calendar. By then Christians were already celebrating 25 March (nine calendar months before Christmas Day) as the Feast Day of the Annunciation to the Virgin Mary. So Dionysius put the New Year’s Day of the first year in his new calendar on that date, believing it to mark the time of Jesus’ conception in the womb of his mother Mary. (Only in 1582 was New Year’s Day restored by Pope Gregory XIII to January, where Julius Caesar had earlier placed it.)

The acceptance of the ‘Christian era’ of Dionysius spread because of the use made of his new Easter tables. It was promoted by the influential ‘Doctor of the Church’, Isidore of Seville (c.560-636). In England the Christian era was adopted at the Synod of Whitby in 664, but it did not become general in Europe until the eleventh century, and in the Greek world not until the fifteenth.

The Christian era was perfectly consistent with the way Christians had come to understand the world. For them, the birth of Jesus Christ was a turning point in world history, a cosmic event just as basic to the universe as the creation of the earth, sun and moon, and the creation of humankind. Did not the biblical story of the birth of Jesus report that a new star appeared in the sky to mark the event, and that this enabled the magi of the east to find their way to Bethlehem? Was not Jesus Christ referred to in the New Testament as the new Adam? Such things made it clear to them that the coming of Jesus Christ was on a par with the creation of humankind.

This conviction about the central place of Christ in human history seemed convincing at that time and came to be universally accepted throughout the Christian world until less than 200 years ago. As Christians approached the beginning of the second Christian millennium (which, according to some authorities, they more correctly calculated to be the first day of 1001 AD), a huge crowd gathered in Rome, expecting the end of the world, and others flocked to Palestine to witness the Advent of the Savior as the Last Trump sounded. There was not even a hint in those days that the Christian picture of the universe would not stand up to careful scrutiny.

As we come to the end of the second millennium, however, we are aware of the subjective foundations of the Christian calendar, and of the legendary character of the data used by Dionysius Exiguus. The date of 25 December for celebrating the birthday of Jesus cannot be traced back with any certainty beyond 336 AD, and has no historical connection with Jesus. It probably resulted from the Christianization of the Mithraic festival, held at the winter solstice, which celebrated the rebirth of the unconquerable sun, Invictus. (Mithraism was the chief religious rival to Christianity in the pre-Constantinian days of Rome.) The Christian festival of the Annunciation on 25 March was arrived at simply by going back nine months from the supposed birthday of Jesus.

In fact we know neither the day on which Jesus was born, nor the year. Just how Dionysius calculated the year of Jesus’ birth we cannot be sure, but it is not consistent with what we find in the New Testament. According to the Gospel of Matthew, Jesus was born ‘in the days of Herod the King’. Since we know that Herod the Great died in what would now be reckoned the year 4 BC, Jesus was perhaps born no later than that year. Luke also assigns the birth of Jesus to the days of Herod but goes on to associate it with an enrollment ordered by Caesar Augustus when Quirinius was governor of Syria. But the rule of Quirinius is now reckoned to be about 6-9 AD, more than a decade after the death of Herod. Thus Luke’s dates, and other clues in the New Testament, are not at all reliable, partly because they were written nearly a century after the event and partly because they were determined by religious interests rather than by a concern for historical accuracy. Modern attempts to date the birth of Jesus by astronomical calculations concerning the brightness of a star are not reliable either, since the visit of Wise Men from the east is almost certainly legendary. Our traditional dating of the years thus rests upon quite late and unhistorical traditions, and in no sense does it mark a supposed significant event with historical accuracy.

Today, even the significance of the birth of Jesus is open to question. Jesus may still be regarded as a wise and innovative teacher who has exerted a great deal of influence during the course of the last 20 centuries, but he is now coming to be seen as one great teacher among others, rather than the incarnation of the one and only God, and the absolute Savior of all humankind. The date of his birth, if we could know it, would still hold some significance for Christians, but it has lost its universal meaning as a turning point in history.

Within the Christian tradition, however, the millennium is also regarded as significant for reasons quite other than the supposed birth-date of Jesus Christ. Some of this has to do with a numbering system based on the numeral 10. We all use the figures 10, 100, 1,000, and 10,000 as handy approximations for particular periods of time and these all occur frequently in the Bible, as they do in other religious traditions. The millennium, or a period of 1,000 years, can thus be seen as a convenient way of delineating a substantial piece of time. But the millennium has come to mean much more than that, largely because the Judeo-Christian tradition believed that God not only created the world but was unfolding its history step by step according to a divine plan.

The twentieth-century philosopher of history, R.G. Collingwood, judged the early Christians to be the first to conceive of writing a universal history of the world going back to the origin of humankind, and instanced Eusebius of Caesarea (c.260-340). But, long before Eusebius, the ancient Israelite scholars had laid sketch plans for such a universal history in their Hebrew Scriptures (the Christian Old Testament), and this was carried further by the Jewish scholar Josephus (37 -100 AD) in his book, Antiquities of the Jews.

These early Israelite traditions (compiled in their present form during the period of the Babylonian Exile) were set out in eras divided by strategic turning points, such as the Great Flood (which led to a new beginning for humankind), the Tower of Babel (which led to the dispersion of peoples), the Abrahamic migration (by which the offspring of Abraham became the ‘chosen people’), the Exodus from Egypt (which led to the possession of the Promised Land) and, finally, the establishment of the Kingdom of David. It was in such ways that figures and events became milestones on the path of a universal history which revealed the supposed plan of divine salvation. History was viewed through theological spectacles, effectively imposing a supposed divine plan on the sequence of historical events.

Belief in such a plan motivated not only the calculations of Dionysius in 525 but also those of Bishop Ussher (1581-1656) a thousand years later. Ussher, noting the time of Herod’s death, decided that Dionysius must have been in error and placed the birth of Jesus in the year 4 BC. Already, the Venerable Bede (c.673 -735) had carefully calculated from the Hebrew Bible that the Creation occurred 3,952 years before the birth of Christ. Ussher, influenced by the fascination with millennia which had become an integral part of the Christian schema, had little difficulty in making a few minor corrections to Bede’s calculations. He concluded that the Creation occurred exactly 4,000 years before the birth of Christ, in 4004 BC -- at noon on 23 October. (His calculations were still being printed in the text of the King James Bible well into the twentieth century). Since, according to the Bible, the world had been created in six days, and 1,000 years are but as a day for God, Ussher believed that the world would last 6,000 years. (This conclusion had already been reached by the ancient Christian scholar Lactantius, c.240-320). The building of Solomon’s temple was finished at the halfway mark, 3,000 years after Creation. Christ came 4,000 years after Creation; this left a further 2,000 years to run. By his calculation the world should have ended in 1996.

This brings us to an even more important reason for the current interest in the new millennium, one which has very ancient roots. The special significance attached to a period of 1,000 years eventually became so widespread that the term millennium now appears in the Oxford English Dictionary as ‘a period of happiness and benign government’. Such a meaning can be traced back directly to a few verses in the biblical book of Revelation (Chapter 20:1-7). The writer, purporting to reveal the future of the world, speaks of a period of 1000 years in which Satan will be held chained to a bottomless pit. During this period Christ, along with all those martyred for their faith in him, will reign, unhindered by the evil designs of Satan. But after the 1,000 years Satan will be released from his prison, and the nations from the four corners of the earth will engage in a gigantic conflict until fire comes down from heaven. Satan, Death, Hades and all people whose names are not found written in the Book of Life will be thrown into a lake of fire.

The writer of this strange apocalyptic book regarded the millennium as an interlude before the final cosmic battle -- an interlude in which peace and joy reign supreme and all evil is completely, if temporarily, suppressed. In his 1957 book, The Pursuit of the Millennium, Norman Cohn examined the influence of this belief in European history. He found that, between the end of the eleventh century and the first half of the sixteenth century, there was a succession of highly emotional mass movements, motivated by the desire of various groups to see the material conditions of their lives greatly improved. These movements varied in tone from violent aggression to mild pacifism, but their motivation could all be traced back to Jewish and Christian prophecies in the ancient world, of which the few verses about the millennium in Revelation are the best example. They are therefore often called chiliastic movements (chilias being the Greek word for ‘thousand’).

Cohn was interested in the possible similarities between these chiliastic movements and the twentieth-century totalitarian movements of communism and German national socialism. These latter, in a far more secular age, lacked the supernatural elements of the earlier more apocalyptic movements (which reflected the mediaeval beliefs of the time), but they did have strong religious overtones. Each was led to ruthless and irrational behavior because of the eschatological ideals. Communism looked to the imminent coming of the classless society and Nazism was committed to the achievement of the pure Aryan race. Hitler’s Third Reich was intended to last for 1000 years.

But where did the idea of a glorious future age come from? What led to the expectation that there would ‘shortly be a marvelous consummation, when good will be finally victorious over evil and for ever reduce it to nullity; that the human agents of evil will be either physically annihilated or otherwise disposed of; that the elect will thereafter live as a collectivity, unanimous and without conflict, on a transformed and purified earth’?2

Some 20 years later, Cohn published his findings in Cosmos, Chaos and the World to Come: The Ancient Roots of Apocalyptic Faith (1993). Until some time around 1500-1000 BC, peoples as diverse as the Egyptians, Sumerians, Babylonians, Indo-Iranians, Canaanites and pre-exilic Israelites all had myths implying that, in the beginning, the world had been organized and set in order by immutable divine decrees, but that this order was continually under threat from evil and destructive forces. The conflict between the intended orderly cosmos and the ever-threatening chaos was given symbolic expression in a combat myth, in which a young hero god, or divine warrior, was charged by the gods with the task of keeping the forces of chaos at bay, and in return was awarded kingship over the world.3

Cohn contends that the Iranian prophet Zoroaster made a break with that static, yet anxious and unstable world view. (Cohn dates Zoroaster about 1200 BCE, while other scholars date him later, as far down as 588 BCE.) Zoroaster, according to Cohn, introduced a radical re-interpretation of the Iranian version of the combat myth. In Zoroaster’s view the world was not static, but was already moving, through incessant conflict, towards a consummation which would result in a perfect and conflict-free state. This perfect time would come when, in a prodigious final battle, the supreme god and his supernatural allies would defeat the forces of chaos with their human allies, and eliminate them once and for all. From then on, the divinely appointed order would be established for all time. Physical distress and want would become unknown. No enemy would threaten. Within the community of the saved there would be absolute unanimity. In other words, the world would be for ever untroubled and secure.

The impact of Zoroastrianism on Judaism (and hence on Christianity and Islam) is now widely, though not unanimously, accepted. As D.S. Russell noted, ‘The influence of Zoroastrianism, and indeed of the whole Perso-Babylonian culture, is amply illustrated in the writings of the Jewish apocalyptists’.4 There are a number of elements which the Judeo-Christian tradition and Zoroastrianism have in common, which do not appear in Judaism until after it came into contact with Zoroastrianism from 540 BCE onwards. They include the naming of angels (Michael, Raphael and so on); a personal Devil (which Satan later became) with accompanying demons; a Book of Life which records the deeds of people during their lifetime; a coming cosmic conflict in which the forces of evil will be finally overthrown; the separation of the soul from the body at death; a general resurrection and a universal judgement; and an afterlife with rewards and punishments. Of particular interest is the division of time into successive significant periods (usually a millennium in length); in Zoroastrian teaching the world would last for a period of 12 millennia, consisting of four eras of 3,000 years each.

R.C. Zaehner claimed that ‘from the moment that the Jews made contact with the Iranians they took over the typical Zoroastrian doctrine of an individual afterlife in which rewards are to be enjoyed and punishments endured … the idea of a bodily resurrection at the end of time was probably original to Zoroastrianism’.5 Cohn also concluded that the similarities between Zoroastrianism and the ideas found in the Jewish Apocalypses were too remarkable to be explained by coincidence.6

Only some Jewish parties, such as the apocalyptic writers of Daniel, I Enoch and Jubilees, and the Qumran Sect and the Pharisees, embraced some or all of these Zoroastrian notions before the beginning of the Christian era. The Pharisees, practicing a puritan and separatist form of spirituality, accepted the belief in a general resurrection (the more conservative Sadducees rejected it) and they left a huge deposit in subsequent rabbinical Judaism. There were other Jewish groups more marginal to the life of Judaism and these later included the Christians. As Cohn points out, Zoroaster’s view that ‘the time would come when, in a prodigious final battle, the supreme god and his supernatural allies would defeat the forces of chaos and their human allies and eliminate them once and for all’ deeply influenced certain Jewish groups which, in turn, ‘influenced the Jesus sect, with incalculable consequences’.7

Since the 1890s New Testament scholars have been rediscovering the importance of apocalyptic literature among Jews and Christians in the ancient world, represented in the books referred to as Apocalypses, which offer visions, revelations of the future, and other divine mysteries. The term ‘apocalyptic’ refers to a religious outlook which contrasts the present, temporary and perishable age with a new age which is to be imperishable and eternal; it contains the belief that this new age is of a transcendent order which, in the imminent future, will suddenly break in from beyond by divine intervention. It is now acknowledged that much of the New Testament was written within a context of apocalyptic or eschatological thought, in which the early Christian movement looked towards the imminent end (eschaton) of the present age and the breaking in of the new age (the Kingdom of God).

There is much evidence of these views in the Gospels, where they are placed by the Evangelists in the mouth of Jesus, in Mark 12, Matthew 24 and Luke 21. While these are the longest and most explicit examples of apocalyptic thinking, there are many shorter examples. Scholars still debate the extent to which Jesus shared these apocalyptic beliefs and intended his teaching about the coming Kingdom of God to be interpreted in their light. Certainly apocalyptic beliefs were dominant among the early Christians and pervade the New Testament. They are clearly present in Thessalonians 4, 5, the earliest of Paul’s letters.

The Apocalypse of John is not so out of keeping with the rest of the New Testament as much later readers were inclined to think. Indeed, by the second century it had become the most frequently quoted New Testament book. Although this was the only Christian apocalypse to gain inclusion in the New Testament canon, there were other early favorites such as the Apocalypse of Peter and the Shepherd of Hermas. Further, the Jewish Apocalypses remained popular among Christians for several centuries -- and it was Christians and not Jews who were responsible for their survival.

Just as the Apocalypse of John made the specific reference to the miraculous character of the millennium, so Papias (60-130 CE), bishop of Hierapolis in Asia Minor, expressed apocalyptic and millenarian convictions. He held that, in the 1,000 years after the general resurrection soon to come, Christ would establish his Kingdom on earth in material form; he believed that Jesus had described the coming millennium thus: ‘The days will come in which vines shall appear, having each 10,000 shoots and on every shoot 10,000 twigs, and on each twig 10,000 stems, and on every stem 10,000 bunches, and in every bunch 10,000 grapes, and every grape will give 25 metretes of wine’.8

By virtue of being canonized in Holy Scripture, the many pieces of apocalyptic writing in the New Testament, including the Apocalypse of John, have become permanently associated with the cardinal Christian doctrine of the expectation of the Second Coming of Christ, as expressed in the Creed: ‘He shall come again with glory to judge both the living and the dead and his kingdom shall have no end.’

It has been mainly at times of cultural change and social crisis, however, that apocalyptic beliefs and millennialism have been revived in Christian thought and practice. At other times this kind of thinking has been marginalized and strongly discouraged. The reason is simple. The apocalyptic has a special appeal to the downtrodden, the persecuted, the dispossessed and the wretched, for whom the idea of a sudden period of bliss brings new hope. The Jewish Apocalypses were written to bring hope and comfort to the Jews suffering domination by the Greeks and Romans. The Christian Apocalypses flourished in a time of the Roman persecution of the Christians. But in more settled times, and particularly after the Constantinian adoption of Christianity as the state religion, the church looked with disfavor on any movement which threatened the status quo.

There was an outbreak of apocalyptic thought and activity in the late second century, led by the prophet Montanus and his women supporters Prisca and Maximilla. Even the great Tertullian (c.160 -- c.22o), second only to Augustine among the early theologians of the western church, embraced Montanism. Over the course of history the years 500, 1000, 1260, 1420, 1533 (the supposed fifteenth centennial of Christ’s death), 1843, 1844, 1845, 1847, 1851 and 1914 have in turn been awaited expectantly as the beginning of the millennium.

Amid the unsettled times of the Protestant Reformation millennial expectations were championed by the radical Protestant reformer Thomas Münzer (c.1490-1525), who became the leader of the Peasants’ Revolt. There were further apocalyptic outbreaks in the nineteenth century and these gave rise to new sects, such as the Millerites, the Seventh Day Adventists and Jehovah’s Witnesses. William Miller, the founder of the Millerites, had a following of more than 100,000 and announced that Christ would return and engulf the world in a conflagration in 1843. The millenialist sects have fallen into two groups, differing in their conviction as to whether Christ will return before the millennium (the pre-millenialists) or after the millennium (the post-millenialists). It is only to be expected that the approach of the year 2000 would once again revitalize millenarian expectations. In the 1970s the evangelical preacher Hal Lindsey wrote The Late Great Planet Earth, which sold 25 million copies around the world. In this and later books he prophesied the imminence of the battle of Armageddon, which would start in a war between Israel and the Arab peoples and end with the destruction of all the major cities before the Advent of Christ and the coming of a new world. The same message was being preached by the American tele-evangelists and eagerly accepted by millions of viewers, including, it has been claimed, Ronald Reagan and some of his supporters.

In the expansion of Christianity around the world in the wake of European colonialism, new converts from the various indigenous peoples have not infrequently fastened on the apocalyptic component and blended it with their own cultural beliefs to create fresh millennial movements which offer their people hope of deliverance from imperialistic conquest and the arrival of a new age of bliss. Mainline Christian denominations have distanced themselves from millenarian and apocalyptic thought, finding its presence in the New Testament something of an embarrassment. But, as we shall see in later chapters, the mainline denominations have been declining. The conservative and more fundamentalist denominations are growing and these usually set much store on millennialism and the Second Coming of Christ. Ironically, although they commonly pride themselves on being the guardians of a pure and unadulterated Christianity, they unknowingly reflect the long-term influence of the Iranian prophet Zoroaster.

This serves to illustrate the first theme of this chapter: that cultural influences are often at work in ways that are both unexpected and undetected. This, we have seen, is true of the Christian calendar and the millennium: they are simply cultural conventions, dependent on the Christian civilization that produced them.

But what is the future of Christian civilization? There are many signs that all is not well. To this topic we now turn, for, just as the ancient Roman calendar disappeared with the decay of Roman civilization, the decline of Christian civilization may ultimately lead to the adoption of a new and non-Christian calendar for universal use around the globe. If this is so, then we could be living through the end of the millennium in more ways than one. The year 2000 AD could be marking the end of the last Christian millennium, and the end to all Christian millennia.

 

Notes:

1. Marcus J. Borg (ed.), Jesus at 2000.

2. Norman Cohn, Cosmos, Chaos and the World to Come; The Ancient Roots of Apocalyptic Faith, Foreword.

3. Ibid., p. 227

4. D.S. Russell, Between the Testaments, p. 22.

5. RC. Zaehner, The Dawn and Twilight of Zoroastrianism, p. 58.

6. Cohn, op. cit., p. 222.

7. Ibid., pp. 227-28.

8. Ibid., p. 198.

Introduction

This book had its beginnings in the English summer of 1997, during a three-month spell in Oxford. Since the idea of the year 2000 CE was already creating some excitement, I began to ponder the questions posed by the century’s end. And so, in the Bodleian Library, I read around the theme of the millennium.

What, after all, does it commemorate? If we really wished to celebrate the two-thousandth anniversary of the birth of Jesus Christ, the year 1996 would be nearer the mark (though this is only an intelligent guess). Even by traditional reckoning, the start of the new millennium should be January 2001, as the Royal Greenwich Observatory has declared and the title of the popular film, 2001: A Space Odyssey, suggests.

The Christian calendar, long believed to be commemorating a divine event, is not only the product of the creative human imagination but also reflects the errors so often made by fallible human beings. So the year 2000 is but a human convention which rests both on a miscalculation and on convictions which have now become outmoded. Yet Mikhail Gorbachev, himself instrumental in altering the course of events in the former Soviet Union, was probably right when he said that humankind stands today at a watershed in its history. Thus, for reasons quite different from those associated with the origin and meaning of the western calendar, the year 2000 appears to mark a significant turning point in human history.

Some look forward to the new millennium with keen anticipation; others approach it with foreboding. The optimists see it as a golden age of unprecedented prosperity sustained by expanding knowledge and technology. The pessimists fear that the third millennium, perhaps even the first century of it, may bring crises of colossal proportions for the human species.

This striking ambivalence is sufficient to make us pause and take stock of what lies ahead. It is not only the second millennium which is coming to an end, but also much of what we have taken for granted in the past. We are living in a whole series of ‘end-times’. People who have already distanced themselves from traditional Christian beliefs may well question the need for a book about the end of conventional Christianity, when this seems self-evident. However, the end of the Christian era is likely to have far-reaching consequences, and we need to understand the reasons for its demise. Many who have abandoned all commitment to the tradition overlook the fact that many aspects of Christian civilization are also under threat in the post-Christian age. The end-times of the late twentieth century are on a global scale, and it is not only Christians who are at risk.

Most people who are still practicing Christians, in one form or another, will not want to contemplate the possibility of the Christian era ending. Yet, in turning to the Bible to defend the traditional Christian teaching. they will find that concern with the end-times is a dominant theme of the Bible itself. The anticipation of the end of the age, along with the hope for a new world to come, has had a long history in Judeo-Christian tradition. The Israelite prophets used words like these:

It shall come to pass in the last days that the mountain on which God dwells will be established as the highest of all mountains. All the nations will stream towards it to learn from God how to walk in the paths designed by him. He will establish justice among the nations, so that they beat their swords into ploughshares and their spears into pruning hooks and abandon war for ever. Thereafter they will sit by their vines and under their fig trees fearful of no one.1

Yet the prophets also warned that the God who had created the world could end it by destruction. About 620 BCE the prophet Jeremiah. observing the signs that the Kingdom of David was about to be swallowed up by invasion from the east, spoke of the end not just of the people of Israel but of the whole earth in these alarming words:

I have seen what the earth is coming to,

and lo, it is as formless and empty as when it began.

And I have seen the heavens;

their light has gone, only darkness is left.

I have seen the mountains and they are quaking,

and the hills are shaking to and fro.

I looked and there is not a human to be found,

and all the birds of the sky have fled.

I looked and the garden-land has become a desert

and all its cities are in ruins.2

Christianity emerged during a resurgence of this Jewish concern about the likely end of the age. It was a period of widespread cultural change and turmoil not unlike that which we have now entered. The Greco-Roman civilization had just passed its zenith. The Jewish people, after a dramatic political recovery in the time of Herod the Great, were experiencing their second expulsion from their ancestral land. The first Christians were Jews who shared this expectation of an imminent end with many of their fellow Jews. But they also believed that the Messiah had already come -- a Messiah who would usher in a new age, and thereby bring the Jewish tradition to its logical end or fulfillment. They went out proclaiming the advent of an entirely different world. Thus, for a variety of reasons, the first Christians saw themselves living in the eschata (end-times or last days) and this belief permeates the writings of the New Testament.

Today there is some doubt among scholars as to whether the idea of an end-time was actually taught and shared by Jesus, but it certainly became dominant in the rise of Christianity after his death. The imminent end-time is described in the New Testament in such striking passages as:

In those days the sun will be darkened and the moon will not give its light and the stars will fall from heaven, and the powers of the heavens will be shaken . . . this generation will not pass away till all these things take place. Heaven and earth will pass away, but these words will not pass away. Of that day and hour no one knows . . . but the Father only. As were the days of Noah, so will be the coming of the Son of man.

In the last days there will come times of stress. For men will be lovers of self, lovers of money, disobedient to their parents, inhuman, slanderers, haters of good.3

I saw a new sky and a new earth, for the first sky and the first earth had passed away and the sea was no more.4

The earliest of the New Testament writings also refer to it; in his letters, Paul wrote with passion and complete confidence about the approaching time when

the Lord himself will descend from heaven with a cry of command, with the archangel’s call, and with the sound of the trumpet of God. And the dead will rise first; and then we who are alive, who are left, shall be caught up together with them in the clouds to meet the Lord in the air; and so we shall always be with the Lord.5

Paul clearly expected the end of the present age and the coming of the new world to occur in his own lifetime. His attention was so focused on the immediate future that he showed little interest in the words and deeds of Jesus, as remembered from the latter’s earthly ministry. Paul saw the now glorified Jesus as the first of a new kind of human being, the new Adam.

The Jewish Christians came to see Jesus as a new Moses, who would lead not only the Jews but the whole of humankind out of enslavement to the Devil. The Gentile Christianity shaped by Paul went further in the second century, even debating whether to cut its links with its Jewish past. Thus Christians joyfully hailed the Christian Gospel as a new beginning, saying, ‘The old has passed away, behold, the new has come.’6 They looked expectantly to the forthcoming time when the world of the past would be completely replaced by a new world of peace, order and wholeness.

The early Christians’ expectation of the world to come was not fulfilled, at least not in the way they imagined it. As someone has succinctly put it, the first Christians looked expectantly for the coming of the Kingdom of God, but what they got was the church! By the end of the first century, and certainly during the second century, a good deal of readjustment in Christian thinking had to be done. What eventually emerged from the chrysalis of early Christianity was Christendom, ruled by an ecclesiastical institution which inherited the structures of imperial Rome. The church affirmed an increasingly detailed body of authoritative Christian doctrine in which hope for the world to come had been subtly transferred to a distant future, to be reached only after death and resurrection. Right up until recent times the Christian west was being openly shaped and motivated by what Christianity had ultimately become.

But we have now come to the end of Christendom. We are nearing the end of the global supremacy of the Christian west. We are even seeing the collapse of conventional Christianity. We are suffering the loss of what we long took to be verities and certainties. We are already caught up in a process of cultural change more rapid, more deeply rooted and more widespread than ever before in human history. We are now entering a post-Christian and uncertain global future. The sandwich-board message of the proverbial Hyde Park preacher, ‘The End is Nigh!’, has unexpectedly become relevant to everybody, Christian or not.

Conventional Christianity is, somewhat ironically, declining in the context of great cultural change not unlike that in which Christianity emerged. There are some clear parallels between now and the time of Christian origins, as there are also between the disintegration of the Christian west and the decline of the Roman Empire. Much of what the west has long taken for granted is now disappearing: the security provided by Christendom; the Christian way of interpreting reality; the confidence that the Christian path leads to eternal salvation; and the belief that Christian doctrine embodies the essential and unchangeable truths by which to live. All these are passing away. We face a future without them, a post-Christian future which will be very different from the past.

There are, however, some significant differences between New Testament times and ours. Christian fundamentalists, who read the New Testament at face value, and who look at today’s events through the lens of the New Testament, are inclined to treat the passing of the last 2,000 years as if nothing has substantially changed. They still look expectantly to the bodily return of Jesus descending from the heavens, the battle of Armageddon in the Middle East and the replacement of this present world with a ‘new heaven and a new earth’. But the end-times of the first Christians, on which Bible-believing Christians of today fasten their attention, are very different from our current end-times, described in Part I of this book. For the latter reflect the history of the Christian era and all that has happened in the centuries since Christianity came into being.

Today’s fundamentalists who proclaim that the end is nigh are apt to anticipate the destruction of the world with enthusiasm. Recently, for example, planeloads of American fundamentalists have been travelling to Israel to view the site, Megiddo, where they believe the great clash among the nations will break out, and the battle of Armageddon will bring to an end the world as we know it.7 As this event is believed to herald the return of Jesus Christ, they have no fear for their own future, understanding from the words of Paul quoted above, that they will be ‘raptured’ (lifted up into the sky and preserved from destruction) and that only non-believers will perish in the death of the old world.

Most mainline churches have little sympathy with this talk about end-times, whether it comes from the fundamentalists or from those who warn of the end of conventional Christianity. Mainstream Christians tend to look to the past rather than to the future, and to search for ways to restore Christianity to its former glory, as manifested in the flowering of Christendom. Only a minority of Christians show any awareness of the crisis now facing Christianity.

There are a few who are looking more seriously into the future of Christianity, and writing about it. They include: Stanley Romaine Hopper, The Crisis of Faith (1947); W.H. van de Pol, The End of Conventional Christianity (1968); David Edwards, The Futures of Christianity (1987); Ewert Cousins. Christ of the 21st Century (1992); Donald English, Into the 21st Century (1995); Douglas John Hall, The End of Christendom and the Future of Christianity (1997); John Shelby Spong, Why Christianity Must Change or Die (1998); and Don Cupitt, After God (1997). By and large, however, there has been great reluctance on the part of the churches to acknowledge that we are entering a post-Christian age.

An examination of today’s end-times is, however, highly pertinent to Christian and non-Christian alike. And what better time to attempt it than as we pass into the third millennium? What does the end of the Christian era mean, not only for the post-Christian west, but for the world as a whole? ‘What will follow this series of end-times? What sort of post-Christian future can we expect in the twenty-first century? What does the world face beyond 2000 CE?

In Part 2, this book attempts, tentatively, to take stock of just where we humans are in the evolution of human culture on this planet, to explore the significance of entering a new era that is both global and post-Christian, and to look into the future.

Humans have long been used to contemplating their own personal future. Unlike other animals which (as far as we can tell) live in a timeless present, we have a sense of the passing of time. We humans are aware of change in personal development, as described in Shakespeare’s ‘seven ages of man’; so we are used to planning for the next day, the next year or even for a lifetime as when, in early adulthood, we choose a career or a spouse. But in the past this contemplation of a personal future normally took place with the idea that the cultural and physical environment had some permanence. The physical world in particular seemed to be changeless. Its mountains, rocks and rivers had such apparent reliability that God’s eternal presence was likened to a rock.

With the advent of the modern world view, humans have become increasingly aware of the phenomenon of change as something which permeates the whole of reality. From geology we have come to understand that, in geological time, the earth is continually changing its surface and the physical environment within which life is to be lived. Hard on the heels of the idea of a slowly changing earth came the notion of biological evolution. The planets’ innumerable living species are not fixed but are subject to slow evolutionary change, leading sometimes to the emergence of new species and sometimes to their extinction.

The ideas of evolution and historical development in the distant past were barely accepted, before the current process of cultural and religious change gained momentum. The widespread consciousness of this in the latter half of the twentieth century caused Alvin Toffler’s Future Shock to be a runaway best-seller. It struck a chord with what people were beginning to feel.

Thus it seems that nothing in the world stays the same. The Buddhist idea of universal impermanence has largely been confirmed. Moreover, the process of change is now accelerating, both in human culture and in our physical environment. So radical is human change that many national and minority cultures are threatened with extinction and are beginning to take desperate measures to try to preserve themselves. We can therefore no longer take the future of the world for granted as our forebears used to. But neither can we afford simply to ignore the future and stoically await what comes. If there are still different possible futures for planetary life in general, and for human existence in particular, those futures have come to depend increasingly on decisions made by the human species. We now have to plan not only for our personal future but also for the future of the earth itself.

Part 2, in assessing the trends now dominant in the global cultural change we are facing, will sketch the process of globalization, humanity’s threat to itself, the human threat to the planet that sustains its life, and the possible scenarios that may result from these threats. It will conclude by exploring the possible emergence of a new kind of society -- a global society, whose cohesion and harmonious life rest upon the rise of a global culture.

Every culture in the past has had a religious dimension, which motivates the culture and supplies it with its values and goals. As the word ‘religion’ has, in popular usage, become associated with an outdated supernatural world, we need to return to the original meaning of the word if there is to be any profitable discussion about the religion of the future. The Latin word religio meant devotion or commitment, ‘a conscientious concern for what really matters’; the English word ‘religion’, while often implying a sense of the sacred, originally referred to the human attitude of devotion. To be religious in any culture is to be devoted to whatever is believed to matter most in life. As we shall see, religion has been usefully defined as ‘a total mode of the interpreting and living of life’.8 That is the sense in which the word will be used here.9

The religious dimension of global culture, if it comes at all, will be naturalistic and humanistic in form. Yet it will evolve out of the many cultures which have preceded it -- and in particular its Christian heritage, simply because the civilization of the Christian west indirectly caused the modern world to come into being. But what could the Christian tradition (long wedded to a supernaturalist format) possibly contribute to the growth of a humanistic global culture? Curiously, the religious dimension of the new global society may draw on some of the long-neglected elements in the biblical tradition itself. The books known as the Wisdom Literature, also referred to as the documents of Hebrew humanism (Proverbs, Job, Ecclesiastes, Wisdom of Solomon, Ecclesiasticus), describe a religious way of thinking drawn from the lessons of daily experience. It was humanistic and universalistic, and it almost completely ignored the historical and theological themes which dominate the rest of the Old Testament. It drew freely from non-Israelite sources. And recent New Testament scholarship suggests that Jesus of Nazareth had much more affinity with this stream of thought than previously realized. ‘Jesus may well have been a wisdom teacher -- a sage,’ concludes Robert Funk of the Jesus Seminar.10

There are other (often forgotten) voices from the Judeo-Christian past which are now being heard more clearly -- people whose work or writing has contributed to the rise of the contemporary naturalist worldview. Modern ecologists (usually non-religious) are singing the praises of the mediaeval Francis of Assisi, a man who abandoned material riches and the life of sensual pleasure to adopt the simplest and most frugal of lifestyles. In particular, he turned back to the natural world, long neglected by earlier Christianity, and acknowledged his kinship with all living things. The order of friars that Francis founded was forbidden, at first, even to own property; they had to live by the work of their hands. The Franciscans also produced some remarkable thinkers -- Roger Bacon, Bonaventure, Duns Scotus and William of Ockham (who indirectly influenced Luther and Feuerbach) -- men who pioneered some of the ideas that led to the modern world several centuries later.

Exploring the idea of Christ in the twenty-first century in 1992, Ewert Cousins wrote: ‘with a penetrating spiritual insight, Francis saw an organic relationship between nature, the human and God . . . For him nature was sacred, an expression of God himself; it was a divine gift which bore God’s imprint’.11 He suggested that Francis’ integral humanism is even more relevant to the global context of the twentieth century than it was to the thirteenth.

The seventeenth-century Jewish philosopher Spinoza (1632-1677) also affirmed the relationship between divinity and nature. He began to speak of ‘God or Nature’, as if these were alternative concepts. Convinced of the unity of all reality, Spinoza effectively eliminated the great gulf previously thought to exist between the Creator and the Creation, between the spiritual and physical. He was far ahead of his time, however, and was completely rejected by both Jew and Christian. Yet his ideas later flourished in the writing of Hegel, Schelling, Schleiermacher and Feuerbach in the nineteenth century and in Buber, Teilhard de Chardin and Tillich in the twentieth.

Ludwig Feuerbach shocked the western Christian world in 1841 with what might be called in current terminology the ‘deconstruction’ of Christianity in his book The Essence of Christianity.12 While he set out to expound Christianity positively, he did so on a completely natural basis and without any reference to supernatural forces. Feuerbach already sensed the radical cultural change beginning to take place. He spoke of the coming of a new era in human history and the need for a new religion. In his lectures to students in Heidelberg he concluded:

We must replace the love of God by the love of man as the only true religion . . . the belief in God by the belief in man . . . My wish is to transform friends of God into friends of man, believers into thinkers . . . candidates for the hereafter into students of this world, Christians, who, by their own profession are half-animal, half-angel, into men, into whole men.13

Feuerbach further offended the people of his day by suggesting that the ancient nature religions remained superior to Christianity since they were sensuously in touch with the earth and with nature, whereas Christianity had become separated from nature, and had made of God a separate, sexless, spiritual being. Some of his words could have been written today.

The Christian tradition has contained many different elements in the past, some of which were in their day rejected and condemned. Ironically, it is some of those dissident elements of the old tradition that may well contain seeds of the religious dimension of the future global society. The global era will, in some respects, be very different from the Christian era which preceded it, but there will also be continuity. The transition from one to the other forms the subject of this book.

 

Notes:

1. Isaiah 2:1-4.

2. Jeremiah 4:23-26.

3. Condensed from Matthew 24:29-39 and 2 Timothy 3:1-3.

4. Revelation 21:1

5. I Thessalonians 4:16-7

6. 2 Corinthians 5:17.

7. Grace Halsell, Prophecy and Politics: Militant Evangelists on the Road to Nuclear War.

8. See Chapter 6.

9. There is a fuller discussion in the author’s Tomorrow’s God, Chapter 7.

10. Robert W. Funk, Honest to Jesus, p. 70.

11. Ewert Cousins, Christ of the 21st Century, p. 135.

12. For a fuller account of the work of Ludwig Feuerbach see the author’s Faith’s New Age, Ch. 7.

13. Ludwig Feuerbach, Lectures on the Essence of Religion, p. 172.