One of the primary questions that I have been struggling with the couple of years has been, what does it mean to be a Christian when the popular thing to do is be Christian?
Let me explain. Christianity is based off of the teachings, life, and resurrection of Jesus. This man named Jesus lived roughly 2000 years ago, and was a social outcast to the government and the way of life at the time. He broke many social, political, and religious codes, and was eventually crucified for it. He claimed to be the Son of God and the Messiah. Many people decided to follow him throughout the years, and from that spouts Christianity.
Christianity was founded in the Middle Eastern region (therefore making Jesus a Middle Eastern Jew). It was seen in a negative light because it broke with the religious tradition of the region, whether that was to pledge allegiance to the government or to Judaism. Christianity was viewed as a form of atheism because these new followers were renouncing the other gods. Many early Christians were killed because of this and became martyrs. Others had different reactions such as monasticism, or a fleeing away from the world.
Eventually, Christianity was okayed by the government thanks to Constantine. From there we have this complicated entangling of church and state that can be seen by the Catholic church, the Holy Roman Empire, the Protestant Reformation, and the Church of England to name a few instances. What is of primary importance in this case is to notice that the heart of Christianity lied in Western Europe, which was the center of the world at this time. Eventually it moved to the Americas and a few other countries with the spread of imperialism.
Now there are many sects, or denominations of Christianity. The primary ones today can be most vaguely classified as Orthodox, Catholic, and Protestant. Christianity is the world's most popular religion. It is growing widely, and is especially predominant in countries that are developing or not as well off as America and Western Europe.
What got me to thinking about this today was reading for my Sociology of Religion class. In this class we are looking at religions around the world, how they change over time, and the development of them. I am just a few class periods in and I am fascinated!
The question I posed above is rooted in the fact that in many places in Western Europe and the United States, Christianity is thought to be the primary religion of the land. However, when Christianity was founded, as shown above, it was not the cool thing to be a Christian. This religion was founded upon being the outcast of the society. It was founded on being set apart and not like the world. The idea of a "Christian nation" is in many ways a juxtaposition to the Gospel itself.
What struck me about my reading (Stephen Prothero's God is Not One, 2010) was the statistical numbers that were presented (I won't get into the nitty gritty of those in here, but they are in that book if you are curious). What is striking is that while Christianity is growing in third world countries, which according to Prothero is due mainly to the Pentecostal church, it is diminishing in Western countries. Many more people in Western countries are denying any faith in God whatsoever, and are affiliating themselves with atheism or agnosticism. These are some of the same countries that have identified themselves by their Christian roots.
This information is common sense to many people. What strikes me about it is the way that the Christianity in these so-called "Christian nations" works. For hundreds of years there has been this we-are-better-than-them mentality that white people in Western parts of the world have had. We can see this throughout history in the colonization and discovery of the America's, the European imperialism in Africa, and the containment policies of the Truman Doctrine. Throughout history, a focus has been made on trying to get others to be like us.
I would argue that this mentality is very much still alive in America, and even so in the American church. With the modern way of missions, this is evident. People in Western, "Christian" countries are generally disconnected from missions. Not that they don't see the need, or advantage of them, but they are often less directly affiliated. More often than not, and I am guilty of this, we just give people our money and expect someone else to do something with it to benefit the hurting and the oppressed. Now don't get me wrong, this is good and money is needed, and in fact, Jesus calls us to give of all our possessions to follow him, BUT missions is more than just donating money. It is working hands on with people, and showing them the love of Christ directly through our lives.
The longing to make people like us continues when we give people money to buy things and to make their churches spring out more like ours. Really, one of the most constructive forms of missions is helping people learn how to do these things on their own with the resources that they already have available to them,
So much focus goes into American Christianity helping people in countries in Africa and Latin America and such. But what is fascinating to me is that we are trying to "fix" these places to have more of an American ideology of the world. Really, Christianity is sprouting there and declining here. Maybe we are looking at this backwards. Maybe we need to look at what they are doing, and see how that can impact us.
Just some thoughts.