Monday, August 11, 2014

Is America a Christian Nation?

religious right, political Christians, Christian nation, founding fathers

In what sense can we describe a nation as being "Christian?" By its leaders, or its citizens, or maybe its laws? By all these measures, America would not be considered a Christian nation. And yet, I frequently hear Christians speak so earnestly about how America is a Christian nation, and how we need to get back to our Christian roots, almost as though the United States has a special place in God's heart. For many evangelicals in America their Christian faith is directly tied to their American identity.


In fact, I have many memories of saying the Pledge of Allegiance at my Christian elementary school, or singing the "Star Spangled Banner" in church. But the truth is, America is not a Christian nation, and I'm not sure it really ever has been a Christian nation.

It's always difficult for me to hear certain Christians talk about the founding fathers as being these proud, whole-hearted Christians. It's often said that they stood on biblical/Christian principles. I have even heard some say that they built this nation based on teachings of Jesus and the laws found in the Pentatuech. This is simply not true, and sadly misinformed. The vast majority of the founding fathers were not Christian. Thomas Jefferson, who wrote the Declaration of Independence and was our third president, did not believe that Jesus was God, performed miracles, or rose from the dead. Does that sound like someone evangelicals would consider Christian? In fact, Jefferson literally modified the Bible by taking out any references to the miraculous and boiled it all down to Jesus' moral teaching. George Washington was an Anglican but had serious doubts about the traditional tenets of the Christian faith. Whenever Washington referred to God it was only in deistic terms like, "Supreme Being" "Heaven" and so on. Similarly, Benjamin Franklin recognized a deity but believed that Jesus was simply a moral teacher and not God. (Of all the founding fathers the two we know that were committed to orthodox Christian faith were John Witherspoon and John Jay). The truth is the founding fathers were far more influenced by the Enlightenment (particularly thinkers like John Locke and William Blackstone) than by any Christian distinctives.

Consider America's laws: Jesus said to look at a person lustfully is to a commit an offense, but there aren't laws prohibiting lust in this country. Or what about the first of the ten commandments that commands all worship be given to Yahweh? Nothing in US law states that Americans have to worship Yahweh. Or what about Paul's teaching to avoid gossip, serve one another in love, act in humility...these are all important practices of the Christian faith and yet the American government does not mandate them. On the other hand, a church will make sure its leaders and members are not lusting, worshiping other gods, and so on, and there will be disciplinary action taken if Biblical principles are violated. This is not the case in the American government. An American citizen can worship whatever god he or she pleases, does not have to act in humility, and can lust after anyone or anything. The founding fathers were not thinking of Christian principles as they formed this nation's laws. Rather they derived their ideas from Enlightenment philosophy, which placed a high emphasis on the individual, as the basis on which to build this country.

Finally, what about the people of America? Haven't Americans mostly been Christians? Perhaps partly. Christians did seem to rise up and play a strong role in the issues surrounding slavery and the civil war (on both sides), and there were church revivals that took place in America that ignited Christian fervor, but that is not to say that most Americans were committed followers of Jesus, and that certainly has not been the case in the past sixty years or so.

So what happened? How come so many Christians today are fixated on reclaiming America as a Christian nation? I think Daryl Cornett explains it best, "The nineteenth century displayed a significant Christianization of the American people up through the Civil War, evidenced by revival and social reform. After the Civil War, steady decline in religious adherence was the impetus for evangelicals to mythologize American history and pine for a return to a golden age of Christian faith  and virtue at its founding that never existed."

All of this to say, I think it is high time evangelicals in America gave up the quest to reclaim America for something it never was: a Christian nation. Christianity was never meant to be spread through political power, and when it has, it does not reflect the self-sacrificial love of a dying savior. Christianity (or any religion) that is mandated, enforced, or politicized has always been and always will be a disaster. American Christians need to stop worrying so much about whether or not the Ten Commandments are displayed in courthouses, and instead, recall the words of Jesus on his way to the cross, "My kingdom is not of this world."

Faith Colloquium : A Blog about Theology, Philosophy, Church, and Culture

5 comments:

  1. “The believer's cross is no longer any and every kind of suffering, sickness, or tension, the bearing of which is demanded. The believer's cross must be, like his Lord's, the price of his social nonconformity. It is not, like sickness or catastrophe, an inexplicable, unpredictable suffering; it is the end of the path freely chosen after counting the cost...it is the social reality of representing in an unwilling world the Order to come.” - Yoder, POJ

    ReplyDelete
  2. Admirable notions. I agree wholeheartedly with your assertions and applaud your challenge of those who have misguided understandings of American religious principles. Still, I think you could take it a step further. Religion, as a facet of human social interaction, was a device created long before Christ. Church is neither Christian nor is it something that should be revered as an institution of Christ. The gatherings we frequently refer to as "church" are not remotely like what Jesus of Nazareth would have been a part of of. The system of bishops and priests, today evolved into the novel concept of a pastor, was a pagan one. What we refer to as the religion of Christ is, in fact, not. I do not know your opinions on Pauline scripture, but there are great doubts about its authorship and authenticity. It is the whole basis by which the Catholic (Universal) Church was created. I think a paring down is in order to understand who and what Christ really was. This European realization is clearly not the Christ of the Gospels. Jesus was not white, and he certainly did not devolve into the modern Christianity of today. Church, whether Catholic, protestant, or any other form, is descended from European religious practices, not Christ.

    ReplyDelete
  3. Hi friend, thanks for your comment. I certainly would affirm with you that there has been strong European and Western influences in Christianity and on the practices of the Christian church, some of which have caused deviation from the teachings of Christ in the New Testament. And you're absolutely right in saying Jesus was not white. However, I am a Christian, and so I do consider the words of the New Testament (from Jesus and Paul) to be authentic. Thus, I do not think I can arrive at your same conclusion that the Christian Church is merely a human religious system that has not been instituted by Christ.

    ReplyDelete
  4. Enjoyed reading this post. The last line is especially poignant.

    ReplyDelete
  5. This comment has been removed by a blog administrator.

    ReplyDelete