top of page

The End of Christian America


I read today the phrase "As Christian America blessedly ends". Is our country Christian? Was it ever? Is it ending? What does it matter?

Whether or not our country is "Christian" has been debated in evangelical circles for the last generation. I've been among conservatives who are convinced that God supernaturally planted the US to be practically a theocracy, a view spread loudly by David Barton (to whom I will not link). The vast majority of historians, though, teach that our country was founded by men who were generally Christians (see John Fea's Was America Founded As a Christian Nation? - great book that you really should read). You can also argue that we are a Christian nation because many people on surveys will answer that they identify with the Christian religion. Guess where I stand in the argument!

Our country may have peaked in terms of public "Christianity" in the 1930s to 1950s. Christian fundamentalism gradually mellowed into evangelicalism. Conservatism became the order of the day. GIs came home, got married, had 2.5 kids, went to church, and generally settled down. Well, at least that was the idea.

Unfortunately, underneath that idea was racial segregation. patriarchalism, and sexual intolerance, all of which were supported by public Christianity, particularly evangelicalism, which was growing. But, rebellion started in the 60s and didn't let up. Society embraced civil rights, feminism, and LGBT rights (to varying degrees and at varying rates), but evangelicals did not.

In the 1970s and 80s, evangelicals began to get involved in politics to make society more moral. They had the idea that not only should they be working from the bottom up, encouraging people through churches, but that they also should change laws to make society more Christian, which has not been terribly successful. (Surprise! You don't go from being a good person to being a Christian; you go from being a Christian to being like Jesus who was supremely good.)

What got them into politics was that the IRS was going to remove tax exempt status from schools, like Bob Jones University, that discriminated on the basis of race. Their second goal was to overturn Roe v. Wade. Neither was successful. Generally, when Republicans have been in power in government, evangelicals have had more say with the administration. Yet, as conservative Christians have had government influence, there seems to have been less sway over the culture. Our American culture has become a moral morass.

And the last twenty years have not been pretty for the Christian church. We are seeing many Christian denominations hemorrhage members, including the historically strong Southern Baptists. The mega-churches of the last 20th and early 21st centuries don't seem to be growing the way that they used to. And, evangelicals are losing their hold in American society, as evidenced by the widespread approval of gay marriage (very opposed by evangelical leadership) and the general disinterest in moving back to "traditional" sexual mores.

And, in the last presidential election, when given a choice between electing a 42 year married Democratic pro-choice woman or a thrice-married adulterous "pro-life" Republican man, the evangelical vote went 81% to Donald Trump. In addition, there is a group of evangelical advisers to Trump, called by John Fea the "Court Evangelicals", who have encouraged Trump in his choice of a Supreme Court nominee as well as such activities as disallowing transgender people from serving in the military. Sadly, these "Court Evangelicals" are the same men (and one woman) whom I used to look up to as godly individuals, but who are now continuing to vocally support a man (Trump) who is clearly the most immoral man to hold the office of President.

What does this have to do with the decline of "Christian America"? Plenty. When our "Christian" leaders can no longer recognize right from wrong and when they make up excuses to support immorality in the higher echelons of government (Cyrus? Please.), we can no longer trust those leaders to lead us as Christians. Not only are our members of society more often not identifying themselves as Christians (especially evangelicals), but one wonders why they should? We have not given them much of an example to work with.

Clearly, the Church (particularly evangelicals - that's the branch I can speak to with some legitimacy) is not making much of a difference in our culture. Christian America is going away. A large part of the reason is because Christians don't look much different than anyone else. We are materialistic and give just as little of our money to charity as anyone else (less, according to some surveys). Our churches are segregated and we don't seem to be too bothered by it. Racial justice is not on the radar of most white Christians. And our sexual ethics are pretty much the same as everyone else's. I'm not talking about LGBT issues, but promiscuity and the lack of commitment. (I'm pretty sure God is OK with LGBT relationships.)

What now, though? "As Christian America blessedly ends", what do we do? We start to live more like Christians (which we should have been doing all along). The Church has never done well as a majority religion. Go back and read some Medieval history if you want some serious examples. Christendom is littered with the history of corruption and graft in the Church as well as the various European countries.

Tertullian supposedly said, "The blood of the martyrs is the soil of the Church". The early Church grew exponentially because they had the recent example of Jesus to guide them. Jesus lived simply and loved greatly.

As a minority religion, Christianity will have to return to it's roots of simply loving and caring for people. The New Testament shows us how to do this. Mark 10:43-45 "Instead, whoever wants to become great among you must be your servant, 44 and whoever wants to be first must be slave of all. 45 For even the Son of Man did not come to be served, but to serve,and to give his life as a ransom for many.” See also the entire book of Acts, 1 John, and most of Paul's epistles.

We don't need Evangelical leaders with the ear of the President. In fact, they seem to be doing more harm than good. They are telling us that wrong is right and supporting public policies that hurt the poor and downtrodden among us. Political power is not where it's at. The end of "Christian America" will be a blessed thing, indeed.

Don't worry about the upcoming changes. Don't fret that people are being honest about the fact that they don't buy into the Christian thing. That's OK. God's got this. He's got work for us to do. We need to re-evaluate our own lives. Are we living loving and simple lives for Jesus? Are we telling people about the Christian life? Don't worry about whether or not you are Evangelical or Mainline (Presbyterian, Episcopal, etc.) or Catholic. What matters is that you believe in Jesus and are living the way he wants you to.

We don't need "Christian America". We need Jesus.

Catherine

Single post: Blog_Single_Post_Widget
bottom of page