Skip to main content

You are here

Do We Live in a Post-Christian Culture?

Image by Aaron Burden via Unsplash. Image by Aaron Burden via Unsplash.

The term “post-Christian” is used often, and therefore often contested. It suggests that our culture, the culture of Western civilization, is undergoing a fundamental transformation from something explicitly Christian to something explicitly secular. Normally, “post-Christian” also calls to mind a connection with the thought of the Ancients, especially through Christian intermediaries. Regardless, the question remains: do we live in a post-Christian culture, and if so, what does that mean?

Thinkers like Rod Dreher are convinced that we live in a culture that has mostly forsaken its Christian heritage. In Dreher's eyes, the last few decades have been filled with political decisions that directly contradict the Christian faith. From the legalization of abortion to the recent decision on gay marriage, we have left behind what it means to be “Christian.” These political decisions were influenced by an increasingly secular culture, but these fiats themselves have now furthered the process of secularization, completing a vicious cycle and ushering in a Post-Christian age.

By contrast, there are others who place an emphasis on the inherently non-Christian nature of the American polity. The argument goes that while some of the Framers were devout Christians, others were deists, influenced by the very Enlightenment principles which would eventually eat away at the foundations of the so-called Christian culture. In this view, America in particular has never been particularly Christian. And while you might say that Europe has a Christian heritage, that heritage has manifested itself in (and fought with) secular reforms.

A third camp builds on this last point by arguing that secular tendencies are really just the secular manifestations of Christian ideas. For example, Jürgen Habermas, a deeply secular thinker, has even commented that, "Christianity, and nothing else, is the ultimate foundation of liberty, conscience, human rights, and democracy, the benchmarks of Western civilization. To this day, we have no other options. We continue to nourish ourselves from this source. Everything else is postmodern chatter." He argues that we live in a “post-secular” society, in which the merits of the modern project may be debated, and has gone on record saying that “to exclude religious voices from the public square is highly illiberal.” All this from a thinker at whom most Christians would scoff.

The truth, I think, is a combination of all three of these positions. Our society is “post-Christian” insofar as we are no longer medieval people whose everyday lives are wholly dominated by a Christian consciousness, but that hasn’t been the case in many places for hundreds of years. Religion maintained a veneer of importance for centuries, but the overarching political and cultural authority of the Church began breaking down after the Reformation, and even before.

It is also true that America is not a truly “Christian” nation. Many of the liberal ideas in the Constitution were long seen as harmful by the clergy. It is more effective, then, to think of our current culture as an outgrowth of the old one, perhaps with less a positive accent than the one apparent in Habermas’s commentary. What we can do, however, is to examine how and why our culture became “post-Christian.” Only then do we have a chance to separate fact from fiction, reclaim what is worth reclaiming, and separate the wheat from the chaff.


Share this article

Subscribe to our mailing list

* indicates required
Select the emails you want to receive: