A Christian Nation?

  • Despite the protection of free religious practice by the First Amendment (commonly known as the separation of church and state), many in the United States tend to think of their country as being Christian. Indeed a 2007 survey reports that 55 percent of respondents believe the U.S. is in fact a Christian nation — which would be a surprise to the Founders. A revealing article by Kevin M. Kruse (The New York Times, 14-Mar-2015) shows why this is so.

     

    After the Great Crash and the ensuing Great Depression of the 1930s, American business was assaulted by the public, labor unions, and F.D.R.’s New Deal. Business leaders pushed back with a campaign to regain their prestige. But nothing worked particularly well until they began an inspired public relations offensive that cast capitalism as the handmaiden of Christianity, writes Kruse. Accordingly, throughout the 1930s and ’40s, corporate leaders marketed a new ideology that combined elements of Christianity with an anti‑federal libertarianism. To see how they did it read A Christian Nation? Since When?

JDN | 16-Mar-2015

The Federalists of the 1780s had a glimpse of what America was to become — a scrambling business society dominated by the pecuniary interests of ordinary people — and they did not like what they saw. This premonition of America’s future lay behind their sense of crisis and their horrified hyperbolic rhetoric. The wholesale pursuits of private interest and private luxury were, they thought, undermining America’s capacity for republican government. They designed the Constitution in order to save American republicanism from the deadly effects of the private pursuits of happiness.

Gordon S. Wood
The Idea of America: Reflections on the Birth of the United States (2011)