- Despite the protection of free religious practice by the First Amendment (commonly known as the separation of church and state), many in the United States tend to think of their country as being Christian. Indeed a 2007 survey reports that 55 percent of respondents believe the U.S. is in fact a Christian nation — which would be a surprise to the Founders. A revealing article by Kevin M. Kruse (The New York Times, 14-Mar-2015) shows why this is so.
After the Great Crash and the ensuing Great Depression of the 1930s, American business was assaulted by the public, labor unions, and F.D.R.’s New Deal. Business leaders pushed back with a campaign to regain their prestige.
But nothing worked particularly well until they began an inspired public relations offensive that cast capitalism as the handmaiden of Christianity,writes Kruse.
Accordingly, throughout the 1930s and ’40s, corporate leaders marketed a new ideology that combined elements of Christianity with an anti‑federal libertarianism.To see how they did it read A Christian Nation? Since When?
A Christian Nation?
Virtually all modern accounts of the Revolution begin in 1763 with the Peace of Paris, the great treaty that concluded the Seven Years’ War. Opening the story there, however, makes the imperial events and conflicts that followed the war — the controversy over the Sugar Act and the Stamp Act crisis — into precursors of the Revolution. No matter how strenuous their other disagreements, most modern historians have looked at the years after 1763 not as contemporary Americans and Britons saw them — as a postwar era vexed by the unanticipated problems in relations between the colonies and metropolis — but as what we in retrospect know those years to have been, a pre-Revolutionary period. By sneaking glances, in effect, at what was coming next, historians robbed their accounts of contingency and suggested, less by design than by inadvertence, that the independence and nationhood of the United States were somehow inevitable.