- Despite the protection of free religious practice by the First Amendment (commonly known as the separation of church and state), many in the United States tend to think of their country as being Christian. Indeed a 2007 survey reports that 55 percent of respondents believe the U.S. is in fact a Christian nation — which would be a surprise to the Founders. A revealing article by Kevin M. Kruse (The New York Times, 14-Mar-2015) shows why this is so.
After the Great Crash and the ensuing Great Depression of the 1930s, American business was assaulted by the public, labor unions, and F.D.R.’s New Deal. Business leaders pushed back with a campaign to regain their prestige.
But nothing worked particularly well until they began an inspired public relations offensive that cast capitalism as the handmaiden of Christianity,writes Kruse.
Accordingly, throughout the 1930s and ’40s, corporate leaders marketed a new ideology that combined elements of Christianity with an anti‑federal libertarianism.To see how they did it read A Christian Nation? Since When?
A Christian Nation?
The unattractive truth was that the arrival of the provisional treaty ending the war in April 1783 made the Continental Army superfluous, and the sooner it disappeared, the better. Congress eventually voted to provide full pay for five years for officers in lieu of half pay for life, but doing so was a purely rhetorical exercise, since there was no money in the federal coffers to pay anyone. Even that meaningless commitment generated widespread criticism, especially in New England, where returning officers were greeted with newspaper editorials describing them as blood-beaked vultures feeding at the public trough. At least in retrospect, the dissolution of the Continental Army in the spring of 1783 was one of the most poignant scenes in American history, as the men who had stayed the course and won the war were ushered off without pay, with paper pensions and only grudging recognition of their service.