Honestly, I find it a touch redundant to even ask if America is a god-centric country. Of course it is, a good fraction of America is due to religion and solely that. Both the Great Awakenings and the Cold War only strengthened this, particularly in modern times the Cold War. America v. Soviets, Religion v. Irreligion. As mentioned, the phrase In God We Trust was plastered on everything despite its obvious challenges to secularism. And of course let's not forget that the right-wing has made just about every social argument an argument about religion, even though in the eyes of the US Government the bible and every other religious text is non-existant in its realm. Thankfully, since the Cold War has ended, America has been going the way of Europe and steadily *truly* secularizing. My generation is the least religious generation to ever exist on US soil, and I only see the apathy/doubt increasing as time passes, to be honest.