America Was Founded as a Christian Nation
When our Founding Fathers referred to this nation as a “Christian Nation,” as did John Jay, one of authors of Federalist Papers, they did not intend that this be understood in the sense that an official church had been established