Originally Posted by snowbird
Christians founded America
No... they didn't. Our founders were a mixture of Christians, Deists, and Non-Believers. And America was founded to have NO as in ZERO religious affiliation whatsoever. It's was founded this way so that all people would be free to follow whatever faith or lack there of they chose. There is no official religion of America. And this was exactly the desire of our founding fathers.