The times are changing. Do you think the USA is still a christian nation?Should citizens of the USA say that their nation is a Christian one?That would violate the 1st Amendment, thereby negating the Constitution. If this were the case, there would no longer be a USA.Should citizens of the USA say that their nation is a Christian one?No, I don't think it is.Should citizens of the USA say that their nation is a Christian one?No, It isn't. And never was.The majority of Americans are Christians, but to say that America was founded as a %26quot;Christian Nation%26quot; is completely false.It's not.
We weren't even founded as a Christian nation....that is a huge myth. most of our founders were deists, not Christians by any means.Technically, I don't think it really was to begin with.The US was never meant to be a Christian nation...it does read %26quot;freedom of religion%26quot; however, many people seem to think this means %26quot;freedom of Christianity%26quot;. Evangalism is gaining a strong foothold here and it scares me to death--classic case where the minority has the loudest voice--I think it's high time for the rest of us to stand up and scream.no.I do not believe the USA is a Christian nation at this point. Although our founding documents and legal system are based on the Judeo Christian God, and many of us say we are Christians, examination of our worldview suggests that many of us are Christians only in what we say, but not what we think or do. The group below has done much research to support thisin the context of its history, sure as in Judeo-Christian foundation. As far as the current political system reflecting Christian, no, it is not a Christian nation. It never, ever was.Nope. I hope it doesn't become one.Yes because it is still the predominate religion.Christianity was never the official religion. But most citizens are Christian.No, it never was. It was formed to be a secular nation by people, many of whom happened to be Christian. By design it doesn't subscribe to any particular faith, and for good reason.America is no longer a christian nation.The USA is (largely) a nation of Christians, not a Christian nation. That was true at the founding and it is true now. nope even if the majority of people are? or once were Christians saying its a Christian nation is like ignoring all the other people in the nation who are just as much americansOf Course. It was founded that way and up to us to continue it. Look around, the number one issue is Christianity which is against abortion and homosexuality. It will return to a nation after God's own heart.