Monday, January 9, 2012

Why do christians think the usa is a christian nation?

This is an excellent question. This is what people choose to believe because it tickles their fancy to do so. If nothing else, the United States of America is a secular nation. When it says "In God We Trust" on our money, "God" is given whatever meaning that people choose for it, based on the 1st Amendment. Americans are given the freedom of religion. If they want to worship money in the place of God, there is no constitutional amendment to prevent this. At any rate, many pastors join the evangelical churches for the sake of make names for themselves and becoming rulers of their own little kingdoms. True Christian pastors that encourage true biblical understanding are a rare breed.

No comments:

Post a Comment