Email us for help
Loading...
Premium support
Log Out
Our Terms of Use and Privacy Policy have changed. We think you'll like them better this way.
Is America really a christian nation? Were the Founding Fathers really Christian God fearing people? Is there a hidden agenda behind our the future of our government? We as christian need to really go with what the Spirit is saying to the Church, the trouble is that alot of are churches are not teaching all of the truth about what is going on. The Constitution never stated that America was a christian nation and un fortunatley alot of right wing brother and sisters believe that it is true.