I actually think that the further our culture creeps into post-Christianity, the more opportunity there will be for the gospel to spread. The feeling that we were a Christian nation (as if there were such a thing apart from Christ's Kingdom) was a superficial and nominal assurance to begin with. There are more and more people who are actually opening up to the gospel.
Also, if we look throughout history, the gospel has always flourished under oppression. If some of you folks realized that, you'd drop the silly and futile political activism, give up on keeping gay marriage and abortion illegal, and start actually working to spread the gospel. Washington can do whatever they want, and it won't hinder God's work.
Only if you're pre-trib, which was never a predominant view until recently.
"Christianity," our state religion?
Discussion in 'News & Current Events' started by billwald, Dec 5, 2011.
Page 2 of 2
Page 2 of 2