I heard Todd Starnes report on a Florida school system that uses a new History text book that pushes Islam and Muhammad in a 36 page chapter and the book has nothing to say about Christianity. We have also seen over the past 20 years or so how Atheists use the ACLU to remove Christianity from public view in many ways. We see revisionist historians remove Christ and His followers from the history books. The rise of Islam in America is starting to copy what Europe saw over the past 20 years and this worries me. But should it? Are we a Christian nation, were we ever? Is that important? Christianity rose in a very hostile part of the world 2000 years ago? If we are to separate from secular society, should we leave it to non Christians? But I being salt and light have anything to do with what is taught in our schools or what is displayed at Christmas time? What are your thoughts?