By Linda Paine, Restoring Patriotism
It has been said, not only by President Obama, but by those Progressives in the media, in the Justice system, and in our educational system that America is no longer a Christian nation…or that we never were one. We have sat silent for years, allowing those who do not understand faith or care about the truth in our history, to re-create our nation’s history and remove the importance that faith played in its founding. We have allowed them to indoctrinate our youth and our culture with the deception that God and faith do not belong in the in the public arena. The consequences have been disastrous.