Through most of America's history, Presidents and other leaders of government and business would regularly make public statements that not only addressed their own personal faith in God, the Bible and Jesus Christ, but also stated without reservation that the United States was indeed a Christian nation.