America, based on the religious beliefs of our founding fathers, is essentially a Christian nation established on Christian principles.
As a consequence of the lies of 9/11 resulting in illegal and unwarranted invasions of foreign countries that followed, which killed, maimed or displaced millions of Muslims and others worldwide, America has degenerated into anything but a Christian nation.