So it seems that for whatever reason, there is some sort of ambiguity among American Christians as to what the basis of their religion actually is and what they may and may not believe in. The reasons can always be debated but it seems to be mostly due to the secular upbringing and education in their country and in most of the world.