The next subject I want to focus on is how the Native Americans was treated. When the Pilgrims came to America we know that the Native Americans helped them. For some reason it’s not taught that much that they were pushed away from their own land. They were oppressed, used, and abused. Tell me is that the Christian way? How can you tell me that the nation was founded on Christianity when most of the history contradicts it? The Bible says that we’re supposed to give tribute to whom it is due. In my opinion the Native Americans should’ve been paid homage for already have being here. Some people may argue that they used to attack the people from Europe, but wouldn’t you attack someone who came on your land and tried to take it away? I’m pretty sure there are some points that I’m missing, but I’m sleepy so my mind is scatterbrained. We’ll address what I missed in the comments section.