All Nonfiction
- Bullying
- Books
- Academic
- Author Interviews
- Celebrity interviews
- College Articles
- College Essays
- Educator of the Year
- Heroes
- Interviews
- Memoir
- Personal Experience
- Sports
- Travel & Culture
All Opinions
- Bullying
- Current Events / Politics
- Discrimination
- Drugs / Alcohol / Smoking
- Entertainment / Celebrities
- Environment
- Love / Relationships
- Movies / Music / TV
- Pop Culture / Trends
- School / College
- Social Issues / Civics
- Spirituality / Religion
- Sports / Hobbies
All Hot Topics
- Bullying
- Community Service
- Environment
- Health
- Letters to the Editor
- Pride & Prejudice
- What Matters
- Back
Summer Guide
- Program Links
- Program Reviews
- Back
College Guide
- College Links
- College Reviews
- College Essays
- College Articles
- Back
Bias in School Textbooks
So I was going through my History Text book and soon found all kinds of biased views in it. It talks about how white people basically invaded villages in Africa and drug people out of their homes, in some cases yes this happened. But in most cases slaves were sold by other Africans, sometimes the Africans even captured whites and used them as slaves. No one mentions that however. It also talks about all the wrongs Americans did to the Indians, wrongs were committed on both sides and I won't try to justify what Americans did but Indians also raided villages in towns and killed men, women and children. All of the bias in the text books is sickening to a point yet we as Americans happily go along with it.
Similar Articles
JOIN THE DISCUSSION
This article has 0 comments.