I was quite shocked to find out the medical discrimination that occurred and is still occurring to this day. This is never brought to light or talked about it. I for one do not agree with anyone being tested, poked, and prodded for the sake of science. African Americans have the same skin, organs, and body parts as anyone else. So, why do white men deem them as than? For goodness sakes this is the 21st century, how have we made absolutely little to no progress in how we treat people who aren't "white". This country values the word of a white man more than anyone else. They are the end all be all of this country. When we as a country truly open up our eyes and treat ALL people as people, only then will we have true progress and equality. The injustice and inequality in this system are sickening and upsetting. I fear that there will be a civil war that arises because of the great social unrest in this country.