I believe the historical events shown in the videos absolutely continue to impact our society today. I like to think of myself as an educated professional social worker but after watching these videos, I see just how much I really don’t know about Persons of Color and the horrors they have endured. And it’s easy to see how insidious some of these forms of racism have been and how they continue to perpetuate racism throughout our country. White Americans have hidden behind the guises of morality, medicine, and justice to perpetrate systemic racism and many continue to do so. I feel manipulated and lied to by my public education. What I keep asking myself is, why aren’t these injustices, events, and blatant abuses taught in American History classes across the United States? Why do children only learn about the ‘wonderful’ accomplishments of white historical figures and how come that history erases anything related to People of Color and what their experiences were throughout American History?