I'm a social studies teacher and an activist and I really history as extremely important to understand. For us all to understand that the myths that we hear everyday is so important for us to deconstruct and that includes the founding fathers, the constitution and other "holy" entities in this country. If we think that this country's founding is unquestionably good, we can't fully understand the injustices today. These myths creates blindness for white people (and some people of color) that the U.S. is usually good, which is incorrect. If we don't understand the conflict and how people of color and their white allies fought to change things than we had a mis reading of history and fall into the idea that the US is naturally good. Indigenous people, Africans and other people of color built the wealth of this country and reparations are both important to deconstructing systematic racism and to have a process of reconciliation, which the US has never grappled with.
I completely agree with you on this. How is it that the history taught in our schools is not reflective of true history? What can we as educators do to change the narrative to the truth, rather than the white washed version included in our history books? Other countries have managed to at the very least acknowledge the wrongs that took place and attempt to make amends, but America refuses to acknowledge that these situations even took place, let alone attempt to repair the damage. We need to acknowledge the mistakes of our past if we have any hope of a better future.