I definitely think that history has a powerful effect on people's beliefs and behaviors today, but not always in the way it necessarily should.
First, history that is being taught in our schools often is very whitewashed and the United States and white people are portrayed as heroes or saviors. I teach 5th grade and we teach about the Native Americans, Exploration, colonization, all the way up to the Revolutionary War and the Declaration of Independence. The curriculum really does not fully address the way that Native Americans were slaughtered and their land was stolen. Explorers were heroes for their home countries and created opportunities for their people to come to America.
During our units on colonization, the curriculum focuses on how hard colonization was for Europeans. Never once does it address how they enslaved millions of African people. Even in the American Revolution there is no mention of how African Americans risked their lives and fought for our country in order to be free from slavery.
It takes teachers who are willing to go beyond the curriculum and who are educated (or willing to educate themselves) on these things for our students to be able to see the whole picture. In a sense I believe that the way history is taught in school almost enforces beliefs of white supremacy and takes away the rich culture and history of people of color in our country.
I think the thing that stood out to me the most is how much things HAVEN'T changed. It is so sad to think that a huge powerful country can make so many advances but still be stuck in an age of racism and bias.