I do not have kids and it has been several decades since I have taken any US History course in a formal setting. So my question is being directed to any forum members who are US History Teachers and/or Professors and/or US History Textbook authors/publishers: Has US History (including textbooks and in-class educational curricula ) even begun to start reporting the actual, truthful History correctly? I remember asking in grade school how the settlers could 'just take someone's land' and then later on say it was 'their land that couldn't be taken' and the question was simply dismissed without any discussion. It seems to me that every person who is awarded a degree in History or is certified to teach History should know the actual facts - and not just a 'white-written' version of the facts. So, my question is - have the History books and the History classes even started to tell the truth yet? And if not, isn't this just more of the systemic racism that is perpetuating the problems? How will children learn the truth if the truth isn't being taught?