As someone who studied U.S. History and is currently teaching U.S. History, I find it unacceptable to exclude these events and history in the curriculum. We hit the Holocaust extremely hard. I'm not trying to downplay what happened in the Holocaust. I still feel that needs to be taught. I feel that light needs to be brought to the internment camps and how immigrants were/are treated. It is important to get a well rounded picture of U.S. History in order to be more empathetic towards others who may be different.