I spoke to a young man who just finished his first year in college. Since I’m military he chose to bring up our wars. He didn’t have much to say, only that WWI and WWII were about Hitler trying to take over the world and Vietnam was the final phase of WWII. The only thing he really recalled from high school was that we shouldn’t have been in Vietnam. It would be funny, if it wasn’t sad. Is history no longer taught, or does our youth simply find no value in studying history? It seems that without even a basic understanding they would have no framework with which to understand their environment. I’m no history buff, but even I know that Mussolini started the Vietnam war (aka, the northern aggression).