America has never lost a war, besides a disputed 'loss' in Vietnam. I've noticed that America demonises it's enemies through propoganda. A valuable wartime technique, but it seems that years after wars people have always rendered our enemies 'faceless'. So called in-depth books about WW2 calls our enemies 'The Germans' or 'The Japanese', and leaves it at that. No events of German or Japanese showing their courage, only GI's fighting 'heroically'. In Vietnam, we never saw the enemy fighting for their country, but only as the dirty 'Vietcong'. The guerillas of mixed nationalites we fight in the middle-east now are called 'terrorists', not even considering we are the ones imposing western will upon them. I myself am a person that wants to look at both sides of a conflict, even our enemies. What do you think?





Reply With Quote














