Tag "United States of America"

Back to homepage

Truth About the Vietnam War

Did the United States win or lose the Vietnam War? We are taught that it was a resounding loss for America, one that proves that intervening in the affairs of other nations is usually misguided. The truth is that our military won the war, but our politicians lost it.

Read More