Pages

19 April 2015

Coming to Terms With American Empire

April 15, 2015

"Empire" is a dirty word. Considering the behavior of many empires, that is not unreasonable. But empire is also simply a description of a condition, many times unplanned and rarely intended. It is a condition that arises from a massive imbalance of power. Indeed, the empires created on purpose, such as Napoleonic France and Nazi Germany, have rarely lasted. Most empires do not plan to become one. They become one and then realize what they are. Sometimes they do not realize what they are for a long time, and that failure to see reality can have massive consequences.

World War II and the Birth of an Empire

The United States became an empire in 1945. It is true that in the Spanish-American War, the United States intentionally took control of the Philippines and Cuba. It is also true that it began thinking of itself as an empire, but it really was not. Cuba and the Philippines were the fantasy of empire, and this illusion dissolved during World War I, the subsequent period of isolationism and the Great Depression.

No comments:

Post a Comment