In a couple of semi-recent political conversations, the person I was talking to or myself would start invoking the concept of “the West”. It could be western values, western culture, or just the West itself. When this happened, I would often realize that I would become way less certain whether we were on the same page or not. It was a very weird sensation because you would think that everyone knows what it is. At school, we have classes in western literature, western music, western architecture, western history, and so on. So why didn’t I know what other people meant by the West in the political circus? Here are a few things that people associate with the West:
- Christendom
- Enlightenment
- Colonies and global empires
- The free world
Without getting bogged down in a history lesson – I would lose to quite a few people at TWC when it comes to nitpicking history – there is something unusual about this list. All of those associations are exclusives with foils: for the Christian West to exist as a polity, it requires a heathen non-West; an enlightened philosopher required an obscurant to rail against; an imperialist country requires the existence of natives being imperialized; and a free world is fighting for supremacy over the unfree world. Furthermore, not only are these ideas of the West rather exclusive, but they are also contradictory at times. Enlightenment thinkers were fighting against the Roman Catholic Church and Christian absolutist monarchies. Colonies are being ruled in an undemocratic way that is decidedly unfree.
This tension can also be seen in the peripheries of what is or isn’t the West. Why isn’t Latin America commonly seen as part of the West? A majority of them are Roman Catholic, despite a few authoritarian governments Latin America as a whole is still much more democratic than Africa or Asia, and they were a part of the Allies in WWII which was when the free world became a part of American lexicon. But there is the fact that these countries were colonies and a part of battlefield of the Cold War more than they were active participants.
Does this mean that being imperialist and anti-Communist are now the biggest qualifiers of being a part of the West? The West decided that maintaining its colonies was politically as well as morally untenable. The truly Western thing to do is now is to not have colonies and not exert military control overseas against the locals’ wills. And Marxism? Karl Marx is a 19th-century stateless German Modernist philosopher whose ideas as we know them couldn’t’ve been made in anywhere but the West. The Reformation and the Enlightenment were probably seen by their detractors as an existential threat to the Christian West in their own days, and they were in some ways, but we do not think of them that way anymore. Now, they are just chapters and periods of needed change in a historical progression that is way more convoluted and less unique than we imagine them to be by calling it all “the West”.
The idea of the West was made after the fact, but its dichotomies strikes me as an oversimplification no matter how you dice it. It’s a false binary. The different definitions of the West is also something of a house divided. Right now, I cannot synthesize the significantly disparate things I associate with the West in a way that is more meaningful than simply recognizing the nuances of the globe and everyone in it. A more cosmopolitan view of humanity strikes me as more useful, but perhaps I’m just missing something in these discussions invoking the West. What do you think?