The 29th century is often called the "Century of the West" or even the "American Century". However if one looks closely at the kind of power the West was wielding in the end of the 19th century it is difficult to ignore the rise of the East.
In the beginning of the 20th century Western Empires ruled the world in one form or another. Asia was subjugated or was attempting to Westernise as was the case of Japan. The west was producing half the world's output while Asia barely one quarter while having half the world's population.
100 years later all the Western empires have fallen with the last (the Soviet Empire) crumbling in 1991. India, China, Africa, Korea, Cambodia, Vietnam delineate geographical and political events that led to the erosion of the Western Dominance. South America as well.
And if you're thinking about a cultural victory of the "West" you have to consider that the western models (either democratic or communist) were never fully adopted in the east but rather adapted to the point those are not recognisable. As the 19th century racial theorists would decry: miscegenation.
On the eve of the 21st century the world has been re-oriented between a declining west and a rising east.
It is not certain that the multicultural societies of the west will be lead to conflict. However no one can question the fact that the cultural borders are not somewhere in the Urals but rather at the heart of every western city.
Your thoughts?





Reply With Quote













