A Commentary by Jakob Augstein
The word "West" used to have a meaning. It described common goals and values, the dignity of democracy and justice over tyranny and despotism. Now it seems to be a thing of the past. There is no longer a West, and those who would like to use the word -- along with Europe and the United States in the same sentence -- should just hold their breath. By any definition, America is no longer a Western nation.
The US is a country where the system of government has fallen firmly into the hands of the elite. An unruly and aggressive militarism set in motion two costly wars in the past 10 years. Society is not only divided socially and politically -- in its ideological blindness the nation is moving even farther away from the core of democracy. It is losing its ability to compromise.
America has changed. It has drifted away from the West.
No comments:
Post a Comment