Western world

From Wikipedia, the free encyclopedia
Jump to: navigation, search

"Western world" has meant various things at various times. During the Roman Empire it meant Italy and the countries west of there. At other times it has meant Western Europe or Europe or Christendom. During the Cold War it sometimes meant the democratic countries or those allied with the various NATO powers. Today it often means the places where most people speak European languages.