The phrase "the West" is often spoken in reference to the Western world, which includes the European Union (also the EFTA countries), the Americas, Israel, Australia, New Zealand and (in part) South Africa.
The concept of the Western part of the earth has its roots in the Western Roman Empire and the Western Christianity. During the Cold War "the West" was often used to refer to the NATO camp as opposed to the Warsaw Pact and non-aligned nations. The expression survives, with an increasingly ambiguous meaning.