west
//wɛst//
Noun
1
The direction toward the point of the horizon where the sun sets; one of the four cardinal directions, opposite to east.
The sun sets in the west.
2
The western part of a region, country, or area.
They live in the west of the country.
3
The western part of the world, especially Europe and the Americas, or the countries of Western civilization.
Many ideas from the West have influenced Asian cultures.
Antonym
Adjective
1
Situated in, directed toward, or facing the west.
The west side of the building gets sun in the afternoon.
2
Coming from the west (especially of wind).
A west wind brought cooler air.
Adverb
1
Toward the west; in a westward direction.
The birds fly west in winter.