Longman Dictionary of Contemporary English homepage

Manifest Destiny

     
Manifest Destiny
the belief that the US people had the right and the duty to take land in North America from other people, because this was God's plan. This phrase was used by journalists and politicians in the 19th century when US citizens moved west across North America and the US gained Texas, California, Oregon, and Alaska.

Dictionary pictures of the day
Do you know what each of these is called?
What is the word for picture 1? What is the word for picture 2? What is the word for picture 3? What is the word for picture 4?
Click on any of the pictures above to find out what it is called.

Explore our topic dictionary