the belief that the US people had the right and the duty to take land in North America from other people, because this was God's plan. This phrase was used by journalists and politicians in the 19th century when US citizens moved west across North America and the US gained Texas, California, Oregon, and Alaska.
Definition from the Longman Dictionary of Contemporary English Advanced Learner's Dictionary.
Dictionary pictures of the day
Do you know what each of these is called?
Click on any of the pictures above to find out what it is called.