the southeastern states of the US, used especially when talking about politics or history. When people in the US talk about the South, they mean the states that were originally part of the Confederacy during the Civil War. The economy of these states was based on slavery, and after slavery was officially ended in 1863, most Southern states made laws that were unfair to black people or separated them from white people. Today, people think of the South as a place where people are more conservative (=not wanting changes) than in other parts of the US. ➔ deep South
the southern part of England. The South of England, especially the area around London is generally considered to be richer than the rest of the UK, and a more expensive place to live.
the poorer countries of the southern parts of the world, including most of Africa, parts of Central and South America, and parts of southern Asia