Appearance
Use device theme  
Dark theme
Light theme

What does American South mean?

Looking for the meaning or definition of the word American South? Here's what it means.

Proper noun
  1. An expansive region encompassing the southeastern and south-central part of the United States, typically defined as including the states of Texas, Oklahoma, Louisiana, Arkansas, Alabama, Tennessee, Kentucky, Georgia, North Carolina, South Carolina, West Virginia, and Virginia.
Find more words!
Use * for blank tiles (max 2) Advanced Search Advanced Search
Use * for blank spaces Advanced Search
Advanced Word Finder
Similar Words

See Also

Nearby Definitions
Find Definitions
go
Word Tools Finders & Helpers Apps More Synonyms
Copyright WordHippo © 2024