Appearance
Use device theme  
Dark theme
Light theme

What does United States of America mean?

Looking for the meaning or definition of the word United States of America? Here's what it means.

Proper noun
  1. A country in North America, stretching from the Atlantic to the Pacific Ocean, and including Alaska, Hawaii, and several territories.
Find more words!
Use * for blank tiles (max 2) Advanced Search Advanced Search
Use * for blank spaces Advanced Search
Advanced Word Finder
Similar Words

See Also

Nearby Definitions
Find Definitions
go
Word Tools Finders & Helpers Apps More Synonyms
Copyright WordHippo © 2024