Appearance
Use device theme  
Dark theme
Light theme

What does Germany mean?

Looking for the meaning or definition of the word Germany? Here's what it means.

Proper noun
  1. (geography) The Central European state formed by West Germany's 1990 absorption of East Germany, with its capital in Berlin.
  2. (geography, historical) The Central European state formed by Prussia in 1871 or its successor states, with their capitals in Berlin.
  3. (geography, historical) A nominal medieval kingdom in Central Europe forming a region of the Carolingian and Holy Roman empires, with various capitals; by extension, the Holy Roman Empire itself, the empire of the Austrian Habsburgs.
  4. (geography, chiefly historical) The nation of the German people, regardless of their political unification (see usage note).
  5. (countable, geography, historical) West or East Germany or any other German state (see usage note); (in the plural) both, several, or all of these states, taken together.
Find more words!
Use * for blank tiles (max 2) Advanced Search Advanced Search
Use * for blank spaces Advanced Search
Advanced Word Finder
Similar Words
Examples
In 1945, overseen by Alfred Hitchcock, a crack team of British film-makers went to Germany to document the horror of the concentration camps.
Occupying Germany had proven too costly and with it, ended 28 years of Roman campaigning across the North European plains.
The Black Bloc tactics emerged in Germany in the 1980s, and caught the attention of the North American public during the 1999 Battle of Seattle.
One afternoon in 1920. a young pianist sat down in a shuttered room in the capital of defeated Germany and played a Bagatelle by Beethoven.
The Brazilians are eating their hearts out over their defeat by Germany in the World Cup.
After the turn of the century, the United Kingdom's industrial monopoly was challenged by Germany and the United States.

See Also

Nearby Definitions
Find Definitions
go
Word Tools Finders & Helpers Apps More Synonyms
Copyright WordHippo © 2024