Appearance
Use device theme  
Dark theme
Light theme

What does Germany mean?

Looking for the meaning or definition of the word Germany? Here's what it means.

Proper noun
  1. (geography) The Central European state formed by West Germany's 1990 absorption of East Germany, with its capital in Berlin.
  2. (geography, historical) The Central European state formed by Prussia in 1871 or its successor states, with their capitals in Berlin.
  3. (geography, historical) A nominal medieval kingdom in Central Europe forming a region of the Carolingian and Holy Roman empires, with various capitals; by extension, the Holy Roman Empire itself, the empire of the Austrian Habsburgs.
  4. (geography, chiefly historical) The nation of the German people, regardless of their political unification (see usage note).
  5. (countable, geography, historical) West or East Germany or any other German state (see usage note); (in the plural) both, several, or all of these states, taken together.
Find more words!
Use * for blank tiles (max 2) Advanced Search Advanced Search
Use * for blank spaces Advanced Search
Advanced Word Finder
Similar Words
Examples
The Black Bloc tactics emerged in Germany in the 1980s, and caught the attention of the North American public during the 1999 Battle of Seattle.
Severing's belief that trade union workers were the most progressive and democratic element in Germany holds up well under investigation.
Gazzamania, that much is clear, took off after the World Cup semi-final between England and Germany in Turin.
Hi everypony! I'm back from Germany and my luck held out! You'd never believe what I found!
In 1945, overseen by Alfred Hitchcock, a crack team of British film-makers went to Germany to document the horror of the concentration camps.
The northern pirates were now swarming on every sea, and the coasts of Britain, Gaul, and Germany were all alike desolated by their harryings.

See Also

Nearby Definitions
Find Definitions
go
Word Tools Finders & Helpers Apps More Synonyms
Copyright WordHippo © 2024