Appearance
Use device theme  
Dark theme
Light theme

What does anatomy mean?

Looking for the meaning or definition of the word anatomy? Here's what it means.

Noun
  1. The art of studying the different parts of any organized body, to discover their situation, structure, and economy; dissection.
  2. The science that deals with the form and structure of organic bodies; anatomical structure or organization.
  3. A treatise or book on anatomy.
  4. The act of dividing anything, corporeal or intellectual, for the purpose of examining its parts; analysis
  5. (colloquial) The form of an individual, particularly a person, used in a tongue in cheek manner, as might be a term used by a medical professional, but in a markedly a less formal context, in which a touch of irony becomes apparent.
  6. (archaic) A skeleton, or dead body.
  7. The physical or functional organization of an organism, or part of it.
Find more words!
Use * for blank tiles (max 2) Advanced Search Advanced Search
Use * for blank spaces Advanced Search
Advanced Word Finder

See Also

Nearby Definitions
7-letter Words Starting With
Find Definitions
go
Word Tools Finders & Helpers Apps More Synonyms
Copyright WordHippo © 2024