Looking for the meaning or definition of the word anatomy? Here's what it means.
Noun
The art of studying the different parts of any organized body, to discover their situation, structure, and economy; dissection.
The science that deals with the form and structure of organic bodies; anatomical structure or organization.
A treatise or book on anatomy.
The act of dividing anything, corporeal or intellectual, for the purpose of examining its parts; analysis
(colloquial) The form of an individual, particularly a person, used in a tongue in cheek manner, as might be a term used by a medical professional, but in a markedly a less formal context, in which a touch of irony becomes apparent.