Appearance
Use device theme  
Dark theme
Light theme

What does New Deal mean?

Looking for the meaning or definition of the word New Deal? Here's what it means.

Proper noun
  1. A series of domestic programs enacted in the United States between 1933 and 1938 in response to the Great Depression, focusing on relief, recovery, and reform.
Find more words!
Use * for blank tiles (max 2) Advanced Search Advanced Search
Use * for blank spaces Advanced Search
Advanced Word Finder
Examples
Even though Roosevelt never embraced socialism, aspects of the New Deal were clearly socialistic, as Peter Temin has noted.
Some uncles were devoted New Deal Democrats and others true-blue Republicans.
Little appetite has the New Deal for trying conclusions with political champions.
Alone among the New Deal agricultural agencies, they provided subsistence and operating credit for farmers.
Chief Madzimawi of the Ngoni people in Chipata has appealed to the New Deal Government to provide clean and safe drinking water in his chiefdom.
American politics has assumed the form of a sweeping social reaction, aimed at overturning the reformist legacy of the New Deal.

See Also

Nearby Definitions
Find Definitions
go
Word Tools Finders & Helpers Apps More Synonyms
Copyright WordHippo © 2024