1940s Updated!
Overview
Although fighting ended halfway through the decade, World War II (1939-1945) was the defining event of the 1940s. Despite the fact that the battles took place in Europe and the Pacific, significant and lasting changes occurred in the lives of Americans both during and after the war. Not long after World War II ended, the Cold War began, mainly between the United States and the Soviet Union. This conflict influenced U.S. government policies long after the decade had passed.
Having suffered tragic losses in World War I (1914-1918), the United States tended toward isolationism in the years following what was then known as The Great War. That is, it stayed out of the affairs of other nations and refrained from...
Read more