Civil War—American
Overview
The U.S. Civil War was one of the most important events in American history. The war transformed the United States and fundamentally changed the American way of life. Just as importantly, the Civil War tested America’s ability to remain united and determined whether the still relatively young nation would live up to the idea that all people are created equal. At its heart, the Civil War was a battle over the future of slavery in the United States. While the Southern states wanted to keep slavery legal, most people in the North were loudly calling for abolition. The Southern states ultimately left the Union, or the United States, to start their own country. The Union subsequently declared war on the...
Read more