This article is part of a series on the |
History of the United States |
---|
The history of the lands that became the United States began with the arrival of the first people in the Americas around 15,000 BC. After European colonization of North America began in the late 15th century, wars and epidemics decimated Indigenous societies. By the 1760s, the thirteen British colonies were established. The Southern Colonies built an agricultural system on slave labor, enslaving millions from Africa. After defeating France, the British Parliament imposed a series of taxes; resistance to these taxes, especially the Boston Tea Party in 1773, led to Parliament issuing the Intolerable Acts designed to end self-government.
In 1776, the United States declared its independence. Led by General George Washington, it won the Revolutionary War in 1783. The Constitution was adopted in 1789, and a Bill of Rights was added in 1791 to guarantee inalienable rights. Washington, the first president, and his adviser Alexander Hamilton created a strong central government. The Louisiana Purchase in 1803 doubled the size of the country. Encouraged by available, inexpensive land and the notion of manifest destiny, the country expanded to the Pacific Coast. The resulting expansion of slavery was increasingly controversial. After the election of Abraham Lincoln as president in 1860, the southern states seceded to form the pro-slavery Confederate States of America, and started the Civil War. The Confederates' defeat in 1865 led to the abolition of slavery. In the subsequent Reconstruction era, the national government gained explicit duty to protect individual rights. White southern Democrats regained their political power in the South in 1877, often using paramilitary suppression of voting and Jim Crow laws to maintain white supremacy.
The United States became the world's leading industrial power in the 20th century, largely due to entrepreneurship, industrialization, and the arrival of millions of immigrant workers. A national railroad network was completed, and large-scale mines and factories were established. Dissatisfaction with corruption, inefficiency, and traditional politics stimulated the Progressive movement, leading to reforms including the federal income tax, direct election of Senators, citizenship for many Indigenous people, alcohol prohibition, and women's suffrage. Initially neutral during World War I, the United States declared war on Germany in 1917, joining the successful Allies. After the prosperous Roaring Twenties, the Wall Street crash of 1929 marked the onset of the decade-long worldwide Great Depression. President Franklin D. Roosevelt's New Deal programs, including unemployment relief and social security, defined modern American liberalism.[1] Following the Japanese attack on Pearl Harbor, the United States entered World War II, helping defeat Nazi Germany and Fascist Italy in the European theater. In the Pacific War, America defeated Imperial Japan after using nuclear weapons on Hiroshima and Nagasaki.
The United States and the Soviet Union emerged as rival superpowers during the Cold War; the two countries confronted each other indirectly in the arms race, the Space Race, propaganda campaigns, and proxy wars. In the 1960s, in large part due to the civil rights movement, social reforms enforced the constitutional rights of voting and freedom of movement to African Americans. In the 1980s, Ronald Reagan's presidency realigned American politics towards reductions in taxes and regulations. The Cold War ended when the Soviet Union was dissolved in 1991, leaving the United States as the world's sole superpower. Foreign policy after the Cold War has often focused on conflicts in the Middle East, especially after the September 11 attacks. In the 21st century, the country was negatively affected by the Great Recession and the COVID-19 pandemic. In the 2020s, America withdrew from the war in Afghanistan, interveaned in the Russian invasion of Ukraine, and got involved in the Middle Eastern crisis.