United States People - language, government, economy, cities, history, tourism, education, religion

Read about United States People: language, government, economy, cities, history, tourism, people, education, religion, agriculture, climate ...

INTRODUCTION OF UNITED STATES PEOPLE

United States
United States People, human population of the United States today and the characteristics of that population. These characteristics include the age, ethnicity, immigration rates, birth and death rates, and geographic distribution of the American people. This article discusses these characteristics and how they have changed during the nation’s history. It includes information on the growth of America’s urban and suburban society, the history of religion in the United States, and changes in the American family over time.

According to the 2000 census, the United States was a nation of 282,338,631 people. In 2006 the U.S. Census Bureau estimated that the United States population had reached a milestone: 300 million people. This population count makes the United States the third most populous country in the world, after China and India. Nearly 5 percent of the Earth’s inhabitants live in the United States. Historically, this nation has attracted vast numbers of immigrants from around the globe. Yet the United States remains less densely populated than other large countries or other industrialized nations—in 2009 there were 34 persons per sq km (87 per sq mi).

The population of the United States has grown continuously, from 4 million at the first national census in 1790, to 76 million in 1900, to 282 million in 2000. Its natural growth rate in 2009 was a moderate 0.5 percent compared with a 1.25 percent growth rate for the world. This U.S. growth rate reflects the 13.8 births and 8.4 deaths per 1,000 people that were occurring yearly in the United States. At this rate of growth, it would take the United States 71 years to double in population, while the world population would double in 55 years. These growth rates, both nationally and internationally, are likely to change, however, as birthrates were declining in developed and developing nations at the turn of the 21st century, and death rates were rising in parts of Africa and the former Soviet Union.

For a large country, the United States is also remarkably uniform linguistically and culturally. Only 6 percent of Americans in the 1990 census reported they spoke little or no English. This is very different from many other countries. In Canada, 66 percent of the population speaks only English, 21 percent speaks only French. India has 14 major languages and China 7 major dialects. The linguistic uniformity in the United States results from early British dominance and from widespread literacy. Advertising, movies, television, magazines, and newspapers that are distributed across the nation also promote a common language and common experiences.

Cultural differences among parts of the United States—north and south, east and west, island and mainland—are also disappearing. In the second half of the 20th century, Americans were more likely than ever before to travel or move to other parts of the country. The national media and large corporations promote the same fashions in dress, entertainment, and sometimes in behavior throughout the states and regions. Newer suburbs, apartments, offices, shops, factories, highways, hotels, gas stations, and schools tend to look much the same across the nation. The uniformity of the American media and the dominance of the English language not only characterize the United States, but increasingly influence cultures around the globe. E-mail and the Internet are the latest technologies that are spreading American English.

Although America’s culture is becoming more uniform, its society remains a diverse mix of ethnic, racial, and religious groups. The United States is a pluralistic society, meaning it is composed of many nationalities, races, religions, and creeds. Some of the people who immigrated to America embraced the opportunity to leave old cultures behind and to remake themselves unencumbered by past traditions and loyalties. Others found that the liberties promised under the Bill of Rights allowed for distinctiveness rather than uniformity, and they have taken pride in preserving and celebrating their origins. Many Americans find that pluralism adds to the richness and strength of the nation’s culture.

The diversity of the U.S. populace has been a source of friction, as well. Throughout the nation’s history, some segments of American society have sought to exclude people who differ from themselves in income, race, gender, religion, political beliefs, or sexual orientation. Even today, some citizens argue that recent arrivals to the United States are radically different from previous immigrants, can never be assimilated, and therefore should be barred from entry. There are very different understandings of what makes a person an American. The nation’s motto, E pluribus unum (“From many, one”), describes the linguistic and cultural similarities of the American people, but it falls short as a description of the diversities among and within the major groups—Native Americans, those whose families have been Americans for generations, and more recent immigrants. This diversity is one of America’s distinguishing characteristics.

This is one of seven major articles that together provide a comprehensive discussion of the United States of America. For more information on the United States, please see the other six major articles: United States (Overview), United States (Geography), United States (Culture), United States (Economy), United States (Government), and United States (History).

GROWTH OF U.S. POPULATION

The territories of the United States spread across many geographic regions and climates. The land stretches from the tropics to within the Arctic Circle. These varied terrains have attracted, challenged, and supported many different groups of people. America’s relatively low population density and its relatively high standard of living, along with opportunities for free expression, continue to fuel immigration to the United States. The nation remains a magnet for immigrants, despite the fact that substantial disparities exist in wealth and in access to resources between recent immigrants and more established Americans.

Growth Through Immigration

Colonizers and conquerors, wanderers and settlers have long been attracted to America’s abundant resources. Since 1820, when national record keeping began, more than 65 million people have come to the United States; 660,000 immigrants arrived in 1998 alone. The vast majority of Americans trace their ancestry to one or more of these immigrant groups. The various ethnic and racial origins of the residents and immigrants remain important sources of personal identity. Of the 224 million people reporting their ancestry in the 1990 census, only 13 million, or 6 percent, identified themselves as Americans only. The rest chose one or more broad racial or linguistic groupings (such as African American or Hispanic) or national heritages (German, English, Irish, and Italian were most common) to define their origins. Most Americans possess varied national, ethnic, and racial identities that reflect both the origins of their ancestors and their own affiliation to the United States.

Until the late 19th century, immigration to the United States was unrestricted, and immigrants came freely from all parts of the world. However, the areas of the world contributing the largest share of immigrants have shifted during the course of America’s history. In the 1790s the largest numbers of immigrants came from Great Britain, Ireland, western and central Africa, and the Caribbean. A hundred years later, most immigrants came from southern, eastern, and central Europe. In 1996 they were most likely to come from Mexico, the Philippines, India, Vietnam, and China—indicating a recent increase in Asian immigration. Not all immigrants stay in the United States. Although 46 million immigrants arrived in the United States from 1901 to 1999, nearly a third later returned to their homelands. In earlier years, a similar proportion of migrants returned.

The 1990 census indicated that nearly 20 million inhabitants had been born outside the United States, about 8 percent of the total population. Eight million, or 40 percent, of those born overseas became naturalized citizens. Early in the 20th century it took immigrants three generations to switch from their native language to English. At the end of the 20th century, the shift to English was taking only two generations. This is not only because of the daily exposure to English-language movies, television, and newspapers, but because entry-level jobs in service industries require more communication skills than did the factory jobs that immigrants took a century or more ago.

Ancient Immigrants and Early Cultures

The earliest arrivals of humans into the territory that is now the United States are poorly documented, but archaeological work provides an idea of when human settlement began in the Americas. Most anthropologists believe that small groups of hunters and foragers began migrating from northeastern Asia to northwestern North America at least 15,000 years ago. These ancient migrants crossed to North America during the most recent of the ice ages, when glaciers had frozen much of the world’s water on land. At that time, sea levels were lower than they are today, and a natural land bridge, called Beringia, linked present-day Siberia and Alaska. The earliest archaeological sites in North America, dated at more than 11,000 years old, indicate that humans quickly spread south and east across the continent. Separate waves of peoples migrated to the Americas over thousands of years. The last of these occurred around 4,000 years ago when the Inuit and Aleut peoples arrived in what is now Alaska from northeastern Asia. Other migrations include the Hawaiian people, who arrived from the island of Raiatea, near Tahiti in Polynesia, in the 7th century AD. More migrations to Hawaii from the same region occurred through the 13th century. For more information on the peopling of the Americas, see First Americans.

By the 15th century thousands of separate groups lived in North America, speaking hundreds of mutually incomprehensible languages and many more related languages and dialects. The cultures were as varied as the languages, ranging from agricultural, mound-building cultures in the Southeast and in the Mississippi and Ohio river valleys to the cliff dwellers in the Southwest, and from the complex fishing societies in the Northwest to the foragers of the northern Great Lakes (see Mound Builders, Cliff Dweller). These various groups were neither static nor homogeneous populations. They only seemed alike to the later-arriving Europeans, who mistakenly labeled all these groups as “Indians.” In fact, recent histories of native America show that towns and cultures emerged, prospered, and sometimes fell because of changes in climate, technology, or available resources. Warfare, diplomacy, and trade also affected native cultures and settlements. The peoples of America have always exhibited social, political, and economic diversity, and American history did not begin with European settlement.

The arrival of Europeans and Africans starting in the late 16th century brought irreversible changes. As the European population grew, conflicts developed between Europeans and Native Americans over the control of the land. From the early 17th century to the late 19th century, war, disease, and the confiscation of land, resources, and livelihoods took a severe toll on all native populations. In what are now the lower 48 states, a native population that ranged from 1.5 million to 8 million or more prior to European conquest was reduced to 243,000 by 1900. On the Hawaiian Islands, the native Hawaiian people numbered 300,000 when Europeans arrived in 1778 and only 135,000 by 1820. In Alaska, 20,000 Aleutian natives existed before contact with Europeans in the 18th century and only 1,400 by 1848.

Entire peoples and ways of life disappeared, and straggling survivors formed new nations or tribal groups with the remnants of other groups, moved to new territories, and adopted various social, economic, and military strategies for survival. Some migrated west, ahead of the advance of European migration. Some went to Canada, where westward settlement by European Canadians occurred somewhat later and where government relations with native peoples were somewhat less harsh. The overall decline of native populations masks periods of recovery and the continued resistance of native peoples, but the dominant trend was one of a steep decline in numbers. This trend was not reversed until the second half of the 20th century—by the 2000 census, 2.5 million Native Americans, including Inuits and Aleuts, lived in the United States.

European and African Immigration in the Colonies

The Europeans and Africans added new layers of complexity to the territories named the New World. European military technology, commercial wealth, and immunity to diseases such as smallpox, influenza, and tuberculosis generally gave Europeans an advantage over the original inhabitants. Yet the Native Americans knew the land and were skilled negotiators, eloquent orators, and fierce fighters. Wresting control of the land from the indigenous peoples took the newcomers some 300 years to accomplish.

Colonists established a variety of outposts for their European empires (see Colonial America). By the 17th and 18th centuries, the French had settlements around the Great Lakes and the upper Mississippi River, and at New Orleans. The Spanish established settlements in Florida, the Southwest, and California. The British entrenched themselves in New England and the South, while the Russians settled on the West Coast, and the Swedes and the Dutch on the East Coast. This short list fails to capture the ethnic complexity of early European settlement in what is now the United States. The various settlements included Scots, Welsh, Irish, Germans, Finns, Greeks, and Italians, as well as Maya, Aztec, and African slaves.

European settlements, both in the North and the South, depended on the skills and labor of these indentured European servants and, particularly after 1700, of enslaved Africans. The majority of the early European immigrants were not free—60 percent in the 17th century and 51 percent in the 18th century arrived as indentured servants or prisoners (see Indentured Servitude). However, these Europeans could hope to achieve freedom at the end of their servitude. Africans were treated differently; neither they nor their children could realistically hope to attain freedom. A few Africans arriving in the New World were free men sailing the Atlantic as part of an economic network connecting Europe, Africa, and the Americas. The vast majority, however, were enslaved, purchased in various parts of Africa to work on European plantations, farms, and homesteads (see Slavery in the United States). Most Africans came from coastal West Africa and the Niger River region. Smaller numbers came from central, southern, and eastern Africa. Twenty-one percent of the population on the eve of the American Revolution (1775-1783) was of African descent, almost all working as slaves.

Ethnically and linguistically the African migration was as diverse as the European; culturally it was more so. Most Africans caught in the slave trade were skilled farmers, weavers, and metallurgists; smaller numbers were herders, hunters, foragers, or city dwellers. Some had been enslaved in their homelands and some were African royalty (see Atlantic Slave Trade). They included Muslims, Christians, and others who worshiped one god, as well as those who worshiped multiple deities, such as animists and ancestor worshipers. These involuntary immigrants faced a hard life in the New World. Their labor and skills were exploited, their specific national origins were forgotten, and their cultural traditions were partially suppressed. Yet Africans in America constructed flexible family networks that allowed their population to grow and expand in spite of enslavement. The family protected its members from some of the harshest features of enslavement and preserved elements of religious belief, vocabulary, poetic tradition, family values, craft and artistic practice, and other aspects of African heritage.

European American populations generally thrived as they expanded their control over the continent. The predominately British Protestant settlements on the East Coast grew rapidly during the colonial period because of the immigration of women and men, nearly all of whom married and had many children. Colonial American women, free and enslaved, gave birth every two years on average, pushing the natural increase (the surplus of births over deaths) of Britain’s American colonies to twice that of the Old World. In addition, Britain absorbed the smaller Dutch and Swedish colonies on the East Coast before the end of the 17th century. The more isolated French, Russian, and Spanish Roman Catholic settlements to the west remained relatively small, in part because few women resided at these military posts, missionary compounds, and trading stations. Their geographic isolation inhibited immigration, keeping growth rates low and populations small.

Diversity and Assimilation in American Society

The American victory in the Revolutionary War united 13 of the English-speaking settlements into the largest and most powerful political unit in the territory, even though those first 13 states hugging the eastern coast seem small compared with the country’s eventual size. As a result of the Revolution, approximately 71,500 people out of a populace of some 2.5 million fled the new United States. Some were Loyalists—political or economic refugees whose loyalties to Great Britain remained strong; others were blacks seeking refuge from slavery. Immigration and the commercial slave trade after the war quickly restored the population to its former level. The Revolution also opened up the area west of the Appalachian mountains to settlement, as fur traders and farmers were no longer confined by British settlement restrictions. Pioneering citizens, immigrants, and slaves moved west, displacing Native Americans who had hoped to preserve their cultures undisturbed by the expanding United States.

The 17th and 18th centuries saw a growing importation of Africans into North America. After 1808 U.S. law forbade the importation of slaves from abroad, although some smuggling of slaves continued. Few people from Africa chose to come to the United States voluntarily, where the free African population was small, considered second-class citizens, and confined largely to the northern states. Large numbers of Europeans migrated to the United States in the early national period, drawn by the promise of freedom, cheap land in the West, and jobs in the first factories of the emerging industrial age. The influx of Europeans, the end of the slave trade, and the ongoing wars removing Native Americans meant that some of the racial diversity of the population was diminishing. By the early decades of the 19th century, a greater proportion of Americans were of western European and Protestant heritage than at the time of the Revolution.

Over the course of the 19th century, the United States gradually absorbed the French colonists in the upper Midwest and in New Orleans, Louisiana; the Spanish and Russian colonists in the South, West, and Northwest; and the territories of the Hawaiian people and other indigenous groups. Sometimes these territories were added by diplomacy, sometimes by brute force. European visitors were surprised at the diversity in nationalities and in religious and secular beliefs in early America, as well as the number of intermarriages between people of differing European heritages. There were also cross-racial births, sometimes voluntary and sometimes by force, but rarely within legal marriages. The population continued to grow through migration as well, driven in part by English, Irish, and German settlers who came in large numbers around 1848 to escape political repression and food shortages in Europe.

By 1860, 86 percent of Americans counted in the census were white (72 percent native-born white) and 14 percent black. (Most Native Americans were not included in census figures until the late 19th century.) Although the country had become more uniform, it was not homogeneous enough for some citizens. They sought at various times between the Revolution and the American Civil War (1861-1865) to delay the naturalization of foreign immigrants, to send African Americans to Liberia or elsewhere, or to discriminate against Roman Catholics. But the German and Irish immigrants of the midcentury gradually won acceptance, and free African Americans insisted on an American identity, pushing for an end to slavery and for full citizenship.

The insecure status of even free African Americans in the middle decades of the 19th century caused thousands of blacks to emigrate from the United States to Canada, especially after the Fugitive Slave Law was passed in 1850. This law required that slaves who escaped to free states be returned to their masters. Within a year, 10,000 black Americans fled to safety in Canada. By 1861, on the eve of the Civil War, 50,000 African Americans resided in Canada.

The American Civil War briefly interrupted European immigration. At the end of the war some slaveowners moved to Brazil and other places where slavery was still legal. With slavery abolished in the United States and former slaves’ status as American citizens constitutionally guaranteed, 30,000 African Americans returned from Canada to rejoin family and friends. The constitutional promises of the post-Civil War era were soon discarded. Racism in both the North and the South confined African Americans to second-class citizenship in which political and civil rights were ignored. Discrimination by race was declared constitutional in 1896 (see Plessy v. Ferguson).

The immigrant population changed dramatically after the Civil War. The majority of white immigrants had traditionally come from western Europe, but during the second half of the 19th century, many immigrants came from central, southern, eastern, and northern Europe. This influx brought in larger numbers of Roman Catholics. And for the first time there were substantial communities of Orthodox Christians and Jews. On the West Coast, Chinese and Japanese immigrants, mostly men, arrived to work in agriculture and on the railroads.

From 1880 to 1914, peak years of immigration, more than 22 million people migrated to the United States. As with earlier arrivals, some immigrants returned home after a few years. Some maintained separate ethnic and religious identities in urban neighborhoods as well as in the smaller towns of the West, while others blended into American society through marriage, education, language, employment, politics, and sometimes religion.

Restrictions on Immigration

Late-19th-century immigrants, with their different ways and seemingly strange religions, made American voters anxious enough to enact the first laws restricting immigration. Social Darwinism, with its beliefs that national characteristics or ethnic traditions were inherited, led Americans to view immigrants from nonindustrialized nations as not only economically backward but biologically inferior. It gave more-established, native-born Americans a supposedly scientific excuse for blocking immigration. Convicts and prostitutes were barred in 1875. Then paupers, the so-called mentally defective, and all Chinese immigrants were excluded in 1882. Contract workers, who were often Italian or Chinese, were also banned in the 1880s. Japanese immigration was stopped in 1907.

By 1910 African Americans made up only 11 percent of the population, and Native Americans constituted only 0.3 percent, their smallest proportions ever. For Native Americans, the population decline was due in part to the military defeat of the last of the independent nations and in part to their impoverishment on reservations. Segregation, lynching campaigns, and poverty slowed the growth of the African American population. Even though more than three-quarters of Americans were native-born whites in 1910, many citizens still felt insecure. The settlement house movement, whose most prominent advocate was social reformer Jane Addams, sought to speed the Americanization of foreign-born urban residents through education and social services. This was an insufficient response for some American citizens, and additional restrictions were placed on immigration. After 1917 only literate individuals were admitted.

The Russian Revolution of 1917 convinced many Americans that all foreigners were Bolsheviks, anarchists, or criminals (see Bolshevism, Anarchism). Fearing the importation of radical political ideas, labor unrest, and attempts at subversion, many Americans retreated into isolationism, the idea that America should separate itself from the rest of the world. In 1921 and 1924 Congress mandated a quota system for immigration, which soon became based on European ethnicities present in the United States in 1890, before many eastern Europeans had arrived. This granted 80 percent of the 150,000 annual visas to immigrants from western Europe, leaving only 30,000 visas for immigrants from other countries.

The Great Depression of the 1930s only sharpened feelings against foreigners in America. With anti-immigrant feelings running high and with jobs being scarce, more people emigrated from the United States than arrived during the 1930s, the first period of negative migration since the Revolution. The emigrants included an estimated 500,000 Mexican Americans, many of them U.S. citizens or legal immigrants, who were forced out of the country on the grounds they were taking jobs from supposedly real Americans, that is, those of western European descent. This decade also saw the lowest population growth rate in the history of the United States.

Not only did old-stock Americans fear eastern and southern Europeans, Hispanics, and Asians, but anti-Semitism was also commonplace in the early 20th century. This was especially true after the turn of the century, when immigration produced a substantial eastern European Jewish presence in the cities. After World War I (1914-1918), the children of these immigrants sought admission to high schools and colleges, and they entered skilled and professional occupations, and many Christians responded with fear. Quotas enforced during the 1920s limited immigration from countries with large numbers of Jewish emigrants. Colleges, professional schools, and businesses barred Jews entirely or admitted only a few during this period. Through the first half of the 20th century, towns and individual householders barred Jews from buying real estate by including restrictive covenants in property deeds, a practice known as “gentleman’s clauses.” Although 102,000 Jewish refugees escaping Nazi Germany were admitted into the United States before World War II (1939-1945), many more were refused entrance. As a consequence of this policy, some died in German labor and death camps.

Immigration in 20th- and 21st-Century America

After the war, revelations about the full extent of Nazi racism in Europe led to reevaluations of American immigration policy and to special legislative exemptions to the quota system. More than 93,000 Jews immigrated to the United States from 1946 to 1949. War brides, displaced persons, refugees, orphans, and other people caught in postwar political changes or in the later conflicts of the Cold War were also granted permission to enter the country. At first these were Russians, Czechs, and Belorussians, but later they included Cubans, Vietnamese, Cambodians, Hmong, Iranians, and others. The number of immigrants was relatively small, and Americans thought of themselves as relatively homogeneous in the 1950s and 1960s, a feeling bolstered by the all-white images dominating the nation’s television screens. In 1960, 83 percent of Americans were native-born whites.

The civil rights movement, which peaked from 1955 to 1965, renewed concerns about racism and issued a clear call to fulfill constitutional guarantees of human equality. Racial prejudice, anti-Semitism, anti-Catholic sentiment, and other forms of discrimination became less acceptable, as did the image of the true American as white, northern European, and Protestant. This change in attitude helped bring an end to national quotas for immigrants. In 1965 family members of those already living in the United States were given priority in immigrating without regard to national origin, as were highly skilled individuals, but migration from Asia was placed under a separate quota system that applied only to the Far East. By 1978 this provision was lifted, and all immigrants were treated equally.

Because of changes in U.S. immigration law and in economic and political conditions worldwide, the number of immigrants to America resurged in the last quarter of the 20th century. Immigrants from the Pacific Rim, including Filipinos, Chinese, and Koreans, as well as immigrants from American dependencies in the South Pacific, arrived on the West Coast and settled throughout the United States. Mexicans, Guatemalans, Costa Ricans, Caribbean peoples, and South Americans sought asylum and opportunity in the United States, particularly in the Southeast and Southwest and in large cities. Indians, Pakistanis, Arabs, Iranians, and others sought an outlet for their skills. These new flows of immigrants added Buddhism, Islam, and Hinduism to the mix of religious beliefs in America. Hispanic Americans became the fastest-growing segment of the population by the end of the 20th century. The effect of immigration was not felt uniformly throughout the United States. Immigrants tended to congregate in the more densely populated areas of the United States: California, Florida, Texas, and the Northeast.

Although most immigrants entered the country legally, some did not. According to official estimates, approximately 5 million illegal immigrants resided in the United States in 1996, most from Mexico, El Salvador, Guatemala, and Canada. Concern over immigration, particularly illegal immigration, increased during the 1980s and 1990s. In the last decades of the 20th century, immigration laws were amended to restrict the flow of all immigrants, to deport illegal aliens, and to deny benefits to those already living in the country legally. This wave of antiforeign sentiment was based on fears of tax increases for schooling immigrant children, for social services, and for health care, although illegal immigrants who work (albeit without legal status) pay wage and sales taxes that help support education and social services. Some citizens were also concerned about increased competition for jobs, even though immigrants frequently fill positions that American citizens do not want.

Other Americans, however, welcomed these new additions to American culture. Some employers depended on immigrants to harvest the nation’s crops, sew garments, or wash dishes in restaurants, jobs that many U.S. citizens found unattractive. Doctors and health professionals recruited from overseas were often hired to staff small-town hospitals in places where American professionals felt socially isolated. Businesses and universities welcomed foreign-born engineers and computer programmers because relatively few American students pursued these fields of study. A lottery system for entrance visas was designed to maintain the diversity of the immigrant pool by selecting a limited number of migrants by chance rather than by national origin.

According to the 2000 census, 70.9 percent of Americans were non-Hispanic whites, and the populations of blacks, Hispanics (who may be of any race), Native Americans, and Asian and Pacific Islanders were increasing. The Native American and African American populations grew, reversing 19th-century declines in their share of the total population. Migration from the Caribbean and smaller flows from various parts of Africa created the first substantial influx of free people of African descent in the nation’s history. The Census Bureau released updated estimates for the diversity of Americans in 2006, based on information compiled by regularly sampling thousands of households in 2005. Non-Hispanic whites were about 67 percent of the total population. Minorities made up 33 percent of Americans, with Hispanics the largest group at 14.4 percent and blacks second at 12.8 percent. The number of immigrants increased over the five-year period since the last census to about 12.4 percent of the population. Over half the immigrants came from Latin America, followed in number by groups from Asia.

Racism

These broad categories only hint at the full ethnic and racial diversity of the American population; conversely, the use of separate categories masks the many characteristics Americans share. The United States has been described as a melting pot where ethnic and racial groups shed their specific traits and join with other Americans to create a new identity. The nation has also been described as a salad bowl where people of different backgrounds mingle at work and school, in civic responsibilities, and as consumers, but where cultural traits remain distinct.

In the 18th century American statesman Benjamin Franklin feared that Germans could never be assimilated because of their foreign ways. In the middle of the 19th century many thought that Irish Catholics would subvert the American way of life. At the end of the 19th century the Chinese, Japanese, Jews, Italians, and others were mistrusted. Yet these groups eventually became part of mainstream America. At the end of the 20th century, many people consider newer Asian immigrants, Spanish-speaking peoples, and Muslims as permanently alien presences. If the past is a guide, these groups too will meld into the general American citizenry.

The main exceptions to full acceptance remain Native Americans and African Americans. Native Americans have a dual status based both on the existence of reservations and vibrant tribal traditions, and on the prejudices of many non-Native Americans. African Americans bear the brunt of the oldest and most deeply rooted of American prejudices.

Initial contacts between Africans and Europeans often began with misunderstanding. Africans at first thought white-skinned people were ghosts looking for people to eat, since white was the color of death in much of Africa. Europeans sometimes assumed black-skinned peoples were followers of the devil and therefore sinful, since black was the traditional color associated with lies, sin, and evil in the Western world. Differences in religion, language, and customs also led to misunderstandings, even while economic similarities favored trade between African kingdoms and European empires.

When European merchants brought the first enslaved Africans to work in their New World, they justified the enslavement on the premise that Africans were not Christian and were supposedly not civilized—in other words, the Africans were considered culturally inferior. By the 18th century, many enslaved African Americans had converted to Protestant Christianity, spoke English, and expressed a desire for freedom. A few people of African descent had, against all the odds, become poets, doctors, almanac publishers, plantation owners, and antislavery activists. It became harder for whites to claim that Africans would always be culturally inferior. Pro-slavery whites then began to justify permanent enslavement by asserting that Africans were somehow biologically inferior to Europeans. Whites claimed that anything accomplished by people with black skin was inferior, that blacks were intellectually and morally incapable of self-government, and that blacks needed to be controlled by whites. This so-called scientific racism based on presumed biological differences was useful in slaveholding areas for protecting the economic interests of slaveholders and useful in non-slaveholding areas for uniting all the different, and potentially conflicting, European ethnicities under the label “white.”

Racial discrimination grew out of the practice of enslavement but outlasted the institution of slavery. European newcomers could find common ground with the majority of Americans by joining in the denigration of African Americans. Poorer whites or socially marginal whites could feel superior by virtue of their skin color, even if they were not economically successful or fully accepted by their peers. Racism helped to create a sense of unity among white Americans by defining who was a full citizen. Racism also united African Americans through shared experiences of discrimination and suffering. As a consequence, white racism also promoted a sense of unity among black Americans, no matter what their backgrounds.

Freedom in the wake of the Civil War was a first step in eradicating this prejudice. The civil rights era of the mid-20th century saw even more advancement, but prejudice against black Americans has not been entirely eliminated. At the beginning of the 21st century, a relatively small number of white people still opposed a race-blind America that would deny them a feeling of racial superiority. Some of these people form militia, fascist, and vigilante groups that use violence against African Americans, the federal government, and others who challenge their restrictive views. The majority of Americans, however, while sometimes reluctant to change, believe that all people are created equal.

Americans tend to think in terms of a biracial, separated society, even though whites and blacks have jointly built the United States, and even though the family histories of whites, blacks, and other races are often intermixed. In addition, the two groups share many beliefs (such as freedom, liberty, and civil rights) and customs (from poetry to sports and from work to holidays). Yet the idea of racial difference, of superiority and inferiority, still provides the basis for many social, cultural, political, economic, and religious divisions in the United States.

Growth through Natural Increase: Births

While the influx of immigrants contributed to the growth of the American population and helped build American society, the major factor affecting population growth in the United States has always been the surplus of births over deaths, or the natural increase of the population. American women at the beginning of the 21st century bear an average of two children over the course of their lives. Their great-grandmothers and great-great-grandmothers in 1890 had an average of four children, because in the 19th century fewer women had access to reliable methods for controlling fertility. A century earlier, around 1790, women might expect seven births throughout their lives, if they survived into their late 40s.

Birthrates in Early America

Little is known of the birthrates of Native American societies before the arrival of Europeans. There are hints that the birthrate was relatively low because Native American women often breastfed their infants for three or four years. Since breastfeeding has a contraceptive effect, it appears that women gave birth about every four years. On the other hand, since many Native American women traditionally married soon after the onset of puberty, at around age 15, they might have had six or seven children if they lived to at least age 45. Some researchers have suggested that when European diseases and warfare killed large numbers of native peoples, women increased their childbearing in order to compensate for the excessive deaths in the community. Native Americans may have gone from low birthrates to high birthrates, but any increases in fertility could not make up for the deaths from disease, starvation, and war. The birthrate among Native Americans would not produce population growth until the 20th century.

European colonists had high birthrates compared with the birthrates in Europe at the time. Free, white colonial women typically bore children every two years and had an average of eight children, four of whom might survive to adulthood. This was twice as many children as European families had. Fertility was higher in the colonies because of the need for labor on colonial farms, the availability of land to support the larger numbers of children, and early and nearly universal marriage.

The enslaved African American population in the 17th century had more men than women and more deaths than births. By the 18th century the ratio of black men to black women was more equal and the population was holding its own. By the early 19th century the African American population was growing rapidly, but because of higher death rates and the absence of immigration after 1808, the overall growth of the African American population remained lower than that of the white population. African Americans became an increasingly smaller proportion of the population from the late 18th century to the early 20th century.

Declining Birthrates

The European American population doubled every 20 to 25 years until late in the 18th century, after which birthrates began to decrease and growth rates slowed. This decline in fertility rates early in America’s history is a distinctive characteristic of American society. In the early 19th century white women who lived through their childbearing years were bearing 7 children over the course of their lives; by 1850 it was 5.4 children, by 1950 it was 3.0, and in 2009 it was 2.1. While the longer-established American population experienced a decline in fertility and family size during the 19th century, newer immigrants had higher birthrates. It took two or three generations for these immigrants to conform to the prevailing American fertility standards.

Women’s Education and Birthrates

Decisions to limit family size are based on complex personal, social, and economic factors. The beginning of any fertility decline is most strongly linked to increased education for women. Female academies appeared after the American Revolution, public schooling became common in the early 19th century, and the first women’s colleges and coeducational institutions were created in the mid-19th century. Women read novels, newspapers, magazines, and religious tracts. Women learned about individuality and self-control and about planning for the future, and they applied these concepts to fertility. They established reform groups and literary and religious societies, indicating their interest in the world outside of marriage and childbearing.

Although wives in early America had been most concerned with the production of food and clothing, 19th-century families became child-centered, and motherhood was exalted as a special calling requiring education. Women had fewer children in order to provide each child individualized attention and the best possible upbringing. Declining fertility rates also reflected the increased cost of child-rearing during the industrial age, as advanced education became increasingly necessary; housing, food, and clothing costs rose; land became scarcer and more expensive; and child labor became less acceptable. Instead of being a potential source of income, children became a major expense, as well as more cherished individuals who deserved every opportunity. African American birthrates, which were high under slavery, fell rapidly once freedom was achieved in the wake of the Civil War, when families could hope to provide the best possible education for their children. By the end of the 19th century, most families were investing substantial amounts of time and money in each child’s future. Parents did not want to shortchange their children and so had fewer.

Birth Control

Women attempted to control child bearing in various ways, including prolonged breastfeeding, abstaining from sex, taking herbal remedies, jumping rope, horseback riding, and having abortions. By the early 19th century, condoms, originally intended to prevent the spread of sexually transmitted infections, were being used to prevent pregnancy. The vulcanization of rubber after 1839 and the invention of latex in World War I (1914-1918) made condoms, cervical caps, and diaphragms, more widely available. From 19th-century newspaper advertisements, it seems that abortion was a common method of controlling family size. These were usually performed by untrained men and women, some of whom were skilled but many of whom were not. Doctors, who were organizing the first state and national professional organizations during the mid-19th century, saw these abortionists as unprofessional competitors and a public danger. Concern about the safety of abortion led to the first state laws, enacted just before the Civil War, restricting the practice.

By the 1870s religious reformers who were worried about prostitution and the perceived spread of vice and sin began to connect contraception and abortion with immorality. The Comstock Law of 1873 declared birth control and abortion information obscene and banned it from the U.S. mail. Many states passed laws against contraception. One reason people supported bans on birth control was the fear that immigrant groups, who tended to have larger numbers of children than native-born white Americans, would come to dominate society if white, Protestant women did not have more babies. Despite the Comstock Law, birthrates continued to fall.

A small number of reformers spoke out publicly in favor of birth control. The most famous of these advocates was Margaret Sanger, who in 1921 founded the organization that would become Planned Parenthood. Sanger worked to help poorer women obtain what was still illegal information on birth control. Planned Parenthood led the fight to have the Comstock Law overturned.

The Comstock Law was declared unconstitutional in 1938, although state laws against birth control remained. In 1965 the Supreme Court of the United States struck down the last of state laws against contraception, asserting that married men and women have a right to privacy. That right was extended to unmarried persons in 1971. In 1973 abortion was legalized in the United States. Since then various restrictions have been placed on abortion, and the issue is one of the most divisive in contemporary America.

Birthrates Since World War II

Birthrates decreased steadily until the Great Depression in the 1930s, when they suddenly dropped 24 percent in a decade, reaching unprecedented lows in the mid-1930s. Families felt they could not afford more children during this prolonged economic crisis. There were also relatively few births during the crisis of World War II as couples feared for the future and as husbands and wives were separated because of military service.

Baby Boom

After World War II birthrates shot up, and by the mid-1950s were 30 percent higher than during the depths of the depression. This unprecedented upward movement in fertility levels produced a baby boom that was both a result of postwar prosperity and a reaction against the deprivations of the depression and war years. This boom helped fuel the growth of suburbs in the postwar period. The baby-boom generation had lasting effects on America. Education costs soared as this generation of children reached school age. The youth culture of the 1960s reflected, in part, the dominance of adolescent and young adult baby boomers. And recognizing that baby boomers will begin retiring in the early decades of the 21st century, the solvency of the Social Security system has become a major concern. Fertility rates declined again after the mid-1950s, although the 76 million baby boomers born between 1946 and 1964 contributed to a second, smaller baby boom in the 1970s and 1980s as they reached adulthood and started having children of their own.

A number of changes affected fertility rates in the 1950s. Many married women who had taken temporary jobs during the crisis of World War II now sought permanent positions. As these women moved into the workforce, they demanded more effective methods of birth control. By the 1960s new forms of contraception were available, including the birth control pill, intrauterine devices, and surgical techniques for permanently inducing infertility, such as tubal sterilization in women and vasectomy in men. At the end of the 20th century, 64 percent of women between the ages of 15 and 44 reported using birth control. Since 1957, the trend in the total birthrate has been downward.

New Attitudes Toward Sexuality

While these new technologies offered more effective control over fertility, new attitudes toward sexuality in the 1950s stressed impulsiveness, innovation, and experimentation—all of which discouraged the use of birth control devices, especially among young, unmarried couples. One result was that teenage pregnancies and births outside marriage soared in the 1950s. Teenage pregnancies declined in the 1960s and 1970s, surged again in the late 1980s, and then declined sharply in the 1990s. By 2000, teenage birth rate was down to 49 births per 1,000. Out-of-wedlock births, once comparatively rare, increased dramatically after World War II, and more than a third of all infants in the United States are now born outside of marriage.

Educational and Racial Differences in Birthrates

Fertility rates declined among all major groups of Americans in the last decades of the 20th century, in keeping with the trend since the late 18th century. One reason for this trend has been the increase in educational opportunities for women. Women’s educational levels affect births. Most college-educated women who have children wait until their 30s to do so, after finishing their education and establishing a career. Other women begin bearing children earlier and continue bearing children later in life.

The education level of parents also affects childbearing. The children of college-educated parents are less likely to be sexually active at age 15 than the children of those who have not completed high school.

Births outside marriage among American subcultures differ significantly. From the 1930s through the 1970s, the rates for unmarried white women giving birth remained below 10 percent. This rate increased, but was still under 20 percent in the 1980s. It increased in the 1990s, reaching 26.7 percent in 1999. The rate of black children born out of wedlock in 1999 was 68.8 percent; this is high in part because married black women have few children. A desire to enhance the opportunities available to their children and fears about the discrimination their children might face inhibit many married African American couples. Unmarried couples of all races tend to be more impulsive about sexuality and childbearing. The percentage of births to unmarried Hispanics in 1999 was 42.1 percent.

Better-educated women and men of all groups—black, white, or Hispanic—are more likely to bear children within marriage than individuals with less education. Black women, married and unmarried, have a far higher rate of unintended or unwanted pregnancies than other groups, more than half of all pregnancies. This may indicate less access to suitable birth control technologies. Hispanic women have the largest number of children among major groups—3.1 children on average, compared to 2.2 for blacks, 2.1 for Native Americans, 2.1 for Asians and Pacific Islanders, and 2.1 for whites.

The causes for the recent changes in births and marriage are poorly understood. But because births outside of marriage, early sexual experimentation, and early childbearing are so strongly linked to educational levels, and because educational achievement is itself linked to wealth, the rise in out-of-wedlock births may be a function of the changing U.S. economy. Since the 1970s the industrial base of the United States has been eroding, and with it many good-paying jobs. In 1979 the typical middle-class worker earned $498 a week. In 1995 he or she earned $475 a week (adjusted for inflation). Income for the poorest fifth of Americans fell .78 percent a year between 1973 and 1993. Industrial employment has been replaced by service work, which rewards highly educated, computer-savvy workers well but which tends to pay the majority of workers low wages. Rapid economic change, financial stress, and anxiety about the future may undermine the ability of couples to form more stable unions and have children within marriage.

Growth through Natural Increase: Deaths

Fertility rates are not the only factor influencing population growth. The population also grows when people are healthier and therefore live longer. Just as the birthrate has been steadily declining in the United States, so, too, has the death rate.

American babies are healthier than ever before in this country’s history and 99.3 percent will survive to their first birthday. Although the records from a century ago are incomplete, they indicate that only 84 percent of infants survived their first year. And a century before that, about 80 percent of infants may have lived to their first birthday. Most of the improvement in infant health has come in the 20th century and is due to improved childcare, better medical care for mothers and children, better sanitation, and the development of antibiotics.

Children born in 2009 can expect to reach age 75.7 if they are male and age 80.7 if they are female. Around the turn of the 20th century, the average life expectancy for women was 48, and for men it was 46. A century earlier, when childbirth was more dangerous, women had the lower life expectancy, around 35, compared with 37 for men.

Americans are living longer because medical care and public sanitation have improved substantially. However, infant survival and life expectancies are lower in the United States than in other developed countries because of disparities in wealth, education, and access to health care. In Japan in 2009, men could expect to live to age 78.8 and women to 85.6; in Sweden men could expect to live to age 78.6 and women to 83.3. In western Europe, the infant mortality rate is about 5 deaths per 1,000 births; in Japan it is 2.8; in the United States it is 6.3.

In the American population, wealthier people live longer, healthier lives than do poorer people. Great differences between rich and poor can produce poor health for the poorest citizens. From the 1920s to the early 1970s, America experienced an expansion of the middle class. Since then, the rich have nearly doubled their share of the country’s wealth. Hopelessness and rage can lead to substance abuse, violence, and mental depression, which can negatively affect health and longevity. More direct effects of poverty that shorten life spans for the poorest populations include malnutrition, exposure to extremes of heat and cold, and lack of medical attention.

More cohesive communities with a more equitable distribution of income and goods, even if relatively poor, tend to have better overall health than those with great disparities in wealth. For example, in the early 1990s the District of Columbia, where there are great disparities between the wealthy neighborhoods and the majority of poor neighborhoods, had an overall life expectancy of 62 for men and 74 for women. In Kansas, where the median household income was below that of Washington, D.C., but where the social differences are less sharply defined, the life expectancy was 73 for men and 80 for women.

Life expectancies also differ substantially by ethnicity and race. In 1999, whites, who tend to be wealthier, had a life expectancy of 77.3, and blacks, who tend to have less wealth, had a life expectancy of 71.4. This is, however, a smaller gap than once existed.

As noted earlier, women have a longer life expectancy than men. This is because women have a somewhat stronger immune system and suffer less from stress-related illnesses and from alcoholism, drug abuse, and violence. Because of the longer female life span, the U.S. population had more women than men in 2009—156 million women compared to 151.2 million men. Up to age 30, however, men outnumbered women in the United States, for two reasons: slightly more males are born than females, and slightly more young men immigrate into the United States than women.

Disease and Death in Early America

The small groups of people who migrated to the Americas from Asia thousands of years ago brought few germs with them. Although accidents and malnourishment were always possible, few infectious diseases were present in the Americas. When explorers and settlers arrived from densely populated Europe, they introduced diseases such as smallpox, measles, influenza, tuberculosis, whooping cough, scarlet fever, malaria, and gonorrhea. Africans brought smallpox as well, along with yellow fever, dengue fever, and malaria. Most Europeans and Africans had stronger immunities to the common diseases of their homelands, and Africans had discovered how to inoculate themselves against smallpox. Native Americans had no immunity to these imported diseases, and they died in large numbers. One estimate indicates disease was responsible for reducing native populations by 25 to 50 percent (in comparison, warfare reduced native populations by about 10 percent during the 18th and 19th centuries). Some Native American nations became extinct. Starvation and dislocation lasting into the 20th century also contributed to high death rates among Native Americans.

The earliest European settlers in the 17th century experienced high death rates. In Virginia, only about a third of the 104 people who came from England in May 1607 survived eight months after arriving. By 1624, about 7,000 settlers had come ashore, but only about 1,200 remained alive. The emphasis on searching for gold and quick profits meant that these first colonists paid little attention to producing food, building houses, or establishing permanent settlements. Starvation, exposure to the elements, and war with the native peoples caused large numbers of deaths. Half of the first settlers in New England did not survive the first winter, in 1620. However, the death rate decreased sharply in the north as colonists arrived in family groups and quickly created farms and towns to provide economic support. As a consequence of the low death rate, the population in the north grew rapidly without the need for many additional immigrants.

At first more European men than women lived in the south, and the southern population grew more slowly than the northern population. Deaths matched or surpassed births. The hotter climate in the south bred diseases such as malaria and dysentery, and European laborers frequently died of these and other semitropical diseases. Africans, who were imported to labor in the fields, were susceptible to lung diseases, but had some protection against malaria and yellow fever, and against smallpox if they had been inoculated in their homelands. African slaves shared their knowledge of smallpox inoculation during the 18th century, and the English discovered a vaccine against smallpox in the early 19th century. Even so, most diseases remained untreatable because the causes of illness were not understood.

Another source of disease emerged when large cities grew up around northern ports in the 17th and 18th centuries. These early cities were dirty places that grew haphazardly, without provision for clean water or sewage disposal. They served as ports of entry not only for travelers and immigrants but also for the diseases these voyagers brought with them. Epidemics of smallpox, yellow fever, measles, mumps, scarlet fever, and influenza frequently swept through the cities, while the isolated countryside was often spared these devastating illnesses. Among the worst of these was a series of yellow fever epidemics that hit Philadelphia in the 1790s. Ten percent of the population died in 1793, and smaller epidemics occurred in New York, Harrisburg, and other cities.

Improved Sanitation

These outbreaks prompted the first concerted efforts at health reform in the late 18th and early 19th centuries. Major northern cities began constructing central water systems and collecting garbage. Central water systems meant that people in the largest cities had cleaner water for drinking and water for washing more frequently. Central water systems also made obsolete the rain barrels where disease-carrying mosquitoes bred. Cities invested in nuisance abatement, which included measures such as draining swamps and flooded areas, cleaning outhouses untended by landlords, tearing down abandoned housing, killing rats and mice, rounding up stray dogs, supervising cemeteries and burial practices, enforcing sanitation measures and market inspections, removing trash, and cleaning streets. Cities also enforced the quarantine of arriving passengers until all seemed healthy. Merchants often protested when their ships were quarantined. However, merchants were convinced of the effectiveness of such measures after quarantines helped diminish death rates during cholera epidemics in the 1840s.

By the middle of the 19th century, these civic reforms made the northern cities healthier than the countryside. Rural areas, however, could not afford the public health measures that improved conditions in the largest and most prosperous cities. Cholera was a major killer on wagon trains heading West. Yellow fever, malaria, hookworm, and other maladies still prevailed in the South, which experienced major yellow fever epidemics in the 1850s and in 1873. These epidemics led to the creation of the National Board of Health and a federal quarantine system.

In the mid-19th century, the development of the germ theory, which stated that microorganisms cause infectious diseases, helped people understand how diseases were transmitted. Antiseptic procedures began to be used, saving many lives in surgery and childbirth. Concerned individuals and private groups carried on much of the early fight against germs and disease. Mothers sought to improve health by attacking the germs that might harm their families. They taught their children to brush their teeth, use a handkerchief when blowing their nose, cover their mouths when coughing, wash with soap, and never spit. This concern for health and sanitation even helped fuel the woman’s suffrage movement, as many women demanded the right to vote in order to push for clean water, clean streets, and the pasteurization of milk. In the second half of the 19th century, the health and longevity of African Americans and their children improved substantially after the end of slavery enabled them to form permanent families. Enslaved children had been undernourished, poorly clothed, and denied education. When plantation owners no longer made the decisions about child care, children were healthier and better educated. And after 1867 the Granger movement, which brought farmers together to solve common problems, helped raise standards of sanitation on farms.

By the turn of the 20th century, the United States was a major center for medical research, and vaccines, antiseptic methods, and preventive measures substantially improved medical care. One estimate is that by 1910 a patient had a 50-50 chance of being cured by a doctor’s advice. As the 20th century began, deaths from communicable diseases were generally declining, although deaths from tuberculosis and influenza remained significant. At the same time, degenerative diseases of old age, such as heart disease, started to become more common causes of death.

Improvements in medicine, sanitation, and health, however, were countered by rapid industrialization of the United States in the late 19th century, which created air and water pollution, overcrowded cities, and substantial pockets of abject poverty in urban and rural areas. The Progressive movement of the late 19th and early 20th centuries addressed the health problems of the urban poor. Its many reforms included meat inspections, the Pure Food and Drug Act, and pasteurization of milk. State and federal governments began to enforce public health measures. The well-being of residents was no longer only a personal or a municipal matter, as state and federal agencies began to bring health reforms to larger numbers of Americans.

The New Deal, the government’s program in the 1930s to counteract the effects of the Great Depression, continued the Progressive agenda of improving health and sanitation. It was particularly effective in improving conditions in the South, which lagged behind the health advances made in the North. This regional disparity was largely because the rural, agricultural South lacked the financial resources of the industrial North. The Civil Works Administration, a New Deal agency that provided work relief in 1933 and 1934, targeted malaria as a severe problem in the South. One aspect of the agency’s activities was building improved housing with screened windows to keep out disease-carrying mosquitoes.

Better Health Care

Access to modern medicine also began to equalize with the New Deal. After 1935 the Social Security Administration began to provide medical aid to children, pregnant women, and the disabled. During this time, private, commercial health insurance began to be developed. In 1929 a group of schoolteachers in Dallas, Texas, contracted with a local hospital to provide health coverage for a fixed fee. Shortly thereafter, the American Hospital Association created Blue Cross and Blue Shield to offer health insurance policies for groups. Health maintenance organizations (HMOs) were developed in the 1940s but did not become widespread until the 1980s.

Higher levels of medical care reached millions as people joined the armed forces during World War II. Community health also improved in many rural areas near military bases, as the government modernized water systems and sewage plants, exterminated mosquitoes and other disease-carrying insects, campaigned against sexually transmitted infections, and provided direct medical attention to civilian workers at the bases.

The federal Department of Health, Education, and Welfare (now the Department of Health and Human Services) was created in 1953. It underwrote the construction of hospitals and clinics and provided funds for medical research. Medicare and Medicaid were added to the Social Security laws in the mid-1960s to offer medical care to the elderly and to the needy. In the 1970s the federal government funded toxic waste cleanups and promoted clean air and water.

Modern antibiotics—including sulfa drugs and penicillin first used during World War II—became available to the American public in the postwar years. These drugs provided the first effective weapons against bacterial infections, and their use transformed medicine in the 1950s. Medical researchers in the 1950s also developed new vaccines, including one against polio. The annual death rate in 1940 (age-adjusted to discount any effect of the postwar baby boom), before the availability of the new antibiotics, was 10.76 percent; by 1960 it was down to 7.6.

Current Trends

Since those days of miracle drugs, however, the rates for cancer have risen, despite considerable improvements in treatment. Cancer and heart disease were the leading causes of death in the United States at the beginning of the 21st century, in part due to the aging of the American population and the successes in curing other diseases. Another reason these diseases became more common is the unhealthy lifestyle of many Americans, who eat high-fat foods and high-calorie snacks and do not exercise enough. In addition, pollution is a suspected cause of cancer.

Additionally, new diseases emerged and old diseases resurfaced in the last quarter of the 20th century. The most serious of the new diseases was acquired immunodeficiency syndrome (AIDS). In 1995 it ranked as the eighth leading cause of death in the United States, but it has since declined significantly. Some diseases—such as tuberculosis, thought to be nearly wiped out because of antibiotics—developed resistance to drugs most commonly used to treat it. Cases of tuberculosis increased during the 1980s, and decreased only after 1991, when the government started taking aggressive steps to halt the increase.

A significant cause of death in the United States in the 20th century was unrelated to disease. During the span of the 20th century, homicide rose from insignificant levels to become a major cause of death. It was, in 1998, the number-three cause of death among children from the ages of 1 to 4, the number-four cause of death among children from 5 to 14, and the number-two cause among young adults from 15 to 24. Only after age 45 does homicide disappear as a major cause of death. While homicide rates in the United States remain higher than in other industrialized nations, in the 1990s the homicide rate began to fall dramatically. In 1991 there were 9.8 homicide victims for every 100,000 people in the United States; by 1999 the rate had decreased to 5.7 victims per 100,000.

AGE OF United States POPULATION

America’s population is growing because more people are being born than are dying and because immigrants, most in their late teens or early 20s, are still coming to the United States. This combination means that the American population is younger than in other developed nations. In 2001, 21 percent of the population in the United States was under the age of 15 This compares with 18 percent in Europe and 15 percent in Japan. Because the U.S. population is young, education costs are higher in the United States. Another effect of the increased number of young people is the larger market for goods and services. Furthermore, these young people will eventually be contributing to Social Security to help support the elderly. A younger population also indicates a smaller proportion of older people. In 2001, 13 percent of the U.S. population was over age 65, compared with 18 percent in Japan, and 15 percent in Europe.

The average age of the American population is, however, older than it once was, and projections indicate the percentage of the population over 65 will continue to increase through the first quarter of the 21st century. In the first census of the United States, taken in 1790, about half of the white male population was under the age of 16. This extremely youthful society was a result of the high birthrate and the relatively low life expectancy that prevailed in the 18th century. No figures exist on the elderly at that time, but the percentage was undoubtedly quite small. By 1890 the proportion of the population under age 15 had fallen to 35.5 percent, in large part because of the declining birthrate. Only 3.9 percent of the population was over age 65. The median age of the population—that is, the age at which half the people are older and half are younger—had risen to 22. By 2001, the proportion of the population under age 15 had fallen to 21.1 percent, while 12.6 percent of the population was over age 65. The median age in 1990 was 32.8, and according to estimates it had increased to 35.9 by 2001.

The rapid increase in the median age between 1990 and 2000 was the result of the aging of the baby-boom generation—people who were reaching their 30s, 40s, and 50s. The percentage of those under age 5 increased by 4.5 percent during these years, while the percentage of the population between 50 and 54 increased by some 55 percent. The numbers of those between 65 and 69 years of age actually decreased between 1990 and 2000, a reflection of the decline in birthrate during the 1930s depression.

Age differences also vary by ethnicity and race. The median age in 2000 for the non-Hispanic white population was 37.7, for non-Hispanic blacks 30.2, for Native Americans 28.0, for Asian and Pacific Islanders 27.5, and for Hispanics was 24.6. These differences stem in large measure from differences in birthrates.

Economists look carefully at the proportions of the population under age 15 and over 65. They assume people in these age groups do not hold paying jobs and therefore depend for support on those of working age (between 16 and 64). The proportion of dependents (meaning nonworking people) to working-age people suggests the productive capacity of the economy and the social expenses of providing for the nonworking population. In 1790 the proportion of workers to dependents was roughly 50-50. Supporting so many dependents absorbed substantial proportions of social resources and thus slowed economic growth. By 1890 the proportion in America had shifted in favor of those of working age, and about 40 dependents existed for every 60 workers. In the late 1990s there were 35 dependents for every 65 workers.

At the beginning of the 21st century, the proportion of elderly people in the population was increasing, meaning that there were fewer workers per dependent over 65. With the oldest members of the baby-boom generation expected to reach retirement age in 2011, the ratio of workers to dependents will drop even further. This aging of the population poses complex questions, such as how to provide funding for the Social Security system, whether to make medical insurance more widely available, determining who should pay for long-term care of the elderly, and questioning the meaning of retirement. It is unclear how old age will be experienced in the future. The division of social resources between the youngest and the oldest Americans, for example, between schools and retirement communities, has become a matter of considerable debate.

Within the United States, the age structure of the population varies from one region to another and is influenced by people moving into and out of particular regions as well as by the residential choices immigrants make. People tend to move between the ages of 15 and 25 as they attend schools and universities away from home, find apprenticeships and training programs, and seek job opportunities. After age 35 many people have established careers, started families, and made friends and connections, and are less likely to move.

The states that attract newcomers, such as Alaska, Colorado, Georgia, and Texas, tend to have the highest proportion of young people and the smallest proportion of older people. Job opportunity is the most frequent reason for moving, although recreational and environmental considerations are also important. Those who move also consider the available housing stock and the cost of living. Of all the states, Utah had the largest portion of young people at the beginning of the 21st century, largely because of high birthrates among its predominantly Mormon population. The states that experience more people leaving than arriving tend to have fewer young people and more older ones. Such states include Rhode Island, Pennsylvania, West Virginia, and North Dakota. Similarly, many northeastern cities have large elderly populations, while suburbs in the Southeast and Southwest have large populations of younger people. Florida is an exception to these trends, because it attracts many retirees as well as younger Cubans, Haitians, and other immigrants.

GEOGRAPHIC DISTRIBUTION OF U.S. POPULATION

In 2000 almost two-thirds of the U.S. population lived in states along the three major coasts—38 percent along the Atlantic Ocean, 16 percent along the Pacific Ocean, and 12 percent along the Gulf of Mexico. The smallest numbers lived in the area between the Mississippi River and the Rocky Mountains, particularly in the central and northern Great Plains. While the Rocky Mountain and plains states account for about half of the landmass of the United States, only 34 percent of the population resides in these areas.

Americans are highly mobile and move an average of 11 to 13 times in their lives, although in the 1980s and 1990s Americans moved less often than they did in the era immediately following World War II. At the beginning of the 21st century the fastest-growing areas were in the Southeast, especially Georgia, the Carolinas, and Florida; in the Rocky Mountains, including Nevada, Arizona, Colorado, Utah, and Idaho; and along the West Coast. Washington State was the fastest growing of the West Coast states.

Since World War II, people have moved to the Southeast, Southwest, and West Coast for many reasons. The economies of these areas were growing. The South and California, in particular, received a disproportionate share of military and government spending during the Cold War. These expenditures created many jobs. A relatively cheap, nonunion labor force in many parts of the South also attracted industry from other parts of the country. In addition, the increasingly widespread ownership of automobiles made moving to rural areas easier. Air conditioning made the South more attractive, as did low housing costs and improved public health conditions, once malaria, hookworm, and other diseases associated with warm climates were reduced or eliminated.

Starting in the 1950s, the areas around the Great Lakes and in the Northeast, which had been major manufacturing centers, lost jobs as industries moved overseas or to other parts of the country. This trend accelerated in the 1970s. The area around the Great Lakes became known as the Rust Belt because of its closed, deteriorating factories. Some of the region’s major 19th-century industrial towns—Detroit, Michigan; Gary, Indiana; Akron, Ohio; Cleveland, Ohio; Erie, Pennsylvania; and Buffalo, New York—lost significant population. The cities that suffered the greatest declines were the ones most dependent on manufacturing. Other cities in the Northeast and around the Great Lakes—New York, Boston, Philadelphia, Washington, and Chicago—retained their importance as centers of finance, service, government, education, medicine, culture, and conventions, even though population growth slowed or stopped once the industrial base disappeared.

The older cities have a number of problems. Roads built decades ago cannot easily accommodate today’s commuter traffic and commercial trucking. School systems designed to train the next generation for industrial jobs, which are now disappearing, have struggled to meet the educational requirements of new technology-based occupations. Housing, commercial offices, and manufacturing facilities are outmoded, and the cost of land and building is relatively high. In spite of these problems, about one-third of all Americans at the beginning of the 21st century still lived around the Great Lakes and in Northeastern states, and the corridor stretching from Boston to Washington, D.C., remained the most densely settled part of the United States.

During the latter part of the 20th century, the largest streams of migrants within the United States were from New York to Florida, New Jersey, and California; from Texas to California; from California to Washington State, Arizona, Texas, and Oregon; and from New Jersey to Florida and Pennsylvania. These streams were not one-way: About 20 percent of these people later returned to their original states, so that many states are losing some people and gaining others. In the 1990s a third of Americans lived in a different state than the one in which they were born, up from a quarter of the population in the late 19th century. Others moved within states.

Migration and Diversity

Americans’ propensity to move helps break down ethnic affiliations and homogenize American society. Ethnic enclaves, with their own churches, social groups, newspapers, schools, and languages, are difficult to reproduce after a move. Intermarriage increases, mingling formerly distinct cultural traits. Over time, ethnic neighborhoods gradually shrink and are replaced by residential areas that are more mixed ethnically, although at the same time newer immigrants are creating their own ethnic enclaves. Migration generally tends to weaken the strong sense of community inherent in ethnic enclaves—neighbors may not know one another, extended family ties break down, and friendships are more transitory.

Racial differences between African Americans and European Americans, however, are so deeply rooted in the American psyche that they continue to be replicated, even in rapidly growing areas. Local laws no longer mandate segregation as they did before the 1950s, but it persists in residential patterns, in primary and secondary schools, and in religion, although it is disappearing in politics, entertainment, higher education, and in some employment sectors (see Segregation in the United States).

Although migration has caused some cultural differences to disappear as people blended, many ethnic identifiers have remained, spreading across the country as people migrated from one place to another. Today they have become part of America’s cultural heritage. Ethnic influences can be seen in music, food, sports, and holidays. Jazz, the blues, bluegrass, Cajun, and other forms of music have spread beyond their original locales because of migration. As these American forms of music spread, they are influenced by still other musical traditions. Foods that immigrants from around the globe introduced to this country are commonly found in many supermarkets. Such foods include pizza, tacos, salsa, bagels, dim sum, sushi, couscous, and spaghetti. As food traditions blend, they sometimes produce oddities, such as jalapeño bagels and pizza with snow peas. Many American sports, such as hockey, football, and lacrosse, have origins in other cultures and countries. Christmas holiday traditions stem from German and Dutch influences, and Jewish and African American groups maintain alternatives to Christmas celebrations. Even American architectural styles often have foreign origins—chalets from Switzerland, log cabins from northern Europe, and bungalows from India are just a few examples. The richness of American civilization comes from adopting and adapting different traditions.

America has its own homegrown traditions ranging from popular musical styles such as Tin Pan Alley, Broadway musicals, and rock and roll, which little resemble Old World models. Indigenous foods, including turkey, pumpkins, and cranberries, characterize the celebration of Thanksgiving, a day that, along with the Fourth of July (see Independence Day), Memorial Day, and Labor Day, has meaning for Americans of many religions, races, ethnicities, and backgrounds.

Major Migrations of the U.S. Population

The history of America includes three major population movements—one forced and two voluntary. The forced migration involved Native American peoples, most of whom were systematically moved westward and eventually settled onto reservations. The voluntary movements involved millions of people seeking economic advancement and greater freedom. The first of these movements included pioneers who trekked west from the mid-18th century to the turn of the 20th century. The next movement was the great migration of African Americans from the rural South to the cities of the Northeast and Midwest from World War I to the 1960s.

Americans are more restless people than residents of most other countries. They move from house to house, neighborhood to neighborhood, state to state, region to region. Most of the movement has been voluntary as individuals and families sought improved living conditions and economic opportunity. Yet substantial movements have been a result of fear, greed, and racism that denied minority groups the liberties enjoyed by the majority.

Native American Relocation

Before contact with European settlers, Native Americans lived primarily on the East and West coasts and along the Mississippi River and its tributaries. The plains were rather sparsely settled until after the Spanish introduced the domestic horse to North America in the mid-17th century. Many tribal groups became horse breeders and horse traders, a new economic existence that sparked Native American migration to the plains in the 18th century.

Two centuries after Europeans arrived along the East Coast, Native American groups were forced to migrate west of the Appalachian Mountains, where they began settling among the indigenous people of the Ohio River valley and the Great Lakes region in the 18th century. Eventually, only scattered remnants of eastern nations remained in their original homelands, most often in the far north or south, and native people generally lived in poverty.

During the 18th century, Native Americans were sought as allies in clashes between European colonial powers. The British sought to guarantee their Native American allies the territory between the Appalachian Mountains and the Mississippi River as a refuge and buffer zone. This land was attractive to the settlers in the 13 colonies, however, and access to it was one cause of the American Revolution. Some settlers moved west even before the Revolution ended. The success of the colonies in the Revolution was a major defeat for Native Americans, who lost their British allies. The United States had no reason to seek alliances with native peoples after the Louisiana Purchase in 1803 removed the French and Spanish as threats to the new country.

The new United States quickly moved to allow settlement of the West, despite earlier treaties with tribal groups. By the 1820s and 1830s, the federal government, the states, and pioneer pressure had forcibly removed entire groups to areas west of the Mississippi. The United States established a pattern of removing native peoples to land considered worthless, then forcibly uprooting these groups once again when white settlers sought to expand into the territory (see Westward Movement: The Frontier to the Mississippi).

In the second half of the 19th century, the U.S. Army fought more than a thousand battles in an effort to place remaining Native American tribes on reservations throughout the West (see Indian Wars; Native American Reservations; Native American Policy). These reservations removed native peoples from land that white settlers desired. The federal government also attempted, with various policies during the 19th and 20th centuries, to weaken tribal loyalties and to turn natives into “Americans,” that is Christian, English-speaking farmers. An 1887 law sought to break up the reservations by allotting farms of up to 60 hectares (160 acres) to Native American households. This left Native Americans even less living space, as any leftover land was given to white settlers. The process was also rife with corruption. Economic failure, sickness, and despair were too often part of life on the reservations. During the 1930s the Bureau of Indian Affairs sought to improve conditions for Native Americans, but with little overall success.

Beginning in the 1950s, large numbers of Native Americans moved into the cities, partly to find work and partly because government programs encouraged this movement. In 1968 Native Americans in Minneapolis, Minnesota, founded the American Indian Movement (AIM), inaugurating a new period of activism and pride. Today many Native Americans travel back and forth between cities and reservations, and although many tribes remain impoverished, others are enjoying newfound wealth from economic ventures such as casino operations, oil drilling, and artistic and fine craftwork.

Conquest of the West

Native Americans were forced onto reservations to make way for settlers moving west looking for more land. The Westward Movement began even before the Revolution. Colonial British America consisted mainly of settlements stretching along the East Coast from Massachusetts to Georgia. By 1760 colonists wanted more land. They moved north into what would become the states of Maine and Vermont and crossed the Appalachian Mountains to settle in western Pennsylvania, northwestern New York, and Kentucky. The Ohio River valley and the Great Lakes region also attracted thousands of settlers in the early 19th century.

Those who chose to move west were hardy. They coped with accidents, disease, and malnutrition on their journeys. They also faced occasional revenge attacks by Native Americans who had managed to survive the U.S. Army, disease, and hunger. Settlers sold their property and possessions to finance their trips and buy enough seed and food to last at least a year, and they hoped to have enough money left to acquire land. Those without sufficient money squatted on land; that is, they illegally lived on unoccupied land, constantly moving west to keep ahead of the speculators and banks that owned legal title to the land.

Migrating Americans tended to move straight west, rather than traveling northwest or southwest. New Englanders settled in the upper Midwest and northern plains, while New Yorkers and Pennsylvanians moved to the Ohio valley and lower Great Lakes, and then to Iowa, Nebraska, and beyond. People from Virginia and North Carolina headed for Kentucky and Tennessee and later to Arkansas and Missouri. Carolinians and Georgians moved further west in the Deep South, to Alabama, Mississippi, and Louisiana. Texas attracted settlers from both the upper and lower South.

The settlers in the West replicated the settlement and cultural patterns of their original homes along the seaboard. Those from the North developed small farms, growing food for home consumption and to sell at market. They generally relied on the family’s labor or hired seasonal labor, and they practiced craft-based trades. They quickly established towns and cities—centers of commerce, education, and production—and by the early 19th century these towns and cities were becoming industrialized, as were their eastern counterparts.

Southerners who moved west primarily planted single cash crops, at first tobacco and rice, but later cotton became the dominant crop. They preferred large-scale plantations, with a permanent labor force of slaves. Since they exported much of their crop to English textile mills in return for finished goods, they did not need many craftspeople. Towns were smaller and cities fewer, and industrialization lagged well behind the North. The different economies and societies of the westward-expanding North and South set the stage for a divided country, eventually culminating in the American Civil War of the 1860s. Early conflicts took place where North and South met in the West—particularly in Kansas (see Border War)and in California.

Settlers began to move west of the Mississippi River in large numbers after the annexation of Texas in 1845 and the discovery of gold and silver in the western mountains. The California gold rush of 1849 spurred California’s population increase—from 14,000 in 1848 to 100,000 a year later. By 1860 its population was 380,000. After the Civil War thousands of newly freed African Americans moved to Texas and Oklahoma, and Mexican Americans migrated west along the southern border of the United States from Texas and New Mexico into Arizona and southern California. Large numbers of Canadians moved southwest into the western United States while Americans settled in western Canada, as people in both nations sought the most fertile lands.

Thousands of Europeans headed west as well during the 19th century, particularly after transcontinental railroads made travel easier. Germans and Scandinavians migrated to the upper Midwest, then to the plains and the Pacific Northwest. The British tended to settle in Utah and along the West Coast. Smaller groups of Russians, Ukrainians, and Italians also added to the diversity of the West. Not all movement was east to west. Thousands of Chinese and Japanese, lured by economic opportunity, headed east across the Pacific Ocean and settled in the American West, particularly in California. A gold rush in the Klondike that started in 1896 drew thousands to Alaska, some of whom stayed. By the turn of the 20th century the West was populated with people from the eastern United States, Europe, and Asia, while the native population was confined to reservations. Except on the reservations, the population of the West continued to grow through the 20th century, especially after the automobile made travel easier.

Although westward migration continued, growth was not uniform. Some areas of the northern plains, as well as some mining and logging districts, began losing population in the 20th century because of economic changes or resource depletion. Mining towns became ghost towns throughout the West as ore ran out or as overseas competition made mines unprofitable. The largest of these emigrations from former frontier areas occurred during the Dust Bowl period of the 1930s, when overuse of the land combined with drought to cause severe erosion of the Great Plains, and farmers from the plains moved to the West Coast. The loss of population in the plains continued through the end of the 20th century, as larger agribusinesses that required less labor replaced small farms.

Black Migration

After the Civil War the majority of African Americans continued to live in the South, where most farmed as sharecroppers or tenants. A smaller number were ministers, teachers, doctors, and nurses. This way of life eventually broke down for several reasons. African Americans in the South were confronted daily with the many indignities of segregation, which the Supreme Court of the United States had ruled constitutional in 1896 (see Plessy v. Ferguson). Their opportunities for employment were restricted, their schools were second-rate, and their voting rights were trampled by policies such as literacy tests, poll taxes, grandfather clauses, and primary elections that were open to whites only. Then in the late 1890s, the boll weevil began to spread through the South, destroying the cotton crop that sustained most black families. African Americans began to leave the South. Entire communities moved to Northern cities, drawn by the possibility of industrial jobs, better schools, and fewer legal restrictions. The pace of black migration increased substantially during World War I, when employers in Northern cities experienced labor shortages. By 1920, 450,000 blacks had moved north. Three out of four of those could be found in Detroit; Cleveland; Chicago; Philadelphia; New York; Indianapolis, Indiana; Kansas City, Missouri; St. Louis, Missouri; Pittsburgh, Pennsylvania; and Cincinnati, Ohio (see African American History: The Great Migration).

Southern employers opposed this great exodus of black Americans from the South and tried to keep their laborers by persuasion or force. African Americans who moved north also often met hostility. Race riots were triggered by whites who feared competition for jobs and residential integration, particularly in the years of labor unrest following World War I. Through fear and discriminatory rental and hiring policies these violent episodes served to confine blacks to segregated ghettos within cities. Still, the migration continued, aided by effects of the 1930s depression in the South and by the jobs and high wages that came with World War II and the early Cold War.

Young adults typically led the way to the North, and other family members would follow once the first venturer found work and housing. It often happened that small, rural communities would recreate themselves in the urban North, living in the same neighborhoods and worshiping at a single church. The migration ended in the 1960s when the successes of the civil rights movement reduced the differences between Northern and Southern racial attitudes. In the 1990s some Northern blacks left the Rust Belt cities behind and moved back to a revitalized South, with its newer cities and expanded job opportunities.

URBANIZATION OF AMERICA

The early United States was predominately rural. According to the 1790 census, 95 percent of the population lived in the countryside. The 5 percent of Americans living in urban areas (places with more than 2,500 persons) lived mostly in small villages. Only Philadelphia, New York, and Boston had more than 15,000 inhabitants. The South was almost completely rural. After 1830 the urban areas of the country grew more rapidly than the rural areas. By 1890 industrialization had produced substantial growth in cities, and 35 percent of Americans lived in urban areas, mostly in the northern half of the United States. The South remained rural, except for New Orleans and a few smaller cities. The number of Americans living in cities did not surpass the number in rural areas until 1920. By the 1990s three out of four Americans lived in an urban setting, and since World War II the southern half of the country has become increasingly urbanized, particularly in Texas, Arizona, and the states along the eastern seaboard.

Growth of Cities

Until the middle of the 19th century, the center of the city was the most fashionable place to live. Merchants, lawyers, and manufacturers built substantial townhouses on the main thoroughfares within walking distance of the docks, warehouses, offices, courts, and shops where they worked. Poorer people lived in back alleys and courtyards of the central city. Markets, shops, taverns, and concert halls provided services and entertainment. The middle classes lived a little farther from the center, and other poor people lived in the suburbs, farther from the economic and governmental centers and away from urban amenities such as town watches, water pumps, and garbage collection. Cities were densely populated, as people had to live within walking distance of work and shops. Streets were narrow, just wide enough to accommodate pedestrians and wagons.

The Industrial Revolution of the 19th and 20th centuries transformed urban life and gave people higher expectations for improving their standard of living. The increased number of jobs, along with technological innovations in transportation and housing construction, encouraged migration to cities. Development of railroads, streetcars, and trolleys in the 19th century enabled city boundaries to expand. People no longer had to live within walking distance of their jobs. With more choices about where to live, people tended to seek out neighbors of similar social status, if they could afford to do so. The wealthy no longer had to live in the center of the city, so they formed exclusive enclaves far from warehouses, factories, and docks. Office buildings, retail shops, and light manufacturing characterized the central business districts. Heavier industry clustered along the rivers and rail lines that brought in raw materials and shipped out finished products. Railroads also allowed goods to be brought into downtown commercial districts. By the second half of the 19th century, specialized spaces—retail districts, office blocks, manufacturing districts, and residential areas—characterized urban life.

The wealthy created separate neighborhoods for themselves by building mansions on large plots of land at the edges of the cities or in the countryside. Housing developments of similar-looking single-family or multiple-family dwellings, built by speculators, sprouted on the edges of cities. These often catered to a new middle class of white-collar employees in business and industry. The houses faced broader streets and increasingly had plots of grass in front and sometimes in the rear. New apartments were spacious and often had balconies, porches, or other amenities. By 1900 more than a third of urban dwellers owned their own homes, one of the highest rates in the world at the time.

As the middle classes left the bustle and smoke of cities, poorer people—newcomers from the countryside and immigrants—moved into the old housing stock. Landlords took advantage of the demand for housing by subdividing city houses into apartments and by building tenements, low-rent apartment buildings that were often poorly maintained and unsanitary. Immigrants gravitated to the cheap housing and to the promise of work in or near the center of cities or around factories. Now the rich lived in the suburbs and the poor near the center of cities.

In the 50 years from 1870 to 1920, the number of Americans in cities grew from 10 million to 54 million. Into the 20th century, cities grew in population and expanded geographically by absorbing nearby communities. In 1898 New York City acquired Brooklyn, Queens, and the Bronx as boroughs, political divisions that are like counties. Chicago grew from about 300,000 inhabitants in 1870 to more than a million in 1890. Three-quarters of the city’s residents were born outside the United States, and while some found work and a comfortable existence, many suffered severe poverty. That poverty, however, was largely invisible to the rich living on the outskirts of the city, since the poor were concentrated in distant neighborhoods.

The growth of cities outpaced the ability of local governments to extend clean water, garbage collection, and sewage systems into poorer areas, so conditions in cities deteriorated. Cities in the late 19th century were large, crowded, and impersonal places devoted to making money. Not surprisingly, corruption was rampant in city government and city services, in the construction industry, and among landlords and employers. High rents, low wages, and poor services produced misery in the midst of unprecedented economic growth.

The Progressive movement of the late 19th and early 20th centuries succeeded in reducing some of the corruption and in establishing housing codes, public health measures, and civil service examinations in city governments. Progressive, regulatory approaches to the problems of cities expanded during the New Deal in the 1930s and during the War on Poverty in the 1960s, but cost-cutting political movements in the 1920s, 1950s, and 1980s reduced funding or eliminated many regulatory programs. As a result of local reform movements throughout the 20th century, corrupt officials were periodically voted out of office and replaced with reformers.

Upward mobility, home ownership, educational opportunities, and cheap goods softened many of the disadvantages of 19th-century urban life. Beautification programs, electrification, and construction of libraries, parks, playgrounds, and swimming pools, gradually improved the quality of urban life in the 20th century, although poor areas received fewer benefits. Poverty, particularly among new arrivals, and low wages remained problems in the cities throughout the 19th and 20th centuries. American reform movements, such as the settlement house movement, have typically been more interested in treating the effects of poverty—housing, health, and corruption—than the causes of poverty—unemployment, underemployment, poor skills, and low wages. Labor unions helped raise wages and benefits for many workers, particularly the most skilled, from 1900 to 1950, but since then replacement of skilled factory work by service employment has reduced both wage levels and the influence of labor unions. The U.S. Department of Labor reports that the average annual wages of American working men fell from $33,244 in 1979 to $31,317 in 1999 (adjusted for inflation). Wages fell further for those without high school diplomas.

Although murder was rare in the nation in the late 19th century, rates rose in cities. Robbery and theft were commonplace, and prostitution flourished more openly than before. Cheap newspapers exaggerated increases in crime with sensational stories. Professional police forces were created in the late 19th century to keep order and to protect property. The Prohibition period, 1920 to 1933, had the unintended effect of increasing organized crime in America, as manufacturing, importing, and selling illegal alcohol provided a financial windfall for gangs of criminals in the cities. The money was used to expand the influence of organized crime into gambling, prostitution, narcotics, and some legitimate businesses. Police and judges were sometimes bribed. Federal prosecutions of criminals in the 1950s and 1960s helped weaken organized crime. The rise in drug use since the 1970s increased the incidence of violent crime, most visibly in cities, although the majority of drug customers are from the suburbs. This has led to increased professionalization of city police forces, including more weapons, increased training, and higher educational requirements for officers. Higher employment rates at the end of the 1990s have helped to reduce crime rates.

Urban areas have continued to expand, but city boundaries have with few exceptions been set since the early 20th century. City populations increased until the 1950s. Then factories began to move to areas where labor was cheaper: to the South, Latin America, and Asia. As jobs in cities disappeared, cities began to shrink. In the second half of the 20th century, the most rapidly growing urban areas were those outside city limits.

Move to Suburbia

Suburbs began to appear in the 18th century when wealthy people built second homes in the country to escape the crowded, sweltering city during the summer. As roads improved in the early part of the 19th century, more people built summer houses. A few began living outside the city full time and commuting by carriage to town. Commuting into the city to work became easier and cheaper in the late 19th century, when commuter railroad lines were built, radiating out from the central city. New suburbs developed that were almost entirely residential and depended on the economic resources of the central city. Because railroad fares were relatively high, most of these suburbs remained the preserve of the wealthiest Americans until after World War II, although a few working-class suburbs sprang up around large manufacturing complexes or ports.

The United States experienced a housing shortage in the late 1940s, as recently married war veterans sought places to live. The GI Bill—which provided unemployment and education allowances and home, farm, and business loans for millions of World War II veterans—enabled a flood of home purchases. Several developers applied the principles of mass production to housing, creating nearly identical houses on moderate-sized lots. These suburban developments were targeted to narrow segments of the broad middle class. Some were home to professionals and executives, some to middle management, some to the lower middle class, some to working-class Americans. Each development was substantially uniform in social status and sometimes in religion and ethnicity.

Suburbanites were similar in other ways. Married couples were generally just starting their families. The baby boom meant that there were large numbers of children in the suburbs. Women were housewives and husbands commuted to jobs in the city. Families valued privacy and were separated from other relatives, who either remained in the city or lived elsewhere. It was both comfortable and isolating. The family was often on its own, knowing few neighbors, watching television in the evening, driving everywhere in private cars to anonymous shopping centers. Some people living in these new suburbs depended on rail lines to get to work, although more took advantage of the automobile as a form of transport. The federal government contributed to suburbanization by subsidizing mortgages for veterans and building highways that made travel between cities and suburbs easier.

As the suburbs grew, more and more of the middle classes abandoned the cities. The suburbs were attractive for many reasons: They were cleaner, newer, had better-funded schools, were socially homogeneous, and provided a sense of security. They provided what city dwellers had long been seeking—bigger yards and more privacy. The perceived problems of the city—crowding, high taxes, crime, and poverty—could be left behind. And because the suburbs were politically independent of the core city, the layers of bureaucracy and corruption could be replaced by smaller, friendlier, and presumably more honest government.

As millions moved to the suburbs, stores followed so that residents did not have to go into the city to shop. By the mid-1950s the shopping mall had appeared. Some large, enclosed malls in the 1980s and 1990s became centers for both consumption and entertainment. Other, smaller strip malls contained shops that sold basic items, such as food and hardware, or provided services, such dry cleaning and film processing.

Suburban housing also underwent changes in the 1980s and 1990s. Townhouses and apartment complexes began to characterize the suburbs as much as houses on lots. Retired couples needed smaller places, high divorce rates created single-adult households, and poorer individuals wanted to share some of the benefits of a suburban lifestyle.

Once the population shifted to the suburbs, employers eventually followed, though more slowly than residents. Because employees might live in any suburb surrounding a city, a central business location in the city had always been convenient. Increased traffic congestion in the city centers, and the promise of lower corporate taxes and less crime in the suburbs, eventually pushed corporations out to the suburbs as well. Office complexes and corporate campuses brought white-collar jobs closer to the suburban areas where many workers lived. Warehouses, light industry, and other businesses were increasingly located in the suburbs. These new locations were poorly served by public transportation. Workers had to commute by car. This trend appeared as early as the 1950s and 1960s in the rapidly growing metropolitan areas like Los Angeles and Dallas and later in the older large cities of the Northeast and Midwest.

Traffic congestion is an increasing problem in cities and suburbs, and Americans spend more and more of their time commuting to work, school, shopping, and social events, as well as dealing with traffic jams and accidents. By the late 1990s rush-hour traffic patterns no longer flowed simply into the city in the morning and out of the city in the evening. Traffic became heavy in all directions, both to and from cities as well as between suburban locations. Suburban business locations required huge parking lots because employees had to drive; there were few buses, trains, or trolleys to carry scattered workers to their jobs. The hope of reduced congestion in the suburbs had not been realized; long commutes and traffic jams could be found everywhere.

Suburbanization has not affected all aspects of American life. Some functions have largely remained in the central cities, including government bureaus, courts, universities, research hospitals, professional sports teams, theaters, and arts groups. Trendy shopping, fine restaurants, and nightlife, which expanded in the booming economy at the end of the 20th century, have become popular in many cities, revitalizing a few urban neighborhoods.

In the 20 largest cities and urbanized areas of the United States, 41 percent of the local population, on average, lives in the city, and 59 percent lives in the surrounding suburbs, towns, and associated rural areas. Hoping for more privacy, more space, and better housing, people continued to look to the fringes of urban areas. In the 1990s it became apparent that older suburbs were losing population to newer suburbs and to the so-called exurbs, rural areas bordering cities.

With these new suburbs springing up on the fringes of major urban centers, older suburbs face many of the hardships of cities. As the young and the more affluent seek the newest housing developments, tax bases in the cities and in older suburbs erode. The housing stock deteriorates because of age and perhaps neglect, and housing prices stagnate or fall, causing tax revenues to decline. The elderly—many on limited incomes and in poor health—are more likely to stay in the older suburbs, a trend that not only diminishes tax revenues but increases demand for social services. Schools, no longer supported by the same strong property tax base, suffer in quality, causing even more people to move out. Poorer people then move into the cheaper housing of the older suburbs. As poverty increases in the older areas, so does crime. Older suburbs are often in more desperate financial straits than the central city, because their economic base is less diverse.

The peace and security that suburbanites originally sought became more elusive near the end of the 20th century, and the trend toward gated and walled housing developments was the most visible sign of anxiety about external threats. The next major trend may be a movement out of large cities and suburbs and into small towns and the countryside as Americans avoid commuting and seek more leisure time and a stronger sense of community. New information technologies such as e-mail and computer networking will probably contribute to the dispersal of the population out of the cities, although a sharp and sustained rise in gasoline prices could reverse current trends by making the private automobile and extensive commuting too expensive.

RELIGION IN THE UNITED STATES

The variety of religious beliefs in the United States surpasses the nation’s multitude of ethnicities, nationalities, and races, making religion another source of diversity rather than a unifying force. This is true even though the vast majority of Americans—83 percent—identify themselves as Christian. One-third of these self-identified Christians are unaffiliated with any church. Moreover, practicing Christians belong to a wide variety of churches that differ on theology, organization, programs, and policies. The largest number of Christians in the United States belong to one of the many Protestant denominations—groups that vary widely in their beliefs and practices. Roman Catholics constitute the next largest group of American Christians, followed by the Eastern Orthodox.

Most Christians in America are Protestant, but hundreds of Protestant denominations and independent congregations exist. Many of the major denominations, such as Baptists, Lutherans, and Methodists, are splintered into separate groups that have different ideas about theology or church organization. Some Protestant religious movements, including Fundamentalism and Evangelicalism, cut across many different Protestant organizations.

Roman Catholics, the next largest religious group in the United States, are far more unified than Protestants. This is due in part to Roman Catholicism’s hierarchical structure and willingness to allow a degree of debate within its ranks, even while insisting on certain core beliefs. The Eastern Orthodox Church, the third major group of Christian churches, is divided by national origin, with the Greek Orthodox Church and the Russian Orthodox Church being the largest of the branches in the United States.

Among many Protestant denominations, blacks and whites generally maintain distinct organizations and practices, or at least separate congregations. Even among Roman Catholics the residential segregation in American society produces separate parishes and parish schools.

Judaism is the next largest religion in the United States, with about 2 percent of the population in 2001. It is also divided into branches, with the largest being Orthodox, Reform, and Conservative. Other religions practiced in America include Buddhism, Hinduism, and Islam. Islam is among the fastest-growing religious groups; its members were just about 1 percent of the U.S. population in 2001.

Large numbers of Americans do not have a religious view of the world–some 8 percent are nonreligious, secular, or are atheists; that is, they do not believe in a god or gods (see Atheism). Adding these to the nonpracticing Christian population means that slightly more than a quarter of the American population is unaffiliated with any church or denomination. This mixture of multiple religious and secular points of view existed from the beginning of European colonization.

History of Religion in the United States

Native Americans had many religious beliefs, but most groups believed in a world of spirits (see Native American Religions). These spirits inhabited plants and animals, mountains and rivers, and tribes, clans, and individuals. The spirits might require prayer, sacrifices, dances and songs, or thanks. Every major event—killing game, planting corn, or acquiring an adult name—required interaction with the spirit world. There were benevolent spirits and protective spirits, as well as trickster spirits who caused sickness and misfortune. Native Americans did not believe that people were superior to the natural world, but held that people had to protect and maintain the spirits in their environment. Certain men and women were given the task of memorizing the religious heritage of the group. From a European point of view, these religions were merely superstitions and had to be eliminated. By the end of the 19th century, most Native Americans belonged to one of the Christian sects. In the 20th century, tribal groups are concerned with preserving and reinvigorating their spiritual traditions.

Religion in the Colonies

One of the stated purposes of European colonization was to spread Christianity. The charter of the London Company, formed in 1606 to establish colonies in the New World, called for English settlers to convert native peoples to the Anglican faith (see the Church of England). The goal was not only to strengthen the Church of England, but also to counter the influence of Catholic Spain and France. Catholic missionaries were actively trying to convert Native Americans in the southwest and far west of the territory that is now the United States, as well as in neighboring Canada, Mexico, and the Caribbean. In truth, some of the earliest colonists were too busy looking for land or gold or seeking profits from tobacco to pay much attention to making converts. The conflict between the search for wealth and spiritual interests continued beyond the colonial period. The fact that Native Americans were not Christian became a convenient excuse for white settlers to seize their land.

Northern Colonies

During the 17th century, New England became a religious refuge for Protestant followers of John Calvin, whose beliefs differed from those of the Church of England. One such group, the Pilgrims, established the Plymouth Colony in 1620 to escape persecution in England. The Puritans, another Calvinist sect, arrived nine years later in Massachusetts. The Puritans eventually absorbed the Pilgrims and spread into Connecticut, Maine, New Hampshire, upstate New York, and eastern Ohio. The religious freedom these pioneers sought for themselves, however, was not extended to others. They allowed only Puritan churches, which were supported by taxes, and only church members had political rights. Advocates of other beliefs were punished, sometimes harshly.

This emphasis on conformity led some members to break away and move to new colonies. Roger Williams, a Puritan clergyman, founded the colony of Rhode Island after being expelled from Massachusetts in 1635 because he disagreed with the colonial government. There he established the principles of separation of church and state, religious toleration for all, and freedom of religious expression. After 1680 Puritans were forced by changes in English law to tolerate other Christian churches in their midst, but taxes still went to the established church. Massachusetts did not achieve separation of church and state until 1833, the last state in the union to do so.

Southern Colonies

In the southern colonies, brothers Cecilius and Leonard Calvert established a refuge in Maryland in the 1630s for Roman Catholics persecuted in England. The Calvert family declared freedom of religion for all Christians in their colony. In the rest of the colonial south, the Anglican Church was established by law. In general, however, southern colonists provided minimal support for their churches, which were often without ministers. After the middle of the 18th century, Baptist and Methodist ministers converted large numbers of settlers to their denominations, particularly the poorer ones and slaves.

The slaves the southern settlers brought into the colonies were usually non-Christian, although a few had been baptized as Roman Catholic. Colonists felt free to enslave Africans because they were not Christians. For the first century of slavery, from the early 17th century to the early 18th century, most Southern states made it a crime to baptize slaves, because slaveholders feared they would have to free slaves if they became their brothers and sisters in Christ.

In the first half of the 18th century, missionaries from the Society for the Propagation of the Gospel in Foreign Parts were able to baptize some slaves as Anglicans. Many slaves, however, became Baptists or Methodists rather than Anglicans like their owners. Because Baptists and Methodists did not insist on the freeing of slaves, plantation owners were persuaded to change laws forbidding the Christianization of slaves. Special Bibles written for slaves stressed obedience. Slaves created hymns and a theology of freedom, however, to counter the proslavery lessons of white preachers. Over time, separate black religions developed among slaves that combined some elements of African practice with Baptist and Methodist theology.

Middle Colonies

The first Europeans to settle in the middle colonies (Delaware, New York, New Jersey, and Pennsylvania) during the 17th century were Dutch and Swedish Lutherans. Quaker William Penn provided for full religious freedom in Pennsylvania after he was granted a colony there in 1681, although Catholics and Jews could not vote. Calvinists, Jews, Moravians, German Lutherans, and Roman Catholics quickly followed the Quakers to Pennsylvania because of its religious freedom (see Calvinism, Moravian Church, Society of Friends). New York provided for locally established churches, with each town voting on which church its tax money would support. There was limited religious freedom in New Jersey.

The wider toleration in the middle colonies promoted the free expression of a variety of religious and nonreligious beliefs and practices, a social order thought to be impossible among Europeans who were used to centuries of religious warfare. This toleration encouraged both ethnic and religious diversity. These colonies provided a model for the later religious tradition of the United States—a slow realization that the freedom to express one’s own faith depended on granting that same liberty to others.

Freedom of religion helped produce religious revivals that transformed the American religious landscape. The First Great Awakening began among the Presbyterians in New Jersey and western Massachusetts, and with the newer denominations of Baptists and Methodists in the 1730s. This period of heightened concern with salvation lasted until the eve of the American Revolution in the 1760s. In individual congregations, in colleges, and in mass outdoor meetings, revivalists preached that all could be born again and saved, and that anyone could preach, not just an educated elite. The Great Awakening was instrumental in converting slaves as well as free people.

The Great Awakening set the stage for the American Revolution by undermining faith in traditional authority, particularly the authority of the Church of England and the king, who was head of the church. In the early days of the movement, working men, women, and African Americans took prominent roles as Bible teachers and prayer group leaders. Working men, in particular, acquired leadership experience that propelled them into political roles during and after the American Revolution.

Religion in a Secular State

During the American Revolution, most state constitutions provided for freedom of conscience and the separation of church and state. The absence of those same rights in the Constitution of the United States, drawn up in 1787, caused many to vote against ratifying it. The first Congress of the United States, therefore, called for certain amendments to the Constitution; these amendments became the Bill of Rights. The first right granted in the Constitution guaranteed separation of church and state on the national level and the free exercise of religious beliefs. The authors of the Constitution provided for a secular state, one based not on religion but on toleration and liberty of conscience. Influenced by the ideals of the Enlightenment that promoted individualism, liberty, and free inquiry, as well as by the examples set by the middle colonies, the Founding Fathers committed the nation to protecting minority viewpoints and beliefs.

The atmosphere of free inquiry in the United States allowed new religions to develop. In the wake of the Revolution, American Anglicans broke with the Church of England and founded the Episcopal Church. American Roman Catholics also broke from the control of the vicar apostolic in London, and in 1789 Baltimore became the first diocese in the United States. American Unitarians and Universalists also had their origins in the 18th century, but did not develop denominational structures until the 19th century (see Unitarianism, see Universalism).

A Second Great Awakening began in New York in the early 1800s and spread north, south, and west before disappearing in the 1840s. Tent meetings that were a part of this revival movement brought together spellbinding preachers and large audiences, who camped for several days to immerse themselves in the heady atmosphere of religion. The movement merged democratic idealism with evangelical Christianity, arguing that America was in need of moral regeneration by dedicated Christians. The men and the large number of women who were attracted to this movement channeled their fervor into a series of reforms designed to eliminate evils in American society, particularly in the industrializing North. These reforms included women’s rights, temperance, educational improvements, humane treatment for the mentally ill, and the abolition of slavery. The growth of an abolitionist movement in the North was one factor leading to the Civil War. Just before the Civil War, many of the denominations in the United States split over the issue of slavery, with Southern congregations supporting slavery and Northern congregations opposing it.

African Americans, finding that segregation and race hatred prevailed among Methodists, formed the African Methodist Episcopal Church in Philadelphia and the African Methodist Episcopal Zion Church in New York City early in the 19th century. Both churches established branches throughout the North. Separate African Episcopal, Lutheran, and Baptist churches soon followed.

The United States has been the birthplace of a number of new Christian sects. The Church of Jesus Christ of Latter-day Saints, organized in 1830 by Mormon religious leader Joseph Smith, has been successful in creating a lasting denominational presence and in influencing the development of the state of Utah. Others, such as the Millerites, who predicted the end of the world in 1844, have not lasted (see William Miller). Some of Miller’s former followers reinterpreted his doctrines and established the Seventh-day Adventist faith in the mid-19th century. In 1879, Mary Baker Eddy founded the Church of Christ, Scientist, and soon had congregations throughout the country. In the early 20th century, the Pentecostal movement developed. It is a localized, stricter fundamentalist faith that grew out of Baptist and Methodist churches, and is often organized around a charismatic preacher. Americans seeking solutions to spiritual problems have created smaller denominations.

Not all new religions were Christian. The major branches of Judaism—Reform, Conservative, and Orthodox—developed in the United States in the late 19th and 20th centuries in response to the social and political conditions that Jews faced in America. Helena Petrovna Blavatsky, known as Madame Blavatsky, help found a spiritualist group in the 1870s called the Theosophical Society. The Nation of Islam, a black Muslim group, was founded in the 1930s in reaction to perceived lingering prejudices of Christianity, and was led for more than 40 years by Elijah Muhammad. It became a political force in the 1960s, rejecting the passive resistance strategy of Martin Luther King, Jr., and advocating a more aggressive assertion of African American equality that did not rule out violence.

Influence of Religion

The beginning of the 20th century saw the development of Fundamentalism, a conservative Protestant movement that crosses many denominational lines and emphasizes a literal interpretation of the Bible. Not as extreme as the Pentecostal movement, it forged a Bible Belt across the nation where Fundamentalism is widely practiced. This Bible Belt stretched from the upper South, through the southern plains, and into parts of California.

One result of the Fundamentalist movement was a series of state laws in the 1920s banning the teaching of the theory of evolution. Fundamentalists saw this theory as contrary to a literal reading of the biblical account of creation. These laws led to the highly publicized Scopes trial in 1925, in which the state of Tennessee prosecuted biology teacher John Scopes for teaching evolution. Scopes was convicted and fined $100 (the state supreme court later reversed the ruling). The negative public response to the creationist point of view helped weaken Fundamentalist influence and promoted a more secular, scientific curriculum in many of the nation’s schools.

Perhaps the high point of religious influence on American society and government came with the prohibition, or temperance, movement that gained popularity in the last half of the 19th century. Church meetings that rallied against the evil effects of drunkenness sometimes led parishioners to march to saloons, which they attempted to close through prayer or violence. The movement led to the formation of the Anti-Saloon League of America, which endorsed political candidates and helped pass state laws banning saloons. In 1919 the league, along with the Woman’s Christian Temperance Union, succeeded in passing the 18th Amendment to the Constitution, which banned the manufacture, sale, or transportation of alcohol, and a federal law, the Volstead Act of 1920, to enforce the amendment. Americans eventually became disillusioned with the law because federal enforcement tactics sometimes trampled on civil liberties, and because Prohibition fed the growth of organized crime and political corruption. Additionally, consumption of alcohol did not diminish; among some groups, especially women, consumption actually increased. The amendment was repealed in 1933.

The speakeasies, nightclubs, cocktails, and portable flasks of liquor that had become popular during Prohibition promoted a culture that rejected puritanical ideas. This freethinking culture was made even more glamorous in the early 20th century by the emerging motion picture industry. Although conservative religious groups were able to establish censorship standards in film, the movies and the private lives of movie stars promoted the acceptability of sexual freedom, easy divorce, and self-indulgence.

After World War II, religion was influential in American society in a variety of ways. When the Soviet Union became identified with ‘godless communism” during the Cold War, many Americans saw the United States as a protector of religion. The phrase, “under God,” was added to the Pledge of Allegiance in the 1950s so that the public would commit themselves at public events to living in “one nation, under God, indivisible, with liberty and justice for all.” The civil rights movement of the 1950s and 1960s drew its leadership from black religious groups, including the Southern Christian Leadership Conference. The Nation of Islam, a black religious group, promoted a more radical black separatist movement. Liberal, white congregations played supporting roles in the drive for racial equality.

Many churches were active in the movement for peace during the Vietnam War (1959-1975), and religious groups took strong positions on whether abortion should be legal. Also during the 1960s, Roman Catholic activists and liberal Protestant groups worked for integration, workers rights, and peace.

During the 1950s the Beat movement sparked an interest in Eastern religions, including Hinduism, Daoism, and Zen Buddhism, that continued into the 1960s (see Beat Generation). A small number of Americans joined ashrams (religious communities) and other alternative religious groups. Meditation and yoga were widely practiced. These relaxation techniques, as well as acupuncture, have become increasingly valuable parts of modern medical practice.

The influence of socialist ideas among college students in the 1960s promoted antireligious viewpoints and lifestyles vastly different from those extolled by religious conservatives. These students promoted women’s rights, gay rights, legalized contraception and abortion, moderate drug use, and alternative living arrangements. They contributed to advances in many of these movements, although their most radical lifestyle experiments did not survive the early 1970s. In response to the dominance of these secular ideals on college campuses, conservatives organized the Campus Crusade for Christ, which became a training ground for conservative politicians who emerged in the 1980s and 1990s.

In the 1980s and early 1990s, televangelists, Fundamentalist ministers who preach on television shows, began to influence American politics. They were generally opposed to abortion (and sometimes contraception), to sexual freedom and easy divorce, to single parenthood, and to high taxes supporting social programs. They were in favor of traditional family structures and a strong anticommunist foreign policy. Their conservative messages and political endorsements helped elect Republican candidates. Regardless of their efforts, however, by the end of the 20th century, the political influence of religious movements had diminished.

Although religion has been influential, the United States remains a secular society rooted in the rational Enlightenment ideals of tolerance, liberty, and individualism. The media and schools generally steer clear of religious issues, and religious toleration and freedom of expression remain widely held values that transcend the multiplicity of beliefs and values.

Religious Discrimination

Although religious toleration is a cornerstone of American society, religious discrimination has also been a part of America’s history. Most Americans, from early colonists to members of the Bureau of Indian Affairs in the 20th century, have viewed Native American spiritual beliefs as superstition. Even the most well-intentioned of American policy makers sought to replace traditional native beliefs with Christianity by breaking up native families, enforcing the use of English, and educating children in boarding schools dedicated to Christianization and Americanization.

European immigrants also sometimes faced religious intolerance. Roman Catholics suffered from popular prejudice, which turned violent in the 1830s and lasted through the 1850s. Americans feared that the hierarchical structure of the Roman Catholic Church was incompatible with democracy. Many felt that separate parochial schools meant that Roman Catholics did not want to become Americans. Irish Catholics were thought to be lazy and prone to heavy drinking. At its peak, the nativist movement—which opposed foreigners in the United States—called for an end to Catholic immigration, opposed citizenship for Catholic residents, and insisted that Catholic students be required to read the Protestant Bible in public schools. The nativist American Party, popularly called the Know-Nothings because of the secrecy of its members, won a number of local elections in the early 1850s, but disbanded as antislavery issues came to dominate Northern politics.

In the early part of the 20th century, the Ku Klux Klan sought a Protestant, all-white America. The Klan was a white supremacist organization first formed in the 1860s. It was reorganized by racists in imitation of the popular movie The Birth of a Nation (1915), which romanticized Klansmen as the protectors of pure, white womanhood. The Klan preached an antiblack, anti-Catholic, anti-Semitic message and sometimes used violence to enforce it. Burning crosses, setting fires, and beating, raping, and murdering innocent people were among the tactics used. Many Protestant congregations in the South and in the Midwest supported the Klan. The Klan attracted primarily farmers and residents of small towns who feared the diversity of the nation’s large cities. Anti-Catholic feelings reappeared during the unsuccessful presidential campaign of Alfred E. Smith in 1928 and in the 1960 presidential campaign, in which John F. Kennedy became the first Roman Catholic president.

Jews were subjected to anti-Semitic attacks and discriminatory legislation and practices from the late 19th century into the 1960s. The Ku Klux Klan promoted anti-Semitic beliefs, there was an anti-Semitic strain in the isolationism of the 1920s and 1930s, and the popular radio sermons of Father Charles Coughlin, a Roman Catholic priest, spread paranoid fears of Jewish conspiracies against Christians. President Franklin D. Roosevelt was the target of anti-Semitic attacks, despite the fact that he was not a Jew. Both the fight against fascism during World War II and the civil rights movement of the 1950s and 1960s helped to diminish anti-Semitism in the United States. Court decisions and civil rights legislation removed the last anti-Jewish quotas on college admissions, ended discrimination in corporate hiring, and banned restrictive covenants on real estate purchases. Far right-wing movements at the end of the 20th century have revived irrational fears of Jewish plots and promoted anti-Semitic statements, as have some African American separatist groups. However, right-wing militias and Klan groups have paid less attention to American Jews than to African Americans, homosexuals, and conspiracies allegedly funded by the federal government.

In the 1990s, the demise of the Soviet Union as the “evil empire” (as President Ronald Reagan named it in 1983) left a void in American political life that has been partially filled by a sporadic antagonism towards certain Muslim nations. Foreign policy crises have coincided with an influx of Muslims into the United States and popular revulsion at the antiwhite rhetoric of the American Nation of Islam. An oil crisis created in the 1970s when Arab oil-producing nations raised prices astronomically triggered anti-Arab, anti-Muslim diatribes in the United States. International crises in the Middle East during the 1980s continued these sentiments. There were outbursts of anti-Muslim feeling during the Persian Gulf War (1990-1991), and many Muslims felt the war was an attack on Islam rather than a dispute with the government of Iraq (see Arab Americans). This sense that U.S. policy was attacking the Islamic faith was a factor when the World Trade Center in New York City was bombed in 1993 and destroyed in 2001 (see September 11 Attacks).

American ideals of religious toleration and freedom of conscience have not always been endorsed in particular cases and in certain periods of American history, but the goal of inclusiveness and liberty remains an important theme in the development of the United States.

FAMILY LIFE

There has never been a typical or single traditional family form in the United States. In the early 21st century, the ideal family is a vehicle for self-fulfillment and emotional satisfaction. The family in early America had different functions as producers of food, clothing, and shelter. There has always been a gap between the ideal family and the more complicated reality of family relationships. While Americans value their families and resent outside interference, they have also been willing to intervene in the family lives of those who seem outside the American ideal.

Native Americans had a variety of family organizations, including the nuclear family (two adults and their children), extended households with near relatives, clans, and other forms of kinship. Family organizations might be matrilineal, where ancestry is traced through the mother’s line, or patrilineal, where ancestry is traced through the father’s line. In general, Native Americans had a great deal of freedom in sexuality, in choosing marriage partners, and in remaining married. After conversion to Christianity, some of the variety in family forms decreased. In the early 20th century, the United States government broke up many Native American families and sent the children to boarding schools to become Americanized, a policy that was disastrous for those involved and was largely abandoned by the middle of the 20th century.

Colonial Families

During the 17th century and the first half of the 18th century, when Americans from European backgrounds spoke about family, they often referred to what we would call households—people who happen to be living together. In addition to the husband, wife, and children, this could include servants, apprentices, and sometimes slaves. These earliest families were productive units, not sentimental, affectionate groupings. The family performed a number of functions that larger institutions now provide. The father, as head of the family, educated his sons, servants and apprentices. Women instructed their daughters in how to run a household. Both husband and wife were responsible for the religious development of the their household members. Primary responsibility for the order of society fell to the family, including supervising individuals, punishing minor offenses, and reporting major offenses to local officials. There was no other police force. Men and women provided basic health care, food, clothing, and entertainment. In order to fill all these roles, it was expected that obedience to the authorities of master, father, mother, church, and state would be maintained. Individualism was not valued. Everyone was expected to pull his or her weight in order for the family to survive.

Marriages were forged primarily for economic reasons, and only secondarily for companionship. Love, if it appeared at all, came after marriage, not before. Husband and wife labored together to sustain the family, but at quite separate tasks. Husbands worked in the fields, tended livestock, worked at a craft, or were merchants. Women often specialized in producing goods, such as dairy products, beer, or sausage, or they provided services like midwifery. They then traded these products or services with other women for their specialties. In the cities, women worked in shops, kept accounts, and assisted their husbands, who practiced a trade or engaged in commerce. Children assisted their parents from an early age. Everywhere, family, business, and social order were combined. Emotional satisfaction was not a function of the family.

While men and women both contributed to the success of the farm or family business, men had full legal authority over their families. Only men could hold positions in government, in the church, or in higher education. Women had no property or marital rights, except those their husbands granted, and fathers had custody of children in the rare cases of separation. Divorce was extremely rare and was illegal in many colonies. Some children, boys and girls, were sent about age 12 to work as servants in other people’s houses to learn farming, a craft, commerce, or housework. Boys might also to go to boarding schools and then to college or to sea, but most girls were not formally educated. The individuality of children was not recognized, and if one died, a later child was sometimes given the same name. The oldest son usually received more of the family’s property than his younger brothers. Daughters received even less, and generally only when they married. Life was hard, and caring parents made sure that their children were obedient, hardworking, and responsible.

Life for children in the colonial period could be difficult. Whipping and other forms of physical punishment were commonplace and sometimes mandated by law. Such punishment was considered a sign of parental love, as parents sought to wean their children from their natural tendency toward sin and corruption. Virtually all children saw a sibling die and suffered several bouts of serious illness themselves. From one-third to half of all children experienced the death of a parent, and the cruel stepmother or heartless stepfather was more than a fairy tale for many colonial children. Orphans were shipped out to relatives, or sometimes local authorities gave them to the lowest bidder—the person who promised tax officials to raise the child most cheaply. Even as adults, sons and especially daughters were expected to obey their parents. Sons were given considerable freedom in deciding whom to marry, but often daughters could only choose to turn down an offensive suitor selected by their father.

Life was harsh in the country and for the majority in the city. There were few social services to support the family. Although children were expected to honor their parents, there was no guarantee that adult children would support their elderly parents. Many parents wrote wills linking the children’s inheritance to the care the children provided their elderly parents. Servants and apprentices were often subjected to harsh beatings, coarse food, and deprivation. In addition, servants could not marry or leave the premises without their master’s permission. Slaves were treated even more harshly. The family was concerned with the maintenance of hierarchies and social order.

African American Families under Slavery

African family traditions, which varied according to national origin and religion, could not be replicated in the New World after Africans were forced into slavery. The slave trade was responsible for breaking up African families, and husbands, wives, and children were liable to be sold separately because U.S. law did not legally recognize their families.

Enslaved Americans were denied a secure family life. Because enslaved men and women were property and could not legally marry, a permanent family could not be a guaranteed part of an African American slave’s life. They had no right to live or stay together, no right to their own children, and it was common for slave parents and children to live apart. Parents could not protect their children from the will of the master, who could separate them at any time. About one-third of slave families suffered permanent separation caused by the sale of family members to distant regions. This might occur to punish some infraction of plantation rules, to make money, to settle an estate after a death in the owner’s family, or to pay back a debt.

For the majority of slaves, who lived on small plantations with only a few other enslaved people, marriage partners had to be found on other farms. Meetings between a husband and a wife could occur only with the permission of the husband’s owner. Children stayed with their mothers. Schooling was not an option for enslaved children, and in most states it was illegal to teach slaves to read and write. The most common reason for slaves to run away was to see family members, if only briefly, although slave women rarely took to the roads both because it was not safe for women to travel alone and because it was difficult to travel with young children. For enslaved people on large plantations, it might be possible to find a partner owned by the same master, although couples could be assigned to different parcels of land or different areas of the plantation, or even to the vacation or city homes of the owner. The Christmas holiday, the one break from work during the year for slaves, was anticipated with excitement because it allowed separated family members to meet and spend a week together. Despite the fragility of familial bonds under slavery, enslaved men and women considered themselves married, recognized their kin in the names they gave their children, looked after their relatives and friends in cases of separation, and protected each other as much as possible.

Slavery and servitude was virtually abolished between the 1770s and the 1830s in the Northern states. This meant that African Americans could legally establish families in the North. Black churches married couples, baptized their children, and recorded the new surnames that former slaves chose for themselves. Benevolent societies looked out for their members’ welfare. Slaves who escaped from slaveholding areas were sheltered and moved to safer locations. Mothers and fathers both worked so their children could become educated. Although African American families in the North faced discrimination and poverty, and worried about being kidnapped by slave catchers, they had hope of maintaining their family ties.

19th-Century Families

Only in the late 18th and early 19th centuries did ideas of affectionate marriages and loving, sentimental relations with children become dominant in American family life. These attitudes first took hold among the urban, educated wealthy and middle classes, and later spread to rural and poorer Americans. This change was due to the growth and increasing sophistication of the economy, which meant that economic issues became less pressing for families and production moved outside the home to specialized shops and factories. With more leisure time and greater physical comfort, people felt that happiness, rather than simple survival, was possible. English philosopher John Locke’s theory that human beings are born good, with their minds as blank slates, contrasted with traditional Christian beliefs that children were sinful by nature. If this blank-slate theory is correct, then goodness can be instilled in children by showering them with kindness and love and by shielding them from the bad things in this world.

Additionally, the psychological theory of sensibility, another 18th-century idea, argued that positive feelings such as friendship, happiness, sympathy, and empathy should be cultivated for a civil life of reason. By the 19th century, romanticism and sentimentality put even more emphasis on emotional attachment and the cultivation of feeling. New ideas about human equality and liberty undermined older notions of hierarchy and order. Americans applied the political ideal of “Life, Liberty, and the pursuit of Happiness” espoused in the Declaration of Independence to family life. Husbands were to rule, but with affection and with their wives’ interests at heart. Wives obeyed, not out of force, but out of love. Parents sought the affection of their children, not their economic contributions. This was the new ideal, but old habits died slowly. Authority, inequality, and violence declined but never entirely disappeared.

By the end of the 18th century and into the 19th century, marriage was undertaken for affection, not for economic reasons. Courtship became more elaborate and couples had more freedom. They attended dances, church socials, picnics, and concerts, and got to know one another well. After the wedding, couples went on honeymoons to have a romantic interlude before settling down to daily life. Raising children became the most important job a wife performed, and children were to be loved and sheltered. Physical punishment of children did not disappear, but it became more moderate and was combined with encouragement and rewards.

Servants, apprentices, and others gradually dropped out of the definition of family. Servants no longer slept within the same house as the family, and apprentices rented rooms elsewhere. By the 19th century, the nuclear family, consisting of a father and mother and their dependent children, had become the model. The ideal, loving family could be found in magazines, poems, and religious tracts. Novels promoted romantic courtship and warned readers of insincere fortune hunters or seducers when seeking a husband or wife. Love and sincerity were advocated. Still, economic considerations did not entirely disappear. Wealthy women married wealthy men; poorer men married poorer women.

The economic transformations of the Industrial Revolution in the 19th century brought about further changes in men’s and women’s roles. Work was less likely to be done in the home, as fewer and fewer Americans lived on farms, and men left the home to work in offices and factories. Men assumed sole responsibility for the financial support of the family, becoming the breadwinners, a term coined in the early 19th century. Married women were not supposed to work for wages, and were considered too pure and innocent to be out in the working world. Women were supposed to devote themselves to domestic duties, and children were seen as young innocents who needed a mother’s protection. Fathers had less and less to do with raising their children.

Although the 19th-century ideal held that married women were not supposed to work, women did contribute to the family’s well-being. Wealthy women planned formal dinners, balls, and other social gatherings that were crucial to their husbands’ political and business success. Middle-class women sewed for what they called pin money, small amounts that frequently balanced the family budget. Married women in the middle and working classes took in boarders, sold hot lunches or pastries to neighbors, and saved money by doing their own baking, brewing, gardening, and other chores. It was also common in middle- and working-class families for sons to be sent to school, while their teenage sisters supported this schooling by working in a factory, teaching in elementary schools, or taking in sewing. Such work was considered acceptable as long as it was either done in the house or by unmarried young women.

Many 19th-century American families did not fit into this nuclear family ideal, as it was expensive. High housing costs meant more people than just the nuclear family often lived under one roof. Extended families, including grandparents and other relatives, were most numerous in the mid-19th century. Immigrants clung to traditional extended-family forms, and poorer families often included grandparents, grandchildren, and sometimes aunts and uncles in order to maximize sources of income and save on rent. Men, women, and children worked long hours for low wages in dirty, cramped surroundings in the sweatshops of major cities. Although the ideal woman was supposed to be pure, innocent, and domestic, most poor women had to work. Taking in boarders, such as young men and women working in local factories, was another way that families earned money, although they gave up family privacy.

Low wages during the early stages of the Industrial Revolution, in the first half of the 19th centurymeant that even young children sometimes had to work instead of being sheltered at home. In the poorest families, and particularly among newer immigrants, children younger than 12 often worked in factories or sold newspapers and trinkets on the streets. School was a luxury for some poor families because they needed the children’s income. Because of this, illiteracy rates actually rose during the early stages of the Industrial Revolution, even though public schools were more widely available.

When husbands died or abandoned their families, women had no choice but to work, opening a shop if they had the capital or working in a sweatshop if they did not. Wages for women’s work were low, and prostitution, which offered more money, flourished in towns big and small. It was very difficult for a single mother or father to work and raise children, and children of single parents were often left at orphanages or simply abandoned to the streets. This was before Social Security, workers’ compensation, unemployment insurance, retirement funds, health insurance, and other private and public programs existed to aid families in times of crisis.

American families made a variety of compromises in the face of economic hardship. In many southern and eastern European immigrant families, where it was more important for married women to stay at home, children were withdrawn from school and sent to work so their mother could run the household. Among African Americans living in the North, educating their children was the most important family goal, so wives joined their husbands in the workforce to enable children to stay in school. In some families, men had total control over finances; in others, wives handled the husband’s paycheck. In some families, resources went to the eldest son, so he could make money and later support his parents and siblings. In other families, all boys were treated equally or all boys and girls were equal. Some families valued close ties and insisted that older children settle near their parents, while others sent their sons out West, to the cities, or simply on the road in hopes of a better future.

During the 19th century, the majority of Americans continued to live on farms where everyone in the family worked, even if it was in and around the house. Women on farms still worked as they had during colonial times, although by the 19th century, they were producing butter, eggs, cheese, and other goods to sell in the nearest city rather than to trade to neighbors. Sharecroppers and tenant farmers worked long and hard for only a fraction of their produce. School was out of the question for poor children in these circumstances. In the West, the difficulties of pioneering often meant that all members of the family worked. For most Americans, these alternate family arrangements were less than desirable. Most Americans sought the private, affectionate, comfortable family life with domestic wives, breadwinning husbands, and well-educated children.

The dominance of the family ideal is only one aspect of life in the 19th century. The constant emphasis on family, domesticity, and children could be confining, so men and women developed interests outside of the home. The 19th century was a great age of organizations only for men, and fraternal groups thrived. Taverns and barrooms provided a space for men to make political deals, secure jobs, and be entertained. Men formed literary and scientific societies, labor organizations, reform groups, Bible study groups, and sports leagues.

The 19th century was also a period of change for women. Married women in the 19th century, who had more education and fewer children than their predecessors, founded reform groups, debating societies, and literary associations. They involved themselves in school reform, health issues, women’s rights, temperance, child labor, and other public-policy issues. A few states in the West granted women full political rights. A women’s movement demanding equal rights, including the vote, gained strength after 1848. In the first half of the century, public education extended basic literacy to many poorer Americans, and in the second half of the century women’s high schools and colleges were founded, along with coeducational colleges in the Midwest and West. Women’s occupational choices began to expand into the new fields of nursing, secretarial work, department store clerking, and more, although work was something a young woman did only until she married. Women who wanted a career had to forgo marriage.

By the middle of the 19th century, many states had passed laws allowing women control over their possessions and wages. A few states allowed divorce on the grounds of physical abuse. New stereotypes appeared at the same time. In child custody cases, women, rather than fathers, were now given control of their children because women were considered natural child rearers. This practice would persist until the late 20th century, when shared custody arrangements became common.

The rise of labor unions during the 19th century was instrumental in changing the nature of work and the shape of families in America (see Trade Unions in the United States). By the end of the century unions were demanding higher wages for men, so that they could provide the sole support for their families. The unions argued that women and children should refrain from paid labor rather than become unionized and press for higher wages. Behind these demands was the ideal of the breadwinner husband and the domestic wife. Unions also sought shorter workweeks to allow men to spend more time with their families.

20th-Century Families

The Progressive movement supported changes in social policy that would create more nuclear families. Progressives and trade unionists sought to limit women’s work and to outlaw child labor. They did this by attempting to close unhealthy sweatshops. They also promoted better housing so that families could have comfortable surroundings. The unions and Progressives were generally successful in gaining bans on child labor in Northern states, although many poor parents and businesses opposed these laws. Some of the poor and traditionalists resisted restrictions on child labor because they believed children needed work experience, not an education.

Rising wages for male workers, the absence of union protection for women workers, and mandatory education laws allowed, or forced, more Americans to realize the domestic ideal. These changes came later to the South, which was poorer and less industrialized. Retirement funds, savings banks, and pension plans meant that older Americans were less dependent on their children’s wages. The gradual development of workers’ compensation and unemployment insurance allowed families to survive even with the loss of the breadwinner’s income. Working-class and middle-class families began to look more alike in the early 20th century. Men went to work, while women stayed at home, and children attended school.

The Progressive movement also brought about the modern social work movement. Trained social workers intervened in families experiencing problems that threatened the well-being of family members and affected the community: physical abuse, drug or alcohol addiction, neglect, or abandonment. Social workers were often successful in protecting the family, although social workers were sometimes influenced by the common prejudices of the time. Married women in the early 20th century were discouraged from leaving abusive husbands because the prevailing belief was that a wife’s place was in the home.

Racism and prejudice also played a part in social policy. Single white girls who became pregnant were secretly sent to special homes and required to give up their babies for adoption so that they could return to their “real” lives. Black girls in the same circumstances were considered immoral and examples of the supposed inferiority of African Americans. They were sent home to rear their children by themselves; a few were forcibly sterilized.

The Great Depression and World War II brought a temporary shift in family structure. During the hard times of the 1930s, children once again had to work. Some were abandoned and wandered looking for work. Families doubled up to save on rent, and women took in boarders, worked as servants, ran hairdressing salons, baked goods, or sewed for extra money. Men, too, took to the roads to look for work, hoping their families would join them once they had obtained a steady paycheck. The domestic life was impossible for many, first because of economic hardship and later because of the war. Marriage and children were delayed, and buying a home was out of the question.

During World War II, for the first time, large numbers of married women took jobs. Because of the war effort and the number of men sent overseas, women were hired to perform jobs traditionally done by men. The popular image of Rosie the Riveter captures the novelty of women dressed in work clothes, engaging in skilled, industrial labor. Factories set up day-care centers to attract married women workers. Women drove cabs, moved into positions with more responsibility, and provided support services for all the major branches of the armed forces. Although women earned lower wages, received fewer promotions, and were among the first laid off, the domestic image of women created in the late 18th and early 19th centuries had changed. Married women were out of the house and earning their own money.

The year after World War II ended, both the marriage rate and the divorce rate soared. The marriage rate went from 12.2 per 1,000 people in 1945 to 16.4 in 1946. The divorce rate, which had been slowly increasing during the century, leaped from 3.5 to 4.3 per 1,000 people. One reason for the extraordinarily large number of divorces in 1946 was that couples who had married in haste before they were shipped overseas for the war found that they had little in common after three to five years apart. The divorce rate slowed after 1946, but by the 1950s was steadily increasing. While divorce was not uncommon before the war, most divorces were sought by recently married couples without children or by older couples with grown children. Once children arrived, couples felt obliged to stay together for the sake of the children, no matter how uncomfortable or violent the marriage. Increasingly after World War II, and especially by the 1960s, the presence of children did not hinder divorce. Parents came to believe that it was better to rear children in a less-stressful setting than to maintain the fiction of marital success. Child custody became a divisive issue in divorces, adversely affecting parents and children.

The end of the war also rapidly reduced the number of married women employed outside the home, as returning veterans sought work. Many of these women gradually returned to work, either because they had enjoyed working or because the family wanted the second income to buy a new home in the suburbs, a second automobile, a new television set, or other consumer goods that were now available. Some veterans took advantage of their military benefits to attend college while their wives worked.

More and more young women graduated from high school and went to college, instead of working to help support their families or to subsidize a brother’s education. As young men and women delayed work and substantial responsibility, a youth culture developed during and after World War II. High school students embraced separate fashions from their parents, new forms of music and dance, slang expressions, and sometimes freer attitudes toward sexuality, smoking, or drug use that created a generation gap between parents and children. Yet parents were anxious to provide their children with advantages that had not existed during the depression and war years.

The 1950s and 1960s produced a period of unparalleled prosperity in the United States. Factories were kept busy filling orders from a war-devastated world. White-collar jobs expanded, wages were high, mortgage and tuition money was available thanks to federal support, and goods were relatively cheap. This economic prosperity allowed more Americans to become more middle class. The ideal middle-class family was epitomized in the new medium of television through shows such as Father Knows Best and Ozzie and Harriet, in which fathers arrived home from work ready to solve any minor problem, mothers were always cheerful and loving, and children were socially and academically successful. These shows reflected the fact that a majority of Americans now owned their own home, a car, and a television, and were marrying earlier and having more children than earlier generations.

This idealized middle-class American family began to show cracks during the late 1950s and early 1960s. In response to the demands on men to create and support expensive domestic paradises, a mythical world of adventure and freedom eventually arose in popular culture. Movies about secret agents and Western gunslingers contrasted with the regimented suburban, corporate lifestyle of many men. The demands on women to be all things to all people—a sexy wife, a caring, selfless mother, a budget-minded shopper, a creative cook, and a neighborhood volunteer—and to find satisfaction in a shining kitchen floor often produced anxious feelings of dissatisfaction.

Concern grew over teenage delinquency and high pregnancy rates, as well as the perceived immorality of rock and roll, all of which were blamed on inadequate parenting, not on the difficulties inherent in the current standards of family life. The ideal suburban life was capable of providing comfort and being emotionally fulfilling for parents and children. It could also be a place where young adults had too little to do, married women became isolated and self-sacrificing, and men were harried by the pressure of providing the consumer products of the “good life.” Children, who were pressured to succeed and to conform to middle-class ideals, became rebellious and created alternative cultures.

The emergence of Beat culture, the civil rights movement, and the antinuclear movement in the 1950s signaled a more organized and intellectually grounded rebelliousness that would bloom in the 1960s. The 1960s and early 1970s saw the emergence and expansion of movements dealing with black power, students’ rights, women’s rights, gay and lesbian rights, Native American rights, and environmental protection. Men and women also began experimenting with new gender roles that blurred traditional boundaries between masculine and feminine behaviors. In 1963 author Betty Friedan, in her book The Feminine Mystique, articulated women’s frustration with being only wives and mothers. The book helped revive the women’s rights movement in the 1960s and 1970s. Men took more interest in child rearing. Some men cultivated supposedly feminine attributes, such as nonviolence and noncompetitiveness. Women sought work, not just to earn money but to have careers. There were attempts to equalize the roles of husbands and wives, or to eliminate traditional marriage vows in favor of personal and/or sexual freedom. By the 1970s, gay and lesbian individuals publicly asserted their right to engage in same-sex unions. These unions were sometimes based on traditional marriage models, including marriage vows and children, and sometimes on newer models that involved more autonomy. Communal alternatives to traditional marriage, as well as open marriages and same-sex partnerships and families, challenged the ideals of the 1950s by rejecting the materialism of suburban lifestyles and by experimenting with nonnuclear family forms.

Throughout the 1970s the buoyant economic basis of the 1950s middle-class family gradually eroded. The end of the Vietnam War in 1975 reduced the military spending that had kept employment and wage levels high. Women moved into the job market in unprecedented numbers to pursue careers and to maintain the family’s standard of living when the husband’s income failed to keep up with inflation. As more women entered the labor force, they began removing some of the barriers to advancement through court cases and concerted pressure on institutions and businesses. Some women undertook careers in medicine, law, politics, management, and higher education that had been dominated by men. Traditionally female jobs were sometimes reconfigured so that, for example, some secretaries became administrative assistants, and some nurses became nurse practitioners, midwives, or other specialists.

These changes sometimes shifted the balance of power within families. Some husbands felt inadequate because they could no longer maintain the role of sole breadwinner for their families. Some wives felt that they had to be supermoms, continuing to cook, clean, and volunteer for local activities, while holding down a full-time job. These stresses contributed to rising divorce rates and may have discouraged some couples from seeking permanent unions. Non-marital unions (couples living together but not married) and out-of-wedlock births soared, particularly among the most financially pressured Americans, although movie and music stars were the most visible of those rejecting traditional marriages and childrearing arrangements.

The nuclear family felt even more pressure as companies fled older cities, factories shut and moved overseas, and service work replaced highly paid, unionized, skilled factory jobs. Young Americans of marriageable age could not count on secure, well-paid employment in the future and became reluctant to make permanent plans. Education became more essential, but fewer students could count on the GI bill to underwrite expenses. Many high school and college students began working after classes, reducing the amount of time spent reading and studying. This contributed to declines in educational achievement. At the same time, religious conservatives began calling for a return to traditional values of earlier times—families with a strong father figure, a domestic mother, and obedient children. These calls did not change many lives.

Current Trends in Family Life

In 1998 there were 2,256,000 marriages in the United States, a marriage rate of 8.4 per 1,000 people. This rate was down from 10.6 per 1,000 in 1980. The year 1998 also saw 1,135,000 divorces in the United States, a rate of 4.2 per thousand people. One estimate is that 50.3 percent of marriages will end in divorce. Divorce rates have been rising since 1920, when records were first kept and when the divorce rate was about a third of the 1995 rate. Although the divorce rate has been declining since it peaked in the early 1980s, America still has one of the highest divorce rates in the world. The majority of divorced people eventually remarry.

The economy at the end of the 20th century offered most workers less security and more competition, a situation that does not favor investment in marriage, particularly among the young. People, on average, are delaying marriage. The middle class peaked in the economic prosperity lasting from 1947 to 1973. Afterwards, the majority of Americans faced shrinking paychecks. Housing, utilities, and health care ate up 35 percent of the average family’s paycheck in 1984, compared to 38 percent in 2000. In 2000, 41 percent of people under age 35 owned their own homes, compared to 44 percent in 1980.

Increased educational requirements and job training, economic insecurity, difficulties finding the “perfect mate,” and the attractions of a carefree life are among the reasons for delaying marriage. In 2000, the average age at which Americans married was 26.8 for men and 25.1 for women, matching the marriage age for men and surpassing the marriage age for women in the 19th century. Virtually all people eventually marry—by age 65, about 95 percent of men and women are married. Americans delay marriage, seek divorces, and remarry because they expect marriage to be loving, supportive, and equitable. If a marriage is disappointing, they often seek the perfect partner in another relationship.

The strict gender roles that once confined men and women to certain activities are disappearing. Many women work and control their own wages, sources of credit, savings, and investments. Many men are enjoying closer relationships with their children as well as with their wives. The amount of time that men contribute to housework has been increasing for decades, although married women remain more heavily engaged in housework and child care.

Families are having fewer children than ever, but children are often staying home longer. The high cost of college education keeps many older children at home. Census takers at the end of the century have noticed what they call a boomerang effect, where adult children leave home but then later return. High rents and low entry-level wages, divorce, single parenthood, and their parents’ higher standard of living are among the factors encouraging adults to return home. Parents often welcome the companionship and assistance of their grown children.

Americans are responsive to the phrase “family values” because they appreciate the ties of kinship and the continuity of family tradition in a society that is rapidly changing and often isolating. Shared activities and shared memories continue to be important at the beginning of the 21st century. The very architecture of new housing reflects the levels of family cooperation by enlarging the kitchen to accommodate the activities of husbands, wives, and children and by having the family room a part of, or nearby, the kitchen. The decoration of houses and apartments often makes the main room a shrine of family portraits and family souvenirs.

Women are having fewer children, yet many children are being born outside of marriage. In 2000 that amounted to 1,345,000 children. The number of children under 18 years of age living with two parents has decreased from 88 percent in 1960 to 68 percent in 1997, and child poverty rates have risen. By 2000, some 20 percent of children were living in poverty. In 1997, 24 percent of all children lived with their mothers only. This is substantially higher than the 8 percent who did in 1960, and reflects both the increases in single motherhood and the rising divorce rate. Because working women still earn substantially less than their male counterparts, and are less likely to be promoted, a rise in female-headed households means that more children are being raised in poverty. A minority of children lived with their fathers only, but again this rate has substantially increased. In 1960, 1 percent of children lived with their fathers only, 37 years later this quadrupled to 4 percent. Another 4 percent lived elsewhere, either with grandparents or other relatives. Large numbers of American children, 815,000, lived with nonrelatives in 1997, mostly in foster care. In 2000, 83 percent of children living with a single parent lived with their mothers and 17 percent with their fathers. While the majority of children live with two parents, that percentage has been shrinking.

Since the late 18th century, families have become more child centered. This trend peaked in the 1950s and 1960s. In the last decades of the 20th century, adults reported high levels of satisfaction with their family relationships, but children sometimes received too little attention and too little of a wealthy nation’s resources. There is evidence of anxiety, depression, and anger as some children are shuffled from place to place and from relationship to relationship, fought over in custody battles, and left on their own while their parents work. The problems that some children experience at home are brought to school and affect the quality of education. Social work and psychological counseling are now necessary adjuncts to schools from the preschool level through college. Violence is a problem in the schools as well as on the streets, and this level of violence is peculiar to the United States among industrialized countries.

The safety net for families and community support for parents and children have been rolled back at the beginning of the 21st century. The United States lags behind other developed nations in educational standards, social welfare programs, infant mortality rates, marriage rates, legitimacy rates, public safety, and other measures of family well-being. Crime, violence, drug abuse, and homelessness are problems that arise from these situations and also weaken existing families. Some of the problems with family life come not from a rejection of the family or from stresses on the family, but from the high and idealistic expectations that Americans place on their marriages, sexual relationships, and parent-child relationships. Many Americans hope for a perfect spouse and a perfect family and will experiment until they find satisfying lives for themselves. The cost may be tenuous relationships.

These tenuous family relationships are not entirely new. In the 17th and 18th centuries, families were similarly unstable, because of high death rates rather than divorce, and children were raised in as wide a variety of situations then as now. Marriages are more fragile, but some family relationships have strengthened over time. Mothers have assumed more responsibility for the economic as well as domestic care of their children. Some fathers are rearing their children. Grown men and women can often count on parental support, and grandparents step up to raise their grandchildren. Surveys show that the majority of adults are happy with the choices they have made and do not regret single parenthood or nonmarital unions. Many children reared by single parents, grandparents, foster parents, or adoptive parents thrive; others suffer from a lack of adult attention and supervision, from the instability of their home lives, and from feelings of rejection.

Although there is concern about these social changes, few would want to return to the days when women were expected to stay in abusive marriages or fathers were routinely denied custody of their children. The majority of Americans accept new attitudes on sexual expression, birth control, abortion, divorce, and child custody, although many personally view homosexuality as immoral, have mixed feelings about abortion, and want to make divorce more difficult to obtain. Both liberals and conservatives agree there are hopeful and troubling aspects of the American family at the beginning of the 21st century. The family is not dead, but it exhibits the plurality of interests, hopes, and troubles that the American people face at the end of the century.

MORE INFORMATION

This is one of seven major articles that together provide a comprehensive discussion of the United States of America. For more information on the United States, please see the other six major articles: United States (Overview), United States (Geography), United States (Culture), United States (Economy), United States (Government), and United States (History).

© The Globe Encyclopedia. All rights reserved. Distributed by ASThemesWorld