When Japan surrendered on 14 August 1945, ending World War II, Americans celebrated wildly but also looked ahead in uneasiness. What would the postwar era bring? Many feared a return to conditions of the Great Depression as the war-induced economic boom ended. In fact, the postwar economy soared, producing a level of material abundance unequaled in American history. Millions of citizens acquired new homes in the suburbs; purchased new cars, appliances, and television sets; and spent freely on leisure-dine activities.
The economic boom affected all aspects of the American experience in these years. Politically, the twenty-year reign of the Democratic Party--the party of Franklin Roosevelt and the New Deal--ended in 1952 as voters gave the Republicans control of Congress and elected the war hero Dwight Eisenhower as president. The return of prosperity had a profound social and cultural impact as well. Some spoke of postwar America as a "consumer culture"--a culture in which the production, marketing, and acquisition of the material symbols of the good life became the central reality shaping society and its values. Conservative social values prevailed as a newly affluent but uneasy middle class resisted real or imagined threats to the status quo.
But the security, happiness, and tranquillity that such abundance promised eluded the nation in these years. Not only did large pockets of poverty and social distress persist amid the general abundance, but the prospering middle class itself--particularly women--experienced stress and tensions as it tried to live up to a social ideal emphasizing domesticity and devotion to family. Cultural critics, intellectuals, and alienated youth probed the flaws and fault lines beneath the smooth facade of Eisenhower prosperity; racial segregation in the South increasingly emerged as a compelling social issue; and anxieties induced by the cold war and the nuclear arms race mocked the era's surface placidity. American society between 1945 and 1960 thus presents a complex and paradoxical picture of dramatic advances in material well-being uneasily coexisting with severe but only half-acknowledged social problems and cultural strains.
The war had brought major social changes to the home front. As millions of men (and many women) entered the military, over six million new women workers had poured into the war plants at home. By 1945, 37 percent of all American women were in the labor force. The war also spurred the cityward migration of Americans that had been under way for decades, including the movement of southern rural blacks to the industrial centers of the North. The black population in the South declined by 1.6 million between 1940 and 1950, while that in the urban North increased correspondingly. In addition, over 935,000 blacks donned uniforms in World War II, many for the first time experiencing life outside the communities where they had grown up. These and other wartime social trends affected postwar life in important ways.
The immediate challenge in 1945, however, was the economic transition from war to peace. As military spending plummeted from $75 billion in 1945 to $43 billion in 1946, and as the Truman administration, under intense political pressure, rapidly demobilized the twelve million men and women in uniform, many observers foresaw massive unemployment and a return to the grim conditions of the 1930s. But in fact the burgeoning economy easily absorbed the returning Gls. A short postwar downturn soon ended, and by the end of 1946 the boom was under way. Economic output in the United States, already at an all-time high of $200 billion in 1946, surged to $318 billion by 1950.
War research contributed to the postwar boom, as developments in electronics, pharmaceuticals, chemicals, aeronautics, atomic energy, and other fields not only made postwar America the world's research center but also sent the economy into overdrive. Electronics sales surged to over $11 billion by 1960; plastics and synthetics, little known before the war, proved to have myriad civilian applications.
The industrial know-how that had produced the planes and ships which helped win the war had direct peacetime applications. So, too, did the backlog of savings resulting from the wartime combination of high wages and scarcity of consumer goods. Americans in August 1945 had some $140 billion in bank savings and war bonds, and this accumulated reserve soon surged into the economy, fueling the postwar demand for consumer goods. The devastation abroad that was part of the war's legacy spurred the economy as well, as American factories, unscathed by bombs, produced the goods for export that could not be made elsewhere. For a time in the late 1940s, one-third of all the world's manufactured goods came from the United States.
The Servicemen's Readjustment Act of 1944 (the so-called GI Bill of Rights), providing educational benefits and business loans to veterans, facilitated the demobilization process. Taking advantage of the law's educational provisions, a tide of 2.3 million veterans surged onto the nation's college campuses between 1945 and 1950.
Though employment continued at a high level as factories converted to peacetime production, most women war workers lost their jobs. Some returned voluntarily to domestic life, but many made the transition unwillingly. A 1944 study by the U.S. Women's Bureau showed that 80 percent of the women who had worked throughout the war hoped to remain employed after the war's end. Nevertheless, women workers were fired at a far higher rate than their male counterparts. In the aircraft industry, for example, where women constituted 39 percent of the wartime work force, they made up 89 percent of those laid off with the coming of peace.
Many black veterans chose to remain in the cities of the North, as did many blacks who had come north seeking factory work. Thus the war speeded up the black migration cityward that had been under way for decades, further shaping the social contours of late-twentieth-century American society.
Statistics suggest the magnitude of the postwar boom, though they cannot convey its full social and cultural meaning. The gross national product (GNP) rose from $212 billion in 1945 to $503 billion in 1960. Though the population grew from 140 million to 181 million in this period, the per capita GNP rose even faster. Led by surging automobile sales--fifty-eight million new cars (most of them domestically produced) rolled out of dealer show-rooms in the 1950s--production records fell on all sides. Six million refrigerators, fifteen million radios--the list goes on and on. Enticed by a torrent of advertising and by easy credit (the first credit card, Diners' Club, appeared in 1950; by 1960 Sears Roebuck alone had issued ten million plastic cards), Americans snapped up consumer goods at a mind-numbing clip. Despite several short recessions, the worst in 1957-1958, the economy hummed through most of the period. An upsurge in cold war military spending and foreign aid (much of it in the form of credits for acquiring American goods and military hardware) stimulated the economy. From 1945 to 1960 American economic and military aid abroad totaled $78 billion, and by the mid 1950s defense spending represented fully 10 percent of the GNP. The economist John Kenneth Galbraith captured the spirit of the era in the title of his 1958 book, The Affluent Society. While lamenting that much of Americans' new prosperity was going for private consumption rather than for society's common needs, Galbraith readily conceded the reality of that unprecedented prosperity.
Though it was not much noted at the time, low energy costs played a critical role in this economic boom. All forms of power, including oil, were cheap and abundant. Domestic oil production increased by nearly 50 percent between 1945 and 1960, and annual oil imports rose from 74 million to 371 million barrels. The typical Detroit car of the 1950s--the gas-guzzling, chrome-laden behemoth with soaring tailfins ridiculed by the critic John Keats in a book called The Insolent Chariots (1958)--typified the era's heedless attitude toward energy. At the bright midday of 1950s prosperity, fossil-fuel shortages, soaring oil prices, and dwindling energy resources were no more than small clouds on a distant horizon.
So, too, were the massive trade deficits that would afflict the economy in the 1970s and 1980s. The specter of massive economic competition from recently defeated Japan, as well as from other nations, still lay in the future. The United States in these years consistently exported far more than it imported. In 1960, for example, America enjoyed a favorable trade balance of nearly six billion dollars.
New technologies that increased productivity represented another major pillar of the postwar boom. From 1945 to 1960, for example, the worker-hours required to manufacture an automobile fell by half. Development of the computer, another product of wartime research, moved forward rapidly after 1945. The invention of the transistor in 1948 by Bell Laboratories scientists offered an alternative to the bulky and unreliable vacuum tubes of the wartime computers. Though the era of the ubiquitous home computer still lay ahead, by 1960 some four thousand computers were at work in private industry, and hundreds more were in such government agencies as the Census Bureau, the Internal Revenue Service, and, above all, the Department of Defense.
But with increased reliance on technological innovation and automated production, the work force changed. An industrial order once heavily dependent on manual labor now increasingly relied on college-trained researchers, engineers, managers, and service personnel. While the percentage of blue-collar workers in the labor force dropped slightly from 1940 to 1960, the proportion of white-collar workers increased from 31 to 42 percent. This trend would intensify in the years ahead.
As the blue-collar ranks thinned and workers shared in the general prosperity, the union movement of the 1930s lost momentum. In 1945 some 36 percent of the nonagricultural work force was unionized; by 1960 this figure had dropped to 31 percent. Restrictive federal labor legislation enacted by an increasingly conservative Congress, notably the 1947 Labor-Management Relations Act (the so-called Taft-Hartley Law) exacerbated this erosion of union power.
Increased productivity changed agriculture as well. With less labor required to meet the nation's food needs, the farm population (continuing a long trend) dropped from thirty million in 1940 to sixteen million in 1960. Blacks left the land at a particularly rapid rate. In 1940, 35 percent of blacks lived on farms; by 1960 only 8 percent did. In the same period, the size of the average farm increased from just over 200 acres to 330 acres, as the small family farm long idealized in song and story gave way to large-scale, mechanized agribusinesses. A government crop-subsidy program favoring large commercial producers hastened this consolidation process. The increase in farm output also reflected the growing use of chemicals and pesticides. With the publication of Rachel Carson's Silent Spring (1962), documenting the dangerous ecological effects of pesticides like DDT, Americans began to realize some of the long-term costs of surging agricultural productivity. But, as in many other areas, the underside of abundance went largely unrecognized in the 1950s.
Increased productivity and a rising GNP translated into more income for millions of Americans. In constant dollars, the median U.S. family income rose from about $4,300 to nearly $6,000 in the 1950s. The average inflation-adjusted hourly wage for factory workers rose from $1.78 in 1945 to $2.61 in 1965. Overall, the real income of the average worker grew by more than a third from 1945 to 1960. Bigger paychecks meant greater buying power, and this helped keep the boom going.
Statistics like these underlay the middle class's political and social conservatism in these years. Aware of their own improving situation in contrast with the Great Depression of the 1930s and the disrupted war years, millions of Americans seemed determined to uphold the status quo against alien ideologies, domestic critics, or "subversive elements" that in any way challenged a system that was treating them so well.
A Nation on the Move
Responding to the lure of jobs and eager to enjoy the fruits of prosperity, Americans moved in record numbers in the 1950s. Since the early nineteenth century, of course, acute social observers had commented on Americans' penchant for uprooting themselves and moving on, but in the postwar era this fondness for geographic mobility was especially apparent. California's population grew by five million in the 1950s as Americans from the East poured into the fabled land of sunshine and orange groves. The population of Orange County, south of Los Angeles, doubled in size during the 1940s and tripled in the 1950s. By 1963 California had become the most populous state in the union.
A monumental program of interstate highway construction facilitated this mobility. The Interstate Highway Act of 1956, a major legislative achievement of the Eisenhower era, earmarked some $32 billion for interstate highway construction over the next thirteen years. As ribbons of concrete and asphalt unrolled across the land, bisecting old city neighborhoods and chewing up thousands of acres of farmland and rolling prairie, motels and fast-food chains followed in their wake. In the early 1930s Franklin Roosevelt had promoted national unity by summoning all Americans to join in the struggle against the Depression; during World War II foreign menaces supplied the glue of social cohesion. In the postwar era, together with a new external foe--the Soviet Union--and centralized mass media, the complex web of interstate highways helped sustain a cohesive--and standardized--American national identity.
Millions of Americans in the postwar years, particularly young couples starting families, moved to the suburbs. A severe housing shortage just after the war eased in the late 1940s as builders like Abraham Levitt and Sons of Long Island bulldozed farms and fields on the urban periphery and with lightning speed erected thousands of suburban tract houses, many of cookie-cutter similarity. Utilizing mass-production techniques they had learned building housing for wartime naval workers in Norfolk, Virginia (and employing unskilled, nonunion labor), the Levitts mastered the technique of transforming open land into streets, lots, and inexpensive houses in a few weeks' time. Other builders across the nation emulated their approach, throwing up suburban developments with sylvan names like Oak Hill and Pine Glen. Thirteen million new homes sprang up in the 1950s; 1955 alone saw a record 1.3 million housing starts. From crowded city apartments and rural towns, eighteen million citizens poured into the suburbs in this decade. By 1960 as many Americans lived in suburbs as in cities.
Together with the low cost of mass-produced housing (the Levitts' homes sold for as little as $6,900 including refrigerator, range, and washing machine), federal policy encouraged suburban growth--and the urban decline that it precipitated. While billions in public monies were spent on forty-one thousand miles of interstate highways in these years, for example, a mere pittance went to urban mass transit. While the Federal Housing Agency (established in the New Deal era) and the Veterans Administration subsidized low-cost mortgages on new homes, urban renters or renovators enjoyed no such benefits. Similarly, homeowners could deduct mortgage interest payments for federal income tax purposes, while the urban renter had no such tax break. The rise of the suburb and the decline of the city did not just happen; it resulted from a series of specific public policy choices.
Suburbia and Television: Symbols of an Era
The burgeoning suburbs placed an indelible stamp on American life in the 1950s. The move to a spanking new suburb was not merely a geographic transition of a few miles; it had social, cultural, and spiritual implications as well. Many who made this move left behind rural communities or big-city immigrant enclaves with strong ties of family, church, and ethnic group to enter the anonymity of bedroom communities where total strangers found themselves next-door neighbors, sharing recipes, lawn mowers, baby-sitters, and opinions. In such a social setting, the premium was on tolerance, adaptability, "fitting in," and avoiding the idiosyncratic or the controversial--traits that some observers denounced as bland and conformist. In books like John Keats's The Crack in the Picture Window (1957), cultural critics wrote harshly of the intellectual sterility and tepid social uniformity supposedly fostered by suburban living.
The new suburbanites themselves, however, often spoke enthusiastically about the liberating experience of moving from a cramped city apartment or tenement, often shared with parents or in-laws, to one's own house on a (sometimes) tree-shaded suburban lot. A generation later, as the lure of suburbia palled, a new generation of young "urban pioneers" would return to the city to rehabilitate aging apartments and crumbling row houses, and take advantage of the city's rich mosaic of social, cultural, and culinary diversity. But in the 1950s the prevailing outlook differed markedly. For a generation seared by economic depression and war, and beset by the amorphous threats and alarms of the cold war era, the move to suburbia symbolized one's break with the past; one's hopes for the future; and, above all, one's desire for security. The cultural aridity and narrowing of social vision that suburban living entailed, while real enough, were doubtless exaggerated by some of the more extreme critics; in any event such drawbacks represented a trade-off millions were more than willing to make.
In a closely related phenomenon the level of religious activity surged in the postwar years. Church membership rose from 50 percent of the population in 1940 to 63 percent in 1960. Americans newly settled in suburbia particularly valued the church's role in promoting family togetherness and providing a source of social cohesion. As a slogan of the day put it: "The family that prays together stays together."
In former times religion had been seen as potentially divisive. Indeed, in the antebellum era Catholics and Protestants had battled in the streets of America's cities; as recently as the 1920s religious prejudice had torn the nation. Postwar America, by contrast, celebrated religion as a unifying social force. This meant emphasizing the most generalized themes common to all faiths and playing down sectarian differences. The religiosity of the period, therefore, while broad, was in many cases not very deep. Best-sellers like Norman Vincent Peale's The Power of Positive Thinking (1952) touted religion's value for mental health and success in life. Movies like The Robe (1953), The Ten Commandments (1956), and The Greatest Story Ever Told (1965) offered Hollywood stars in Technicolor biblical extravaganzas. Congress joined in the mood of civic piety, adding "under God" to the pledge of allegiance in 1954, and making the phrase "In God We Trust," which had appeared on the nation's currency and coins since 1865, the country's official motto in 1956.
But the more theologically rigorous Evangelicalism of an earlier era was far from dead. The fiery Baptist revivalist Billy Graham, with his traditional message of sin and salvation, attracted thousands to his crusades. On local radio and television programs, and in evangelical churches across the nation, the older faith survived. In the 1970s it would break forth in a surge of growth and political activism. But in the 1950s climate of togetherness and social unity, differences were played down and commonalities were celebrated.
Indeed, what was true of religion was true of society as a whole. In contrast to the passionate and conflict-ridden 1930s, a spirit of moderation and cautious restraint, epitomized by the "Modern Republicanism" of the Eisenhower presidency, characterized the 1950s. Social harmony and cohesion represented high social and political objectives, even at the price of papering over obvious differences and blunting divisive issues.
A series of critics dissected postwar culture and social values. In The Lonely Crowd (1950) the sociologist David Riesman contrasted the era's "other directed" style of interpersonal relations, by which the individual modifies his or her behavior according to the feedback he or she gets from others, with the "tradition directed" and "inner directed" character types of earlier generations. In White Collar (1951) the radical Columbia University sociologist C. Wright Mills harshly attacked the blandness, conformity, and opportunism of the rapidly growing white-collar class. William H. Whyte's The Organization Man (1956), originally serialized in Fortune magazine, took a skeptical look at the lives and values of corporate executives and their wives in an affluent Chicago suburb.
If suburbia offered both admirers and critics one central symbol of postwar America, television, the newest of the mass media, provided another. Though the 1939 New York World's Fair introduced Americans to television, World War II interrupted its commercial development, so that as late as 1946 it remained a novelty confined to a few thousand homes in the major cities. In the fifteen years that followed, however, it spread with lightning speed all over the nation. The first coast-to-coast hookup came in 1951. By 1960, 87 percent of all American households had at least one television set. The time the average American spent watching television grew from about four and a half hours a day in 1950 to more than five hours in 1960. In 1954 the Swanson Company introduced frozen "TV dinners," so that mealtime no longer need interfere with viewers' favorite shows. The most successful new magazine of the postwar years was TV Guide.
Television both reflected and intensified the suburban values and the obsession with consumer goods characteristic of 1950s culture. The programming that proved most popular symbolized the degree to which white, middle-class America in these years turned away from the unpleasant or the threatening to a realm of fantasy. After a brief period when TV shows like Kraft Television Theater (launched in 1947) offered serious dramas dealing with real social issues--an era sometimes called, with considerable hyperbole, television's "golden age"--Gresham's law was confirmed again as the medium increasingly purveyed superficial and escapist fare. Quiz shows offering contestants thousands of dollars or mountains of consumer goods as prizes symbolized the era's materialistic preoccupations. The ever-cheerful middle-class television families of "situation comedies" (sitcoms) like Father Knows Best, Leave It to Beaver, and I Love Lucy showed millions of suburban viewers, and those who looked to suburbia as the social ideal, a world they longed to believe was--or could be--their own: a world of strong, devoted parents and likable, well-scrubbed, wisecracking children inhabiting sparkling new houses and confronting only problems that could be resolved in thirty minutes. Exploiting suburbia's minor crises, they rarely challenged its underlying values or confronted the America that lay beyond its well-manicured lawns.
Above all, television provided a powerful new medium by which corporations could entice consumers with alluring images of the new, improved automobiles, refrigerators, detergents, cereals, cigarettes, and soaps available to them. American television evolved as a mechanism for delivering the largest possible audience to its business sponsors, and it fulfilled that function with phenomenal success. The share of corporate America's advertising budget spent on television rose from 3 percent in 1950 to 17 percent by 1965.
To be sure, television had its critics. Ray Bradbury in Fahrenheit 451 (1953) portrayed a nightmarish future in which all books are burned and the masses are anesthetized by the nonstop entertainment and propaganda fed them by the ruling elite on wall-sized television screens. Vance Packard in The Hidden Persuaders (1957) warned of advertisers' insidious influence in shaping not only American consumption but also American values. Newton Minow, chairman of the Federal Communications Commission, denounced TV as a "vast wasteland" in 1961 and challenged network executives to watch for twenty-four hours the fare they were foisting on the public. The acerbic comedian Fred Allen called television "chewing gum for the eyes."
But while the critics wrung their hands, television steadily became the most influential medium of both news and entertainment for most Americans. From the late 1940s to 1960, many newspapers ceased publication or lost subscribers; once-influential magazines like Life, Collier's, and The Saturday Evening Post vanished or fell on hard times; and movie attendance declined from ninety million to forty-five million. But with each passing year television strengthened its hold on the American mass mind.
Women in the Early Postwar Years
After fifteen years of crisis and upheaval at home and abroad, Americans of the 1945-1960 era, particularly the young middle-class couples moving to the suburbs, placed a high premium on material well-being, social stability, and family cohesion. The postwar emphasis on domesticity was reflected in the declining age at which women married; the average age for women at first marriage dropped by more than a year between 1940 and 1960, from 21.5 to 20.3. After decades of decline, the birthrate (the number of live births per one thousand women in the population) rose from 80 in 1940 to 106 in 1950 to a 1957 peak of 123, when it again began a long slide downward.
The surging birthrate contributed to the economic prosperity of these years. Young couples moved to larger quarters and bought a wide range of products for their growing families; communities built schools and hired teachers to cope with the onslaught of pupils. The postwar "baby boom" generation would continue to affect American society as its members moved through their college years, young adulthood, and the successive stages of their lives.
All this had important implications for middle-class women and the prevailing view of their appropriate role. The culture extolled domesticity and what one women's magazine called "togetherness." "Rosie the Riveter," the female factory worker celebrated in a popular World War II song, gave way to maternal images of "the happy housewife." In Woman: The Lost Sex (1947), two Freudian scholars, Marynia Farnham and Ferdinand Lundberg, blamed most of modern society's problems on women who had left the domestic sphere to compete with men. Dr. Benjamin Spock's best-selling Baby and Child Care, first published in 1946, similarly assumed that women's primary, if not only, role should be as homemaker and mother. Television shows deified the American family, amiably presided over by hardworking dad and apron-clad mom. Advertisers cultivated the image of housewives as the ultimate consumers, endlessly preoccupied with deciding among different brands of detergents or vacuum cleaners. Hollywood, which in the 1930s had portrayed stars like Joan Crawford, Joan Bennett, and Katharine Hepburn as feisty and independent (at least until the end of the movie, when they usually succumbed to stars like Cary Grant and Spencer Tracy), now typically celebrated the domestic virtues with saccharine stars like Debbie Reynolds and Doris Day. Even women's fashion reflected the new cultural norm, emphasizing tight waists; full, sharply defined bosoms; and full skirts featuring layer upon layer of crinoline petticoats.
How did women respond to cultural pressures intent on confining them to a narrow, restricted social role? On the one hand, feminist activism was nearly nonexistent. Old-line organizations such as the League of Women Voters continued their work, but at a low ebb. At the same time, other evidence suggests that many women did not placidly adapt to the niche society built for them. Despite the pressures aimed at confining women to the domestic sphere, the ranks of working women grew steadily after ebbing in the late 1940s. The year 1950 found 31 percent of American women in the labor force (up from 30 percent in 1947); by 1960 the figure stood at 35 percent. Even more significantly, the percentage of married women working outside the home rose from some 17 percent in 1940 to 32 percent in 1960. And the number of working women with children under age seventeen rose from forty million in 1950 to fifty-eight million in 1960. In a revealing poll of 1962, only 10 percent of the women surveyed wanted their daughters to have the same kind of life they had led.
But in the 1950s women's discontents remained largely unarticulated and did not find collective or ideological expression. Most women appear to have worked from economic necessity, or a desire to enhance their family's standard of living, rather than from an ideological commitment to a career. And the workplace remained highly stratified along gender lines, with women largely confined to such occupations as clerk, nurse, schoolteacher, and secretary, while medicine, law, and business management remained firmly male preserves.
Not until the political and cultural climate shifted in the early 1960s would women begin actively to resist the gender stereotyping so characteristic of 1950s social attitudes. In The Feminine Mystique (1963), a work that launched a new wave of feminist activism, Betty Friedan described the frustrations she and other white, middle-class young women had felt in the 1950s, and the narrowness of their lives as young wives and mothers. The pervasive cultural pressures preventing women from fully realizing their abilities, said Friedan, were "the problem that has no name." Certainly in the 1950s, amid the rush to the suburbs and the cultural celebration of domesticity and the family, the problem was rarely identified or discussed.
Social Tensions and Fears
While the economy boomed and consumerism pervaded the culture, anxiety and tensions belied the surface placidity of 1950s society. In this cold war era, fear of Communist expansion abroad and subversion at home, as well as of nuclear war, shaped American life in profound ways. As the long conflict with the Soviet Union took shape in 1946-1947, American society became increasingly obsessed with communism, disloyalty, and dissident opinion generally. These concerns intensified as the Soviet Union tightened its grip on Eastern Europe; the West responded with an anti-Soviet military alliance, the North Atlantic Treaty Organization (1949); a Communist government came to power in China (1949); and the United States fought a bloody and politically divisive war in Korea (1950-1953). This preoccupation helped mold the conservative political and social climate of the decade. As early as 1947 President Truman established a Loyalty Review Board to investigate government workers and dismiss any who presented "reasonable ground for belief in disloyalty."
Republican Senator Joseph R. McCarthy of Wisconsin capitalized most spectacularly on the era's anticommunist obsessions. Speaking in West Virginia in February 1950, McCarthy accused the Truman administration of coddling subversives and waved a paper that he said contained a list of Communists in government. Though he never produced proof, McCarthy over the next few years dominated the headlines, wrecked careers with wild charges, and fed the nation's paranoia with a stream of accusations and innuendos. At last in 1954 McCarthy overreached himself with charges that the Eisenhower administration and the U.S. Army were part of the Communist conspiracy. The televised "Army-McCarthy hearings" that spring, giving the American people the chance to see firsthand the senator's bullying tactics, hastened his downfall--and in the process demonstrated television's power. In December 1954 the Senate voted to censure McCarthy. His star fell rapidly thereafter, and he died in 1957 of complications related to alcoholism.
But though the term "McCarthyism" came to be applied to the antiradical political mood of the early 1950s, the paranoid climate was not confined to the actions of any one individual. The House Committee on Un-American Activities (HUAC), for example, conducted a series of highly publicized investigations of alleged Communist subversion in many realms of American life, including the Protestant clergy, university professors, atomic scientists, and Hollywood filmmakers. HUAC's 1948 investigation of journalist Whittaker Chambers's charges against a former State Department official, Alger Hiss, sent Hiss to jail on perjury charges and furthered the political career of an obscure young California congressman, Richard M. Nixon. In another case fraught with political overtones, Julius and Ethel Rosenberg of New York City were convicted and sentenced to death in 1951 for passing atomic secrets to the Soviets. Despite worldwide protests, some orchestrated by the Communist Party, they went to the electric chair in June 1953. These and less well-known cases contributed to a climate of uneasy apprehension in the 1950s, stifled dissent and controversy, and deepened the mood of caution and restraint. Movies like Invasion of the Body Snatchers (1956), in which sinister aliens take over the bodies of seemingly ordinary and innocuous citizens in a California town, epitomized the national aura of suspicion and foreboding despite the prevailing prosperity. The more extreme manifestations of the postwar "red scare" abated as the 1950s wore on, but they left a bitter legacy.
Fear of the Russians took many forms. In October 1957, the Soviets launched Sputnik I, the world's first orbiting satellite. Americans long confident of their nation's technological superiority suddenly found that supremacy under challenge. Frustration intensified when Vanguard I, the missile intended to launch a U.S. satellite, exploded on the launching pad that December. The Eisenhower administration responded with massive increases in spending on missile development (which had military as well as space-exploration relevance) and in 1958 created the National Aeronautics and Space Administration (NASA) to oversee the nation's space program.
Sputnik also triggered a wave of concern about the alleged failure of the nation's education system. The American High School Today, a 1959 report by James B. Conant, former president of Harvard University, criticized the high schools for substituting undemanding courses for rigorous work in science, mathematics, and foreign languages. To improve performance at the higher educational levels, Congress in 1958 passed the National Defense Education Act (NDEA), allocating some $800 million for loans to undergraduates and graduate students and provided funds for improving instruction in science, mathematics, and foreign languages. Spurred by such measures, enrollments in the nation's colleges and universities soared from 2.5 million in 1955 to 3.6 million in 1960.
The federal government's growing involvement in education, long the preserve of the states and private groups, represented a major new social trend of the period. Washington's total spending on education rose from $113 million in 1945 to $4 billion by 1965. The new technocratic economic order, as well as the cold war struggle, demanded a highly trained citizenry, and Washington's new educational priorities reflected this realization.
The early postwar years were shadowed not only by the reality of conflict in Korea but also by a pervasive fear of atomic war that shaped the culture in many ways. President Truman's decision to drop the atomic bomb on two Japanese cities in August 1945, while welcomed for its apparent role in ending the war, also set off a shock wave of fear that the same weapon might someday be turned against the United States. Politically active atomic scientists played upon such terror in 1945-1947 as part of their campaign to win support for the international control of atomic energy. This initial wave of atomic fear faded as the cold war turned attention from the bomb to the Russian menace, and as Washington reassured a jittery public with civil-defense plans and glowing propaganda about the peacetime uses of atomic energy.
But fear surged to the surface again in 1949-1950 when the Soviets tested their first atomic bomb, and President Truman responded with a go-ahead for the development of the hydrogen bomb, a doomsday weapon a thousand times more destructive than those which had destroyed Hiroshima and Nagasaki. The United States tested its first hydrogen bomb in 1952; Soviet and British tests soon followed.
Eisenhower's secretary of state, John Foster Dulles, exacerbated anxiety in the mid 1950s by proclaiming a policy of "massive retaliation" to any provocation and insisting on the wisdom of going to "the brink of war" to prove the nation's will. Though the Eisenhower administration's foreign policy in fact continued that of the Truman years in focusing on "containment" (rather than destruction) of Soviet power, the belligerence and moralistic fervor with which Dulles proclaimed this policy deepened public uneasiness. So, too, ironically, did civil-defense programs emphasizing fallout shelters in public buildings and suburban backyards. For many schoolchildren, the most vivid memory of the 1950s would be hiding under their desk during nuclear-attack drills and watching films such as Duck and Cover, in which Bert the Turtle offered instruction on how to respond to the atomic flash. Cities conducted practice drills and distributed leaflets listing evacuation routes. The citizens of Madison, Wisconsin, for example, were told to drive to nearby rural counties if nuclear war threatened; those without automobiles, civil-defense officials promised, would be transported in freight cars.
Apprehension deepened in 1957-1958 with the advent of intercontinental ballistic missiles (ICBMs) capable of delivering thermonuclear bombs anywhere in the world at supersonic speeds. As the arms race rose to a new level of menace, the prospect of annihilation seemed more real than ever.
These years also brought widespread concern about fallout from nuclear tests. As American and Soviet H-bomb tests in the Pacific pumped radioactive poisons, including deadly strontium-90, into the atmosphere, clouds of radioactive ash spread far and wide. By the late 1950s, fear of fallout pervaded the nation, spawning a political campaign to ban nuclear tests led by groups such as Physicians for Social Responsibility and SANE, the National Committee for Sane Nuclear Policy. In one SANE newspaper ad Benjamin Spock gazed with furrowed brow at a young child under the caption "Dr. Spock Is Worried."
But while the test-ban campaign challenged the prevailing political inertia of the 1950s, the overall effect of nuclear fear was probably to deepen the decade's mood of disengagement. The turning from politics and social issues to domesticity and private concerns represented one response to global dangers that seemed beyond the individual's power to control. Religion, particularly the soothing "positive thinking" of the Norman Vincent Peale variety, offered another means of coping with nuclear fear.
The bomb hovered over American culture in these years. Theologians debated the ethics of nuclear war. Social psychologists pondered the long-term effects of fallout-shelter confinement. Life magazine ran a picture story on a newlywed couple who spent their honeymoon in a fallout shelter. Ray Bradbury's Martian Chronicles (1950) and Walter Miller, Jr.'s, A Canticle for Leibowitz (1959) were only two of many science-fiction works that imagined scenarios of history's approaching end. Poets and artists offered images of nuclear destruction and human extinction, while Tom Lehrer, a songwriter and performer popular on college campuses, and the humor magazine Mad employed ridicule and satire against the insanities of the arms race.
A wave of mutant films launched by Them! (1954), in which giant ants spawned in the New Mexico atomic test site go on a rampage in their search for sugar, both reflected and exacerbated popular fears of radioactive fallout. So, too, did Stanley Kramer's On the Beach (1959), a bittersweet picture of the nuclear end of civilization played out to the strains of "Waltzing Matilda." TV science-fiction series such as The Outer Limits and Rod Serling's The Twilight Zone often dealt with themes of atomic war, radioactivity, and the social toll of nuclear fear. In one Twilight Zone story, for example, the citizens of a typical suburban community, panicked by a nuclear alert, turn upon each other in fury as they struggle to get into a backyard shelter whose owner has barricaded the door against them. Movies and television programs such as this made it clear that the electronic media, for all their capacity to narcotize the public, could heighten public awareness of serious issues.
Overt nuclear fear continued to eddy through the culture until 1963, when the Limited Nuclear Test Ban Treaty (banning atmospheric and underwater tests) for a time pushed it to the deeper recesses of consciousness. But for much of the 1945-1963 period, the specters of nuclear war and nuclear fallout were never far from the public's awareness. This reality, no less than affluence, consumerism, and suburbanization, is central to an understanding of the social history of the period.
A Culture of Dissent
The conformity and blandness that troubled social critics like Riesman and Mills became the target of a small but influential cadre of 1950s humorists, novelists, and poets who formed the vanguard of a counterculture that in the next decade would become much more influential. Nightclub satirists like Mort Sahl, Lenny Bruce, and Shelley Berman offered a caustic view of Eisenhower's America. J.D. Salinger's The Catcher in the Rye (1951) presented middle-class "phoniness" through the eyes of its late-adolescent hero, Holden Caulfield. The Beat writers, notably Allen Ginsberg and Jack Kerouac, celebrated spontaneity, sensual gratification, alcohol and marijuana, and the freedom of the open road as alternatives to suburban conformity and family commitment. Ginsberg's poem Howl (1956), portraying capitalist, nuclear-armed America as a monster devouring its young, and Kerouac's On the Road (1957), a thinly veiled autobiographical novel of escape through ceaseless movement and a variety of mind-altering substances, attracted the fierce loyalty of young and alienated Americans repelled by what they saw as the blandness and regimentation of 1950s society. Norman Mailer, who had won celebrity with his realistic novel of World War II, The Naked and the Dead (1948), in a 1957 essay called "The White Negro" romanticized young street blacks, or "Hipsters," as cultural models for repressed middle-class whites. From Hollywood came Marlon Brando, the motorcycle-gang leader in The Wild One (1954), and James Dean as the misunderstood adolescent in Rebel Without a Cause (1955).
But the message of such work was often ambiguous at best. The sardonic routines of Lenny Bruce and Mort Sahl and other stand-up comics encouraged ironic detachment more than activism. At the end of The Catcher in the Rye, Holden Caulfield suffers a nervous breakdown and is institutionalized. And in Rebel Without a Cause James Dean's rebelliousness is effectively neutralized by a therapeutic social worker who probes the Freudian sources of his malaise (a weak father who dons an apron and helps with the housework) and persuades him to accept society's conventions as embodied in Natalie Wood, the girlfriend who is clearly ready to marry and settle down in suburbia.
The cultural wars of the 1950s spilled over into the world of popular music. While the mainstream music of these years celebrated domesticity, somewhat sanctimonious piety, and a rather treacly form of romantic love, at the end of the decade a very different genre, rock and roll, a watered-down version of the driving beat and open sexuality of black rhythm and blues, became vastly popular with the younger generation. Promoted by radio disc jockeys like Alan Freed of New York's WINS, early rock-and-roll hits like "Shake, Rattle, and Roll" (1954) and "Rock Around the Clock" (1955), both performed by Bill Haley and the Comets, proved harbingers of things to come. Church leaders, educators, and other pillars of the establishment protested, but rock and roll had arrived. Its most famous exponent, Elvis Presley from Tupelo, Mississippi, who recorded his first songs as an eighteen-year-old in a Memphis studio in 1953, enjoyed a phenomenal string of fourteen million-seller records from 1956 through 1958, when he was drafted into the army. Like the nightclub comics and the Beat writers, rock and roll challenged the dominant culture in largely nonideological, nonpolitical ways; in a few years that challenge would become much more explicitly ideological and political.
The Other America
Although the white, middle-class world of the suburbs set the cultural tone of the postwar era, it was only part of the kaleidoscope of American society in these years. Beyond the trees and lawns of suburbia lay a very different world of Americans with lower incomes, less education, bleaker prospects, and, in many cases, darker skins. In 1960, after over a decade of booming prosperity, some one-third of American families lived on less than $4,000 a year.
In Appalachia and other rural byways, small farmers continued to eke out a bare existence. In the Southwest several million Mexican Americans worked as braceros (migrant agricultural laborers) or held low-paying jobs in the cities. Although official statistics recorded 528,000 immigrants from Mexico between 1951 and 1965, the actual number was vastly greater. In "Operation Wetback," launched in 1953, the Eisenhower administration returned many undocumented workers to Mexico, but the flow of migrants northward continued nearly unabated. In 1960 Hispanics made up an estimated 12 percent of the total population of California, Arizona, Colorado, New Mexico, and Texas. Meanwhile, in crowded East Coast communities like New York's East Harlem lived nearly a million Puerto Ricans, most of whom had arrived since World War II. These Spanish-speaking residents were among the poorest Americans, disadvantaged by a lack of education and by difficulty in speaking, reading, and writing English at a time when all but the most menial jobs increasingly required English verbal skills and advanced training. In the 1950s, one-third of all Hispanics lived below the poverty line as defined by the federal government.
The nation's Native American population, victims of centuries of white exploitation and neglect, existed at the lowest income levels and grappled with a wide range of social and medical problems. Government policies made matters worse. In 1953, intent on opening Indian lands to white development, Congress cut off all special services as a means of pressuring Indians to leave their ancestral lands and blend into the general population. The results were predictable: some sixty thousand Indians, tragically unprepared for urban life, drifted to cities, where they led marginal, deracinated lives, often barely surviving.
The nation's black population, which grew from thirteen million in 1940 to nineteen million in 1960, shared only partially in the prosperity of these boom years. In 1960-1961, the average black family had an income of $3,233, compared with $5,835 for white families; 32 percent existed on less than $2,000--three times the percentage of white families in this lowest income group.
As whites moved to the suburbs, they left the cities to blacks and Hispanics, many of whom held low-paying jobs or no job at all. As the factories that had historically provided entry-level work for immigrants closed or moved to outlying areas, urban unemployment inched upward, as did problems of crime, public health, poor schools, and social disorganization. At the same time the middle-class flight eroded the urban tax base needed to address these problems. The social trends of these years, in short, planted seeds that would produce a bitter harvest of almost intractable urban social problems in the decades ahead.
These grim social realities did not begin to penetrate the awareness of middle-class America, for whom prosperity, consumer abundance, suburban life, and the largely escapist fare of television defined the parameters of social experience. Not until the 1960s, under the spur of urban unrest and of books like Michael Harrington's The Other America (1962), would this dimension of American life come sharply into focus as the object of cultural scrutiny and concerted political action.
The Civil Rights Movement
While the problems confronting inner-city blacks were largely ignored, the more blatant manifestations of racism became the focus of political activism in these years, laying the groundwork for a profound and long-overdue transformation in American race relations. For years, the National Association for the Advancement of Colored People (NAACP, founded in 1910) had chipped away in the courts at the vast structure of racial segregation, especially in the South. World War II heightened awareness of the problem, as the incongruity of fighting racism abroad while tolerating it at home struck many Americans. In 1941, to forestall a threatened civil rights march on Washington, President Roosevelt issued an executive order barring racial discrimination in federal agencies and in war plants. In An American Dilemma (1944) the Swedish sociologist Gunnar Myrdal documented the extent of American racism and its heavy social toll.
The cause of racial justice moved still higher on the public agenda after the war, reflecting both blacks' growing political power and Washington's desire to win friends among the darker-skinned people of Africa, Asia, and Latin America. In 1946 President Truman created the Committee on Civil Rights, whose subsequent report, To Secure These Rights, documented the magnitude of the task ahead.
Early in 1948, facing a tough reelection battle, Truman proposed a panoply of civil rights measures, including federal laws against lynching, the poll tax (which effectively disfranchised most southern blacks), and racial discrimination in employment. Truman's program met fierce opposition from powerful southern Democratic congressmen, but it did provide a rallying point for the growing civil rights movement.
At the Democratic convention that summer, when northern liberals like Minnesota's Hubert Humphrey pushed through a strong civil rights plank, southern segregationists walked out. Forming a third party, the so-called Dixiecrats, they ran J. Strom Thurmond of South Carolina for president. Despite this challenge (and another from the left-leaning Progressive Party headed by Henry Wallace), Truman narrowly won, demonstrating, among other things, the growing political clout of black voters.
The NAACP's long legal struggle against segregation reached a stunning climax on 17 May 1954, when the Supreme Court, in Brown v. Board of Education, held school segregation unconstitutional. This landmark ruling reversed the Plessy v. Ferguson decision of 1896, which had upheld the constitutionality of the "separate but equal" doctrine. In a follow-up decision in 1955, the high court ruled that segregated schools must be integrated "with all deliberate speed."
Initially targeted on the South, Brown v. Board of Education laid the foundation for a civil rights struggle that eventually confronted the whole vast reality of a racially stratified society. Though far from over by 1960, or even by 1990, this struggle set a course for the future and represents the major social legacy of the Eisenhower years.
President Eisenhower initially disapproved of Brown v. Board of Education. "You cannot change the hearts of people by law," he believed. Reflecting this view, his administration at first did little to enforce the Supreme Court's ruling amid mounting southern opposition led by the Ku Klux Klan and newly formed white citizens' councils.
A showdown came in September 1957, however, when Arkansas governor Orval Faubus proclaimed his state's determination to resist the court-ordered integration of Little Rock's Central High School. Despite his personal reservations, Eisenhower at last made it clear that the law of the land must be obeyed, and that Arkansas's defiance would not be tolerated. In the face of hundreds of white racists who poured into Little Rock, the president sent federal troops from the 101st Airborne Division to assure the safety of black students who entered Central High. Televised images of black boys and girls entering school protected against jeering whites by federal power conveyed to the nation a powerful message not only of the ugly reality of racism but also of American society's gradually stiffening will to resist it. Once again, television demonstrated its potential for dramatizing great social issues as well as providing mindless escape from such issues. But the Brown decision and the showdown at Central High, while major landmarks, represented only early faltering steps of a long journey. As late as 1965 only 2 percent of schools in the Deep South were integrated.
Meanwhile, as the accelerating civil rights movement confronted the larger issue of pervasive racial discrimination in the South, blacks took the lead in challenging the visible symbols of second-class citizenship. The focus in this phase of the struggle shifted to Montgomery, Alabama, which, like other southern cities, required blacks to sit at the rear of city buses and yield their seat to any white who demanded it. On 1 December 1955 a black woman of Montgomery, Rosa Parks, tired after a long day's work as a seamstress, refused to give her seat in the white section of the bus to a white man. She was arrested, jailed briefly, and fined ten dollars. From this spark grew a movement that helped topple the vast structure of legalized racial segregation across the Old Confederacy. Led by Martin Luther King, Jr., the young pastor of the city's Dexter Avenue Baptist Church, and veterans of the labor movement like E.D. Nixon, the blacks of Montgomery organized a bus boycott. For nearly a year, as the bus system lost thousands of dollars, blacks walked to work or gave each other rides. The combination of King's pulpit eloquence, the boycott leaders' tactical skills, and the strong-willed endurance of Montgomery's black community gradually prevailed. In November 1956 the Supreme Court upheld a lower court ruling declaring Montgomery's bus segregation illegal. The boycott had succeeded, and long-entrenched patterns of racial segregation began to fall all across the South.
In the Civil Rights Act of 1957, the first since Reconstruction days, Congress forbade attempts to intimidate citizens from voting, created the U.S. Civil Rights Commission, and established a civil rights division within the Department of Justice. That same year, Martin Luther King and others formed the Southern Christian Leadership Committee (SCLC) to spearhead the campaign. The new organization reflected King's commitment to nonviolence as both a practical strategy and a moral principle. Civil rights activists skillfully organized symbolic actions that dramatized the reality of racism and won the nation's attention and admiration. In 1960, for example, black college students in Greensboro, North Carolina, occupied seats at the lunch counter of a local Woolworth's store that, following southern custom, did not serve blacks. This "sit-in" strategy became a major technique not only in the civil rights campaign but also in the anti-Vietnam war movement of the 1960s.
Black writers contributed to the movement as well. Ralph Ellison's Invisible Man (1952), a masterpiece of surreal fiction, conveyed the complex psychological experience of living as a black in white America. In Go Tell It on the Mountain (1953) James Baldwin told of his boyhood as the son of a Harlem minister. In Notes of a Native Son (1955) Baldwin addressed American racism directly in a series of incisive essays.
Though focused on legally enforced segregation in the South rather than on the broader patterns of racism endemic in American society, the civil rights movement of the 1950s nevertheless played an essential role in confronting racism's more blatant manifestations, sensitizing millions of whites to long-ignored realities, and arousing a generation of young blacks to demand full equality.
On a wintry morning in January 1961, as seventy-year-old Dwight Eisenhower watched impassively, a vigorous young John Kennedy took the oath as president. Not only a moment of political transition, the inauguration had broader resonances as well. The Eisenhower era was over; so, too, was a cycle of American history that began with the victory celebrations of August 1945.
This era challenges social historians because of the sharp disparities between the experiences of different social groups. For many it brought good times as a stream of consumer goods poured forth from a powerful engine of production near its peak of efficiency. American capitalism between 1945 and 1960 made possible a standard of living unmatched in human history; from houses and automobiles to ballpoint pens and Levi's, the cornucopia of consumer goods awed the world. From the perspective of a later generation coping with industrial decline, trade deficits, environmental worries, and soaring energy costs, that distant era of seemingly effortless abundance seems almost a mirage.
Yet the darker side of the picture must be recognized as well. The postwar red scare warped the political process and shriveled intellectual and cultural life. Nuclear fear affected the national psyche in profound ways. A narrow conception of women's role held half the population hostage. And millions of citizens found themselves excluded from the banquet of material affluence.
The era challenges historical interpretation, too, because of the sharp contrast between its reality and the myopic and self-congratulatory tone of a politics and mainstream culture that determinedly ignored important aspects of that reality. Fundamental issues sometimes found political expression, as in the civil rights and test-ban movements, but more often they did not. Only later--in some cases much later--would the nation confront in anything like a comprehensive fashion the era's racism, gender exploitation, nuclear and environmental hazards, and vast class disparities. Also largely unacknowledged were the long-range social implications of the emerging computerized, automated technocratic order. Only in the 1960s, in books like Silent Spring, The Feminine Mystique, and The Other America (and in films like Dr. Strangelove, Stanley Kubrick's brilliant 1963 satire on the nuclear arms race) would the harsher contours of the years between 1945 and 1960 even begin to come into focus.
Given this disparity between reality and the nation's readiness to confront that reality, anyone seeking to understand postwar American social history must heed with particular care the voices on the cultural periphery, where issues found expression in sometimes allusive and distorted ways: science-fiction stories of aliens and robots taking over the world, B-grade mutant movies, the cynical monologues of comedians in smoky nightclubs; the at times hysterical and half-crazed rantings of Beat writers and poets; the raw sexuality and raucous self-assertion of rock and roll, the irreverent satires of Mad magazine; the hellfire warnings of evangelical revivalists. To grasp the complex texture of American society in these years, one must attend carefully to sources like these. In them one confronts the issues, tensions, and social forces that stirred beneath the deceptively placid surface of postwar America.
Boyer, Paul. Promises to Keep: The United States Since World War II. 2nd ed. (1998). Chaps. 1-5.
Gilbert, James. Another Chance: Postwar America, 1945-1985. 2nd ed. (1986).
Hart, Jeffrey. When the Going Was Good!: American Life in the Fifties (1982).
O'Neill, William L. American High: The Years of Confidence, 1945-1960 (1986).
Perrett, Geoffrey. A Dream of Greatness: The American People, 1945-1963 (1979).
Thought and Culture; Mass Media
Altschuler, Glenn T., and David I. Grossvogel. Changing Channels: America in T.V. Guide (1992).
Biskind, Peter. Seeing Is Believing: How Hollywood Taught Us to Stop Worrying and Love the Fifties (1983).
Boyer, Paul S. By the Bomb's Early Light: American Thought and Culture at the Dawn of the Atomic Age (1985).
Cook, Bruce. The Beat Generation (1971).
Frith, Simon. Sound Effects: Youth, Leisure, and the Politics of Rock 'n' Roll (1981).
Goldman, Albert. Elvis (1981).
Graebner, William S. The Age of Doubt: American Thought and Culture in the 1940s (1991).
Horowitz, Daniel. Vance Packard and American Social Criticism (1994).
Kozol, Wendy. Life's America: Family and Nation in Postwar Photojournalism (1994).
Marc, David. Democratic Vistas: Television in American Culture (1984).
Marling, Karal Ann. As Seen on TV: The Visual Culture of Everyday Life in the 1950s (1994).
Meikle, Jeffrey L. American Plastic: A Cultural History (1995).
Miles, Barry. Ginsberg: A Biography (1989).
Spiegel, Lynn. Make Room for TV: Television and the Family in Postwar America (1992).
Wilk, Max. The Golden Age of Television: Notes from the Survivors (1976).
George, Carol. God's Salesman: Norman Vincent Peale and the Power of Positive Thinking (1993).
Martin, William. A Prophet with Honor: The Billy Graham Story (1991).
Marty, Martin E. Modern American Religion 3 [1941-1960] (1996).
Wuthnow, Robert. The Restructuring of American Religion: Society and Faith Since World War II (1988).
Suburbs, Women, and Family
Breines, Wini. Young, White and Miserable: Growing Up Female in the Fifties (1992).
Coontz, Stephanie. The Way We Never Were: American Families and the Nostalgia Trap (1992).
Eisler, Benita. Private Lives: Men and Women of the Fifties (1986).
Hudnut-Beumler, James. Looking for God in the Suburbs: The Religion of the American Dream and Its Critics, 1945-1965 (1994).
Jackson, Kenneth T. Crabgrass Frontier: The Suburbanization of the United States (1986).
Jones, Landon Y. Great Expectations: America and the Baby Boom Generation (1980).
Kaledin, Eugenia. Mothers and More: American Women in the 1950s (1984).
Kessler-Harris, Alice. Out to Work: A History of Wage-Earning Women in the United States (1982).
May, Elaine Tyler. Homeward Bound: American Families in the Cold War Era (1988).
Rupp, Leila J., and Verta Taylor. Survival in the Doldrums: The American Women's Rights Movement, 1945 to the 1960s (1987).
Wright, Gwendolyn. Building the Dream: A Social History of Housing in America (1981).
Farmers, Minorities, Civil Rights Movement
Acuña, Rodolfo. Occupied America: A History of Chicanos. 2nd ed. (1981).
Burt, Larry W. Tribalism in Crisis: Federal Indian Policy, 1953-1961 (1982).
Camarillo, Albert. Chicanos in California (1984).
Fite, Gilbert C. American Farmers: The New Minority (1981).
Flynt, J. Wayne. Dixie's Forgotten People: The South's Poor Whites (1979).
Kluger, Richard. Simple Justice: The History of "Brown v. Board of Education" and Black America's Struggle for Equality (1975).
Lewis, David L. King: A Critical Biography (1970).
Sitkoff, Harvard. The Struggle for Black Equality, 1954-1980 (1981).
Cold War Culture
Brands, H. W. The Devil We Knew: Americans and the Cold War (1993).
Caute, David. The Great Fear: The Anti-Communist Purge Under Truman and Eisenhower (1978).
Clowse, Barbara B. Brainpower for the Cold War: The Sputnik Crisis and the National Defense Education Act of 1958 (1981).
Lipsitz, George. Class and Culture in Cold War America (1982).
May, Lary, ed., Recasting America: Culture and Politics in the Age of the Cold War (1989).
Oakes, Guy. The Imaginary War: Civil Defense and American Cold War Culture (1994).
Whitfield, Stephen J. The Culture of the Cold War (1991).
Winkler, Allan M. Life Under a Cloud: American Anxiety About the Atom (1993).