(1608-1680s) * (1636-1748) * (1749-1763) * (1763-1775) * (1776-1783) * (1787-1797) *
(1798-1812) * (1812-1814) * (1814-1836) * (1817-1842) * (1724-1857) * (1834-1846) *
(1846-1860) * (1859-1862) * (1863-1876) * (1862-1878) * (1862-1891) * (1869-1908) *
(1877-1906) * (1898-1918) * (1918-1929) * (1930-1941) * (1941-1945) * (1944-1954) *
(1947-1968) * (1946-1975) * (1968-1974) * (1963-1980) * (1980-1991) * (1992—).
In This Chapter
End of the Reagan-Bush years.
The New Right, Libertarians, and militia groups.
An electronic democracy.
Writing about what has happened is much easier than writing about what is happening. Because we don’t live in the future, it’s no mean feat to say one current event is historically significant, but another one isn’t. The history of our times has yet to be written. So this closing chapter is nothing more than a moistened finger lifted to the prevailing ,winds. By the time you read this, some of these winds may be forgotten breezes, and I may well have overlooked the coming storms.
The Economy, Stupid. In 1980 and again in 1984, the American electorate voted Ronald Reagan into office with gusto. In 1988, Americans had relatively little enthusiasm for either Democrat Michael Dukakis, governor of Massachusetts, or Republican George Bush, vice president of the United States. Nor was there wild enthusiasm in 1992, when the incumbent Bush was opposed by the youthful governor of Arkansas, Bill Clinton. However, during the Bush administration, the American dream seemed somehow to have slipped farther away. True, the world was probably a safer place than it had been at any time since the end of World War II (but it was still a dangerous world), and the United States had acquitted itself in the best tradition of its long democratic history by resolving the Persian Gulf War. Yet the electorate felt that President Bush habitually neglected domestic issues and focused exclusively on international relations. Candidate Bill Clinton’s acerbic campaign manager, James Carville, put it directly, advising Governor Clinton to write himself a reminder lest he forget the issue on which the election would be won or lost: “It’s the economy stupid.”
Third Party Politics. The economy. It was not that America teetered on the edge of another depression in 1992. Although homeless men, women, and even children could be seen on corners and in doorways of the nation’s cities, there were not the long bread lines or crowded soup kitchens of the 1930s. But a general sense existed among the middle class, among those who had jobs and were paying the rent, that this generation was not doing “as well as” previous generations. Sons were not living as well as fathers, daughters not as well as mothers. The pursuit of happiness, fueled in large part by cash, had become that much more difficult, and a majority of voters blamed it on George Bush. Bill Clinton entered the White House by a comfortable margin, but if voters rejected Bush, a sizable minority of them also rejected Clinton.
For the first time since Theodore Roosevelt ran as a Bull Moose candidate in 1912, a third-party contender made a significant impact at the polls. Presenting himself as a candidate dissatisfied with both the Republicans and Democrats, billionaire Texas businessman H. Ross Perot (b. 1930) told Americans that they were the “owners” of the nation and that it was about time they derived benefit from such ownership.
The New Right. The Perot candidacy was not the only evidence of widespread discontent with politics as usual. Americans who had come of age in the 1960s, the era of the Great Society and of protest against the Vietnam War, tended to take liberalism for granted, as if it were naturally part and parcel of the American way. Beginning in the 1970s, however, a conservative counter-movement gained increasing strength. Its values rested upon the family as traditionally constituted (father, mother, kids), a strong sense of law and order, reliance on organized religion, a belief in the work ethic (and a corresponding disdain for the welfare state), a passion for decency (even to the point of censorship), and a desire for minimal government.
Right to Life. Many who have identified themselves with the new right hold passionately to a belief that abortion, even in the first trimester of pregnancy (when the fetus cannot survive outside of the womb), is tantamount to murder. The so-called Right to Life movement has spawned a fanatic fringe, whose members have bombed abortion clinics and have intimidated, assaulted, and even murdered physicians who perform abortions. However, the mainstream of the movement has relied on legal means to effect social change, with the ultimate object of obtaining a constitutional amendment barring abortion. The Right to Life movement became so powerful a political lobby that the Republican party adopted a stance against abortion as part of its 1992 platform.
Regardless of one’s attitude toward abortion—whether pro life or pro choice—many Americans are alarmed by a trend among particular political candidates and even entire political parties to become identified with single issues—such as abortion, gun control, prayer in schools, gay rights—rather than with an array of issues, let alone a philosophy of government.
Christian Right. The First Amendment to the Constitution specifies that “Congress shall make no law respecting an establishment of religion.” Church and state are explicitly separated in the United States; but the government freely invokes the name of God in most of its enterprises. Our currency bears the motto “In God We Trust,” both houses of Congress employ full-time chaplains, and (despite occasional protests) the Pledge of Allegiance most of us grew up reciting at the start of each school day describes the United States as “one nation, under God. “ While many Americans shake their heads over their perception that religious worship has somehow gone out of fashion in America, most opinion polls agree that approximately 96 percent of the population believes in God. A 1993 survey was able to turn up fewer than one million Americans willing to identify themselves as atheists-out of a total U.S. population, according to the 1990 census, of 248,709,873.
The rise of the so-called Christian Right—religiously motivated conservatism-should surprise no one, though it has worried many, who see in overt unions of religious faith and politics a threat to First Amendment freedoms. At present, the most significant political manifestation of the Christian Right is the powerful lobbying group called the Christian Coalition, which was spun off of the unsuccessful 1988 presidential campaign of religious broadcaster Pat Robertson (b. 1930). Based in Washington, D.C., the coalition boasts a membership of 1.6 million and has claimed responsibility for the Republican sweep of the Congress during the midterm elections of 1994. In 1995, the organization spent more than a million dollars mobilizing its “born-again” evangelicals behind the conservative “Contract with America” promulgated by Speaker of the House Newt Gingrich. While some people have welcomed what they see as a return of morality to American political life, others see the Christian Right as narrow, coercive, and intolerant.
The Age of Rage. In turn, the Christian Right has viewed liberal America as too tolerant of lifestyles and beliefs that seem alien or offensive to certain religious principles, or that apparently threaten the moral fiber of the nation.
This is hardly a new dialogue. Right and left have been taffy-pulling national life since well before the Revolution. Indeed, Americans may be too accustomed to thinking in terms of right versus left; for another body of belief appears to want little to do with either side. In its mildest form, this group has expressed itself in a third political party, the Libertarians, founded in 1971. Libertarians oppose laws that limit personal behavior (including laws against prostitution, gambling, sexual preference), advocate a free market economy without government regulation or assistance, and support an isolationist foreign policy (including U.S. withdrawal from the United Nations).
But look around and listen. It’s not Libertarian dialogue that you hear. It’s rage. Most of the time, it”s part of the background, the Muzak that marks the tempo of our times: angry slogans on bumper stickers, the endless, staccato of sound bites reeling out from the television tube, the jagged litany that issues from talk-radio DJs, and the remarkable volume of violence that plays across movie screens. Sometimes the rage explodes, front and center.
Waco and Oklahoma City. April 19, 1993, saw the fiery culmination of a long standoff between members of a fundamentalist religious cult called the Branch Davidians and federal officers. Followers of David Koresh (his real name was Vernon Howell) holed up in a fortified compound outside of Waco, Texas, and resisted the intrusion of agents from the U.S. Treasury Department’s Alcohol, Tobacco and Firearms Unit (ATF). The agents were investigating reports of a stockpile of illegal arms as well as rumors of child abuse in the compound. When ATF officers moved in on the compound on February 28, the cultists opened fire, killing four ATF agents. Koresh was wounded in the exchange, and at least two of his followers were killed. For the next 51 days, the FBI laid siege to the Branch Davidians, until April 19, when the agents commenced an assault with tear gas volleys. The Branch Davidians responded by setting fire to their own compound, a blaze that killed more than 80 cultists, including 24 children. Millions witnessed both the February 28 shoot-out and the April 19 inferno on television.
Millions also saw the bloody aftermath of the bombing of the Alfred P. Murrah Federal Office Building in Oklahoma City on April 19, 1995, which resulted in the deaths of 169 persons. Timothy McVeigh and Terry Nichols were the two young men indicted in connection with the bombing. The men were associated with the “militia movement,” a phrase describing militant groups organized in several states after the raid in Waco and a 1992 government assault on Randy Weaver, a white supremacist, and his family in Ruby Ridge, Idaho.
The incidents at Waco and Ruby Ridge, together with passage of relatively mild federal gun-control legislation, inspired the formation of these armed cadres. The groups were opposed not only to what they deemed excessive government control of everyday life, but also to what they saw as a United Nations plot to take over the United States in a drive toward “One World Government.” Few Americans could understand how bombing a federal office building—which contained no military installations, no CIA secret headquarters, but only such ordinary offices as the local Social Security unit-could be deemed a blow against tyranny. Among those killed and injured were a number of children at play in the building’s daycare center.
A Tale of Two Trials. The rage that has characterized the 1990s is not always politically motivated. Two trials commanded national attention.
Rodney King. On March 3, 1991, Rodney King, an African-American, was arrested for speeding on a California freeway. By chance, a witness carrying a camcorder videotaped the arrest: a brutal beating by four nightstick-wielding Los Angeles police officers. The tape was broadcast nationally, sending shockwaves of outrage from coast to coast. Incredibly, the officers were acquitted on April 30, 1992, by an all-white jury in the upscale California community of Simi Valley. Five days of rioting, arson, and looting erupted in Los Angeles, especially in the city’s predominantly black South-Central neighborhood. (In a second, later trial on federal civil rights charges, two of the officers were convicted.)
The first verdict and the riots that followed—the worst urban disorder since the New York “Draft Riots” during July 13-16, 1863—were heartbreaking, suggesting that we as a nation had not come very far in learning to live harmoniously and productively together. Yet the sad episode was also a demonstration of how technology could serve the ends of democracy. For despite the jury’s verdict, the beating, videotaped and broadcast, united most of the nation not in rage but in outrage. As those brutal images flickered across television screens everywhere, unquestioning belief in law and order dissolved. Middle-class whites were forced to ask themselves, Is this what it means to be black in America? At the very least, those millions who saw the beating had to ponder: This is exactly what American democracy was created to prevent.
O.J. Simpson. Television brought another racially charged legal battle into the nation’s living rooms when O.J. Simpson, an African-American who had made it big as a football star and, later, sports broadcaster, was tried for the brutal murder of his ex-wife, Nicole Brown Simpson, and her friend Ronald Goldman. For almost a year, the televised “Trial of the Century” riveted a significant portion of the population. On October 3, 1995, after having been sequestered for 266 days and then deliberating for less than four hours, the Los Angeles jury found Simpson not guilty. While television had once again united the nation in focus on a single event, the televised verdict revealed a deep national division along racial lines. Whites overwhelmingly deemed the decision a miscarriage of justice, whereas the majority of African-Americans believed Simpson had been the innocent victim of racist police officers determined to frame him for the murder of his white ex-wife and her white friend.
E Pluribus Unum. Many Americans complained during the long Simpson trial that the media, in devoting coverage to every minute detail of the tortuous legal maneuvering that characterized the proceedings, gave short shrift to many other important stories of the year. Perhaps. But the televised trial did have much to teach us about our nation; about what it means to live in a country based on the presumption of innocence; about what it means to be black—or white—in the United States; about how access to wealth may influence the outcome of a trial.
From Leave It to Beaver to Sesame Street to the Internet. The Simpson trial also reminded us—if we needed reminding—of how profoundly mass media, particularly television, shapes our perceptions, even as it mirrors them. Back in the late 1950s and early 1960s, television programs depicted the American family as a collection of ethnically nondescript, vaguely Protestant white people living in white-painted, picket-fenced colonial-style suburban homes: the Ward, June, Wally, and Theodore “Beaver” Cleaver of Leave It to Beaver. This is how we liked to think of ourselves back then, and television obliged.
Then television brought us Sesame Street in 1969, a series aimed at entertaining and teaching children, but one that also depicted an urban neighborhood populated by whites, blacks, Hispanics, Asians, as well as people with handicaps and disabilities. The journey from Leave It to Beaver to Sesame Street consumed a decade in which a majority of Americans became more mindful and more accepting of the essence of democracy: From many, one.
And that mindfulness may just be the engine driving popular fascination with yet another technology. As little as two decades ago, the world of computers was an arcane realm that commanded relatively little interest and less understanding from most folks. Use the word Internet much before the 1990s, and you might as well have come from outer space.
Now it is difficult to get through a day without hearing some reference to the Internet. For an increasing number of Americans, few days go by without a personal visit to it.
A network of computer networks, the Internet is an information superhighway and also a forum, the ultimate town square, a place where ideas can be aired, shared, and debated. As yet, the Internet is unregulated by any government agency. Is the Internet democracy? No. Democracy cannot be reduced to this or that technology. But the Internet is an expression of democracy, a fervent wish to be democratic, to hear and to be heard, to share, to communicate, to connect with one’s neighbors—next-door and around the world—and to be at the center of a great web that offers infinite centers (since, on the Internet, the center is wherever you happen to be).
Will the Internet make democracy any easier? Maybe, maybe not. But, more important, after the more than 200 years since the Constitution was written with pen and ink in the painstakingly graceful hand of the 18th century, the binary 0s and 1s, the light-speed ebb and flow of electrons through the Internet continue to embody the passion, the ideals, the dreams of those who founded the nation and those who have nurtured it for so long. Whether penned with a quill or tapped out on a keyboard, the message is the same: E pluribus unum—From many, one.
The Least You Need to Know
The 1990s have been characterized by discontent, political extremism, and rage that threaten democracy, but also by a renewed passion to gather and share information and ideas, the very elements that keep democracy strong.
A revolution in electronic media, born in large part of an impulse to democracy, may well promote and preserve democratic values in the next millennium.
Stats. Even though Perot dropped out of the race for a time (stunning his many supporters), he won 19,237,247 votes, an astounding 19 percent of the popular vote total.
Word for the Day.After World War II, from 1947 to 1961, the birth rate sharply rose in America. This period was described as a baby boom, and those born during this time were baby boomers. Those born between 1961 and 1972, typically college educated but chronically pessimistic and vaguely dissatisfied with career possibilities, have been dubbed Generation X, a label drawn from a novel of that name by Douglas Coupland.
Word for the Day. During the 1980s and 1990s, Americans heard a great deal about PACs. A PAC–Political Action Committee–is a special interest group, lobby, or pressure group organized to raise money for specific political activity.
Word for the Day. “In God We Trust” is plain enough English. The Great Seal of the United States, reproduced on the dollar bill, also Includes two phrases of Latin: Annuit coeptis (“He has favored our undertakings) and Novus ordo seclorum (“A new order of the ages”).
Real Life. Newt (Newton Leroy) Gingrich (b. 1943), elected U.S. Congressman from Georgia’s 6th District in 1979, became Speaker of the House in 1995. Intensely, abrasively partisan, the eloquent Gingrich brilliantly masterminded the Republican party’s so-called “Contact with America.” This conservative legislative agenda promised smaller government, more responsive government, and a renewal of a sense of opportunity in American life. Gingrich has drawn ardent admiration from conservatives and equally contempt from liberals.
Stats. In the 1980 presidential election, Libertarian candidate Ed Clark polled 920,8569 votes, but candidate David Bergland received only 227,949 votes in 1984. Ron Paul garnered 409,412 votes in 1988, and Andre Marrou received 281,508 votes in 1992.
Voice from the Past
“People, I just want to say, you know, can we all get along? Can we get along? Can we stop making it, making it horrible for the older people and the kids?”
Rodney King, spoken during the Los Angeles riots, May 2, 1992
Main Event. The very last year of the 19th century saw the invention of radio—at the time called the “wireless telegraph”—by the Italian-Irish inventor Guglielmo Marconi (1874-1937). Twenty-one years later, Pittsburgh’s station KDKA made what is generally considered the first commercial radio broadcast in the United States (an announcement of election results), and electronic mass media was born. In 1933, a Russian-born American engineer for RCA, Vladimir Zworykin (1889-1982), having invented and patented in 1925 the “iconoscope” (basis of the modern “picture tube”), broadcast a television image over a radio-wave relay between New York and Philadelphia. It took well over a decade for the new medium to catch on with the American public, but in 1948, when an ex-vaudeville comic named Milton Berle (b. 1908) began cavorting across the small screen (often attired in drag), Americans got hooked. Berle’s Texaco Star Theater dominated the airwaves through 1956, sales of TV sets soared, more programming was developed, and television became the dominant American entertainment medium. Soon, it also displaced the newspaper as the dominant news medium as well.
Radio and television are essentially one-way media, broadcasting to a passively receptive audience. The development of the personal computer, especially as linked to other computers through such networks as the vast Internet, represents an emerging interactive medium, in which there may be many parties in an exchange of entertainment, news, and information.
The modern personal computer has its most direct origins in the “analytical engine” invented by Thomas Babbage and Lady Lovelace in 1822-37, in computational theory developed by the British mathematician Alan Turing in the 1930s, and in ENIAC, the first electronic computer, unveiled at the University of Pennsylvania in 1946. From that time until the 1970s, computers were big and expensive, requiring a team of experts to program and to operate. In the 1970s, a number of companies introduced much smaller and cheaper computers, which could fit on a desktop, and, on August 12, 1981, IBM introduced the PC-personal computer—the first truly practical desktop machine. Since then, the machines have grown cheaper and more powerful each year, and one is hard pressed to find a desk—in offices as well as homes—that doesn’t sport a PC.
Stats.In 1984, 18 percent of people over age 18 regularly used computers. In 1993, the figure was 36 percent. Significantly, 56 percent of children aged 3 to 17 regularly used computers in 1993.
Word for the Day. The Internet is an international web of interconnected government, education, and business computer networks At last estimate, at least 4 billion people access the Internet, which many regard as a kind of electronic community.
Around 60 % of the world population has an internet connection today. In 1995, it was less than 1%. The first billion was reached in 2005. The second billion in 2010. The third billion in 2014.