History 111 Logo

Lecture: The New Millennium (1992-2008)


Outline

Post-modern America
Presidents and the Media
Digital Revolution movie
The New Home
A Medicated Society
Transgenic Foods
Globalization and global effects
Terrorist Attack and War
National Security
 

Post-modern America

During much of the 1990s and into the early years of the new millenium, much of the country prospered from an expanding economy. Old problems (such as agricultural profitability), however, got worse. Advances in technology would benefit some, but not all equally.

Post-modern philosophy, as it developed in Europe and America, provided a construct for attacking previous knowledge and norms. Modern thinking, as we saw during the 1920s, was based on rationalism and empirical evidence. Despite the traditionalists of that era, modern thinking (which included gender equality, the dominance of science, and the vagaries of mass media) became the foundation of philosophy for much of the 20th century. However, being overly rational and scientific has its drawbacks. In particular, it can conflict with emotions, and with ethics. In the post-World War II era, many thinkers experienced a crisis in confidence regarding empiricism (that is, knowledge based on our five senses). Similar to the post-Great War angst, which wondered how so much modern technology could lead to so many senseless deaths, World War II (and particularly the Holocaust) called these issues even more into question.

There were many post-modern thinkers, but I'll talk briefly about Jacques Derrida, a French philosopher born in  Algeria (which had been a French colony). He "deconstructed" knowledge, noting that everything we think we know is based on language. This language, these words, can be changed, and doing so seems to change meaning. So all knowledge is constructed by human beings, which implies that nothing is really "true", or even "correct". My apologies to philosophy majors, because I  know this is a vast oversimplification of one of the most prolific philosophers of all time. But the impact of post-modern philosophies, regardless of their differences and complexities, was to call into question what most people knew as reality.

Sometimes this could be helpful to various causes. The domination of females in politics and society, the oppression of AfricanAmericans, the disdain with which young people were treated -- these could all be seen as mere social constructions, ready to be destroyed and replaced with a better way to do things. The downside of post-modernism is an emphasis on relativism, the idea that nothing is really true except in its relation to other things that society has created. Although post-modern philosophers might have ways to make that OK by examining ethics more closely, most people don't do that. Instead, they operate as if all things are relative, that there is no truth and no ethical foundation for behavior. We will see this problem play out in both this lecture and the next one.

Presidents and the Media

Consider that there has been a trend throughout the course concerning the government and journalism. Think Jacob Riis, Carey McWilliams...

Since this is a good time to review things anyway, I want to discuss a trend that runs throughout the course regarding Presidents and the media. Before radio, most editorial comments about the Presidency ran in the newspaper, and often these reports were very frank in their criticism. But, as I mentioned in the discussion of Watergate, the criticisms were based on the man's performance in office. Even the lambasting radio speeches of Father Coughlin and Huey Long during the Depression focused on Franklin Delano Rooselvet's economic programs. "FDR on Car"

FDR himself was a good example of how the press didn't cover the President's personal life. FDR was crippled by polio shortly after WWI, in the middle of his bid for vice-presidency. He became governor of New York several years later, and President in 1933. FDR was unable to walk without the use of two canes, and unable to stand without assistance. The press seemed to have a tacit agreement not to photograph him needing assistance; he was always shown seated, leaning on the rail at the back of the train, or standing behind a podium. It's as if the press were trying to preserve the image of FDR as a strong President.

The media was no more concerned about sexual affairs. For many years, FDR visited Warm Springs, Georgia, and attended a spa and doctors there to help with the paralysis. Apparently he met a woman there, and they had an affair that lasted many years, until his death. The press didn't report this either, although certainly anyone hanging out at Warm Springs must have been aware of it. Apparently President Eisenhower also had a long-standing affair, with his female chauffeur from the war years. Kennedy was well-known for his women in the White House, among them Marilyn Monroe.

Perhaps the desire of the media not to cover these things reflected the desire of Americans not to know about them, or perhaps there was just a feeling of "boys will be boys". If it's because of this belief, that men in power should be permitted some leeway in this area, it enforced the long-standing perception that married men can have affairs, but married women cannot. Even after Kennedy was assassinated, the press was happy to cover Jacqueline Kennedy's affair with Greek shipping tycoon Aristotle Onassis, and to criticize her for it. The double standard held for celebrities, too; male stars were expected to sleep around, but Ingrid Bergman was run out of Hollywood on a rail for her affair with Roberto Rosselini.

Cartoon of President Clinton wearing t-shirt saying "I'm with stupid" and a hand pointing downwardIn the late 70's, Americans were shocked to hear that President Carter, in an interview with Playboy, admitted to "lusting in his heart" about women other than his wife. During the Reagan presidency, the behavior of the President in his official capacities occupied the press (especially Iran-Contra), as did the fact that the First Lady liked to consult a psychic. With Bush the Elder, everyone was tied up with the War for Kuwait. But with Clinton, the subject of the President's personal life emerged full-force in 1999 with the Paula Jones case, based on the accusations of a woman who claimed she was indecently propositioned by Clinton when he was governor of Arkansas. Then his sexual indiscretions with White House intern Monica Lewinsky became front-page news, in excruciating detail. The issue was treated as a major national concern as American sexuality became more restrictive and Republicans desperately sought the moral high ground. The media was glad to help, because it sold.

The media has also been influential during war time. The Hearst's Yellow Press is seen as a cause of the Spanish-American War, and that was during a time when newspapers were the media. In World War II, new film technologies made war coverage more vivid, but the newsreels seen in movie houses were carefully tailored in cooperation with the government. The U.S. was never seen retreating; there was no blood. Television changed that with its coverage of the Vietnam War, on-the-spot footage which helped change America's view of the war. Much Vietnam War footage was uncensored, because the government was not yet prepared to deal with the immediacy of television. In 1991, coverage of the Gulf War was carefully controlled by the Pentagon, and the military forbade reporters in the field. The result was a sanitized video game watched on CNN. Capitulating to media demands, the military has now created the "embedded" reporter, a journalist who travels with and is thus dependent on the U.S. military. Many un-embedded journalists regard this as government censorship.

So at the same time as media outlets sold stories about the president's sexual activities, they were prevented from reporting fully on a war. In a post-modern world, the ethical implications of this are interesting.

The Digital Revolution

Consider that there have been themes about technology since 1865. Think Henry Ford, radio, canned food.

The name "The Digital Revolution" is used to describe the commonplace use of computer technology. Computers during the 1950s and 60s were used only by large businesses, and the machines had taken up entire rooms for processing data using tape storage. It was only in the 1980s, with the IBM PC and the Apple computer that the consumer could afford a "personal computer", a device intended for home record-keeping and entertainment, but not connected to the outside world. But by the 1990s, homes, businesses and libraries became better connected to the world. According to a U.S. Census Bureau, home ownership of computers connected to the internet was 68% by 2009.

Although originally designed by the military and universities as a way to store and share information over wired connections, the Internet (which includes the web, email, and other digital transfers) evolved to become the common gateway of shared resources and interchange. Early on, Vice President Al Gore spoke about the government's perspective, which has permitted the Internet to be created as a mostly unregulated medium. Note Gore's references to the communications achievements in the past, and you may want to call it The Communication Revolution.

Document: Al Gore: Remarks on the Digital Revolution (1994)

 Click here to open document in a new window

"Computer"This electronic repository of information has caused a questioning of what constitutes good research. A student goes into a library in the 1960s to do a research paper. She uses long-standing and respected books, journals, etc. from which to gather her facts. She reads the interpretations of others in "refereed" journals (that is, journals where other experts from the same discipline evaluate each others work). But when a student gets on the Internet to do research these days, a search engine will bring up everything that links to the topic according to that particular search engine's specifications. Much that is "out there" will be missed, and what does come up is unrefereed, disordered, and may be created by people who are not specialists, and whose opinion on an issue has no substantiation. It's like asking your neighbor, who's a nuclear physicist, to give an opinion on the stock market.

The research problem was complicated in 2007 by the popularization of "Web 2.0", also called the "Read-Write Web". The popular use of cheap private websites and social media (such as Facebook and MySpace), and the advent of inexpensive GPS (geographic positioning systems) devices, created enormous social possibilities. People could connect through mobile devices. Movements could be tracked. And ones electronic contacts could become more "real" than ones "real world" communities. This is an expansion of a trend begun with cellular phone communications in the 1990s.

The movie clip I've chosen for this lecture is from Clueless. Although adapted from the Jane Austen novel, Emma, it reflects contemporary topics like disconnected families, cell phone friendships, the ubiquity of technology and work-obsessed adults.



 

In a sense, then, post-modernism is reflected in these new activities. While on the one had, online and cell phone communications could be seen as just a continuation of communications via letter, telegraph, or newspaper, their participatory nature also creates a larger opportunity for deconstructing previous knowledge.

Home and Family

Consider themes about how people live at home. Think Victorian and Craftsman homes, Benjamin Spock, the ads from the 1950s, Betty Friedan.

Ever the retreat of the middle class, the home began to take on new importance in the 1990s and the current decade. While the Moral Majority of the 1980s had emphasized the home as the location for women's work and the foundations of morality in children, the 1990s expanded it into the center of culture.

modern kitchenAs usual, marketing provides insight into cultural trends.

Home Depot, founded in 1979, is a good example. Catering to do-it-yourself homeowners as well as contractors, the home improvement building center company had 1,100 stores nationwide by 2000. These stores, like the merchant giants at the turn of the century, were able to run out competition (in this case, small hardware stores) through volume pricing. Home improvement and decorating trends were represented on TV as well, with the increasing popularity of shows like This Old House , sit-coms like Home Improvement, and the advent of "Home and Garden Network". The 1950s trend of "keeping up with the Joneses" was reborn as people began to value home again as not only a retreat, but often a place from which to combine the responsibilities of family and work. The "home office" became more commonplace. The events of September 11, 2001 (see below) strengthened the trend toward emphasizing the home as a safe and central location in the lives of Americans.

Home schooling became more popular and acceptable in the 90's and early 21st century, although it had its origins in both the 70's continuation of communal living and the 80's Christian homeschooling movement. As studies showed that children who are homeschooled have as good or better test scores and social skills, there was less resistance and more public support, which meant government support. This included public schools offering homeschooling information and waivers. At the same time, chartered public schools and campaigns for school vouchers (where tax money pays for private schools) reflected a growing concern for alternatives in children's education.

Home birthing also became more acceptable as an alternative to hospital births. The obstetrical house call had disappeared during World War II, and that combined with the medical community's insistance on hospital conditions (sterility, convenience for doctors over the comfort of women in labor) had led to a cultural disdain of midwives and home birth. As part of a general trend toward alternative medicine, midwives gained cultural acceptance, although in most states the conventional medical community dominates laws and procedures.

Home schooling and home birthings were not the only domestic changes to raise eyebrows. The 90s also saw a change in home management and perceptions. In families based on the male-female couple with kids, the effort to combine home and work could mean a reversal of conventional gender roles. In the early 90s, the number of middle class two-income families increased. Children often went to day-care centers while both parents worked. By the late 90s, some of these families were considering returning to one income, with one adult taking care of the kids. In situations where the woman was earning more, or the man wanted to be intimately involved in child-rearing, some men became SAH (stay-at-home) dads. However, such men were often treated with derision, assumed to be babysitting or not fully competent as the main parent. gay marriage map

Even the definition of "family" underwent change in the 90s and early 21st century. For many years, gay people had referred to all other gays as family, and certainly gay couples raising children had to be considered as such. Also, because of the high divorce and remarriage rate, many families were "combined", with children from different marriages and relationships. Some families began sharing roles and tasks with other families very closely, creating extended families in an effort to make life more manageable or enjoyable. Some researchers (for example, those studying AfricanAmerican families in the ghetto) found that they had to redefine "family" in order to ask the right questions and make accurate observations. A family could be mom-dad-kids, or it could be grandparents and children, or a kinship group, or a more diverse entity including anyone with whom the subjects of the study had an interdependent relationship. Hillary Clinton's popularization of the phrase "it takes a village to raise a child" reflected this growing awareness of the changing home and family networks.

Marriage became a controversial topic during this era, also, with the Defense of Marriage Act (1996) defining marriage as between one man and one woman. This led to civil rights activities to promote gay marriage as a right enshrined in the Constitution.

A Medicated Society

Consider that this class had had themes about medicine or health care. Think settlement houses, Margaret Sanger, Jonas Salk.

The government declared "War on Drugs" during the 1980s. But that was only on illegal items. The FDA approved many medications, and the number of medical patents increased in the 90s as the society came to depend more and more on pharmaceuticals.

ad: Are you ready for Prozac weekly?The development of new drugs for mood disorders led to the creation of the hallmark anti-depressant Prozac, which was first marketed in 1988. The next year, annual sales topped $350 million. Although most forms of clinical depression (in which synapses of the brain fail to correctly process neurotransmitters like serotonin) had been treated since the 1950s, nothing had seen the popularity of Prozac. Unlike tranquilizers and earlier anti-depressants, Prozac could improve mood without altering personality or making the user sleepy. It was prescribed for everything from grieving to personal unfulfillment to post-feminist anxiety.

The 90s also saw the introduction of Viagra, a pill to help male impotence. Since it also tended to increase sexual desire, it was soon marketed and prescribed as a sexual "pick-me-up", with ads emphasizing women grateful that their men were taking it.

Ritalin was a drug designed to treat ADD (Attention Deficit Disorder), and it was prescribed primarily for children marked the decade before as "hyperactive". The drug, a stimulant, became over-prescribed very quickly for basic behavioral problems. A revolt began, which is still gaining steam, against the over-medicating of children, some of whom need help but most of whom were simply being kids. In December 2000, Fred A. Baughman Jr., M.D , reported a

Dramatic escalation in child psychiatric drug prescriptions. The Journal of the American Medical Association (JAMA) reported that the use of psychoactive medicines by children ages 2 to 4 tripled from 1991 to 1995. Ritalin is at the top of the list, according to insurance and Medicaid figures compiled by epidemiologist Julie Zito (University of Maryland, Baltimore). SCIENCE states that as many as 150,000 to 200,000 US children in this age group may now be taking Ritalin.
This was despite the fact that Ritalin was tested only on children over the age of six.

Tried-and-true 20th century pharmaceuticals also underwent a change in usage. Labs had to produce stronger antibiotics in reaction to improperly administered (i.e. the patient only took 3 days of a 10-day course of treatment) and overused prescriptions. Up until the mid-1990s, doctors routinely wrote prescriptions for antibiotics to treat viruses like the common cold. The justification was that antibiotics would prevent a bacterial "super-infection", but the result was widespread restistance to antibiotics that had previously been deemed safe and effective, like penicillin. Antibiotics were also introduced into "antibacterial" soaps and toys, further increasing resistance and necessitating the development of ever-stronger and less tested antibiotics which entered the marketplace very quickly. As each old antibiotic failed to work on a particular germ, patients had to be prescribed different antibiotics and combination antibiotics, sometimes trying two or three before one worked. Diseases like tuberculosis (TB), thought to be on its way toward eradication, have made a frightening comeback as a result of strong antibiotic resistance. As the 90s came to a close, more and more doctors began to refuse antibiotics to patients with viruses, and insist that those who needed them finish the full course of treatment.

One of the reasons that medication was hastily prescribed beginning in the 1980s was that it was often prescribed without lab work to confirm which pathogens were being fought. The increasing dominance of Health Maintenance Organizations (HMOs), and their associated insurance companies, led to a focus on costs rather than care. Pharmaceuticals were often cheaper to both patient and HMO than blood tests, sputum cultures, psychological testing, or other diagnostic procedures. In the absence of a life-threatening illness, prescribing medication became the easiest way for doctors to satisfy both patients' expectations and the HMO guidelines which increasingly controlled medical practices.

Transgenic Foods

Consider that we have had themes about the role of farming, environment and science. Think John Muir, Victory gardens, Love Canal.

During the 70s and 80s, with the use of DDT banned across the country, chemical companies had developed other toxic substances for dealing with insect pests and diseases. You may recall from our discussion about the Dust Bowl that the main reason we need all these chemicals is because of soil erosion and depletion of nutrients (soil exhaustion). Certainly, many people had been made aware of the dangers of chemical insecticides after Silent Spring, and environmental biologists felt they had an age-old solution: biological controls.

If you have a problem with a certain insect, why not import something that eats that insect? Get the natural enemies of those pests to destroy them. This was, as I said, not a new idea. In the 1880s in Hawaii, the mongoose (which eats rats) was introduced to control rats in sugar cane fields. Unfortunately, the mongoose hunts in the day, and Hawaiian rats come out at night. So instead the mongoose ate most of the native birds of Hawaii. This problem, of the introduced non-native species causing more harm than good, has been repeated many times. Australia has had the worst problems, importing rabbits for food (now they're trying to kill them with deadly viruses because they eat all the crops) and cane toads to eat sugar cane grubs (they don't eat the grubs but have bred so prolifically they're all over the roadways of Queensland). In 1992, ladybugs sold to eat aphids in people's gardens on the east coast of the U.S. bred with Japanese ladybugs and proliferated. They got into people's closets, kitchens, everything. All the introduced species run out native species, causing ecological imbalance.

Another solution, developed in the 90s, is transgenic foods. The original idea is also based on an old concept: plant breeding to select for beneficial characteristics. Let's say you find a corn that grows fairly well in Kansas, but only about 20% of the crop survives the frosts and droughts. You take the seed from that 20%, and plant it again, saving seeds and replanting each generation. Or you take that 20% of seed and cross it (hybridize) it with another corn that withstands heat well, but doesn't taste quite as good. This kind of farming has been going on for millenia; you select out and hybridize to get a certain kind of plant.

"Organic Tomatoes"

But transgenic foods take this to the cellular level. Instead of selective breeding, bioengineers genetically alter the crop by selecting out DNA from different plant or animal species to obtain certain characteristics. Not surprisingly, American chemical companies have led the way, both to regain market losses from decreased pesticide use and to find another use for their chemicals. Calgene created the FlavrSavr tomato through gene splicing; it lasted on the shelf up to four months. By the time you got it, it had little flavor and almost no vitamins, but you could have a tomato on your sub sandwich in February in New York, and that's what Americans wanted. In mid-1996 this tomato was removed from production due to "genetic glitches", but Calgene was bought by Monsanto anyway. Monsanto had created Roundup-Ready Soybeans. Roundup is an herbicide made by Monsanto to kill weeds. When you used it on soybeans, it also killed the soybeans. But by crossing a gene from a herbicide-resistant plant with the soybean, Monsanto has created a soybean that you can spray with Roundup and it's OK.

The problems with transgenic foods are several. The product is completely unpredictable; there's no way of knowing what product you'll end up with when you work with molecular genetics. In 1989, genetically engineered Tryptophan (an amino acid used to treat depression and insomnia) went toxic, killing several people and prompting the unjustified removal of all Tryptophan from the market. A soybean containing a brazil nut gene is now highly allergenic. The soybeans (unidentified as transgenic because no law covers this yet) are used in baby formulas. Many babies allergic to milk drink soy formula, and many babies allergic to other things are allergic to nuts, so this makes massive amounts of soy formula that allergic babies can't drink, leaving them with nothing if they can't have breastmilk. A genetically engineered corn used for a non-petroleum fuel, ethanol, made the land infertile everywhere it was grown, and can no longer be used.

Avoiding transgenic foods became increasingly impossible as they entered the food chain in the 1990s, because the "drift" of pollens creates transgenic species even where they aren't wanted. The solution, naturally, is organic diversity, but even organic corn can be pollinated by a transgenic corn on the next farm. In 2000, genetically-altered Starlink corn, deemed unfit for human consumption due to a high content of allergens, accidentally entered human food products. Although many nations around the globe ban them, when individual states tried to pass measures prohibiting them, megacorporations like Monsanto put millions into defeating the measures, threatening skyrocketing food prices if these measures were approved.

Globalization and Global Effects

Consider themes on the U.S. role in the world, including the 19th century imperial expansion, the Versailles Treaty and the Cold War. Also themes about political public protest, such as women's peace parties during the Great War, Ban the Bomb groups after second World War, and King's march on Washington.

On a broad scale, American capitalism and consumerism expanded world-wide. This included US dominance of organizations such as the World Trade Organization and the World Bank. During the Clinton administration, this modeling of the world, either in the American image or to suit the goals of the U.S. (depending on ones point of view) caused negative responses in many nations. Protestors at the WTO conference in Seattle, 1999, were an example of this.

Document: Paul Reynolds: The Battle of Seattle (1999)

 Click here to open document in a new window

There were also, however, international efforts for the betterment of the world. Around the globe, other nations were trying to build on the international successes regarding the environment. The new European Union, created during the 1990s, began to codify European standards regarding the environment, standards which included the labelling of genetically modified products in excess of .9% in food, the prohibition of hormone use in feed animals, limits on particle emissions into the air, and measures on sustainable development. On a broader scale, increasing scientific evidence of global warming led to the creation of the United Nations Framework Convention on Climate Change in 1992, which developed a protocol at Kyoto for setting voluntary carbon emissions limits. By 2009, 192 nations had signed on to the protocol. The United States ratified it under President Bush I, but in March 2001 President George W. Bush (whom I will call Bush II) withdrew the U.S. from the Protocol.

Document: The Kyoto Protocol to the U.N. on Climate Change (1997/2005)

 Click here to open document in a new window

The US remained the only developed country that hasn't ratified the agreement. The Obama administration focused on individual arrangements and treaties with other countries, tying climate change to various trade agreements instead.

Terrorism and War: 9/11

Consider that there has been a trend throughout this class on war and fear since 1865. Think both Red Scares and all four wars.

Cover of New Yorker showing twin towers in shadowOn September 11, 2001, two airplanes hijacked by terrorists crashed into the World Trade Center. Another was crashed into the Pentagon building in Washington, D.C, and another crashed into a field in Pennsylvania. At the Twin Towers of the World Trade Center, which were built to withstand the impact of an airliner, the fires from the planes melted the steel cores of the buildings. The towers collapsed, killing thousands.

The political response was immediate. The George W. Bush administration declared "war on terrorism", and pushed Congress to award unprecedented power to the President to fight any threats. Barbara Lee, Democratic representative from California, cast the single vote against permitting President George W. Bush to use force against "those nations, organizations, or persons he determines planned, authorized, committed, or aided the terrorist attacks" [House Joint Resolution 64]. It's rather startling that Congress essentially ceded its responsibility for declaring war, permitting military activities in several countries without Congressional debate.

photo showing fireman with flag in rescueFollowing evidence linking the attacks to the international al-Q'aeda terrorist organization, the U.S. attacked Afghanistan, where the repressive Taliban government provided haven for the group. But the U.S. was not alone -- the original partner was the UK, and by December the NATO-led coalition in Afghanistan included 17 European countries, South Korea and Australia. Given plenty of warning of impending attack, the al-Q'aeda members fled, and the Taliban removed from power. The peace movement in the U.S. was played down by the media. The government had tightly controlled access to information by the press.

There are other questions too that were ignored by the mainstream Western media. One concerns the collapse of the Twin Towers, which killed more people than the impact of the planes. A plane had before crashed into the Empire State Building, which had prompted changes in building. But a 1968 fire that melted the steel core of a building in Britain led to a recommendation for steel cores to be surrounded by concrete rather than fireproofing spray, a recommendation that was ignored in New York because it was expensive and inconvenient.

By fall of 2002, the Bush Administration's focus had turned to Iraq, although the war continued in Afghanistan. Historians will probably note combined Cartoon of Bush on "War" surfboard heading into wave of World Opinionmotives of oil, neo-Christianity, father-son psychology, neo-imperialism, and frustration with the "war on terror". When it became apparent that some members of the U.N. Security Council (and most of world opinion) did not favor war, the U.S. combined with Britain and Spain to attack Iraq and oust Saddam Hussein in what is now called the Second Gulf War. This has revived an interest in America's previous experience with imperialism, as well as studies of the perils and successes of the British Empire. There have also been attempts to repeal the unprecedented authority that Congress gave to the President in the year following 9/11, but none have been successful so far.

Both an anti-war and pro-war response emerged with Afghanistan and Iraq. I'm going to use country music to show the deep divide that split the U.S. people in these responses.

Document: Jim Morin: Gas Station Cartoon (2001)

 Click here to open document in a new window

click here for audio song: Toby Keith Country Toby Keith created a song in support of a violent response as being in the American tradition. It was called "Courtesy of the Red, White and Blue". Lyrics
Country The Dixie Chicks focused on a more individual experience of war in "Travelin' Soldier". This group caught hell for saying they were sorry that President Bush was from Texas. Lyrics

Rock musicians weren't nearly as subtle. Here's System of a Down with their anti-war song "Boom!" Released in November 2002, the song played before the video was made. The video was filmed at the international war protests of February 15, 2003. In response to Bush's declaration that certain countries constituted an "Axis of Evil", Serj Tankian from System Of A Down and Tom Morello from Audioslave formed the "Axis of Justice" (axisofjustice.net) to act as muckrakers exposing the injustices in our response and in our system. Lyrics



 

National Security since 9/11

There are themes here about measures to protect the nation and their effects.

Following 9/11, the federal government began to centralize its authority, through the passing of bills such as U.S.A. Patriot Act and the establishment of a Department of Homeland Security. The Bush administration pursued its war on terrorism in Afghanistan and into Iraq, naming this nation as one of an "axis of evil" perpetrating wicked deeds. Some security experts criticized the approach, noting that a centralized, nationalistic response to a decentralized, international enemy (these terrorists operate in independent "cells" or units around the world) could lead to serious problems.

These events were reminiscent of the Red Scares. Membership in the American Civil Liberties Union jumped in 2001-2, a good indicator of the concerns of many Americans.

Document: Anthony Romero: In Defense of Liberty in a Time of National Emergency (2002)

 Click here to open document in a new window

By 2008, the end of Bush's second term, many Americans had tired of an endless war and an administration that did much of its governing in secret under the umbrella of national security.

 

 



Top

 © 

All text, lecture voice audio, and course design copyright Lisa M. Lane 1998-2018. Other materials used in this class may be subject to copyright protection, and are intended for educational and scholarly fair use under the Copyright Act of 1976 and the TEACH Act of 2002.