Feeds:
Posts
Comments

Archive for the ‘childhood’ Category

What do Philip Roth’s polio novel Nemesis and Justin Cronin’s vampire novel The Passage, both published in 2010, have in common?  Quite a bit, actually.  Roth’s novel renders the atmosphere of fear surrounding a polio outbreak in Newark, New Jersey in the summer of 1944.  The protagonist is a young playground director futilely trying to protect the children he supervises from contracting polio.  Cronin’s novel imagines a world ninety years in the future that has been overrun by killer vampiric “virals” as the result of a scientific experiment gone awry.  It focuses on a small colony of people struggling to defend themselves against the virals.  Both books deal with the pervasive threat of an invisible, poorly understood contagion that has no known cure.  Both focus on adult attempts to shelter children from the contagion.  In both, characters wrestle with existential questions and religious doubt—why would God inflict polio on children?  Do virals have souls?  Both books demonstrate how natural time changes—spring to summer, day to night—can heighten the threat of contagion and force humans to change their routines and behavior.  Both show how suspicion and fear can cause communities to turn against one another.  As cultural artifacts of the moment, both novels also resonate powerfully with contemporary anxieties surrounding contagion in our everyday lives.

In recent years, Americans have been alerted to the threats posed by multiple toxins and diseases: salmonella in eggs, ecoli in spinach and peanutbutter, cadmium in Shrek glasses sold at McDonald’s and jewelry sold at Wal-Mart, H1N1, West Nile virus.  Such warnings came about against the backdrop of broader fears about biological terrorism post-9/11. The current wave of viral panic is of course the most recent chapter in a very long history of epidemics—Ebola in the 1990s, mad cow disease in the 1980s, polio in the 1930s and 1940s, the black death in the 14th century, and so on.

There is a cultural dimension to this biological history—a cultural history of contagion.  Nemesis and The Passage are the latest examples of artistic expressions, literary and otherwise, that in some way reflect, reimagine, or comment upon viral panics.  Think Daniel DeFoe’s Journal of the Plague Year (1722), Thomas Mann’s Death in Venice (1912), Albert Camus’s The Plague (1947) (literary critic Elaine Showalter recently made a brilliant connection between Nemesis and The Plague), and Stephen King’s The Stand (1978/1990).  Think nonfiction books like The Hot Zone (1994).  Or films like Panic in the Streets (1950), Omega Man (1971), and Outbreak (1995).

As artifacts of the 21st century culture of viral panic, unlikely bedfellows Nemesis and The Passage join a diverse cultural cohort that includes the comic book series Y, about a plague that kills every mammal on earth with a Y chromosome; the 28 Days and 28 Weeks Later films, featuring the “rage” virus; the horror movie Cabin Fever, starring a flesh-eating virus (tagline: “Catch It”); Max Brooks’s book World War Z, an “oral history of the zombie wars”; and the forthcoming young adult novel Virals, about a group of teenagers exposed to canine parvovirus.

These contemporary examples of the culture of viral panic offer audiences ways to process a number of fears and anxieties that circumscribe our everyday lives in the 21st century.  Anxieties about the limitations and abuses of science and medicine.  Anxieties about our ability to protect children and youth from invisible menaces.  Anxieties about community cohesiveness and civility.  Anxieties about the government’s ability to respond to disaster.  In other words, the culture of viral panic is never just about the threat of contagion.  It always speaks to broader concerns about institutions, social relations, childrearing, science, spirituality.  Read in this context, Nemesis and The Passage have a great deal in common.

Advertisements

Read Full Post »

The New York Times recently published an article on teenagers who have decided to reduce or eliminate the time they spend on Facebook in order to bolster their grades, their offline social lives, and their self-esteem (“To Deal With Obsession, Some Defriend Facebook,” December 21, 2009, A16).  I found the article especially interesting for the way it depicted teenagers and their relationship to technology.  The story reinforces one of our dominant, culturally constructed beliefs about adolescence: namely, the notion that technology poses a particular threat to teens because they lack the maturity and willpower to use it in a responsible, temperate manner.

First consider how technology is represented.  Throughout the article, Facebook is described as an addiction.  Words like “habit,” “obsession,” and “temptation” appear, as do phrases such as “like an eating disorder,” the “lure of the login,” and “time consuming but perhaps not all that fulfilling.”  One teenager mentioned by the piece even went on a “Facebook fast” for Lent.  Here, technology is depicted as a drug that can be especially dangerous and abused in the hands of teens.

Next consider the representation of adolescents.  The teenagers who have weaned themselves off of Facebook come across as having accomplished a remarkable victory.  They had to undergo a fierce struggle, one that required them to fight against their inherent adolescent traits and tendencies.  These exceptional teens exhibited “self-control,” “willpower,” and the ability to “delay gratification.”  The presumption here is that teens fundamentally lack these qualities… and they do not acquire them until they become adults.

Indeed, an unstated assumption of this New York Times story is that adults can use technology like Facebook more responsibly because adults have self-control.  However, many of my adult friends, myself included, are regular if not obsessive users of Facebook.  I even tried to “quit” Facebook back in April, posting a status update that read, “Leaving Facebook in 10 days.  I am both addicted and over it.”  In ten days I stopped looking at my account, but did not deactivate it.  Then two weeks later I was back on, and have basically logged in daily since then.  The teens profiled in this article are made of stronger stuff than I. 

One of the consequences of recirculating these assumptions about teens and technology in our mass news media is that it makes it easier for adults to claim the need to restrict or monitor teens’ use of technology.  The passing of laws targeting teens and technology comes to be seen as normal and even necessary for their safety.  For example, California recently passed a law prohibiting drivers under the age of 18 from using their cell phones; drivers over the age of 18, however, may use hands-free devices to talk on their phones.  Video games are assigned ratings (not by law, but by voluntary industry self-regulation) and people under the age of 18 cannot purchase certain games. 

I am interested in the larger questions raised by this story, questions about the role of technology in all of our lives, questions about the meaning of “maturity” and “immaturity” in our society, and questions about our cultural attitudes—and selective stigmas—regarding various “addictions” and “obsessions.”

Read Full Post »

When I was a kid and I got the flu it meant days home in bed.  I’ll try not to glamorize this, but I do recall some fairly blissful mornings with a pile of Kleenex, cough drops (remember Ludens—they used to taste just like candy!), and the TV all to myself.  I’m sure this wasn’t convenient for my mom who was a single parent, but we didn’t have flu vaccine so there wasn’t any way else to manage.  We just did.

I just took the girls to get their vaccines, and while I was waiting in the long line of other similarly haggard looking parents doing the same I had to wonder—when did we get so desperate to keep our kids from getting sick, and why?  Is it that the flu is worse now than it was when I was a kid?  h1n1 aside, I don’t think so.  More likely, we just don’t have time these days for our kids to get the flu. It’s ok if we get the flu because we can straggle into work in various stages of consciousness and, if it gets that bad we can always stay home and check emails from bed.  But when the kids get sick you can’t work.  Well, you can, but it’s hard.  Plus you look like a really crappy parent when you’re over on the computer while your kid moans from the other room.  Perhaps this is what makes flu season so scary and the flu vaccine such a balm.  We live lives without any back up—a sick kid can’t go to school and there isn’t any other childcare available to many of us.  We can’t take the time off work because in the age of the iphone, work is everywhere all the time.   A sick kid equals workplace disaster.  Preventing illness allows us to keep living lives that have little room for error, man-made or biological.

When the nurse gave the girls their post-vaccine pep talk about how they wouldn’t get sick now and wasn’t that wonderful, I found I only half agreed.  Sure, I don’t want them to be sick.  But maybe they should be.

My mother had polio, so I have to be careful not to suggest that the good old days were when people got sick.  Still, I think there is a point to getting “everyday” sick; illness is part of the natural rhythm of things.  And again, I’m not talking about h1n1.   I wonder what’s in store for this generation of kids taught in classrooms with Costco sized hand sanitizer bottles who learn group songs about the virtues of hand washing.  Will they be susceptible to superillnesses because of limited immune experiences?  Will they be unempathetic to those who are ill or disabled?  Will they equate the good life with bodies and lives consistently under their control?

It all makes me long for that box of Ludens, and a day on the couch.

Read Full Post »

November 9th marked the 20th anniversary of the fall of the Berlin Wall, a momentous event that signified the end of the Cold War.  I was a sophomore in college when the wall fell.  Today’s sophomores in college were not yet born.  I sometimes forget that a whole generation has now grown up in the United States with essentially no personal memory of the Cold War.  What they know of that forty-year geopolitical conflict they have learned in history class, or heard from parents and grandparents, or taken from popular culture.

When I was growing up in the 70s and 80s, the Cold War was a prominent backdrop to my life.  It was a continual presence in my pre-teen and teen imagination and in the broader culture of the time.  I worried about nuclear holocaust.  I was freaked out by the television movie The Day After.  The Soviet boxer Ivan Drago intimidated me in Rocky IV.  The opening scene in Red Dawn, in which invading Russians and Cubans parachute into a schoolyard and open fire on teachers and students, really unnerved me.  I ardently agreed with Sting when he sang, “I hope the Russians love their children, too.”  I wrote poems and songs about atomic war, radioactive fallout, and post-apocalyptic nuclear mutants.  I was actually very concerned about the mutants and what I would do about them if I survived an atomic attack.    

I wonder what the Cold War means today for Americans who did not grow up with it.  Our political discourse is rife with Cold War references, even if they are sometimes (okay, oftentimes) incorrectly used: President Obama is a communist, or a socialist, or a Manchurian candidate; health care reform will bring back gulags or make the U.S. like Cuba; communist symbols can supposedly be found on prominent Manhattan buildings (or so claimed Glenn Beck, telling FOX news viewers he found images of a hammer and sickle in Rockefeller Center).  Contemporary American culture is similarly throwing us back to the Cold War era: take for example the TV show Mad Men, the newest Indiana Jones movie, the documentary Virtual JFK (which imagines what might have happened in Vietnam if Kennedy had lived), and the recently published book The Hawk and the Dove, a dual biography of Paul Nitze and George Kennan.  There’s even a popular indie rock band from Fullerton called Cold War Kids.  And in 2010, Hollywood will release a remake of Red Dawn; in this version, teenagers will fight an invading force of Chinese and Russian soldiers.

So for the post-Cold War generation, that historical era lives on in partisan political name-calling and in popular culture.  Still, it’s strange for me to think that the Cold War is already becoming the stuff of popular memory.  For me, it will always be linked inextricably to my childhood, and especially to my childhood fears about how the world might end.  My college students today tell me that they don’t even worry about nuclear war.  Instead, they fear the world will end as a result of global warming or an infectious disease outbreak.  Or some believe that it won’t end at all. Perhaps that’s progress.

Read Full Post »

One of the laments we often hear about the state of childhood in America today is that childhood has become commercialized.  Our news media and bookstore bookshelves regularly feature stories about how American children are barraged by advertising and marketing: the average child views 40,000 television commercials a year, children as young as three can recognize brand logos, kids these days are “born to buy.”  There is no question that children are exposed to an inordinate amount of advertising and consumer messaging in their daily lives–just as adults are.  However, it’s sometimes easy to forget that what appears to be the latest “crisis” in childhood is not exactly so new.  I was reminded of this recently when I showed students in my popular culture class a clip from the 1950s children’s program, “The Ding Dong School.”

“Ding Dong School” (1952-1956, NBC) was hosted by Dr. Frances Horwich, an educator who pioneered the idea of interacting with her young viewers through the television set (an approach Mr. Rogers perfected years later).  “What day is today?” she would ask, then pause to give kids at home time to answer, “Friday.”  The content was meant to be educational and culturally enriching.  Like any other television show, however, “Ding Dong School” was commercially sponsored, and “Miss Frances,” as she was known, regularly pitched products to her young viewers.  The clip I showed in class was of Horwich hawking Wheaties.  You can watch the clip on YouTube here.  Viewing this ad, we are reminded that direct advertising to children is nothing new.  Holding up a Wheaties box, Miss Frances asks kids at home, “What do I have? [pause] What do you think it is? [pause] Say it with me–Wheaties!”  She proceeds to instruct her young viewers, “When mother goes to the store, you help her find the new Wheaties box… I’ll tell her about it.  You tell her about it, too.  Please.”  This naked appeal to children to develop brand loyalty, to think like consumers, and to influence their parents’ spending is arguably no less sinister than what many Americans rail against today as the “newest” threat against children.

There is in fact a long history in the United States of manufacturers marketing their goods and services to children, using a variety of direct and indirect means.  Boxes of Cracker Jack started coming with toys inside them in 1912.  In the 1930s, many toymakers began appealing directly to children (previously they had targeted parents with their advertising).  Pinocchio toys were in production a year before the release of the Disney film in 1940.

There is similarly a long history of adults fearing that children are being unduly influenced by consumerism.  In 1902, for example, social reformer Jane Addams observed starkly, “Has our commercialism been so strong that our schools have become insensibly commercialized?… Is it possible that the business men, whom we in America so tremendously admire, have really been dictating the curriculum of our public schools?”  And in 1956, Dr. Frances Horwich resigned from NBC because she was concerned that children’s TV programming had become too commercialized.

Watching Miss Frances sell Wheaties in class this week, and reflecting on this broader historical context, reminded me that many of our contemporary fears about American childhood have been around for a long time.  I was also reminded of a lament expressed by many scholars who study the history of childhood: As a society, we tend to do a much better job of complaining about the state of childhood than of actually working to change and improve the lives of children.

Read Full Post »