Feeds:
Posts
Comments

Photo Credit: Associated Press

The teacher protests currently taking place in Wisconsin are clearly significant for a variety of political and educational reasons. The events unfolding in Madison have amplified ongoing debates about unionism, the crisis in U.S. education, and the politics of state budgeting. The protests arrive after several years of escalating public criticism of teachers. And the outcome will potentially impact the contours of our national debate about education policy. What is happening in Wisconsin is also significant for cultural reasons. It is important to remember just how rare it is for Americans to see photographs and television footage of tens of thousands of teachers gathered together in one place. The uncommon images coming out of Madison are profoundly disruptive to our common sense about teachers and schooling. From a cultural perspective, the portraits of teacher protest now circulating in the mass media are especially striking for the ways in which they challenge traditional representations of the teacher in American society.

Historically, the teacher has been depicted in American culture as one of three types: the schoolmarm, the bumbling pedagogue, or the lone hero. The schoolmarm is typically an older, unmarried, rural woman who dedicates her entire life to her pupils. She is the Miss Dove character, or the prairie schoolteacher, or the imagined head of the classroom in the little red schoolhouse of yore. The bumbling pedagogue is usually male, often effeminate, and either clownish, pedantic, or otherwise socially awkward.  He is the Ichabod Crane of our popular memory, the Mr. Kotter, the Scott Guber of Boston Public. Finally, the lone hero is the renegade who operates in isolation. He or she employs a nontraditional pedagogy—often in opposition to an unsupportive administrator—in an effort to educate the students everyone else has given up on. From Blackboard Jungle to Stand and Deliver to Dangerous Minds, the lone hero is a staple of many Hollywood films.

Throughout the years, these stock images have mostly served to denigrate the profession of teaching, depicting it as a lonely, all-consuming, non-specialized career.  These fictional teachers rarely if ever complain about wages, benefits, or the excessive demands placed on them. They seem to willingly sacrifice their personal lives for the sake of their students. And they are almost never imagined as members of a teacher union. Such popular depictions also tend to suggest that teaching requires little if any formal training; in fact, many of these iconic pedagogues become teachers because they can’t do anything else. Most importantly, these images have worked to normalize the idea that public schools are crisis-ridden environments in which teachers must act in solitude, as outsiders, if they hope to accomplish anything.

Interestingly enough, these three dominant representations of teachers have subsided from view in recent years, supplanted in the media by images of tough, pragmatic, business-minded reformers.  I am thinking in particular of Michelle Rhee, former chancellor of public schools in Washington D.C. For a while, Rhee became a media darling who was profiled in Newsweek and Time (where she was photographed holding a broom) and was featured in the pro-charter school documentary Waiting for Superman.  In a similar vein, business luminaries Bill Gates and Mark Zuckerberg have been spotlighted as the new philanthropic saviors of U.S. education (the Facebook CEO recently donated $100 million to Newark public schools).  The media celebration of these corporate reformers has conveniently dovetailed with broader calls to privatize public schooling and introduce more business-inspired models of incentive pay and accountability into school systems.

At the same time that our iconography of school reform has been crowded with the likes of Michelle Rhee for the past few years, U.S. popular culture has propagated a more deviant image of the classroom teacher. Television shows in particular have featured teacher protagonists who presumably mean well but who must of necessity supplement their educator salaries with illegal activities.  Take, for example, the cancer-stricken chemistry teacher who cooks and sells methamphetamine on Breaking Bad, or the history-teacher-turned-male-escort on Hung.  Just this past week, the trailer for the forthcoming film Bad Teacher, starring Cameron Diaz as an apathetic, pot-smoking, foul-mouthed schoolteacher, went viral (curious timing, to say the least). Teachers like these reinforce the perception that we need no-nonsense leaders like Michelle Rhee to sweep schools clean of them.

All of this explains why the images coming out of Madison are so culturally significant. First, these images show thousands of teachers united together in solidarity.  Here we see not one teacher working alone, but a mass of teachers, a community, working together toward a common end.  Second, these depictions of non-elite, everyday people taking to the streets in the name of public education contrast sharply with the lone power suit administrator or the billionaire philanthropist imposing top-down reforms on supposedly inept teachers.  To best illustrate this second point, I can’t help but compare the pictures of thousands of teachers protesting in Madison with the Time magazine cover of Rhee standing alone with a broom back in 2008.  The new imagery overwhelms the old, draining the Rhee photograph of its symbolic power.

Finally, these many iconic images of Wisconsin teachers project strength—not bumbling hesitancy, or shoulder shrugging resignation, but conviction and fortitude.

Without question, these cultural representations matter. As the editors of the 1994 book Schooling in the Light of Popular Culture remind us, education in the United States is “likely to be understood in ways that are at least in part beholden to popular images and ideas embodied in widely disseminated texts.”  The cultural stories we tell ourselves about schooling can shape how we discuss education, debate policy, and perceive teachers. Today, at least, the quaint faces of Ichabod Crane and the frontier schoolmarm have been replaced by a bold faculty of thousands. Long after the impasse in Wisconsin is resolved, the images emanating from there are sure to resonate in our popular imagination.

Advertisements

The Lonely Network

The great irony of the The Social Network, of course, is that its central theme is not connectivity but disconnection.  A film about the genesis of a technology designed to bring people closer together features a main character who experiences the painful dissolution of social bonds. The plot is driven by protagonist Mark Zuckerberg’s exclusion from social groups, the end of his romantic relationship, and the collapse of his close friendship.  This sense of disconnection is pervasive despite the fact that The Social Network is a crowded movie: characters are almost never seen alone (only when they are individually “wired in” to their computer screens), and certainly never seen enjoying a moment of quiet solitude. Instead, the characters are regularly packed together in small places—a legal office, a dorm room—or in big, loud impersonal places—a nightclub, a drunken party. But these spaces, despite their capacities, are repeatedly portrayed as lonely and alienating.

While the characters may inhabit a decidedly unsocial non-network beneath a façade of constant social interaction, the film itself serves as a remarkably vibrant cultural network. For the student of American culture, The Social Network is a fountainhead of intertextuality.  Perhaps appropriate for a film about Facebook, The Social Network functions as a busy crossroads of cultural referents, many readily recognizable, others unstated but nevertheless present.  The movie obviously plays on our familiarity with Facebook, but it is also features appearances by Bill Gates and the creator of Napster (both portrayed by actors), a musical score by industrial rock luminary Trent Reznor of Nine Inch Nails, and a Harvard setting (even though campus scenes were mostly filmed at The Johns Hopkins University).  It is also directed by David “Fight Club” Fincher and written by Aaron “West Wing” Sorkin.  One of the students on Zuckerberg’s Facebook team is played by Joseph Mazzello, the actor who starred as the little boy Tim Murphy in Jurassic Park.  In other words, what is really “social” about The Social Network is the way in which it serves as a pulsating intersection for a range of icons, myths, and expressive forms that circulate in the audience’s collective imagination.  It is populated by cultural detritus and ephemera with which we are friendly, if you will.

I imagine these multiple and varied cultural associations may in part explain the source of the film’s pleasure for viewers. The experience of viewing The Social Network is akin to data-mining.  It rewards a 21st century audience accustomed to scanning mounds of digital information and quickly categorizing that information into familiar frames of reference. For example, the brief appearance of Bill Gates evokes our Horatio Alger myth of success and the American dream.  The presence of Sean Parker of Napster infamy conjures associations with the lone rebel, hacking the system and sticking it to the man.  And Zuckerberg himself comes across as a nerd pulling one over on the Olympic jocks.

Reznor’s musical score triggers memories of his earlier work on such Nine Inch Nails albums as Pretty Hate Machine (1989).  That record opens with the line, “god money I’ll do anything for you/god money just tell me what you want me to,” and builds to the chorus, “head like a hole/black as your soul/I’d rather die/than give you control.” Pretty Hate Machine, with its loud synthesizers, drum machines, and vocal wails, is not unlike The Social Network: an expression of male adolescent angst and rage confined inside an electronic world.

And there are still other resonances: Fincher’s directorship reminds us of his previous explorations of masculinity and antisocial behavior in Fight Club, The Game, and Zodiac.  Sorkin’s dialogue echoes the brainy loquaciousness of the political characters he developed for the television show The West Wing. Nearly twenty years ago, in Jurassic Park, actor Mazzello played a child victimized by a technology abused and gone awry.

As I watched The Social Network, I even found correspondences with The Lonely Crowd (1950), the sociological study of “other-directed” social character that became a bestseller in postwar America.  Co-authored by David Riesman and his team, the book argues that Americans are increasingly motivated by the need for peer acceptance.  More and more, our “inner gyroscope” is set in motion not by individualistic self-reliance, but by the drive to win approval and fit in.  Consequently, our time is spent trying to decode what is popular and adjust our personalities accordingly: “The other-directed person acquires an intense interest in the ephemeral tastes of ‘the others’… the other-directed child is concerned with learning from these interchanges whether his radar equipment is in proper order” (74). What is The Social Network if not a depiction of a lonely crowd?  Indeed, isn’t Facebook itself a kind of lonely crowd?

I can’t help but wonder if this way of reading the movie—this pleasurable scanning of its cultural allusions—in some way works to conceal its underlying critique of 21st century connectivity. The film’s central theme of social dissolution is undercut by its teeming network of familiar, friendly signifiers.  Its “ephemeral tastes.”  Yet we are “friends” with these signifiers in as much as we are “friends” with hundreds of people on Facebook—another text, by the way, that we scan with the same data-mining mindset.  As portrayed in The Social Network, Facebook doesn’t really bring people closer together in any meaningful way; it facilitates casual hook-ups in bar bathrooms, or it breeds addiction, as one character confesses, or it quickly delivers trivial information like the results of a crew boat race. You would think The Social Network, with its depiction of Zuckerberg and his creation, would compel a mass exodus from Facebook, or at least spark critical public reflection on our complicity with this technology of the lonely crowd. But instead it rewards and reinforces our ways of reading: our ingrained habits of consuming images, scanning information, reducing human experience to pannable nuggets of gold.

I have two bookish confessions to make.  The first is that I keep a running list of every book I have read since I graduated from college some twenty years ago.  The list includes the author and title as well as the month and year I finished each book.  The list includes books I read for the classes I teach as well as books I read for “pleasure.”  The list includes unabridged audio books, or at least it would if I had ever actually finished listening to one in its entirety.

I keep the list to remember and to mark my progress—each year I try to read at least one more book than I read the previous year.  The list also helps me recall not just the titles, but the cities in which I lived and the specific places in which I read those books: at a coffee shop in Austin, on the beach on vacation in Puerto Rico, in my grandparent’s house one winter in Vermont, on the couch in my small apartment in New Hampshire, in a summer sublet in Boston.  The list generates strong associations with the past—with the seasons, the weather, the people around me, the fleeting episodes of happiness, confusion, sadness in my life.  The list, in other words, is more than a tally of books.  Each book anchors me in a moment.

My second bookish confession is that each year, after Thanksgiving, I begin the pleasantly agonizing process of selecting the books I will take home with me for the holidays. I stack, I list, I revise lists, I restack.  I try to figure out which books I want to relish during my December visit to family in New Jersey.  It is, admittedly, quite the process.  Almost always it must be a novel.  Something I have wanted to read for a while.  Something that will offer satisfaction and escape, and not too much challenge.  Something I will look forward to reading after my parents have gone to sleep (which happens earlier and earlier each year), or before my nephews rush into my bedroom to wake me up the next morning (which, thankfully, happens later and later each year).  I usually start with about fifteen books and taper it down to two or three by the time I pack my suitcase.

This holiday season, I took a moment to revisit the books I read during past December visits.  Ever the connector of dots, I found some curious patterns.  Unbeknownst to me until now, the books I read over the holidays tend to fall into four sometimes overlapping categories: from mom’s bookshelf, work-inspired, political, and ambitiously literary.

Apparently on several visits home, I abandoned my carefully chosen books and instead selected one from my mother’s bookshelf in her office.  My mother likes buying used books at public library sales, and she’s amassed an impressive collection over the years. I pulled Hard Times from the shelf in 1994, A Clockwork Orange the year before (why did my mother have this one?).

When I taught in an Education department at a college in North Carolina a few years ago, it seems I was inspired to read school stories: Old School, the excellent prep school novel by Tobias Wolff, followed the next winter by the classic Goodbye, Mr. Chips.

Election seasons punctuate my list: Howard Dean’s Winning Back America in December 2003, Barack Obama’s The Audacity of Hope in December 2006.

The rest of the books fall into the ambitiously literary category, books I must have picked because I was on some mission to expand my literary horizons over the holidays.  One Christmas I read Kafka’s The Castle.  Another year I worked through Nabokov’s Palefire (so much for the “not too challenging” criterion) and Kurt Vonnegut’s Cat’s Cradle.  In 1997 I read a biography of Zora Neale Hurston.  In 2008, it was Toni Morrison’s A Mercy.  Raymond Carver’s Cathedral and George Eliot’s Silas Marner were my Yuletide pleasures in 1996.

What these four categories have in common is setting: my parents’ house, the place I left at age 18 and return to every Christmas.  These are the books I have read in my childhood bed, now long donated to one of my sisters.  In the guest bed in the basement.  On the couch in front of the fireplace in the living room.  In the oversized recliner in the den.  Each holiday book brings back the smells and tastes of home cooking, family voices, a glimpse of snowflakes outside the window, the barking of my parents’ dog.  This may be why I select the books with such care: they will absorb these memories.  Preserve them.  These books will carry a little bit of family history with them.  They will be so much more than an author and a title on a list.

Which reminds me—I still need to whittle down my stack for next week.  Happy holidays, readers.

After watching the documentary, Food Inc. the other day, one scene stood out.   It wasn’t any of the film’s most shocking, arresting, or appalling moments.  It actually unfolded in a rather quiet setting.  A Walmart dairy buyer stands just inside the fence of an organic farmer’s lush green fields.  He is there to buy as much hormone free milk as he can gobble up.  “We won’t be here,” he admits, “if it wasn’t for customer preferences.”

The Walmart buyer’s statement says a lot about how the post-need consumer economy works.  Even the largest retailer and supermarket chain in the world has to bend to “customer preferences.” This points to an essential aspect of the nature of transactions.  Despite all the Mad Men and Madison Avenue manipulation, consumption, especially of relatively cheap, faddish items like food and fashion, represents what might think of as a rough democracy.

Walmart and other companies need to give us — consumers — what we want, or we will go elsewhere.  The rough democracy of desire means, then, that we vote with our money and credit cards at the point of purchase.  What’s popular sells, what isn’t doesn’t.  (Remember the New Coke.)  It also means that we can use our buying muscle to shape purchasing policies at the top, what or how the companies we patronize operate in the global marketplace.

While I was doing the research for my book, Everything But the Coffee, I traveled around the world going to Starbucks.  The results were, in some ways, rather disappointing.  For the most part, a Starbucks in Singapore looks, runs, and tastes exactly like a Starbucks in Seattle.  Except for one thing.  Starbucks devotes different amounts of signage and beverage and shelf space to fair trade coffee in different parts of the world.

In China and Japan, Starbucks stores said nothing about fair trade, no signs, no brochures, no messages on the back of cups.  When I asked a Starbucks in official in Japan — an American who didn’t speak Japanese — why there weren’t any fair trade drinks or signs with fair trade coffee farmers on them in Tokyo, she paused for a moment and said, “on one asked.”

No one asked.  Well British customers must have asked.  On a visit to a Starbucks in Norwich, England in 2009, there were signs everywhere about fair trade.  Grizzled, happy, handsome hard-working farmers — imagine Latino versions of the Marlboro Man — looked down from the posters on the walls and bathrooms, reassuring customers concerned about where their beans came from that their purchases improved the daily lives of growers in Central America and beyond. Sixteen months later, I went to that store again and found out that Starbucks  in the United Kingdom had dramatically changed its policy.  “Every Latte, Every Cappuccino,” the cups promised, was “100% Fairtrade coffee.”

In the US, the status of fair trade is somewhere in the middle between Japan and the United Kingdom.  Less than 10 percent of the beans Starbucks uses here where the companies operates more than 10,000 stores come from fair trade farms, though at least a quarter of the company’s signage seems to talk about Starbucks’ modest fair trade purchases.   On college campuses, where fair trade support is ostensibly the highest, the company regularly features Cafe Estimo (estimo means esteem in Spanish) — its fair trade blend — as its coffee of the day.

Thinking back to the comment from the Walmart buyer featured in Food, Inc., the differences in fair trade at Starbucks can be read as a poll, as a barometer really, for support for global awareness and fair trade consciousness in different countries around the world.  These disparities also tell us something about the rough democracy of buying.  Companies will, as the Walmart man tells us, shape their products to meet consumer desires.  Consumers, then, need to be more aware of their power.  If they raise their voices, or withdrawal their purchases, firms will respond.  That’s what happened with Starbucks.  Japanese customers haven’t asked for fair trade coffee, so they don’t have a choice.  But in the UK, the customers wanted it and got it.

The realm of consumption may just be a new — or renewed — front for justice.  Perhaps it is here — even more than the political realm where Senate seats are going in this election cycle for between $10-$141 million — that consumers can have the greatest efficacy and be heard the clearest.

But this remains only a rough form of democracy.  Corporations aren’t the most publicly minded or trustworthy of allies.  Like crafty centralist politicians, they want to co-op and de-politicize issues.  They are interested in more votes — in more customers — not justice, or even fair trade.  But they can be moved.

What do Philip Roth’s polio novel Nemesis and Justin Cronin’s vampire novel The Passage, both published in 2010, have in common?  Quite a bit, actually.  Roth’s novel renders the atmosphere of fear surrounding a polio outbreak in Newark, New Jersey in the summer of 1944.  The protagonist is a young playground director futilely trying to protect the children he supervises from contracting polio.  Cronin’s novel imagines a world ninety years in the future that has been overrun by killer vampiric “virals” as the result of a scientific experiment gone awry.  It focuses on a small colony of people struggling to defend themselves against the virals.  Both books deal with the pervasive threat of an invisible, poorly understood contagion that has no known cure.  Both focus on adult attempts to shelter children from the contagion.  In both, characters wrestle with existential questions and religious doubt—why would God inflict polio on children?  Do virals have souls?  Both books demonstrate how natural time changes—spring to summer, day to night—can heighten the threat of contagion and force humans to change their routines and behavior.  Both show how suspicion and fear can cause communities to turn against one another.  As cultural artifacts of the moment, both novels also resonate powerfully with contemporary anxieties surrounding contagion in our everyday lives.

In recent years, Americans have been alerted to the threats posed by multiple toxins and diseases: salmonella in eggs, ecoli in spinach and peanutbutter, cadmium in Shrek glasses sold at McDonald’s and jewelry sold at Wal-Mart, H1N1, West Nile virus.  Such warnings came about against the backdrop of broader fears about biological terrorism post-9/11. The current wave of viral panic is of course the most recent chapter in a very long history of epidemics—Ebola in the 1990s, mad cow disease in the 1980s, polio in the 1930s and 1940s, the black death in the 14th century, and so on.

There is a cultural dimension to this biological history—a cultural history of contagion.  Nemesis and The Passage are the latest examples of artistic expressions, literary and otherwise, that in some way reflect, reimagine, or comment upon viral panics.  Think Daniel DeFoe’s Journal of the Plague Year (1722), Thomas Mann’s Death in Venice (1912), Albert Camus’s The Plague (1947) (literary critic Elaine Showalter recently made a brilliant connection between Nemesis and The Plague), and Stephen King’s The Stand (1978/1990).  Think nonfiction books like The Hot Zone (1994).  Or films like Panic in the Streets (1950), Omega Man (1971), and Outbreak (1995).

As artifacts of the 21st century culture of viral panic, unlikely bedfellows Nemesis and The Passage join a diverse cultural cohort that includes the comic book series Y, about a plague that kills every mammal on earth with a Y chromosome; the 28 Days and 28 Weeks Later films, featuring the “rage” virus; the horror movie Cabin Fever, starring a flesh-eating virus (tagline: “Catch It”); Max Brooks’s book World War Z, an “oral history of the zombie wars”; and the forthcoming young adult novel Virals, about a group of teenagers exposed to canine parvovirus.

These contemporary examples of the culture of viral panic offer audiences ways to process a number of fears and anxieties that circumscribe our everyday lives in the 21st century.  Anxieties about the limitations and abuses of science and medicine.  Anxieties about our ability to protect children and youth from invisible menaces.  Anxieties about community cohesiveness and civility.  Anxieties about the government’s ability to respond to disaster.  In other words, the culture of viral panic is never just about the threat of contagion.  It always speaks to broader concerns about institutions, social relations, childrearing, science, spirituality.  Read in this context, Nemesis and The Passage have a great deal in common.

Stomach Share

What would you think if someone told you that they were fighting for a “share” of your stomach?  Bring to mind organ harvesting? alien invasion? theft?

But this is in fact what the food industry is doing, and has been for some time.  I first heard this term last week when I took part in a to-remain-nameless gathering of food experts in the bay area.  It was in the context of a discussion of how we might, as eaters, make healthful choices in the American food marketplace.  Someone in the room recalled being at a food industry gathering, recently, where executives from a soda company were debating how to increase their “stomach share.”  They were seeking to expand their line of products (from sodas, to juices, waters, and exercise drinks) to make sure that whenever someone put a beverage in their stomach it was from company X.   Rather than merely competing with another brand in, say, “the marketplace,” the “stomach share” metaphor takes the battle to the consumers’ own body.  The question is not just how can we ensure that the consumer is buying the maximum amount of our product, but also how can we ensure that whenever the consumer is ingesting it is our product that’s got a majority share of the space.

This stomach talk reminded me of another phrase I’d come across in my research for Empty Pleasures—“prosperity stomach.”  Coined in 1966 by Henry Schacht, an executive from a diet-food company, and mentioned in a talk to newspaper editors called “How to Succeed in Business without Getting Fat,” the phrase referred to a troubling problem faced by the food industry.  Because people (at least the middle class) did less manual labor and had more money to buy the cheaper food produced by American industry, they had begun to gain weight.  That wasn’t really the problem from Schacht’s point of view.  More troubling was that this weight gain meant that they could not—or would not—buy all of the food they wanted—food that industry could profit by selling.  The answer? Diet Foods.  By developing more foods that had fewer calories, manufacturers and marketers could enjoy profits in excess of the stomach barrier.

“Stomach share” and “prosperity stomach”—terms invented nearly fifty years apart—remind us that the food industries have long viewed consumers as reducable to mere storage spaces for their products.  Within this climate, the wonder is not that our stomachs have expanded, it’s that they have not expanded even further.

Slow foods & local foods are fabulous, and we should be grateful to those who have made them part of the landscape.   Thanks to advocates in recent years, many of us can now purchase a tomato from the store and know where it came from or have a conversation with a grower while getting apples at the local farmer’s market.  These encounters allow us to better understand our food, and they make eating more pleasurable by connecting us to the past, and to each other.

But they do not solve the fundamental problem we face with American food and the way we’ve been taught to use it.

Our generation has inherited a food system—and by that I mean everything from how our food is grown, to how it is processed, to its flavors, to its branding, marketing, and store shelf placement—that depends on convincing people to eat and drink way more food than they need, way too much of the time.

Within this system, it would be wonderful if everyone could go local—slow—organic.  But better health doesn’t depend on it.   Better health does depend on being able to eat moderate amounts, for rational reasons, and stopping when one is full.

It’s not a zero sum game, of course.  We can have local/slow food and ethical/sustainable food production and marketing practices.  Yet I notice that I hear far more about the importance of yummy foods and regional farmers than I do about the importance of fair, just, and rational food landscapes.  Why not advocate for the heirloom tomato and the concept of a once-in-a-while soda and the removal of nutritionally bereft (2 for 1!!) foods from the end of the aisle at the grocery store?  Consider what would happen if some of the efforts we put into creating and sustaining farmer’s markets were diverted to build better barriers between food promoters and American stomachs.

I think part of the problem is that the food revolution has become, in a way, too tasty.  We want to advocate for structural change, but we want to do it through a good meal that we enjoy.  Fighting for accurate claims in food marketing (diet! natural! healthy!) or dissecting grocery store product placement—these are a long way from meals many of us who care deeply about food would even want to eat.  Thus, while we feel occasional outrage (bewilderment?) when we see a mega display pushing 3 12 packs of soda (+chips!) for $10, the attention can easily drift away from problems that, if solved, would benefit someone else (who doesn’t “eat right” anyway…) to those that benefit ourselves.

We should sit down together at the table of slow/local and celebrate our good fortune.  And, when the meal is done, take that tasty energy we’ve ingested and use it to regulate industry claims and prevent the over-making and over-marketing of all kinds of food in the US.

That way, no matter how fast their food moves, all eaters could have a better shot at health.