Feeds:
Posts
Comments

In my late thirties, I developed an obsession with American cinema of the 1970s that surfaced yearly in the dead of winter.  The obsession would start with me reading a scholarly book about seventies films.  This would be followed by a manic updating of my Netflix queue and then marathon screenings.  I would usually burn out by late March.  I would move on to other interests.  I would forget about seventies films.  But the cycle would start again the next year.  Always like clockwork, yet never planned.

The obsession began in the winter of 2006, when I bought The Films of the Seventies, an annotated filmography of every American, British, and Canadian movie that had come out between 1970 and 1979.  I bought the book on a lark at a used bookstore in Winston Salem.  I was living in North Carolina with a girlfriend, in a relationship that was still a year and a half away from its end but already on the decline.  When I got the book home, I highlighted all of the films I had already seen: Jaws, Star Wars, Breaking Away, Alien, Grease—the stuff of childhood memories.  Then I proceeded to watch classics that, embarrassingly, I hadn’t yet seen: Dog Day Afternoon, Network, Serpico, All the President’s Men, The French Connection, Chinatown.  At the time, I saw this as an exercise in self-education.  I had recently finished a Ph.D. in American Studies, and I felt I needed to know about these films.  The project preoccupied me for a few months, and then it ended.  Or so I thought.

In February 2007, the obsession returned.  I read, in quick succession, Peter Biskind’s Easy Riders, Raging Bulls: How the Sex-Drugs-And-Rock ‘N’ Roll Generation Saved Hollywood (a book I had been assigned in grad school but never read) and Peter Lev’s American Films of the 70s: Conflicting Visions.  The reading had nothing to do with my teaching or my academic research.  I just found myself enthralled by these assessments of cinema and the 1970s.  Biskind hooked me when he wrote, “this was a time when film culture permeated American life,” when film was “no less than a secular religion.”  I was intrigued by his claim that “this was to be a directors’ decade if ever there was one.”  Lev similarly captured my imagination when he noted that this was “the most exciting and most experimental period of the American feature film.”  I made lists as I read, and I subsequently watched movies like Shampoo, Scarecrow, The Parallax View, The Conversation, Marathon Man, The Last Detail, and, fittingly enough, Obsession.  This time, my fixation lasted into the spring.  By the summer, I had moved to California to start a new job and my six-year relationship had ended.  I was thirty-seven years old.

A year went by without a relapse.  I was busy teaching at a new school and processing my failed romance.  I spent February and March in a battle with a loud, temperamental neighbor.  In search of a fresh start, I moved into a beachside apartment in April.  I cultivated new friendships.  There was no seventies that year for me.

In the winter of 2009, however, it started again.  I devoured a collection of essays, American Cinema of the 1970s: Themes and Variations.  I made more lists.  My viewing included Klute, Nashville, Night Moves, Joe, Phantom of the Paradise, Hardcore, McCabe and Mrs. Miller.  But that year, I had a revelation.  I recognized a pattern not in the cycle of what I was doing, but in my experience of it: as it turned out, every time I revisited the ’70s, I liked reading about the films more than I liked watching them.

The books made the movies sound better than they were.  The books put the films in a historical context, explicated their cultural themes, imbued them with political significance—Watergate and Vietnam, the working class, paranoia and conspiracy, generational strife.  But the movies themselves?  They were often plodding, incoherent, indulgent, dull.  Not all of them.  But many of them.  The Last Detail bored me.  Night Moves confused me.  Phantom of the Paradise was whacky.  Joe was improbable.

In retrospect, the books I had read warned me of this. Peter Lev, for example, states bluntly (albeit on the very last page of his book), “one should not overly romanticize the films of the 1970s.  In a period of uncertainty and change, many mediocre films were made.”  In his introduction to Themes and Variations, editor Lester Friedman counsels the reader to beware of “hyperbolic assessments” of the era, for they “belie the fact that terrible movies were made during the 1970s.”  He writes this on page twenty-three, so I suppose I have no excuse there.

The same thing would happen each year. I regularly found myself waiting impatiently for a movie to end just so I could highlight the title in my filmography and be done with it.  Or avoiding watching it altogether.  My ex had noticed this tendency of mine when we still lived together in North Carolina.  Some of the titles that arrived from Netflix would sit next to the television for weeks.  She would threaten to mail them back the next time I was away from the house.  “You don’t have to do this to yourself,” she said at one point, sending us both into a fit of laughter.  And yet I did.  I was stuck in a cycle of pleasure and tedium.

Then I turned forty in 2010 and it all stopped.

I have two related theories about my obsession with American cinema of the 1970s.  The first is that the books I read provided a historical context for a time in my life when I was aware of very little beyond my own childhood.  The books offered a portrait of a cultural milieu that was taking shape outside of my tiny world of family, play, and school.  I could insert the memories and images I had of my own seventies childhood—a quite happy childhood—into a bigger picture of what was going on in America.  This was an exercise in intellectual nostalgia, in historicizing my youth.  That’s why I liked reading about the films more than watching them.  The moving pictures I had in my head, of me then, were much better.

The second theory is that reading about the seventies provided an escape.  In my late thirties I was going through major transitions: a stressful job search, the end of a serious relationship, a move to California.  Rather than deal with the corrosion and change that defined my present, I retreated to a safer past.  It wasn’t really about the seventies at all.  Sometimes we sit through bad movies because we need to.

So was my obsession with seventies films about nostalgia or escape?  I’m not sure they’re so different.  I was certainly obsessed with both at a particular stage of my life.  All I know is that when I finally let her go, when I turned forty and resolved to look forward more than I looked back, that decade also lost its purchase on me.  It’s true that one should not overly romanticize American cinema of the 1970s.  But it’s also true that one not overly romanticize the past.  Those moving pictures may sound good in theory, but sitting through them can just be another way of making the present pass by.

Not Again

When I first learned of the massacre of children and teachers in Connecticut, my first thought was “not again.” Based on my Facebook and Twitter feeds I am not alone. “Not again,” we say, first in disbelief, and then in resolve. Yet until something changes, we all know, it will happen again.

The first step, I think, is to realize that we face a human problem. Evil is part of our nature as surely as good is. We are a species of violence. This is a profound truth—a historical truth, a biological truth, a moral truth. The great theologian Reinhold Niebuhr once declared, “The doctrine of original sin is the only empirically verifiable doctrine of the Christian faith,” and now we have yet more horrific evidence of this truth. Yet if we stop our thinking there, we remain mired in a hopeless nihilism. Surely evil will always be with us—in this world, at least. But the persistence of evil is no reason to cease working for good. We can’t make a perfect world, but we can make a far better one.

We are now appallingly familiar with the script that plays out in the hours and days after an atrocity like this, and so we know that soon enough some pundits or politicians will start declaring that the answer to gun violence is more guns. If only a teacher had been armed, or the principal. Since evil will always be with us, they say, if the other guy has a gun, I better have one too. A bigger one. This is the answer of moral nihilism. We can do better.

Pondering evil is a fine place to start, but we need to understand that this kind of evil is not just a human problem—it is, sadly, a particularly American problem. We live in a culture diseased by violence. Ours is a frontier society, born and nurtured in conquest, genocide, enslavement, and revolution. Ours is a beautiful and noble experiment in freedom and democracy too, admirable and good in many ways—but also, clearly, a society still living off an inheritance soaked in blood. This inheritance has left us a culture that glorifies war and weapons, that preaches perverse doctrines of honor and distorted notions of manhood. The Swedes and Japanese and Indians share our same human condition, but they don’t kill like we do. We are reaping what we have sown.

This realization too might lead one to despair. With so many guns, and such a legacy of violence, the best we can do, says the gun lobby and its enablers, is arm ourselves, protect our own. Aside for the empirical reality that such defensiveness rarely works—as in the case in Newtown, the guns in ours homes are far more likely to kill through crime or accident than to protect—this position fails to understand that cultures can and do change. Our American cult of violence is a product of history, and it can be undone in history. Not easily, not quickly, but schools and parents and churches and governments and people of courage and character can move the tectonic plates of culture. We may never live in a world without violence, but we can live in a society that doesn’t worship killing and glorify killers. We can. Some day.

But we don’t have to wait for that day before we save lives. We can start tomorrow.  Most obviously:

We can regulate guns.
We can provide better mental health care.

These steps would vastly reduce the mass killings that happen in those places where people say, “we never thought it could happen here.” To middle class folks. To white folks. In malls and offices and kindergartens. Good policy ideas abound, ideas that could immediately reduce the number and scope of these kinds of mass murders. All we need is the political courage and leadership to make it happen. These simple steps really shouldn’t be that hard.

Most murders, of course, don’t happen in those places. Most killings happen in ghettoes of violence, in the places where people know all too well that it can happen here, because it does, relentlessly. The chronic violence of these forgotten places, places where the poor and black and brown live, is a product of our particular history too, the result of our uniquely American combination of racism and neglect and greed. We can and must root out the violence in these places, through policies that provide educations and jobs, hope and opportunity, free from the shackles of systemic poverty and racial injustice.

This is hard, never-ending work. But we must start, now, if we want “not again” to mean anything more than an empty cry.

I am a member of the faculty at UC Davis, and my campus has been roiled these past few days by violent actions taken by the campus police, directed at non-violently protesting students. The video of campus police pepper-spraying students has been seen ’round the world, and letters of protest, outrage and demands have been blowing up my email inbox all weekend.

 

On Sunday, just as I was catching up on Friday’s events, I was also set to present at a panel at the American Academy of Religion Conference, in San Francisco.  The panel was convened by the good people behind Freq.uenci.es, a “collaborative genealogy of spirituality” and a website to which I’ve contributed.  I prepared some remarks, but I felt impelled to preface them by saying how weird it felt to talk about spirituality life on my campus — for my students, colleagues and friends — had changed irrevocably.  I opened with some comments about feeling that tension, and then went on with the show.  I explained that I was having a hard time reconciling these two things, but that I was okay to live with that unreconciled tension.

 

This morning, on my way to participate in the General Assembly at UC Davis, a friend told me about another video.  A second video – not of police and protesters, but of the Chancellor, Linda Katehi, being walked to her car, following an on campus meeting Saturday night.  In the video, you see Katehi walking down a road, flanked by hundreds of silent, standing students, making their protest as loudly as possible by not saying a thing. 

 

All you hear are her heels click-clacking on the asphalt.

That video, not the one of the cops, speaks to the power of these students that this community.  And the sound of her heels amidst the crowd of hundreds of students express perfectly the potential in this moment.  Potential that is political, and dare I say: spiritual, too.

 

The most recent data released by the U.S. Bureau of Justice Statistics (2008) disclosed that 11.7 million persons in the United State experienced at least one attempted or successful incident of identity theft during a two-year period.  The incidences increased 22% from 2007 to 2008.  For the past decade it has been a federal crime to “knowingly transfer, possess, or use, without lawful authority, a means of identification of another person.”  A steady stream of news stories reminds us of the dangers of identity theft and identity impersonation—from credit card and social security numbers stolen, to bank accounts hacked into, to child predators trolling social networking sites under false identities.

Our legal system and news media have sent us a clear message about identity theft: it is criminal.  It is unacceptable.  It is a menace to society.  You need to protect yourself, vigilantly, against it.  But in the imaginary world of popular culture, the ethics of identity theft is, well, far less rigid.  In the fictional stories we tell ourselves about identity fraud, the act itself can actually be a justified means to a moral end.  In the past year alone, three films have been released with plots that are propelled by characters who assume false identities, identities inhabited “knowingly” and without permission.  In the sci-fi film Source Code (2011), a soldier is transposed into the body of an everyday man who is riding a train that will blow up in eight minutes; he is occupying the man’s body in order to try to find the bomber on the train.  In Unknown (2011), a doctor awakens after a car accident to find that another man has assumed his identity, is married to his wife, and has his career (the other man’s photograph even shows up when the doctor’s name is searched on the internet).  Finally, the documentary Catfish (2010) explores the world of Facebook and the ways in which users perform their identities for people they’ve never met in real life.  I don’t want to give too much more away from any of the three movies, as each contains various twists and turns, but suffice it to say that identity cloning and identity manipulation is a theme that runs through all of them.

And here’s what strikes me about this cultural content and its broader context: in each case, the ethics of assuming someone else’s identity is not exactly black and white. In fact, in the moral logic of Source Code, posing as another person is wholly justified if it can help save the lives of others.  In Unknown, it turns out that identity fraud is ultimately a redeeming act for one of the characters.  And Catfish challenges the audience to reflect on whether questionable identity performance justifies the good works that can result in the process.  In short, if another person’s identity is assumed for reasons other than criminal purposes and/or economic gain, then it may just be okay.

It’s not just these three movies that explore the ethical gray area of identity theft.  We can go back farther into the decade to find others.  As readers of the Harry Potter series know, the magical “polyjuice” is a potion that will transform the imbiber into another person, at least in outward appearance.  While it is used to evil purposes in one of the books, it is also used by Harry and his friends Hermione and Ron throughout the series to obtain valuable information and pursue dangerous missions.  Their temporary acts of identity theft are deemed morally defensible.  Another example: the hero protagonist of the popular 2002 film Catch Me If You Can is a clever identity imposter—based on a real-life criminal—whom the audience finds itself rooting for.  And in the TV show Mad Men (spoiler warning…), the lead character is not actually the “real” Don Draper, but is a Korean war vet named Dick Whitman who has assumed the identity of an officer killed in the war—named Don Draper.

Of course, the theme and plot device of false identity is not a new preoccupation of literature and cinema.  Mark Twain explored it in Pudd’nhead Wilson, Herman Melville in “Benito Cereno,” Vladimir Nabokov in his dark novel Despair.  Older movies, like Day of the Jackal, the film version of the play Six Degrees of Separation, and the science fiction story Gattaca all revolve around identity fraud.  But in the world we live in today, where online identity theft is a rapidly growing criminal phenomenon, I find these contemporary iterations of the theme even more noteworthy.  As the performance of our identity becomes more fluid, unstable—indeed more uncomfirmable—in this online age of social networking, it seems that our sense of the morality of identity performance has become more muddied and culturally contested.  The law and the news media are telling us one story about the ethics of identity theft and manipulation; our popular culture is telling us another; and, I daresay, what we post on our own Facebook and Twitter accounts is telling us a third.

1. Gather your friend(s) together.
2. Listen to new episode of This American Life.
3. At the end of the episode, as Ira runs down the credits, each person guesses which clip will be used to mock Torey Malatia.
4. The person who guesses correctly gets to drink.
5. Repeat.

How did the humanities lose out to neuroscience in making culture seem interesting?

I’ve been listening to a lot of NPR and podcasts lately. I’ve given my historical favorites a little break (sorry, This American Life and Sound Opinions), and I’ve been listening more and more to Planet Money and Radiolab (as podcasts), and to the usual NPR fare that airs while I do dishes or cook dinner: All Things Considered, Marketplace, and of course: Fresh Air.

What I’ve noticed is how often scientists and economists show up on theses shows to talk about things I thought were the main interests of humanists and social scientists. Questions like the how restaurants work, whether or not race matters, why certain songs get stuck in our heads, how people calculate the value of things or make decisions they know are not in their best interests, and so on.

These are the questions to which I have long sought answers by looking at culture and its various expressions, and in which my field of American Studies has long been interested (albeit in different forms, over time).

Yet somehow, every time I turn on the radio, I find one or another neuroscientist (or, often enough, Jonah Lehrer) talking about precisely these same questions, and about how the pathways of neurons and synapses can answer questions art or love or whatever.

So here’s my question to my colleagues: how did we become so untouchable or so uninteresting to mainstream media? How come the good people at NPR (and, presumably, their listeners) find neuroscientists and economists more interesting and more capable of talking about these questions that we are? How did they become the go-to academics for understanding how and why people do what they do? Social scientists and humanists look at those phenomena, too, but somehow, we have become generally less interesting than our colleagues.

This is not the neuroscientists’ fault: they are good at what they do, and their creativity in asking profound questions that teeter on the line between culture and biology ought to be encouraged. Similarly, it’s not the fault of the radio programmers; they are looking for the most intelligent, engaging guests they can find. And they’re finding them in neuroscience and economics, not in the Humanities.

Why is everyone else talking about culture but us? Are we that boring? Have we grown so adept at gazing at our own navels that we can’t talk about other things? Does “the public” think that so-called “hard” science is really the only arbiter of actualities in the world?

How have we become so irrelevant even on topics that are ostensibly our areas of expertise and scholarly interest?

Whenever someone (often on the internet, but not always) wants to emphasize how smart the collective masses are, or point to the effectiveness of the “hive mind,” they always end up looking at Wikipedia.

But why hold Wikipedia up as a bastion of either mass-generated wisdom or productivity? Why make Wikipedia the pinnacle of the possibilities of collective action?

Yes, its Wikipedia is both a tremendous source of information and an example of the possibilities of a kind of open-ended collaborative effort to tap the massive labors of folks around the world who would, historically, have been shut out of the process of knowledge production. And that opening-up of access is a good thing, generally speaking. However, the price of such opening-up might be a more generally accurate account, but its a less curious, less inspiring and ultimately, less informative product.

Wikipedia gives us information but beyond that, I’m not sure it gives us very much.

Pointing to Wikipedia or the Encyclopedia Britanica as the best of a culture’s intellectual output makes about as much sense as looking to sausage for the best of a culture’s protein sources. Both are collections of abstract and often reductive information — no matter how extensive that information is. Rendered, often, in the authoritative prose of clumsy journalism, both sources are and remain locations for fact checking and the collection of surface level information. They’re useful, and (to extend the sausage metaphor), they might even be a little useful, but they are hardly the finest examples of intellectual production to which we can point. Wikipedia is great for checking dates of major events or grabbing thumb-nail sketches of historical figures, but it is not a terribly productive location for fostering curiosity or encouraging complex intellectual investigation.

Wikipedia offers information at its most basic and banal. It does not signal the “death of the expert,” but its eternal life. And I don’t mean the good kind of expert, either. I mean the niggling, nit-picking kind of expert who treasures the firmness of facts over the flexibility or rigor of intellectual labor.

Wikipedia is great at presenting gross information, but it is much worse at presenting nuanced argument. Encyclopedia Britanica is not much better, for that matter, but what the continual stoking of the Wikipedia v. Britanica debate suggests not the triumph of the collective mind over the individual brain. Nor does it indicate the death of the expert. Instead, it illustrates the embrace of the middle by the mass — which doesn’t tell us anything we don’t already know, and which is exactly where Wikipedia belongs.

Coming Soon: The Death of the Death of the Expert

Photo Credit: Associated Press

The teacher protests currently taking place in Wisconsin are clearly significant for a variety of political and educational reasons. The events unfolding in Madison have amplified ongoing debates about unionism, the crisis in U.S. education, and the politics of state budgeting. The protests arrive after several years of escalating public criticism of teachers. And the outcome will potentially impact the contours of our national debate about education policy. What is happening in Wisconsin is also significant for cultural reasons. It is important to remember just how rare it is for Americans to see photographs and television footage of tens of thousands of teachers gathered together in one place. The uncommon images coming out of Madison are profoundly disruptive to our common sense about teachers and schooling. From a cultural perspective, the portraits of teacher protest now circulating in the mass media are especially striking for the ways in which they challenge traditional representations of the teacher in American society.

Historically, the teacher has been depicted in American culture as one of three types: the schoolmarm, the bumbling pedagogue, or the lone hero. The schoolmarm is typically an older, unmarried, rural woman who dedicates her entire life to her pupils. She is the Miss Dove character, or the prairie schoolteacher, or the imagined head of the classroom in the little red schoolhouse of yore. The bumbling pedagogue is usually male, often effeminate, and either clownish, pedantic, or otherwise socially awkward.  He is the Ichabod Crane of our popular memory, the Mr. Kotter, the Scott Guber of Boston Public. Finally, the lone hero is the renegade who operates in isolation. He or she employs a nontraditional pedagogy—often in opposition to an unsupportive administrator—in an effort to educate the students everyone else has given up on. From Blackboard Jungle to Stand and Deliver to Dangerous Minds, the lone hero is a staple of many Hollywood films.

Throughout the years, these stock images have mostly served to denigrate the profession of teaching, depicting it as a lonely, all-consuming, non-specialized career.  These fictional teachers rarely if ever complain about wages, benefits, or the excessive demands placed on them. They seem to willingly sacrifice their personal lives for the sake of their students. And they are almost never imagined as members of a teacher union. Such popular depictions also tend to suggest that teaching requires little if any formal training; in fact, many of these iconic pedagogues become teachers because they can’t do anything else. Most importantly, these images have worked to normalize the idea that public schools are crisis-ridden environments in which teachers must act in solitude, as outsiders, if they hope to accomplish anything.

Interestingly enough, these three dominant representations of teachers have subsided from view in recent years, supplanted in the media by images of tough, pragmatic, business-minded reformers.  I am thinking in particular of Michelle Rhee, former chancellor of public schools in Washington D.C. For a while, Rhee became a media darling who was profiled in Newsweek and Time (where she was photographed holding a broom) and was featured in the pro-charter school documentary Waiting for Superman.  In a similar vein, business luminaries Bill Gates and Mark Zuckerberg have been spotlighted as the new philanthropic saviors of U.S. education (the Facebook CEO recently donated $100 million to Newark public schools).  The media celebration of these corporate reformers has conveniently dovetailed with broader calls to privatize public schooling and introduce more business-inspired models of incentive pay and accountability into school systems.

At the same time that our iconography of school reform has been crowded with the likes of Michelle Rhee for the past few years, U.S. popular culture has propagated a more deviant image of the classroom teacher. Television shows in particular have featured teacher protagonists who presumably mean well but who must of necessity supplement their educator salaries with illegal activities.  Take, for example, the cancer-stricken chemistry teacher who cooks and sells methamphetamine on Breaking Bad, or the history-teacher-turned-male-escort on Hung.  Just this past week, the trailer for the forthcoming film Bad Teacher, starring Cameron Diaz as an apathetic, pot-smoking, foul-mouthed schoolteacher, went viral (curious timing, to say the least). Teachers like these reinforce the perception that we need no-nonsense leaders like Michelle Rhee to sweep schools clean of them.

All of this explains why the images coming out of Madison are so culturally significant. First, these images show thousands of teachers united together in solidarity.  Here we see not one teacher working alone, but a mass of teachers, a community, working together toward a common end.  Second, these depictions of non-elite, everyday people taking to the streets in the name of public education contrast sharply with the lone power suit administrator or the billionaire philanthropist imposing top-down reforms on supposedly inept teachers.  To best illustrate this second point, I can’t help but compare the pictures of thousands of teachers protesting in Madison with the Time magazine cover of Rhee standing alone with a broom back in 2008.  The new imagery overwhelms the old, draining the Rhee photograph of its symbolic power.

Finally, these many iconic images of Wisconsin teachers project strength—not bumbling hesitancy, or shoulder shrugging resignation, but conviction and fortitude.

Without question, these cultural representations matter. As the editors of the 1994 book Schooling in the Light of Popular Culture remind us, education in the United States is “likely to be understood in ways that are at least in part beholden to popular images and ideas embodied in widely disseminated texts.”  The cultural stories we tell ourselves about schooling can shape how we discuss education, debate policy, and perceive teachers. Today, at least, the quaint faces of Ichabod Crane and the frontier schoolmarm have been replaced by a bold faculty of thousands. Long after the impasse in Wisconsin is resolved, the images emanating from there are sure to resonate in our popular imagination.

The Lonely Network

The great irony of the The Social Network, of course, is that its central theme is not connectivity but disconnection.  A film about the genesis of a technology designed to bring people closer together features a main character who experiences the painful dissolution of social bonds. The plot is driven by protagonist Mark Zuckerberg’s exclusion from social groups, the end of his romantic relationship, and the collapse of his close friendship.  This sense of disconnection is pervasive despite the fact that The Social Network is a crowded movie: characters are almost never seen alone (only when they are individually “wired in” to their computer screens), and certainly never seen enjoying a moment of quiet solitude. Instead, the characters are regularly packed together in small places—a legal office, a dorm room—or in big, loud impersonal places—a nightclub, a drunken party. But these spaces, despite their capacities, are repeatedly portrayed as lonely and alienating.

While the characters may inhabit a decidedly unsocial non-network beneath a façade of constant social interaction, the film itself serves as a remarkably vibrant cultural network. For the student of American culture, The Social Network is a fountainhead of intertextuality.  Perhaps appropriate for a film about Facebook, The Social Network functions as a busy crossroads of cultural referents, many readily recognizable, others unstated but nevertheless present.  The movie obviously plays on our familiarity with Facebook, but it is also features appearances by Bill Gates and the creator of Napster (both portrayed by actors), a musical score by industrial rock luminary Trent Reznor of Nine Inch Nails, and a Harvard setting (even though campus scenes were mostly filmed at The Johns Hopkins University).  It is also directed by David “Fight Club” Fincher and written by Aaron “West Wing” Sorkin.  One of the students on Zuckerberg’s Facebook team is played by Joseph Mazzello, the actor who starred as the little boy Tim Murphy in Jurassic Park.  In other words, what is really “social” about The Social Network is the way in which it serves as a pulsating intersection for a range of icons, myths, and expressive forms that circulate in the audience’s collective imagination.  It is populated by cultural detritus and ephemera with which we are friendly, if you will.

I imagine these multiple and varied cultural associations may in part explain the source of the film’s pleasure for viewers. The experience of viewing The Social Network is akin to data-mining.  It rewards a 21st century audience accustomed to scanning mounds of digital information and quickly categorizing that information into familiar frames of reference. For example, the brief appearance of Bill Gates evokes our Horatio Alger myth of success and the American dream.  The presence of Sean Parker of Napster infamy conjures associations with the lone rebel, hacking the system and sticking it to the man.  And Zuckerberg himself comes across as a nerd pulling one over on the Olympic jocks.

Reznor’s musical score triggers memories of his earlier work on such Nine Inch Nails albums as Pretty Hate Machine (1989).  That record opens with the line, “god money I’ll do anything for you/god money just tell me what you want me to,” and builds to the chorus, “head like a hole/black as your soul/I’d rather die/than give you control.” Pretty Hate Machine, with its loud synthesizers, drum machines, and vocal wails, is not unlike The Social Network: an expression of male adolescent angst and rage confined inside an electronic world.

And there are still other resonances: Fincher’s directorship reminds us of his previous explorations of masculinity and antisocial behavior in Fight Club, The Game, and Zodiac.  Sorkin’s dialogue echoes the brainy loquaciousness of the political characters he developed for the television show The West Wing. Nearly twenty years ago, in Jurassic Park, actor Mazzello played a child victimized by a technology abused and gone awry.

As I watched The Social Network, I even found correspondences with The Lonely Crowd (1950), the sociological study of “other-directed” social character that became a bestseller in postwar America.  Co-authored by David Riesman and his team, the book argues that Americans are increasingly motivated by the need for peer acceptance.  More and more, our “inner gyroscope” is set in motion not by individualistic self-reliance, but by the drive to win approval and fit in.  Consequently, our time is spent trying to decode what is popular and adjust our personalities accordingly: “The other-directed person acquires an intense interest in the ephemeral tastes of ‘the others’… the other-directed child is concerned with learning from these interchanges whether his radar equipment is in proper order” (74). What is The Social Network if not a depiction of a lonely crowd?  Indeed, isn’t Facebook itself a kind of lonely crowd?

I can’t help but wonder if this way of reading the movie—this pleasurable scanning of its cultural allusions—in some way works to conceal its underlying critique of 21st century connectivity. The film’s central theme of social dissolution is undercut by its teeming network of familiar, friendly signifiers.  Its “ephemeral tastes.”  Yet we are “friends” with these signifiers in as much as we are “friends” with hundreds of people on Facebook—another text, by the way, that we scan with the same data-mining mindset.  As portrayed in The Social Network, Facebook doesn’t really bring people closer together in any meaningful way; it facilitates casual hook-ups in bar bathrooms, or it breeds addiction, as one character confesses, or it quickly delivers trivial information like the results of a crew boat race. You would think The Social Network, with its depiction of Zuckerberg and his creation, would compel a mass exodus from Facebook, or at least spark critical public reflection on our complicity with this technology of the lonely crowd. But instead it rewards and reinforces our ways of reading: our ingrained habits of consuming images, scanning information, reducing human experience to pannable nuggets of gold.

I have two bookish confessions to make.  The first is that I keep a running list of every book I have read since I graduated from college some twenty years ago.  The list includes the author and title as well as the month and year I finished each book.  The list includes books I read for the classes I teach as well as books I read for “pleasure.”  The list includes unabridged audio books, or at least it would if I had ever actually finished listening to one in its entirety.

I keep the list to remember and to mark my progress—each year I try to read at least one more book than I read the previous year.  The list also helps me recall not just the titles, but the cities in which I lived and the specific places in which I read those books: at a coffee shop in Austin, on the beach on vacation in Puerto Rico, in my grandparent’s house one winter in Vermont, on the couch in my small apartment in New Hampshire, in a summer sublet in Boston.  The list generates strong associations with the past—with the seasons, the weather, the people around me, the fleeting episodes of happiness, confusion, sadness in my life.  The list, in other words, is more than a tally of books.  Each book anchors me in a moment.

My second bookish confession is that each year, after Thanksgiving, I begin the pleasantly agonizing process of selecting the books I will take home with me for the holidays. I stack, I list, I revise lists, I restack.  I try to figure out which books I want to relish during my December visit to family in New Jersey.  It is, admittedly, quite the process.  Almost always it must be a novel.  Something I have wanted to read for a while.  Something that will offer satisfaction and escape, and not too much challenge.  Something I will look forward to reading after my parents have gone to sleep (which happens earlier and earlier each year), or before my nephews rush into my bedroom to wake me up the next morning (which, thankfully, happens later and later each year).  I usually start with about fifteen books and taper it down to two or three by the time I pack my suitcase.

This holiday season, I took a moment to revisit the books I read during past December visits.  Ever the connector of dots, I found some curious patterns.  Unbeknownst to me until now, the books I read over the holidays tend to fall into four sometimes overlapping categories: from mom’s bookshelf, work-inspired, political, and ambitiously literary.

Apparently on several visits home, I abandoned my carefully chosen books and instead selected one from my mother’s bookshelf in her office.  My mother likes buying used books at public library sales, and she’s amassed an impressive collection over the years. I pulled Hard Times from the shelf in 1994, A Clockwork Orange the year before (why did my mother have this one?).

When I taught in an Education department at a college in North Carolina a few years ago, it seems I was inspired to read school stories: Old School, the excellent prep school novel by Tobias Wolff, followed the next winter by the classic Goodbye, Mr. Chips.

Election seasons punctuate my list: Howard Dean’s Winning Back America in December 2003, Barack Obama’s The Audacity of Hope in December 2006.

The rest of the books fall into the ambitiously literary category, books I must have picked because I was on some mission to expand my literary horizons over the holidays.  One Christmas I read Kafka’s The Castle.  Another year I worked through Nabokov’s Palefire (so much for the “not too challenging” criterion) and Kurt Vonnegut’s Cat’s Cradle.  In 1997 I read a biography of Zora Neale Hurston.  In 2008, it was Toni Morrison’s A Mercy.  Raymond Carver’s Cathedral and George Eliot’s Silas Marner were my Yuletide pleasures in 1996.

What these four categories have in common is setting: my parents’ house, the place I left at age 18 and return to every Christmas.  These are the books I have read in my childhood bed, now long donated to one of my sisters.  In the guest bed in the basement.  On the couch in front of the fireplace in the living room.  In the oversized recliner in the den.  Each holiday book brings back the smells and tastes of home cooking, family voices, a glimpse of snowflakes outside the window, the barking of my parents’ dog.  This may be why I select the books with such care: they will absorb these memories.  Preserve them.  These books will carry a little bit of family history with them.  They will be so much more than an author and a title on a list.

Which reminds me—I still need to whittle down my stack for next week.  Happy holidays, readers.