Feeds:
Posts
Comments

Archive for the ‘popular culture’ Category

In my late thirties, I developed an obsession with American cinema of the 1970s that surfaced yearly in the dead of winter.  The obsession would start with me reading a scholarly book about seventies films.  This would be followed by a manic updating of my Netflix queue and then marathon screenings.  I would usually burn out by late March.  I would move on to other interests.  I would forget about seventies films.  But the cycle would start again the next year.  Always like clockwork, yet never planned.

The obsession began in the winter of 2006, when I bought The Films of the Seventies, an annotated filmography of every American, British, and Canadian movie that had come out between 1970 and 1979.  I bought the book on a lark at a used bookstore in Winston Salem.  I was living in North Carolina with a girlfriend, in a relationship that was still a year and a half away from its end but already on the decline.  When I got the book home, I highlighted all of the films I had already seen: Jaws, Star Wars, Breaking Away, Alien, Grease—the stuff of childhood memories.  Then I proceeded to watch classics that, embarrassingly, I hadn’t yet seen: Dog Day Afternoon, Network, Serpico, All the President’s Men, The French Connection, Chinatown.  At the time, I saw this as an exercise in self-education.  I had recently finished a Ph.D. in American Studies, and I felt I needed to know about these films.  The project preoccupied me for a few months, and then it ended.  Or so I thought.

In February 2007, the obsession returned.  I read, in quick succession, Peter Biskind’s Easy Riders, Raging Bulls: How the Sex-Drugs-And-Rock ‘N’ Roll Generation Saved Hollywood (a book I had been assigned in grad school but never read) and Peter Lev’s American Films of the 70s: Conflicting Visions.  The reading had nothing to do with my teaching or my academic research.  I just found myself enthralled by these assessments of cinema and the 1970s.  Biskind hooked me when he wrote, “this was a time when film culture permeated American life,” when film was “no less than a secular religion.”  I was intrigued by his claim that “this was to be a directors’ decade if ever there was one.”  Lev similarly captured my imagination when he noted that this was “the most exciting and most experimental period of the American feature film.”  I made lists as I read, and I subsequently watched movies like Shampoo, Scarecrow, The Parallax View, The Conversation, Marathon Man, The Last Detail, and, fittingly enough, Obsession.  This time, my fixation lasted into the spring.  By the summer, I had moved to California to start a new job and my six-year relationship had ended.  I was thirty-seven years old.

A year went by without a relapse.  I was busy teaching at a new school and processing my failed romance.  I spent February and March in a battle with a loud, temperamental neighbor.  In search of a fresh start, I moved into a beachside apartment in April.  I cultivated new friendships.  There was no seventies that year for me.

In the winter of 2009, however, it started again.  I devoured a collection of essays, American Cinema of the 1970s: Themes and Variations.  I made more lists.  My viewing included Klute, Nashville, Night Moves, Joe, Phantom of the Paradise, Hardcore, McCabe and Mrs. Miller.  But that year, I had a revelation.  I recognized a pattern not in the cycle of what I was doing, but in my experience of it: as it turned out, every time I revisited the ’70s, I liked reading about the films more than I liked watching them.

The books made the movies sound better than they were.  The books put the films in a historical context, explicated their cultural themes, imbued them with political significance—Watergate and Vietnam, the working class, paranoia and conspiracy, generational strife.  But the movies themselves?  They were often plodding, incoherent, indulgent, dull.  Not all of them.  But many of them.  The Last Detail bored me.  Night Moves confused me.  Phantom of the Paradise was whacky.  Joe was improbable.

In retrospect, the books I had read warned me of this. Peter Lev, for example, states bluntly (albeit on the very last page of his book), “one should not overly romanticize the films of the 1970s.  In a period of uncertainty and change, many mediocre films were made.”  In his introduction to Themes and Variations, editor Lester Friedman counsels the reader to beware of “hyperbolic assessments” of the era, for they “belie the fact that terrible movies were made during the 1970s.”  He writes this on page twenty-three, so I suppose I have no excuse there.

The same thing would happen each year. I regularly found myself waiting impatiently for a movie to end just so I could highlight the title in my filmography and be done with it.  Or avoiding watching it altogether.  My ex had noticed this tendency of mine when we still lived together in North Carolina.  Some of the titles that arrived from Netflix would sit next to the television for weeks.  She would threaten to mail them back the next time I was away from the house.  “You don’t have to do this to yourself,” she said at one point, sending us both into a fit of laughter.  And yet I did.  I was stuck in a cycle of pleasure and tedium.

Then I turned forty in 2010 and it all stopped.

I have two related theories about my obsession with American cinema of the 1970s.  The first is that the books I read provided a historical context for a time in my life when I was aware of very little beyond my own childhood.  The books offered a portrait of a cultural milieu that was taking shape outside of my tiny world of family, play, and school.  I could insert the memories and images I had of my own seventies childhood—a quite happy childhood—into a bigger picture of what was going on in America.  This was an exercise in intellectual nostalgia, in historicizing my youth.  That’s why I liked reading about the films more than watching them.  The moving pictures I had in my head, of me then, were much better.

The second theory is that reading about the seventies provided an escape.  In my late thirties I was going through major transitions: a stressful job search, the end of a serious relationship, a move to California.  Rather than deal with the corrosion and change that defined my present, I retreated to a safer past.  It wasn’t really about the seventies at all.  Sometimes we sit through bad movies because we need to.

So was my obsession with seventies films about nostalgia or escape?  I’m not sure they’re so different.  I was certainly obsessed with both at a particular stage of my life.  All I know is that when I finally let her go, when I turned forty and resolved to look forward more than I looked back, that decade also lost its purchase on me.  It’s true that one should not overly romanticize American cinema of the 1970s.  But it’s also true that one not overly romanticize the past.  Those moving pictures may sound good in theory, but sitting through them can just be another way of making the present pass by.

Advertisement

Read Full Post »

The most recent data released by the U.S. Bureau of Justice Statistics (2008) disclosed that 11.7 million persons in the United State experienced at least one attempted or successful incident of identity theft during a two-year period.  The incidences increased 22% from 2007 to 2008.  For the past decade it has been a federal crime to “knowingly transfer, possess, or use, without lawful authority, a means of identification of another person.”  A steady stream of news stories reminds us of the dangers of identity theft and identity impersonation—from credit card and social security numbers stolen, to bank accounts hacked into, to child predators trolling social networking sites under false identities.

Our legal system and news media have sent us a clear message about identity theft: it is criminal.  It is unacceptable.  It is a menace to society.  You need to protect yourself, vigilantly, against it.  But in the imaginary world of popular culture, the ethics of identity theft is, well, far less rigid.  In the fictional stories we tell ourselves about identity fraud, the act itself can actually be a justified means to a moral end.  In the past year alone, three films have been released with plots that are propelled by characters who assume false identities, identities inhabited “knowingly” and without permission.  In the sci-fi film Source Code (2011), a soldier is transposed into the body of an everyday man who is riding a train that will blow up in eight minutes; he is occupying the man’s body in order to try to find the bomber on the train.  In Unknown (2011), a doctor awakens after a car accident to find that another man has assumed his identity, is married to his wife, and has his career (the other man’s photograph even shows up when the doctor’s name is searched on the internet).  Finally, the documentary Catfish (2010) explores the world of Facebook and the ways in which users perform their identities for people they’ve never met in real life.  I don’t want to give too much more away from any of the three movies, as each contains various twists and turns, but suffice it to say that identity cloning and identity manipulation is a theme that runs through all of them.

And here’s what strikes me about this cultural content and its broader context: in each case, the ethics of assuming someone else’s identity is not exactly black and white. In fact, in the moral logic of Source Code, posing as another person is wholly justified if it can help save the lives of others.  In Unknown, it turns out that identity fraud is ultimately a redeeming act for one of the characters.  And Catfish challenges the audience to reflect on whether questionable identity performance justifies the good works that can result in the process.  In short, if another person’s identity is assumed for reasons other than criminal purposes and/or economic gain, then it may just be okay.

It’s not just these three movies that explore the ethical gray area of identity theft.  We can go back farther into the decade to find others.  As readers of the Harry Potter series know, the magical “polyjuice” is a potion that will transform the imbiber into another person, at least in outward appearance.  While it is used to evil purposes in one of the books, it is also used by Harry and his friends Hermione and Ron throughout the series to obtain valuable information and pursue dangerous missions.  Their temporary acts of identity theft are deemed morally defensible.  Another example: the hero protagonist of the popular 2002 film Catch Me If You Can is a clever identity imposter—based on a real-life criminal—whom the audience finds itself rooting for.  And in the TV show Mad Men (spoiler warning…), the lead character is not actually the “real” Don Draper, but is a Korean war vet named Dick Whitman who has assumed the identity of an officer killed in the war—named Don Draper.

Of course, the theme and plot device of false identity is not a new preoccupation of literature and cinema.  Mark Twain explored it in Pudd’nhead Wilson, Herman Melville in “Benito Cereno,” Vladimir Nabokov in his dark novel Despair.  Older movies, like Day of the Jackal, the film version of the play Six Degrees of Separation, and the science fiction story Gattaca all revolve around identity fraud.  But in the world we live in today, where online identity theft is a rapidly growing criminal phenomenon, I find these contemporary iterations of the theme even more noteworthy.  As the performance of our identity becomes more fluid, unstable—indeed more uncomfirmable—in this online age of social networking, it seems that our sense of the morality of identity performance has become more muddied and culturally contested.  The law and the news media are telling us one story about the ethics of identity theft and manipulation; our popular culture is telling us another; and, I daresay, what we post on our own Facebook and Twitter accounts is telling us a third.

Read Full Post »

How did the humanities lose out to neuroscience in making culture seem interesting?

I’ve been listening to a lot of NPR and podcasts lately. I’ve given my historical favorites a little break (sorry, This American Life and Sound Opinions), and I’ve been listening more and more to Planet Money and Radiolab (as podcasts), and to the usual NPR fare that airs while I do dishes or cook dinner: All Things Considered, Marketplace, and of course: Fresh Air.

What I’ve noticed is how often scientists and economists show up on theses shows to talk about things I thought were the main interests of humanists and social scientists. Questions like the how restaurants work, whether or not race matters, why certain songs get stuck in our heads, how people calculate the value of things or make decisions they know are not in their best interests, and so on.

These are the questions to which I have long sought answers by looking at culture and its various expressions, and in which my field of American Studies has long been interested (albeit in different forms, over time).

Yet somehow, every time I turn on the radio, I find one or another neuroscientist (or, often enough, Jonah Lehrer) talking about precisely these same questions, and about how the pathways of neurons and synapses can answer questions art or love or whatever.

So here’s my question to my colleagues: how did we become so untouchable or so uninteresting to mainstream media? How come the good people at NPR (and, presumably, their listeners) find neuroscientists and economists more interesting and more capable of talking about these questions that we are? How did they become the go-to academics for understanding how and why people do what they do? Social scientists and humanists look at those phenomena, too, but somehow, we have become generally less interesting than our colleagues.

This is not the neuroscientists’ fault: they are good at what they do, and their creativity in asking profound questions that teeter on the line between culture and biology ought to be encouraged. Similarly, it’s not the fault of the radio programmers; they are looking for the most intelligent, engaging guests they can find. And they’re finding them in neuroscience and economics, not in the Humanities.

Why is everyone else talking about culture but us? Are we that boring? Have we grown so adept at gazing at our own navels that we can’t talk about other things? Does “the public” think that so-called “hard” science is really the only arbiter of actualities in the world?

How have we become so irrelevant even on topics that are ostensibly our areas of expertise and scholarly interest?

Read Full Post »

Photo Credit: Associated Press

The teacher protests currently taking place in Wisconsin are clearly significant for a variety of political and educational reasons. The events unfolding in Madison have amplified ongoing debates about unionism, the crisis in U.S. education, and the politics of state budgeting. The protests arrive after several years of escalating public criticism of teachers. And the outcome will potentially impact the contours of our national debate about education policy. What is happening in Wisconsin is also significant for cultural reasons. It is important to remember just how rare it is for Americans to see photographs and television footage of tens of thousands of teachers gathered together in one place. The uncommon images coming out of Madison are profoundly disruptive to our common sense about teachers and schooling. From a cultural perspective, the portraits of teacher protest now circulating in the mass media are especially striking for the ways in which they challenge traditional representations of the teacher in American society.

Historically, the teacher has been depicted in American culture as one of three types: the schoolmarm, the bumbling pedagogue, or the lone hero. The schoolmarm is typically an older, unmarried, rural woman who dedicates her entire life to her pupils. She is the Miss Dove character, or the prairie schoolteacher, or the imagined head of the classroom in the little red schoolhouse of yore. The bumbling pedagogue is usually male, often effeminate, and either clownish, pedantic, or otherwise socially awkward.  He is the Ichabod Crane of our popular memory, the Mr. Kotter, the Scott Guber of Boston Public. Finally, the lone hero is the renegade who operates in isolation. He or she employs a nontraditional pedagogy—often in opposition to an unsupportive administrator—in an effort to educate the students everyone else has given up on. From Blackboard Jungle to Stand and Deliver to Dangerous Minds, the lone hero is a staple of many Hollywood films.

Throughout the years, these stock images have mostly served to denigrate the profession of teaching, depicting it as a lonely, all-consuming, non-specialized career.  These fictional teachers rarely if ever complain about wages, benefits, or the excessive demands placed on them. They seem to willingly sacrifice their personal lives for the sake of their students. And they are almost never imagined as members of a teacher union. Such popular depictions also tend to suggest that teaching requires little if any formal training; in fact, many of these iconic pedagogues become teachers because they can’t do anything else. Most importantly, these images have worked to normalize the idea that public schools are crisis-ridden environments in which teachers must act in solitude, as outsiders, if they hope to accomplish anything.

Interestingly enough, these three dominant representations of teachers have subsided from view in recent years, supplanted in the media by images of tough, pragmatic, business-minded reformers.  I am thinking in particular of Michelle Rhee, former chancellor of public schools in Washington D.C. For a while, Rhee became a media darling who was profiled in Newsweek and Time (where she was photographed holding a broom) and was featured in the pro-charter school documentary Waiting for Superman.  In a similar vein, business luminaries Bill Gates and Mark Zuckerberg have been spotlighted as the new philanthropic saviors of U.S. education (the Facebook CEO recently donated $100 million to Newark public schools).  The media celebration of these corporate reformers has conveniently dovetailed with broader calls to privatize public schooling and introduce more business-inspired models of incentive pay and accountability into school systems.

At the same time that our iconography of school reform has been crowded with the likes of Michelle Rhee for the past few years, U.S. popular culture has propagated a more deviant image of the classroom teacher. Television shows in particular have featured teacher protagonists who presumably mean well but who must of necessity supplement their educator salaries with illegal activities.  Take, for example, the cancer-stricken chemistry teacher who cooks and sells methamphetamine on Breaking Bad, or the history-teacher-turned-male-escort on Hung.  Just this past week, the trailer for the forthcoming film Bad Teacher, starring Cameron Diaz as an apathetic, pot-smoking, foul-mouthed schoolteacher, went viral (curious timing, to say the least). Teachers like these reinforce the perception that we need no-nonsense leaders like Michelle Rhee to sweep schools clean of them.

All of this explains why the images coming out of Madison are so culturally significant. First, these images show thousands of teachers united together in solidarity.  Here we see not one teacher working alone, but a mass of teachers, a community, working together toward a common end.  Second, these depictions of non-elite, everyday people taking to the streets in the name of public education contrast sharply with the lone power suit administrator or the billionaire philanthropist imposing top-down reforms on supposedly inept teachers.  To best illustrate this second point, I can’t help but compare the pictures of thousands of teachers protesting in Madison with the Time magazine cover of Rhee standing alone with a broom back in 2008.  The new imagery overwhelms the old, draining the Rhee photograph of its symbolic power.

Finally, these many iconic images of Wisconsin teachers project strength—not bumbling hesitancy, or shoulder shrugging resignation, but conviction and fortitude.

Without question, these cultural representations matter. As the editors of the 1994 book Schooling in the Light of Popular Culture remind us, education in the United States is “likely to be understood in ways that are at least in part beholden to popular images and ideas embodied in widely disseminated texts.”  The cultural stories we tell ourselves about schooling can shape how we discuss education, debate policy, and perceive teachers. Today, at least, the quaint faces of Ichabod Crane and the frontier schoolmarm have been replaced by a bold faculty of thousands. Long after the impasse in Wisconsin is resolved, the images emanating from there are sure to resonate in our popular imagination.

Read Full Post »

The great irony of the The Social Network, of course, is that its central theme is not connectivity but disconnection.  A film about the genesis of a technology designed to bring people closer together features a main character who experiences the painful dissolution of social bonds. The plot is driven by protagonist Mark Zuckerberg’s exclusion from social groups, the end of his romantic relationship, and the collapse of his close friendship.  This sense of disconnection is pervasive despite the fact that The Social Network is a crowded movie: characters are almost never seen alone (only when they are individually “wired in” to their computer screens), and certainly never seen enjoying a moment of quiet solitude. Instead, the characters are regularly packed together in small places—a legal office, a dorm room—or in big, loud impersonal places—a nightclub, a drunken party. But these spaces, despite their capacities, are repeatedly portrayed as lonely and alienating.

While the characters may inhabit a decidedly unsocial non-network beneath a façade of constant social interaction, the film itself serves as a remarkably vibrant cultural network. For the student of American culture, The Social Network is a fountainhead of intertextuality.  Perhaps appropriate for a film about Facebook, The Social Network functions as a busy crossroads of cultural referents, many readily recognizable, others unstated but nevertheless present.  The movie obviously plays on our familiarity with Facebook, but it is also features appearances by Bill Gates and the creator of Napster (both portrayed by actors), a musical score by industrial rock luminary Trent Reznor of Nine Inch Nails, and a Harvard setting (even though campus scenes were mostly filmed at The Johns Hopkins University).  It is also directed by David “Fight Club” Fincher and written by Aaron “West Wing” Sorkin.  One of the students on Zuckerberg’s Facebook team is played by Joseph Mazzello, the actor who starred as the little boy Tim Murphy in Jurassic Park.  In other words, what is really “social” about The Social Network is the way in which it serves as a pulsating intersection for a range of icons, myths, and expressive forms that circulate in the audience’s collective imagination.  It is populated by cultural detritus and ephemera with which we are friendly, if you will.

I imagine these multiple and varied cultural associations may in part explain the source of the film’s pleasure for viewers. The experience of viewing The Social Network is akin to data-mining.  It rewards a 21st century audience accustomed to scanning mounds of digital information and quickly categorizing that information into familiar frames of reference. For example, the brief appearance of Bill Gates evokes our Horatio Alger myth of success and the American dream.  The presence of Sean Parker of Napster infamy conjures associations with the lone rebel, hacking the system and sticking it to the man.  And Zuckerberg himself comes across as a nerd pulling one over on the Olympic jocks.

Reznor’s musical score triggers memories of his earlier work on such Nine Inch Nails albums as Pretty Hate Machine (1989).  That record opens with the line, “god money I’ll do anything for you/god money just tell me what you want me to,” and builds to the chorus, “head like a hole/black as your soul/I’d rather die/than give you control.” Pretty Hate Machine, with its loud synthesizers, drum machines, and vocal wails, is not unlike The Social Network: an expression of male adolescent angst and rage confined inside an electronic world.

And there are still other resonances: Fincher’s directorship reminds us of his previous explorations of masculinity and antisocial behavior in Fight Club, The Game, and Zodiac.  Sorkin’s dialogue echoes the brainy loquaciousness of the political characters he developed for the television show The West Wing. Nearly twenty years ago, in Jurassic Park, actor Mazzello played a child victimized by a technology abused and gone awry.

As I watched The Social Network, I even found correspondences with The Lonely Crowd (1950), the sociological study of “other-directed” social character that became a bestseller in postwar America.  Co-authored by David Riesman and his team, the book argues that Americans are increasingly motivated by the need for peer acceptance.  More and more, our “inner gyroscope” is set in motion not by individualistic self-reliance, but by the drive to win approval and fit in.  Consequently, our time is spent trying to decode what is popular and adjust our personalities accordingly: “The other-directed person acquires an intense interest in the ephemeral tastes of ‘the others’… the other-directed child is concerned with learning from these interchanges whether his radar equipment is in proper order” (74). What is The Social Network if not a depiction of a lonely crowd?  Indeed, isn’t Facebook itself a kind of lonely crowd?

I can’t help but wonder if this way of reading the movie—this pleasurable scanning of its cultural allusions—in some way works to conceal its underlying critique of 21st century connectivity. The film’s central theme of social dissolution is undercut by its teeming network of familiar, friendly signifiers.  Its “ephemeral tastes.”  Yet we are “friends” with these signifiers in as much as we are “friends” with hundreds of people on Facebook—another text, by the way, that we scan with the same data-mining mindset.  As portrayed in The Social Network, Facebook doesn’t really bring people closer together in any meaningful way; it facilitates casual hook-ups in bar bathrooms, or it breeds addiction, as one character confesses, or it quickly delivers trivial information like the results of a crew boat race. You would think The Social Network, with its depiction of Zuckerberg and his creation, would compel a mass exodus from Facebook, or at least spark critical public reflection on our complicity with this technology of the lonely crowd. But instead it rewards and reinforces our ways of reading: our ingrained habits of consuming images, scanning information, reducing human experience to pannable nuggets of gold.

Read Full Post »

What do Philip Roth’s polio novel Nemesis and Justin Cronin’s vampire novel The Passage, both published in 2010, have in common?  Quite a bit, actually.  Roth’s novel renders the atmosphere of fear surrounding a polio outbreak in Newark, New Jersey in the summer of 1944.  The protagonist is a young playground director futilely trying to protect the children he supervises from contracting polio.  Cronin’s novel imagines a world ninety years in the future that has been overrun by killer vampiric “virals” as the result of a scientific experiment gone awry.  It focuses on a small colony of people struggling to defend themselves against the virals.  Both books deal with the pervasive threat of an invisible, poorly understood contagion that has no known cure.  Both focus on adult attempts to shelter children from the contagion.  In both, characters wrestle with existential questions and religious doubt—why would God inflict polio on children?  Do virals have souls?  Both books demonstrate how natural time changes—spring to summer, day to night—can heighten the threat of contagion and force humans to change their routines and behavior.  Both show how suspicion and fear can cause communities to turn against one another.  As cultural artifacts of the moment, both novels also resonate powerfully with contemporary anxieties surrounding contagion in our everyday lives.

In recent years, Americans have been alerted to the threats posed by multiple toxins and diseases: salmonella in eggs, ecoli in spinach and peanutbutter, cadmium in Shrek glasses sold at McDonald’s and jewelry sold at Wal-Mart, H1N1, West Nile virus.  Such warnings came about against the backdrop of broader fears about biological terrorism post-9/11. The current wave of viral panic is of course the most recent chapter in a very long history of epidemics—Ebola in the 1990s, mad cow disease in the 1980s, polio in the 1930s and 1940s, the black death in the 14th century, and so on.

There is a cultural dimension to this biological history—a cultural history of contagion.  Nemesis and The Passage are the latest examples of artistic expressions, literary and otherwise, that in some way reflect, reimagine, or comment upon viral panics.  Think Daniel DeFoe’s Journal of the Plague Year (1722), Thomas Mann’s Death in Venice (1912), Albert Camus’s The Plague (1947) (literary critic Elaine Showalter recently made a brilliant connection between Nemesis and The Plague), and Stephen King’s The Stand (1978/1990).  Think nonfiction books like The Hot Zone (1994).  Or films like Panic in the Streets (1950), Omega Man (1971), and Outbreak (1995).

As artifacts of the 21st century culture of viral panic, unlikely bedfellows Nemesis and The Passage join a diverse cultural cohort that includes the comic book series Y, about a plague that kills every mammal on earth with a Y chromosome; the 28 Days and 28 Weeks Later films, featuring the “rage” virus; the horror movie Cabin Fever, starring a flesh-eating virus (tagline: “Catch It”); Max Brooks’s book World War Z, an “oral history of the zombie wars”; and the forthcoming young adult novel Virals, about a group of teenagers exposed to canine parvovirus.

These contemporary examples of the culture of viral panic offer audiences ways to process a number of fears and anxieties that circumscribe our everyday lives in the 21st century.  Anxieties about the limitations and abuses of science and medicine.  Anxieties about our ability to protect children and youth from invisible menaces.  Anxieties about community cohesiveness and civility.  Anxieties about the government’s ability to respond to disaster.  In other words, the culture of viral panic is never just about the threat of contagion.  It always speaks to broader concerns about institutions, social relations, childrearing, science, spirituality.  Read in this context, Nemesis and The Passage have a great deal in common.

Read Full Post »

So, a friend approached me a few weeks ago and asked if I would accompany him on a project. He is listening his way through this book called 1001 Albums You Must Hear Before You Die. He asked me to blog the project with him.

But I can’t do it.

There are three main problems with the book.

1. It’s too big. 1001 albums to hear before you die is too long of a list — it’s a long enough list that it could include just about anything and requites precious little editing. “Which Mudhoney album should I include? I’ll just go ahead an include both.” Britney or Christina? Bring them both along. A list this expansive fails to do what a good list does, and that’s provide some guidance by making some editorial decisions about what makes the list and what does not.

2. It’s not big enough. It’s selective in a really painful way: It’s too rock-ish. For those of you keeping score at home:

Oasis 2. John Coltrane 1.
Arrested Development and Charles Mingus tie at one apiece.
Kings of Leon (2) outlast Nina Simone (1)
Billie Holiday (1) manages a draw with Slipknot (1).
John Cage (0) & Phillip Glass (0) get shut out entirely.

According to this book, there are almost no jazz albums worth hearing since Miles Davis‘ (4) Bitches Brew. and there is no album resembling “classical” music that’s you need to hear before you die. Period. In choosing the 1001 albums, the contributors demonstrate little beyond their own limited taste, or else they demonstrate what passes for the contemporary moment’s self-satisfied “ecclectic” listening which means that you have both Tupac (1) and Steely Dan (4) on your ipod. Come on.

3. But really, my issue with the book is how they define the “album.” The authors credit Frank Sinatra (3) with “inadvertently ushering the album era” with his 1950 release, In the Wee Small Hours.” Albums, in fact had been around for about a decade, but they were not the now coveted (and fetishized) 33 1/3 rpm records. They were albums of records that were modeled on photo albums and were packaged and sold as collections — often of Broadway soundtracks, but also of the work of a single musician. By opening the “album era” in 1955, the authors commit a more egregious sin than what they did to Herbie Hancock (0). By conflating the LP with the album, they omit albums that preceded In the Wee Small Hours (like Charlie Parker’s 1950 release Bird With Strings).

But worse yet, in the age of single-song downloads and declining sales of record albums, they nostalgically enshrine the LP with a kind of rock-centric power, as if American music can be best understood by 45-minute collections of songs — as if that format is somehow both natural and superior to all others. By enshrining the album, the contributors suggest that our aural days ought not be measured in music, but in format. And in that way, the book, in its inclusions and omissions, implies that anything beyond this capacious and capricious list isn’t worth hearing, anyhow.

Read Full Post »

The conversation about “selling out” in popular music has been dead for some time. And I’m not interested in reviving that conversation now. The last time it really flared up was around 1989, when Nike featured the Beatles’ “Revolution” in a commercial. Since then, it’s basically been a done deal.

So, today’s New York Times‘ story about Converse opening a recording studio did not come as that big a surprise. It’s a pretty interesting story, actually, that points out the real deadness of the “selling out” debate. Given the state of the music industry — the rise of digital downloading, the bloatedness of the major labels, the constriction of radio outlets through consolidation (and companies like Clear Channel), the so-called “360 deals,” rampant product placement in pop music and so on — why shouldn’t Converse enter the industry? Why shouldn’t Whole Foods? Barnes and Noble? You or I?

It’s a rhetorical question, of course, but it raises three important issues that we ought to be clear about, if we’re thinking about the current state of popular music.

1. Converse will make music to sell shoes. The music is “successful” if it results in shoe sales. The Converse record label is the idea of Geoff Cottrill, Converse’s chief marketing officer. Cottrill is pretty plain about his intentions:

“Let’s say over the next five years we put 1,000 artists through here, and one becomes the next Radiohead,” he said. “They’re going to have all the big brands chasing them to sponsor their tour. But the 999 artists who don’t make it, the ones who tend to get forgotten about, they’ll never forget us.”

In other words, if the company has a .01% success rate in terms of music sales, but it builds brand loyalty for its shoes, then the music is a worthwhile investment. It’s a strange approach to “arts patronage,” in which it has none of the trappings of the Rennaissance or human expression — it’s about creating art to sell shoes. And I know (thanks, Warhol), that this, too, is old, self-referential, post-modern news. Nevertheless, I think we ought to be clear about these new arrangements and what is serving and what is being served.

2. Because it is in the business of selling shoes, Converse is actually being far more generous to its artists than the labels (at least it appears to be so). The article reported that Converse has little to no interest in owning the recordings that it makes. This is something new, and it does give more power to the artists than they typically have under contract with major labels — but good luck selling your song to Nike or Starbucks or VW if you’ve already sold it to Converse. And if you’re a musician, you’re probably not making money selling records, so where are you going to sell your music?

3. The entrance of Converse into this marketplace seems like evidence of the breaking-apart of the music industry as we knew it in the 20th century. Indeed, one of the great things about music these days is that anyone with a laptop and an internet connection can become a label. This is radically liberating for many artists. But what’s the real difference between Columbia and Converse? Amidst the sweeping changes in the music industry, it still seems to be about artists serving larger corporate interests. Converse, like Columbia or EMI or Decca or whomever, has the broadcast outlets; it has the power in the marketplace that independent musicians don’t have.

And, though Converse seems to be more generous with their artists, they appear to care less about their music.

Read Full Post »

I, for one, am thrilled that “bromance” has been officially added to the newest edition of the Oxford English Dictionary.  Bromance: “a close but nonsexual relationship between two men.”  The word received its 2010 OED stamp of approval along with “chillax,” “frenemy,” “exit strategy,” and “defriend,” among others.  Now that bromance is part of our venerated lexicon, we can wield it with authority, apply it smartly, inject it into our elite discourse.  No longer must we wave the word around with a wink and a nudge, smugly proving our pop culture literacy and familiarity with the film “I Love You, Man.”  Let the studious scholarship on bromance—Bromance Studies—begin in our literary and cultural circles.  Bromance has been given the green light, man.

I imagine that literary critic Leslie Fiedler (1917-2003) would approve of this development, whispering “I told you so” from beyond the grave.  Fiedler rocked the literary world with his 1948 Partisan Review essay, “Come Back to the Raft Ag’in, Huck Honey!” in which he pointed out that a dominant archetype in American literature is the homoerotic love affair between two men who light out for the territory in order to escape civilization’s responsibilities and constrictions (and its women).  Moreover, as Fiedler observed—even more controversially—these bromantic pairings tend to involve a young white man and a man of color: Huck Finn and Jim, Ishmael and Queequeg, Natty Bumbo and Chingachcook.  These “mythic” relationships are always characterized by the “pure love of man and man,” offering the reader a model of “chaste male love as the ultimate emotional experience.”  The love is of course always innocent; there “lies between the lovers no naked sword but a childlike ignorance, as if the possibility of a fall to the carnal had not yet been discovered.”  Bromance!  The OED should cite Fiedler.

Fiedler elaborated on his argument in his 1960 book Love and Death in the American Novel. I propose an updated edition of this book be released with a new title: Bromance and Death in the American Novel.  With an introduction by Judd Apatow.  I suspect this would make Fiedler’s argument more legible than ever to the reading public.  And to dudes, especially.

Bromance can now be applied without shame to literary studies of other male pairings, even to those men who do not abandon civilization for the wilderness.  Take the bromance between Nick Carraway and Jay Gatsby.  Or Joe Starrett and Shane.  Or Sal Paradise and Dean Moriarty.  At long last we have the word to describe these complex character relationships.

And why stop with bromance?  I believe several other new additions to the OED would prove quite useful to literary critics.  Ahab just needed to chillax.  Hemingway’s Old Man needed an exit strategy when the fish proved too much for him.  Hester Prynne should have argued that she merely had a wardrobe malfunction.  Portnoy overthinks sex.  Biff Loman couldn’t get a hold in Death of a Salesman because maturity was too much of a buzzkill for him (and his dad needed better soft skills).  Faulkner’s Compson family should have defriended one another.  Were Gene and Finny friends or frenemies in A Separate Peace? How did Silas Lapham’s toxic debt affect his moral compass in Dreiser’s novel?  How would Sonny’s blues have sounded different in James Baldwin’s short story if he had played a vuvuzela?

The possibilities abound.  Book critics would do well not to be fussbudgets or haters and follow the OED’s lead.  Time to update the literary glossary.  As for me, I’m off to take a chill pill and eat some turducken while I reread that prescient cool hunter, Leslie Fiedler.

Read Full Post »

I don’t have a T.V. but I do have access to TV programming, thanks to my Netflix subscription and various other online sources of traditional broadcast entertainment. As a result, I found myself watching HULU the other night, and as after I selected the sitcom I wanted to watch, the screen went dark and I was presented with a decision:

Which advertising experience would you prefer?

And then, Hulu presented me with a choice of three ads to watch, each one for a particular insurance company.

Is this really what all the hype about interactivity and the radical power of the internet has come to? If this is really what it means to “harness the power of the internet,” then I might just go back to dial-up. Has Web 2.0 been reduced to my ability to choose which advertisement to watch? The dimensions of absurdity of this encounter, framed as a “choice” of “experience,” are too manifold to list, so I’ll focus on just two:

1. It’s a choice of advertisements, but I still have to watch an advertisement.
2. It’s a choice between advertisements for the same insurance company.

In a sense, this is the epitome of choice within capitalism. Which is to say: its not really choice, or else its a choice that is so constrained that the choice itself doesn’t end up really mattering. Theodor Adorno would be so proud and so perplexed. And he’d be laughing (BTW: if you don’t “choose,” HULU will choose for you…. so even not choosing is a choice).

Read Full Post »

Older Posts »