Feeds:
Posts
Comments

Archive for October, 2010

What do Philip Roth’s polio novel Nemesis and Justin Cronin’s vampire novel The Passage, both published in 2010, have in common?  Quite a bit, actually.  Roth’s novel renders the atmosphere of fear surrounding a polio outbreak in Newark, New Jersey in the summer of 1944.  The protagonist is a young playground director futilely trying to protect the children he supervises from contracting polio.  Cronin’s novel imagines a world ninety years in the future that has been overrun by killer vampiric “virals” as the result of a scientific experiment gone awry.  It focuses on a small colony of people struggling to defend themselves against the virals.  Both books deal with the pervasive threat of an invisible, poorly understood contagion that has no known cure.  Both focus on adult attempts to shelter children from the contagion.  In both, characters wrestle with existential questions and religious doubt—why would God inflict polio on children?  Do virals have souls?  Both books demonstrate how natural time changes—spring to summer, day to night—can heighten the threat of contagion and force humans to change their routines and behavior.  Both show how suspicion and fear can cause communities to turn against one another.  As cultural artifacts of the moment, both novels also resonate powerfully with contemporary anxieties surrounding contagion in our everyday lives.

In recent years, Americans have been alerted to the threats posed by multiple toxins and diseases: salmonella in eggs, ecoli in spinach and peanutbutter, cadmium in Shrek glasses sold at McDonald’s and jewelry sold at Wal-Mart, H1N1, West Nile virus.  Such warnings came about against the backdrop of broader fears about biological terrorism post-9/11. The current wave of viral panic is of course the most recent chapter in a very long history of epidemics—Ebola in the 1990s, mad cow disease in the 1980s, polio in the 1930s and 1940s, the black death in the 14th century, and so on.

There is a cultural dimension to this biological history—a cultural history of contagion.  Nemesis and The Passage are the latest examples of artistic expressions, literary and otherwise, that in some way reflect, reimagine, or comment upon viral panics.  Think Daniel DeFoe’s Journal of the Plague Year (1722), Thomas Mann’s Death in Venice (1912), Albert Camus’s The Plague (1947) (literary critic Elaine Showalter recently made a brilliant connection between Nemesis and The Plague), and Stephen King’s The Stand (1978/1990).  Think nonfiction books like The Hot Zone (1994).  Or films like Panic in the Streets (1950), Omega Man (1971), and Outbreak (1995).

As artifacts of the 21st century culture of viral panic, unlikely bedfellows Nemesis and The Passage join a diverse cultural cohort that includes the comic book series Y, about a plague that kills every mammal on earth with a Y chromosome; the 28 Days and 28 Weeks Later films, featuring the “rage” virus; the horror movie Cabin Fever, starring a flesh-eating virus (tagline: “Catch It”); Max Brooks’s book World War Z, an “oral history of the zombie wars”; and the forthcoming young adult novel Virals, about a group of teenagers exposed to canine parvovirus.

These contemporary examples of the culture of viral panic offer audiences ways to process a number of fears and anxieties that circumscribe our everyday lives in the 21st century.  Anxieties about the limitations and abuses of science and medicine.  Anxieties about our ability to protect children and youth from invisible menaces.  Anxieties about community cohesiveness and civility.  Anxieties about the government’s ability to respond to disaster.  In other words, the culture of viral panic is never just about the threat of contagion.  It always speaks to broader concerns about institutions, social relations, childrearing, science, spirituality.  Read in this context, Nemesis and The Passage have a great deal in common.

Advertisement

Read Full Post »

What would you think if someone told you that they were fighting for a “share” of your stomach?  Bring to mind organ harvesting? alien invasion? theft?

But this is in fact what the food industry is doing, and has been for some time.  I first heard this term last week when I took part in a to-remain-nameless gathering of food experts in the bay area.  It was in the context of a discussion of how we might, as eaters, make healthful choices in the American food marketplace.  Someone in the room recalled being at a food industry gathering, recently, where executives from a soda company were debating how to increase their “stomach share.”  They were seeking to expand their line of products (from sodas, to juices, waters, and exercise drinks) to make sure that whenever someone put a beverage in their stomach it was from company X.   Rather than merely competing with another brand in, say, “the marketplace,” the “stomach share” metaphor takes the battle to the consumers’ own body.  The question is not just how can we ensure that the consumer is buying the maximum amount of our product, but also how can we ensure that whenever the consumer is ingesting it is our product that’s got a majority share of the space.

This stomach talk reminded me of another phrase I’d come across in my research for Empty Pleasures—“prosperity stomach.”  Coined in 1966 by Henry Schacht, an executive from a diet-food company, and mentioned in a talk to newspaper editors called “How to Succeed in Business without Getting Fat,” the phrase referred to a troubling problem faced by the food industry.  Because people (at least the middle class) did less manual labor and had more money to buy the cheaper food produced by American industry, they had begun to gain weight.  That wasn’t really the problem from Schacht’s point of view.  More troubling was that this weight gain meant that they could not—or would not—buy all of the food they wanted—food that industry could profit by selling.  The answer? Diet Foods.  By developing more foods that had fewer calories, manufacturers and marketers could enjoy profits in excess of the stomach barrier.

“Stomach share” and “prosperity stomach”—terms invented nearly fifty years apart—remind us that the food industries have long viewed consumers as reducable to mere storage spaces for their products.  Within this climate, the wonder is not that our stomachs have expanded, it’s that they have not expanded even further.

Read Full Post »

Slow foods & local foods are fabulous, and we should be grateful to those who have made them part of the landscape.   Thanks to advocates in recent years, many of us can now purchase a tomato from the store and know where it came from or have a conversation with a grower while getting apples at the local farmer’s market.  These encounters allow us to better understand our food, and they make eating more pleasurable by connecting us to the past, and to each other.

But they do not solve the fundamental problem we face with American food and the way we’ve been taught to use it.

Our generation has inherited a food system—and by that I mean everything from how our food is grown, to how it is processed, to its flavors, to its branding, marketing, and store shelf placement—that depends on convincing people to eat and drink way more food than they need, way too much of the time.

Within this system, it would be wonderful if everyone could go local—slow—organic.  But better health doesn’t depend on it.   Better health does depend on being able to eat moderate amounts, for rational reasons, and stopping when one is full.

It’s not a zero sum game, of course.  We can have local/slow food and ethical/sustainable food production and marketing practices.  Yet I notice that I hear far more about the importance of yummy foods and regional farmers than I do about the importance of fair, just, and rational food landscapes.  Why not advocate for the heirloom tomato and the concept of a once-in-a-while soda and the removal of nutritionally bereft (2 for 1!!) foods from the end of the aisle at the grocery store?  Consider what would happen if some of the efforts we put into creating and sustaining farmer’s markets were diverted to build better barriers between food promoters and American stomachs.

I think part of the problem is that the food revolution has become, in a way, too tasty.  We want to advocate for structural change, but we want to do it through a good meal that we enjoy.  Fighting for accurate claims in food marketing (diet! natural! healthy!) or dissecting grocery store product placement—these are a long way from meals many of us who care deeply about food would even want to eat.  Thus, while we feel occasional outrage (bewilderment?) when we see a mega display pushing 3 12 packs of soda (+chips!) for $10, the attention can easily drift away from problems that, if solved, would benefit someone else (who doesn’t “eat right” anyway…) to those that benefit ourselves.

We should sit down together at the table of slow/local and celebrate our good fortune.  And, when the meal is done, take that tasty energy we’ve ingested and use it to regulate industry claims and prevent the over-making and over-marketing of all kinds of food in the US.

That way, no matter how fast their food moves, all eaters could have a better shot at health.

Read Full Post »

Mayor Bloomberg would like to prevent New Yorkers on food stamps from trading “stamps” for soda.  Under his newly proposed plan, food stamps won’t be an acceptable form of payment for beverages (other than milk and some fruit juices) that contain more than 10 calories per 8 oz serving.   By cutting down on soda consumption, the Mayor and his staff have argued, people will consume fewer calories and be healthier.  But what’s actually going to happen, in most cases, is that soda drinkers are going to remain soda drinkers–they’ll just swtch to “diet.”

High fructose corn syrup and sugar contribute hundreds of inessential calories to the average American body each day.  And many of these come in the form of sodas that have, year by year, been getting bigger, cheaper, and more smartly advertised as instant pleasure delivery systems.  This is not good for us, and we should cut our consumption.

But diverting those who currently drink large quantities sugared sodas to artificially sweetened ones is not a good solution. Even the most compelling argument—that diet sodas have few or no calories and so therefore will help people lose weight over regular sodas—can be refuted.  Current research emerging on artificial sweeteners suggests (counter-intuitively) that for many users they actually lead to weight gain.  And while sugar and corn syrup are sweet, the artificial sweeteners on the market today are sweeter–200 to 600 times so, per part.  So, while they don’t contribute calories, they may encourage our desire for ever-sweeter foods (most of them food-stamp eligible). And one merely has to google artificial sweeteners to find a litany of consumer complaints that they have caused a host of ailments–claims that while largely unsupported by science ought to give us pause.

While we can, and do, consume too much of it, sugar does at least provide caloric energy.  Artificial sweetener, on the other hand, has nothing our bodies can use.  That’s been, in fact, its selling point: we can have the pleasure of consumption without consequence. This promise has been very good for the pharmaceutical companies that make sweeteners, the food and beverage companies that put it in our food and drinks and the marketers that have helped nearly 200 million of us Americans become regular consumers.  Yet one is hard pressed to find evidence that it has made us healthier.

Unless the loophole is closed, many food stamp recipients will simply switch from Coke regular to Coke Zero. When that happens, artificial sweetener and soda makers will lose little.  They may even sell more once their products are state-sanctioned good-for-you options.

If we are going to legislate nutrition—and I’m not sure we should—let’s at least make sure we don’t create new problems out of the ones we face right now. Diet soda is not a healthy choice.

Read Full Post »

So, a friend approached me a few weeks ago and asked if I would accompany him on a project. He is listening his way through this book called 1001 Albums You Must Hear Before You Die. He asked me to blog the project with him.

But I can’t do it.

There are three main problems with the book.

1. It’s too big. 1001 albums to hear before you die is too long of a list — it’s a long enough list that it could include just about anything and requites precious little editing. “Which Mudhoney album should I include? I’ll just go ahead an include both.” Britney or Christina? Bring them both along. A list this expansive fails to do what a good list does, and that’s provide some guidance by making some editorial decisions about what makes the list and what does not.

2. It’s not big enough. It’s selective in a really painful way: It’s too rock-ish. For those of you keeping score at home:

Oasis 2. John Coltrane 1.
Arrested Development and Charles Mingus tie at one apiece.
Kings of Leon (2) outlast Nina Simone (1)
Billie Holiday (1) manages a draw with Slipknot (1).
John Cage (0) & Phillip Glass (0) get shut out entirely.

According to this book, there are almost no jazz albums worth hearing since Miles Davis‘ (4) Bitches Brew. and there is no album resembling “classical” music that’s you need to hear before you die. Period. In choosing the 1001 albums, the contributors demonstrate little beyond their own limited taste, or else they demonstrate what passes for the contemporary moment’s self-satisfied “ecclectic” listening which means that you have both Tupac (1) and Steely Dan (4) on your ipod. Come on.

3. But really, my issue with the book is how they define the “album.” The authors credit Frank Sinatra (3) with “inadvertently ushering the album era” with his 1950 release, In the Wee Small Hours.” Albums, in fact had been around for about a decade, but they were not the now coveted (and fetishized) 33 1/3 rpm records. They were albums of records that were modeled on photo albums and were packaged and sold as collections — often of Broadway soundtracks, but also of the work of a single musician. By opening the “album era” in 1955, the authors commit a more egregious sin than what they did to Herbie Hancock (0). By conflating the LP with the album, they omit albums that preceded In the Wee Small Hours (like Charlie Parker’s 1950 release Bird With Strings).

But worse yet, in the age of single-song downloads and declining sales of record albums, they nostalgically enshrine the LP with a kind of rock-centric power, as if American music can be best understood by 45-minute collections of songs — as if that format is somehow both natural and superior to all others. By enshrining the album, the contributors suggest that our aural days ought not be measured in music, but in format. And in that way, the book, in its inclusions and omissions, implies that anything beyond this capacious and capricious list isn’t worth hearing, anyhow.

Read Full Post »

The conversation about “selling out” in popular music has been dead for some time. And I’m not interested in reviving that conversation now. The last time it really flared up was around 1989, when Nike featured the Beatles’ “Revolution” in a commercial. Since then, it’s basically been a done deal.

So, today’s New York Times‘ story about Converse opening a recording studio did not come as that big a surprise. It’s a pretty interesting story, actually, that points out the real deadness of the “selling out” debate. Given the state of the music industry — the rise of digital downloading, the bloatedness of the major labels, the constriction of radio outlets through consolidation (and companies like Clear Channel), the so-called “360 deals,” rampant product placement in pop music and so on — why shouldn’t Converse enter the industry? Why shouldn’t Whole Foods? Barnes and Noble? You or I?

It’s a rhetorical question, of course, but it raises three important issues that we ought to be clear about, if we’re thinking about the current state of popular music.

1. Converse will make music to sell shoes. The music is “successful” if it results in shoe sales. The Converse record label is the idea of Geoff Cottrill, Converse’s chief marketing officer. Cottrill is pretty plain about his intentions:

“Let’s say over the next five years we put 1,000 artists through here, and one becomes the next Radiohead,” he said. “They’re going to have all the big brands chasing them to sponsor their tour. But the 999 artists who don’t make it, the ones who tend to get forgotten about, they’ll never forget us.”

In other words, if the company has a .01% success rate in terms of music sales, but it builds brand loyalty for its shoes, then the music is a worthwhile investment. It’s a strange approach to “arts patronage,” in which it has none of the trappings of the Rennaissance or human expression — it’s about creating art to sell shoes. And I know (thanks, Warhol), that this, too, is old, self-referential, post-modern news. Nevertheless, I think we ought to be clear about these new arrangements and what is serving and what is being served.

2. Because it is in the business of selling shoes, Converse is actually being far more generous to its artists than the labels (at least it appears to be so). The article reported that Converse has little to no interest in owning the recordings that it makes. This is something new, and it does give more power to the artists than they typically have under contract with major labels — but good luck selling your song to Nike or Starbucks or VW if you’ve already sold it to Converse. And if you’re a musician, you’re probably not making money selling records, so where are you going to sell your music?

3. The entrance of Converse into this marketplace seems like evidence of the breaking-apart of the music industry as we knew it in the 20th century. Indeed, one of the great things about music these days is that anyone with a laptop and an internet connection can become a label. This is radically liberating for many artists. But what’s the real difference between Columbia and Converse? Amidst the sweeping changes in the music industry, it still seems to be about artists serving larger corporate interests. Converse, like Columbia or EMI or Decca or whomever, has the broadcast outlets; it has the power in the marketplace that independent musicians don’t have.

And, though Converse seems to be more generous with their artists, they appear to care less about their music.

Read Full Post »