Feeds:
Posts
Comments

Archive for the ‘consumer culture’ Category

What would you think if someone told you that they were fighting for a “share” of your stomach?  Bring to mind organ harvesting? alien invasion? theft?

But this is in fact what the food industry is doing, and has been for some time.  I first heard this term last week when I took part in a to-remain-nameless gathering of food experts in the bay area.  It was in the context of a discussion of how we might, as eaters, make healthful choices in the American food marketplace.  Someone in the room recalled being at a food industry gathering, recently, where executives from a soda company were debating how to increase their “stomach share.”  They were seeking to expand their line of products (from sodas, to juices, waters, and exercise drinks) to make sure that whenever someone put a beverage in their stomach it was from company X.   Rather than merely competing with another brand in, say, “the marketplace,” the “stomach share” metaphor takes the battle to the consumers’ own body.  The question is not just how can we ensure that the consumer is buying the maximum amount of our product, but also how can we ensure that whenever the consumer is ingesting it is our product that’s got a majority share of the space.

This stomach talk reminded me of another phrase I’d come across in my research for Empty Pleasures—“prosperity stomach.”  Coined in 1966 by Henry Schacht, an executive from a diet-food company, and mentioned in a talk to newspaper editors called “How to Succeed in Business without Getting Fat,” the phrase referred to a troubling problem faced by the food industry.  Because people (at least the middle class) did less manual labor and had more money to buy the cheaper food produced by American industry, they had begun to gain weight.  That wasn’t really the problem from Schacht’s point of view.  More troubling was that this weight gain meant that they could not—or would not—buy all of the food they wanted—food that industry could profit by selling.  The answer? Diet Foods.  By developing more foods that had fewer calories, manufacturers and marketers could enjoy profits in excess of the stomach barrier.

“Stomach share” and “prosperity stomach”—terms invented nearly fifty years apart—remind us that the food industries have long viewed consumers as reducable to mere storage spaces for their products.  Within this climate, the wonder is not that our stomachs have expanded, it’s that they have not expanded even further.

Advertisement

Read Full Post »

Slow foods & local foods are fabulous, and we should be grateful to those who have made them part of the landscape.   Thanks to advocates in recent years, many of us can now purchase a tomato from the store and know where it came from or have a conversation with a grower while getting apples at the local farmer’s market.  These encounters allow us to better understand our food, and they make eating more pleasurable by connecting us to the past, and to each other.

But they do not solve the fundamental problem we face with American food and the way we’ve been taught to use it.

Our generation has inherited a food system—and by that I mean everything from how our food is grown, to how it is processed, to its flavors, to its branding, marketing, and store shelf placement—that depends on convincing people to eat and drink way more food than they need, way too much of the time.

Within this system, it would be wonderful if everyone could go local—slow—organic.  But better health doesn’t depend on it.   Better health does depend on being able to eat moderate amounts, for rational reasons, and stopping when one is full.

It’s not a zero sum game, of course.  We can have local/slow food and ethical/sustainable food production and marketing practices.  Yet I notice that I hear far more about the importance of yummy foods and regional farmers than I do about the importance of fair, just, and rational food landscapes.  Why not advocate for the heirloom tomato and the concept of a once-in-a-while soda and the removal of nutritionally bereft (2 for 1!!) foods from the end of the aisle at the grocery store?  Consider what would happen if some of the efforts we put into creating and sustaining farmer’s markets were diverted to build better barriers between food promoters and American stomachs.

I think part of the problem is that the food revolution has become, in a way, too tasty.  We want to advocate for structural change, but we want to do it through a good meal that we enjoy.  Fighting for accurate claims in food marketing (diet! natural! healthy!) or dissecting grocery store product placement—these are a long way from meals many of us who care deeply about food would even want to eat.  Thus, while we feel occasional outrage (bewilderment?) when we see a mega display pushing 3 12 packs of soda (+chips!) for $10, the attention can easily drift away from problems that, if solved, would benefit someone else (who doesn’t “eat right” anyway…) to those that benefit ourselves.

We should sit down together at the table of slow/local and celebrate our good fortune.  And, when the meal is done, take that tasty energy we’ve ingested and use it to regulate industry claims and prevent the over-making and over-marketing of all kinds of food in the US.

That way, no matter how fast their food moves, all eaters could have a better shot at health.

Read Full Post »

Mayor Bloomberg would like to prevent New Yorkers on food stamps from trading “stamps” for soda.  Under his newly proposed plan, food stamps won’t be an acceptable form of payment for beverages (other than milk and some fruit juices) that contain more than 10 calories per 8 oz serving.   By cutting down on soda consumption, the Mayor and his staff have argued, people will consume fewer calories and be healthier.  But what’s actually going to happen, in most cases, is that soda drinkers are going to remain soda drinkers–they’ll just swtch to “diet.”

High fructose corn syrup and sugar contribute hundreds of inessential calories to the average American body each day.  And many of these come in the form of sodas that have, year by year, been getting bigger, cheaper, and more smartly advertised as instant pleasure delivery systems.  This is not good for us, and we should cut our consumption.

But diverting those who currently drink large quantities sugared sodas to artificially sweetened ones is not a good solution. Even the most compelling argument—that diet sodas have few or no calories and so therefore will help people lose weight over regular sodas—can be refuted.  Current research emerging on artificial sweeteners suggests (counter-intuitively) that for many users they actually lead to weight gain.  And while sugar and corn syrup are sweet, the artificial sweeteners on the market today are sweeter–200 to 600 times so, per part.  So, while they don’t contribute calories, they may encourage our desire for ever-sweeter foods (most of them food-stamp eligible). And one merely has to google artificial sweeteners to find a litany of consumer complaints that they have caused a host of ailments–claims that while largely unsupported by science ought to give us pause.

While we can, and do, consume too much of it, sugar does at least provide caloric energy.  Artificial sweetener, on the other hand, has nothing our bodies can use.  That’s been, in fact, its selling point: we can have the pleasure of consumption without consequence. This promise has been very good for the pharmaceutical companies that make sweeteners, the food and beverage companies that put it in our food and drinks and the marketers that have helped nearly 200 million of us Americans become regular consumers.  Yet one is hard pressed to find evidence that it has made us healthier.

Unless the loophole is closed, many food stamp recipients will simply switch from Coke regular to Coke Zero. When that happens, artificial sweetener and soda makers will lose little.  They may even sell more once their products are state-sanctioned good-for-you options.

If we are going to legislate nutrition—and I’m not sure we should—let’s at least make sure we don’t create new problems out of the ones we face right now. Diet soda is not a healthy choice.

Read Full Post »

So, a friend approached me a few weeks ago and asked if I would accompany him on a project. He is listening his way through this book called 1001 Albums You Must Hear Before You Die. He asked me to blog the project with him.

But I can’t do it.

There are three main problems with the book.

1. It’s too big. 1001 albums to hear before you die is too long of a list — it’s a long enough list that it could include just about anything and requites precious little editing. “Which Mudhoney album should I include? I’ll just go ahead an include both.” Britney or Christina? Bring them both along. A list this expansive fails to do what a good list does, and that’s provide some guidance by making some editorial decisions about what makes the list and what does not.

2. It’s not big enough. It’s selective in a really painful way: It’s too rock-ish. For those of you keeping score at home:

Oasis 2. John Coltrane 1.
Arrested Development and Charles Mingus tie at one apiece.
Kings of Leon (2) outlast Nina Simone (1)
Billie Holiday (1) manages a draw with Slipknot (1).
John Cage (0) & Phillip Glass (0) get shut out entirely.

According to this book, there are almost no jazz albums worth hearing since Miles Davis‘ (4) Bitches Brew. and there is no album resembling “classical” music that’s you need to hear before you die. Period. In choosing the 1001 albums, the contributors demonstrate little beyond their own limited taste, or else they demonstrate what passes for the contemporary moment’s self-satisfied “ecclectic” listening which means that you have both Tupac (1) and Steely Dan (4) on your ipod. Come on.

3. But really, my issue with the book is how they define the “album.” The authors credit Frank Sinatra (3) with “inadvertently ushering the album era” with his 1950 release, In the Wee Small Hours.” Albums, in fact had been around for about a decade, but they were not the now coveted (and fetishized) 33 1/3 rpm records. They were albums of records that were modeled on photo albums and were packaged and sold as collections — often of Broadway soundtracks, but also of the work of a single musician. By opening the “album era” in 1955, the authors commit a more egregious sin than what they did to Herbie Hancock (0). By conflating the LP with the album, they omit albums that preceded In the Wee Small Hours (like Charlie Parker’s 1950 release Bird With Strings).

But worse yet, in the age of single-song downloads and declining sales of record albums, they nostalgically enshrine the LP with a kind of rock-centric power, as if American music can be best understood by 45-minute collections of songs — as if that format is somehow both natural and superior to all others. By enshrining the album, the contributors suggest that our aural days ought not be measured in music, but in format. And in that way, the book, in its inclusions and omissions, implies that anything beyond this capacious and capricious list isn’t worth hearing, anyhow.

Read Full Post »

The conversation about “selling out” in popular music has been dead for some time. And I’m not interested in reviving that conversation now. The last time it really flared up was around 1989, when Nike featured the Beatles’ “Revolution” in a commercial. Since then, it’s basically been a done deal.

So, today’s New York Times‘ story about Converse opening a recording studio did not come as that big a surprise. It’s a pretty interesting story, actually, that points out the real deadness of the “selling out” debate. Given the state of the music industry — the rise of digital downloading, the bloatedness of the major labels, the constriction of radio outlets through consolidation (and companies like Clear Channel), the so-called “360 deals,” rampant product placement in pop music and so on — why shouldn’t Converse enter the industry? Why shouldn’t Whole Foods? Barnes and Noble? You or I?

It’s a rhetorical question, of course, but it raises three important issues that we ought to be clear about, if we’re thinking about the current state of popular music.

1. Converse will make music to sell shoes. The music is “successful” if it results in shoe sales. The Converse record label is the idea of Geoff Cottrill, Converse’s chief marketing officer. Cottrill is pretty plain about his intentions:

“Let’s say over the next five years we put 1,000 artists through here, and one becomes the next Radiohead,” he said. “They’re going to have all the big brands chasing them to sponsor their tour. But the 999 artists who don’t make it, the ones who tend to get forgotten about, they’ll never forget us.”

In other words, if the company has a .01% success rate in terms of music sales, but it builds brand loyalty for its shoes, then the music is a worthwhile investment. It’s a strange approach to “arts patronage,” in which it has none of the trappings of the Rennaissance or human expression — it’s about creating art to sell shoes. And I know (thanks, Warhol), that this, too, is old, self-referential, post-modern news. Nevertheless, I think we ought to be clear about these new arrangements and what is serving and what is being served.

2. Because it is in the business of selling shoes, Converse is actually being far more generous to its artists than the labels (at least it appears to be so). The article reported that Converse has little to no interest in owning the recordings that it makes. This is something new, and it does give more power to the artists than they typically have under contract with major labels — but good luck selling your song to Nike or Starbucks or VW if you’ve already sold it to Converse. And if you’re a musician, you’re probably not making money selling records, so where are you going to sell your music?

3. The entrance of Converse into this marketplace seems like evidence of the breaking-apart of the music industry as we knew it in the 20th century. Indeed, one of the great things about music these days is that anyone with a laptop and an internet connection can become a label. This is radically liberating for many artists. But what’s the real difference between Columbia and Converse? Amidst the sweeping changes in the music industry, it still seems to be about artists serving larger corporate interests. Converse, like Columbia or EMI or Decca or whomever, has the broadcast outlets; it has the power in the marketplace that independent musicians don’t have.

And, though Converse seems to be more generous with their artists, they appear to care less about their music.

Read Full Post »

I don’t have a T.V. but I do have access to TV programming, thanks to my Netflix subscription and various other online sources of traditional broadcast entertainment. As a result, I found myself watching HULU the other night, and as after I selected the sitcom I wanted to watch, the screen went dark and I was presented with a decision:

Which advertising experience would you prefer?

And then, Hulu presented me with a choice of three ads to watch, each one for a particular insurance company.

Is this really what all the hype about interactivity and the radical power of the internet has come to? If this is really what it means to “harness the power of the internet,” then I might just go back to dial-up. Has Web 2.0 been reduced to my ability to choose which advertisement to watch? The dimensions of absurdity of this encounter, framed as a “choice” of “experience,” are too manifold to list, so I’ll focus on just two:

1. It’s a choice of advertisements, but I still have to watch an advertisement.
2. It’s a choice between advertisements for the same insurance company.

In a sense, this is the epitome of choice within capitalism. Which is to say: its not really choice, or else its a choice that is so constrained that the choice itself doesn’t end up really mattering. Theodor Adorno would be so proud and so perplexed. And he’d be laughing (BTW: if you don’t “choose,” HULU will choose for you…. so even not choosing is a choice).

Read Full Post »

Today’s NPR story on the relationship between weight loss, emotions, and hormones called leptin revealed how far we have come in our understanding of food and our bodies–and how far we still have to go.  In “Rational or Emotional? Your Brain on Food,” Columbia University Medical Center researchers reveal that weight loss can cause both a slower metabolism (which can make it harder to burn calories) and lowered levels of leptin–a hormone that works to control appetite.  It turns out that when people who have lost weight are given injections of leptin they have more activity in their brains in areas “associated with conscious decisions.”  While the study isn’t definitive, the report suggests that increasing these hormone levels may help people who have lost weight make better decisions to keep that weight off in the future.

Who wouldn’t be excited about an injection that could enhance will power, especially in keeping off that hard fought few pounds?  But will it, really?  For this we have to look at the method.  Researchers measured people’s will power by giving them an MRI and then showing them various plastic containers of foods–gummy worms, candies, cookies, bell peppers–and measuring brain activity as they contemplated the yumminess (or lackthereof) in the items.  But how many of us confront that moment of food choice (bell peppers or gummy worms, hmm…) while laying in a scanner staring at slow-moving food separated in plastic bins?  Aren’t we in fact making food choices in precisely the opposite conditions–standing in chaotic fast food lines, ordering from menus with one hand on the iphone, managing crying children or chatty friends, just after breaking up with girlfriends, while walking the aisles with hundreds of brightly packaged jumbled munchies vying for our attention?

Studies like the one at Columbia are essential if we want to understand the complex ways in which our bodies react to appetite stimulation, and we should.  Still, we need to see them for what they are: questions asked and answered in a vacuum.  The reporter herself acknowledges this when she opens the piece recalling the overwhelming smell of peanuts sold by a street vendor that challenge her calorie resolve, even in the midst of the story.  Would those peanuts have impacted her the same way in the MRI tube with a set of researchers well aware of “good food choices” watching her reactions?  We live in a complex culture and it is there that our food choices are made.  We can isolate the data in the lab and come up with technological solutions, but these are only going to address part of the problem.  And they may make new problems all their own.  We need to study food where we eat it–in the real world. For that we need scholars trained in socio-cultural aspects of food production and food choice, and scientists ready to bring them on board.

Read Full Post »

To wit: This week’s “Education Life” Section of the NY Times, where the cover article is called: “Making College ‘Relevant‘” I appreciate the quotation marks in the title, but the article seems to focus primarily on how to translate a BA into a J-O-B. This is a question that those of us in the humanities and social sciences get with some frequency. And our response is often couched in terms of “critical thinking skills” or “cultural analysis” or “nuance,” “subtlety,” “tensions,” or “cultural politics.”

But the question isn’t really if what we do is relevant, but rather why the job hunt and the endless pursuit of wealth and “practical knowledge” seems to have controlled the conversation about “relevance.” Why are those of us so gifted at cultural analysis often so poor at explaining its “relevance” to our students? Or, maybe more importantly: why is it such a challenge to provide frameworks for our students to recognize the relevance of what we’re doing on their own? Surely, relevance isn’t only about capitalizing on skills you can market through your “personal brand.”

Frankly, if relevance were judged by making money, the NYTimes would be in worse trouble than our universities.

Read Full Post »


The UCD Sustainable Pen

There’s something troubling about this artifact.  What appears at first to be another of the hundreds of variations on “spirit pens,” this UC Davis implement is something else entirely.  It is actually a plastic pen–perhaps of the bic variety?–wrapped with a thin layer of cardboard onto which the UCD logo is affixed (and a note indicating that it is “recycled material” is added).  Topping off this creation is a popsicle-stick like clip that is attached directly to the plastic pen top and can, one assumes, enable the user to affix the pen to the interior of a pocket–a scenario in which only the popsicle stick would protrude.

I don’t believe the UC Davis gurus of promotional products mean this to be funny.  Jokes that pass plastics off as good for the environment products don’t tend to get a laugh these days.  It’s possible that this is a sign that our sales and marketing team is losing their edge–certainly this is not a great product on several levels.  Green washing has to go right alongside peeling cardboard, cracking popsicle sticks, and depleted ink on the list of “poor design qualities.”

Yet let’s imagine just for a moment that we might need to take this pen seriously as what it claims to be: a symbol of our university.  That’s when things get scary.

In the day of business plans and bottom lines, is it too much of a stretch to imagine that this pen symbolizes the precarious nature of knowledge itself in the modern research university?  With class sizes growing and the pressure to “make something” and “bring in funding” (even in the humanities) we may find ourselves teaching students who never get deeper than superficial concepts and producing work that looks neat but has little there there, ultimately.  Maybe we are becoming the professorial equivalent of cardboard plastic pens stating our “recycled material.”

and that is definitely not funny.

Read Full Post »

One of the laments we often hear about the state of childhood in America today is that childhood has become commercialized.  Our news media and bookstore bookshelves regularly feature stories about how American children are barraged by advertising and marketing: the average child views 40,000 television commercials a year, children as young as three can recognize brand logos, kids these days are “born to buy.”  There is no question that children are exposed to an inordinate amount of advertising and consumer messaging in their daily lives–just as adults are.  However, it’s sometimes easy to forget that what appears to be the latest “crisis” in childhood is not exactly so new.  I was reminded of this recently when I showed students in my popular culture class a clip from the 1950s children’s program, “The Ding Dong School.”

“Ding Dong School” (1952-1956, NBC) was hosted by Dr. Frances Horwich, an educator who pioneered the idea of interacting with her young viewers through the television set (an approach Mr. Rogers perfected years later).  “What day is today?” she would ask, then pause to give kids at home time to answer, “Friday.”  The content was meant to be educational and culturally enriching.  Like any other television show, however, “Ding Dong School” was commercially sponsored, and “Miss Frances,” as she was known, regularly pitched products to her young viewers.  The clip I showed in class was of Horwich hawking Wheaties.  You can watch the clip on YouTube here.  Viewing this ad, we are reminded that direct advertising to children is nothing new.  Holding up a Wheaties box, Miss Frances asks kids at home, “What do I have? [pause] What do you think it is? [pause] Say it with me–Wheaties!”  She proceeds to instruct her young viewers, “When mother goes to the store, you help her find the new Wheaties box… I’ll tell her about it.  You tell her about it, too.  Please.”  This naked appeal to children to develop brand loyalty, to think like consumers, and to influence their parents’ spending is arguably no less sinister than what many Americans rail against today as the “newest” threat against children.

There is in fact a long history in the United States of manufacturers marketing their goods and services to children, using a variety of direct and indirect means.  Boxes of Cracker Jack started coming with toys inside them in 1912.  In the 1930s, many toymakers began appealing directly to children (previously they had targeted parents with their advertising).  Pinocchio toys were in production a year before the release of the Disney film in 1940.

There is similarly a long history of adults fearing that children are being unduly influenced by consumerism.  In 1902, for example, social reformer Jane Addams observed starkly, “Has our commercialism been so strong that our schools have become insensibly commercialized?… Is it possible that the business men, whom we in America so tremendously admire, have really been dictating the curriculum of our public schools?”  And in 1956, Dr. Frances Horwich resigned from NBC because she was concerned that children’s TV programming had become too commercialized.

Watching Miss Frances sell Wheaties in class this week, and reflecting on this broader historical context, reminded me that many of our contemporary fears about American childhood have been around for a long time.  I was also reminded of a lament expressed by many scholars who study the history of childhood: As a society, we tend to do a much better job of complaining about the state of childhood than of actually working to change and improve the lives of children.

Read Full Post »

Older Posts »