Feeds:
Posts
Comments

Archive for the ‘communication’ Category

How did the humanities lose out to neuroscience in making culture seem interesting?

I’ve been listening to a lot of NPR and podcasts lately. I’ve given my historical favorites a little break (sorry, This American Life and Sound Opinions), and I’ve been listening more and more to Planet Money and Radiolab (as podcasts), and to the usual NPR fare that airs while I do dishes or cook dinner: All Things Considered, Marketplace, and of course: Fresh Air.

What I’ve noticed is how often scientists and economists show up on theses shows to talk about things I thought were the main interests of humanists and social scientists. Questions like the how restaurants work, whether or not race matters, why certain songs get stuck in our heads, how people calculate the value of things or make decisions they know are not in their best interests, and so on.

These are the questions to which I have long sought answers by looking at culture and its various expressions, and in which my field of American Studies has long been interested (albeit in different forms, over time).

Yet somehow, every time I turn on the radio, I find one or another neuroscientist (or, often enough, Jonah Lehrer) talking about precisely these same questions, and about how the pathways of neurons and synapses can answer questions art or love or whatever.

So here’s my question to my colleagues: how did we become so untouchable or so uninteresting to mainstream media? How come the good people at NPR (and, presumably, their listeners) find neuroscientists and economists more interesting and more capable of talking about these questions that we are? How did they become the go-to academics for understanding how and why people do what they do? Social scientists and humanists look at those phenomena, too, but somehow, we have become generally less interesting than our colleagues.

This is not the neuroscientists’ fault: they are good at what they do, and their creativity in asking profound questions that teeter on the line between culture and biology ought to be encouraged. Similarly, it’s not the fault of the radio programmers; they are looking for the most intelligent, engaging guests they can find. And they’re finding them in neuroscience and economics, not in the Humanities.

Why is everyone else talking about culture but us? Are we that boring? Have we grown so adept at gazing at our own navels that we can’t talk about other things? Does “the public” think that so-called “hard” science is really the only arbiter of actualities in the world?

How have we become so irrelevant even on topics that are ostensibly our areas of expertise and scholarly interest?

Advertisement

Read Full Post »

Whenever someone (often on the internet, but not always) wants to emphasize how smart the collective masses are, or point to the effectiveness of the “hive mind,” they always end up looking at Wikipedia.

But why hold Wikipedia up as a bastion of either mass-generated wisdom or productivity? Why make Wikipedia the pinnacle of the possibilities of collective action?

Yes, its Wikipedia is both a tremendous source of information and an example of the possibilities of a kind of open-ended collaborative effort to tap the massive labors of folks around the world who would, historically, have been shut out of the process of knowledge production. And that opening-up of access is a good thing, generally speaking. However, the price of such opening-up might be a more generally accurate account, but its a less curious, less inspiring and ultimately, less informative product.

Wikipedia gives us information but beyond that, I’m not sure it gives us very much.

Pointing to Wikipedia or the Encyclopedia Britanica as the best of a culture’s intellectual output makes about as much sense as looking to sausage for the best of a culture’s protein sources. Both are collections of abstract and often reductive information — no matter how extensive that information is. Rendered, often, in the authoritative prose of clumsy journalism, both sources are and remain locations for fact checking and the collection of surface level information. They’re useful, and (to extend the sausage metaphor), they might even be a little useful, but they are hardly the finest examples of intellectual production to which we can point. Wikipedia is great for checking dates of major events or grabbing thumb-nail sketches of historical figures, but it is not a terribly productive location for fostering curiosity or encouraging complex intellectual investigation.

Wikipedia offers information at its most basic and banal. It does not signal the “death of the expert,” but its eternal life. And I don’t mean the good kind of expert, either. I mean the niggling, nit-picking kind of expert who treasures the firmness of facts over the flexibility or rigor of intellectual labor.

Wikipedia is great at presenting gross information, but it is much worse at presenting nuanced argument. Encyclopedia Britanica is not much better, for that matter, but what the continual stoking of the Wikipedia v. Britanica debate suggests not the triumph of the collective mind over the individual brain. Nor does it indicate the death of the expert. Instead, it illustrates the embrace of the middle by the mass — which doesn’t tell us anything we don’t already know, and which is exactly where Wikipedia belongs.

Coming Soon: The Death of the Death of the Expert

Read Full Post »

The great irony of the The Social Network, of course, is that its central theme is not connectivity but disconnection.  A film about the genesis of a technology designed to bring people closer together features a main character who experiences the painful dissolution of social bonds. The plot is driven by protagonist Mark Zuckerberg’s exclusion from social groups, the end of his romantic relationship, and the collapse of his close friendship.  This sense of disconnection is pervasive despite the fact that The Social Network is a crowded movie: characters are almost never seen alone (only when they are individually “wired in” to their computer screens), and certainly never seen enjoying a moment of quiet solitude. Instead, the characters are regularly packed together in small places—a legal office, a dorm room—or in big, loud impersonal places—a nightclub, a drunken party. But these spaces, despite their capacities, are repeatedly portrayed as lonely and alienating.

While the characters may inhabit a decidedly unsocial non-network beneath a façade of constant social interaction, the film itself serves as a remarkably vibrant cultural network. For the student of American culture, The Social Network is a fountainhead of intertextuality.  Perhaps appropriate for a film about Facebook, The Social Network functions as a busy crossroads of cultural referents, many readily recognizable, others unstated but nevertheless present.  The movie obviously plays on our familiarity with Facebook, but it is also features appearances by Bill Gates and the creator of Napster (both portrayed by actors), a musical score by industrial rock luminary Trent Reznor of Nine Inch Nails, and a Harvard setting (even though campus scenes were mostly filmed at The Johns Hopkins University).  It is also directed by David “Fight Club” Fincher and written by Aaron “West Wing” Sorkin.  One of the students on Zuckerberg’s Facebook team is played by Joseph Mazzello, the actor who starred as the little boy Tim Murphy in Jurassic Park.  In other words, what is really “social” about The Social Network is the way in which it serves as a pulsating intersection for a range of icons, myths, and expressive forms that circulate in the audience’s collective imagination.  It is populated by cultural detritus and ephemera with which we are friendly, if you will.

I imagine these multiple and varied cultural associations may in part explain the source of the film’s pleasure for viewers. The experience of viewing The Social Network is akin to data-mining.  It rewards a 21st century audience accustomed to scanning mounds of digital information and quickly categorizing that information into familiar frames of reference. For example, the brief appearance of Bill Gates evokes our Horatio Alger myth of success and the American dream.  The presence of Sean Parker of Napster infamy conjures associations with the lone rebel, hacking the system and sticking it to the man.  And Zuckerberg himself comes across as a nerd pulling one over on the Olympic jocks.

Reznor’s musical score triggers memories of his earlier work on such Nine Inch Nails albums as Pretty Hate Machine (1989).  That record opens with the line, “god money I’ll do anything for you/god money just tell me what you want me to,” and builds to the chorus, “head like a hole/black as your soul/I’d rather die/than give you control.” Pretty Hate Machine, with its loud synthesizers, drum machines, and vocal wails, is not unlike The Social Network: an expression of male adolescent angst and rage confined inside an electronic world.

And there are still other resonances: Fincher’s directorship reminds us of his previous explorations of masculinity and antisocial behavior in Fight Club, The Game, and Zodiac.  Sorkin’s dialogue echoes the brainy loquaciousness of the political characters he developed for the television show The West Wing. Nearly twenty years ago, in Jurassic Park, actor Mazzello played a child victimized by a technology abused and gone awry.

As I watched The Social Network, I even found correspondences with The Lonely Crowd (1950), the sociological study of “other-directed” social character that became a bestseller in postwar America.  Co-authored by David Riesman and his team, the book argues that Americans are increasingly motivated by the need for peer acceptance.  More and more, our “inner gyroscope” is set in motion not by individualistic self-reliance, but by the drive to win approval and fit in.  Consequently, our time is spent trying to decode what is popular and adjust our personalities accordingly: “The other-directed person acquires an intense interest in the ephemeral tastes of ‘the others’… the other-directed child is concerned with learning from these interchanges whether his radar equipment is in proper order” (74). What is The Social Network if not a depiction of a lonely crowd?  Indeed, isn’t Facebook itself a kind of lonely crowd?

I can’t help but wonder if this way of reading the movie—this pleasurable scanning of its cultural allusions—in some way works to conceal its underlying critique of 21st century connectivity. The film’s central theme of social dissolution is undercut by its teeming network of familiar, friendly signifiers.  Its “ephemeral tastes.”  Yet we are “friends” with these signifiers in as much as we are “friends” with hundreds of people on Facebook—another text, by the way, that we scan with the same data-mining mindset.  As portrayed in The Social Network, Facebook doesn’t really bring people closer together in any meaningful way; it facilitates casual hook-ups in bar bathrooms, or it breeds addiction, as one character confesses, or it quickly delivers trivial information like the results of a crew boat race. You would think The Social Network, with its depiction of Zuckerberg and his creation, would compel a mass exodus from Facebook, or at least spark critical public reflection on our complicity with this technology of the lonely crowd. But instead it rewards and reinforces our ways of reading: our ingrained habits of consuming images, scanning information, reducing human experience to pannable nuggets of gold.

Read Full Post »

The conversation about “selling out” in popular music has been dead for some time. And I’m not interested in reviving that conversation now. The last time it really flared up was around 1989, when Nike featured the Beatles’ “Revolution” in a commercial. Since then, it’s basically been a done deal.

So, today’s New York Times‘ story about Converse opening a recording studio did not come as that big a surprise. It’s a pretty interesting story, actually, that points out the real deadness of the “selling out” debate. Given the state of the music industry — the rise of digital downloading, the bloatedness of the major labels, the constriction of radio outlets through consolidation (and companies like Clear Channel), the so-called “360 deals,” rampant product placement in pop music and so on — why shouldn’t Converse enter the industry? Why shouldn’t Whole Foods? Barnes and Noble? You or I?

It’s a rhetorical question, of course, but it raises three important issues that we ought to be clear about, if we’re thinking about the current state of popular music.

1. Converse will make music to sell shoes. The music is “successful” if it results in shoe sales. The Converse record label is the idea of Geoff Cottrill, Converse’s chief marketing officer. Cottrill is pretty plain about his intentions:

“Let’s say over the next five years we put 1,000 artists through here, and one becomes the next Radiohead,” he said. “They’re going to have all the big brands chasing them to sponsor their tour. But the 999 artists who don’t make it, the ones who tend to get forgotten about, they’ll never forget us.”

In other words, if the company has a .01% success rate in terms of music sales, but it builds brand loyalty for its shoes, then the music is a worthwhile investment. It’s a strange approach to “arts patronage,” in which it has none of the trappings of the Rennaissance or human expression — it’s about creating art to sell shoes. And I know (thanks, Warhol), that this, too, is old, self-referential, post-modern news. Nevertheless, I think we ought to be clear about these new arrangements and what is serving and what is being served.

2. Because it is in the business of selling shoes, Converse is actually being far more generous to its artists than the labels (at least it appears to be so). The article reported that Converse has little to no interest in owning the recordings that it makes. This is something new, and it does give more power to the artists than they typically have under contract with major labels — but good luck selling your song to Nike or Starbucks or VW if you’ve already sold it to Converse. And if you’re a musician, you’re probably not making money selling records, so where are you going to sell your music?

3. The entrance of Converse into this marketplace seems like evidence of the breaking-apart of the music industry as we knew it in the 20th century. Indeed, one of the great things about music these days is that anyone with a laptop and an internet connection can become a label. This is radically liberating for many artists. But what’s the real difference between Columbia and Converse? Amidst the sweeping changes in the music industry, it still seems to be about artists serving larger corporate interests. Converse, like Columbia or EMI or Decca or whomever, has the broadcast outlets; it has the power in the marketplace that independent musicians don’t have.

And, though Converse seems to be more generous with their artists, they appear to care less about their music.

Read Full Post »

I don’t have a T.V. but I do have access to TV programming, thanks to my Netflix subscription and various other online sources of traditional broadcast entertainment. As a result, I found myself watching HULU the other night, and as after I selected the sitcom I wanted to watch, the screen went dark and I was presented with a decision:

Which advertising experience would you prefer?

And then, Hulu presented me with a choice of three ads to watch, each one for a particular insurance company.

Is this really what all the hype about interactivity and the radical power of the internet has come to? If this is really what it means to “harness the power of the internet,” then I might just go back to dial-up. Has Web 2.0 been reduced to my ability to choose which advertisement to watch? The dimensions of absurdity of this encounter, framed as a “choice” of “experience,” are too manifold to list, so I’ll focus on just two:

1. It’s a choice of advertisements, but I still have to watch an advertisement.
2. It’s a choice between advertisements for the same insurance company.

In a sense, this is the epitome of choice within capitalism. Which is to say: its not really choice, or else its a choice that is so constrained that the choice itself doesn’t end up really mattering. Theodor Adorno would be so proud and so perplexed. And he’d be laughing (BTW: if you don’t “choose,” HULU will choose for you…. so even not choosing is a choice).

Read Full Post »

A few weeks ago I reorganized my bedroom closet.  This alone may be worthy of a blogpost, but I won’t bore you with recounting the small joy that this task brought me.  What struck me about the process, from a cultural perspective, was the sheer volume of paper memories I found myself sorting through and reordering.  Ten photo albums.  Two file crates of stories and poems I wrote as a child and adolescent.  Four different memento boxes of written correspondence from friends, family, and former girlfriends dating from high school through the recent present.  A thick stack of letters from my grandmother, starting in college and continuing through her death in 1999.  A shoebox of love letters, another shoebox of random photographs, a pile of birthday cards.  All handwritten.  All saved.  All newly organized on a shelf in my closet.  All ready to be grabbed up in case of a fire.

I always imagined that I would re-read these letters someday on my porch sitting in my rocking chair when I was old and gray.  I would revisit the words, the thoughts, the feelings, the handwriting of people I know and love, of people I knew and loved.  But as I was organizing, I became painfully aware of a gap.  The collection had dwindled substantially over the past eight years, slowing to barely a drip of birthday cards and the occasional sweet letter sent by a former partner.  But essentially the paper trail dries up around 2002.  It noticeably dries up.  Why?  Because my correspondents and I stopped sending mail and instead used electronic means almost exclusively.

My question that day, surrounded by boxes on the floor and letters and photos strewn about my bed, was this: how will we organize, preserve, and retrieve our memories, our special moments, our correspondence, in the digital age?  We communicate via email, we post our digital photos on Flicker and Facebook, we text message quips and best wishes and intimate confessions.  But how many of these will be saved?  When we are old and gray, will we sit in rocking chairs and peruse our laptop, reread thousands of emails, and revisit texts stored on old cell phones?  And if so, will anything be lost?  Or gained?

Perhaps the age we live in has merely required us to be more selective.  To print out the few emails that mean something to us, file them away, and let the rest pass into the past.  On more than one occasion I’ve been accused of holding on too much to my personal history, as I’ve carted these memories around from apartment to apartment over the years.  Granted, I do.  But mail has meant something to me in a way email hasn’t.  Maybe it’s not the medium but the message that counts here.  Still, I think reading the handwriting of others invokes a certain kind of memory, connection, and closeness in us.  Do standard text fonts have the same power?  I suppose we’ll find out in the years to come.

Read Full Post »

Hours before Sunday’s broadcast of the Oscars, whole swaths of the New York viewing audience were cut out because ABC pulled its affiliate stations. In the wake of this event, a coalition of broadcast providers submitted a petition to the FCC, asking that the current rules that govern broadcasting and redistribution are ripe for reassessment.

The current rules were formulated in 1992, and it should be obvious that lots of things have changed with respect to telecommunications and broadcasting since 1992.

The lag between older legal structures and emerging technologies and practices is nothing new. In this realm the law doesn’t and has never governed retroactively, but it always operates from the wake of history. Laws about new or emerging communications technologies are built on older frameworks, and therefore have always lagged a few decades behind the actual communications practices and technologies that the laws are written to govern.

As telephony and telegraphy spread, the laws that governed them were laws that were originally written to regulate interstate commerce. The only imaginable structure that could compare with telephony was trucking. When the radio act was written in 1927, it was modeled on the wireless laws that that controlled ship-to-ship communications because that was the only format for understanding wireless communication.

The laws governing television followed the laws of radio and the network model, and the laws about cable were written with television in mind, and so on. To be sure, there have been changes in the law — both innovative and some deeply troubling. But as new laws are imagined and formulated, they are done so as if new communications models will look just like older models; that telephony would be similar to trucking or that television would be like radio.

This is less a failure of law than a success of particular interests that were shaped from the very beginning by the desires of commercial interests. Here’s the punchline: because the commercial interests (AT&T during the radio era, DirectTV now) are deeply invested in maintaining the status quo. There is lots of money to be made (or lost) in controlling what communications look like. There is little impetus for innovation as such, so why not build the new laws to look like the old ones?

Certainly, the internet is radically reshaping habits and practices of the consumption of popular culture, and the 17 year-old laws are deeply in need of an overhaul. But is this round of law going to preserve the past or usher in the future?

Read Full Post »

A number of media stories have appeared recently dissecting the ways in which technology and social networks have transformed our culture of dating, romance, and breaking up.  For example, a few days ago NPR reported on “Digital Tears: Breakups and Social Networks,” the New Yorker online ran a short piece on “The Importance of Email to Romance,” and on Valentine’s Day, one writer’s personal story about Facebook heartache was published in the L.A. Times under the title, “Relationships in the Digital Age.”  Granted, the appearance of these stories has conveniently coincided with our national marketing of Valentine’s Day.  Nevertheless, they are interesting to consider for their articulation of concerns about the rapid and often uncomfortable ways in which technology seems to be influencing our ideas about romance and especially the end of romance.

Historically, of course, new technologies have always impacted our cultural construction of romance as well as our everyday experience of it.  The massive expansion of the U.S. postal system in the 19th century greatly facilitated the writing, sending, and receiving of romantic letters on a regular basis.  The automobile completely changed rituals of courtship and dating.  The telephone, electronic mail, and now social networks have similarly transformed how romantic partners communicate with one another (and how they expect to communicate).

What these recent stories suggest is that, while our latest online technologies can make romance feel more immediate, exciting, and accelerated, they have also made breaking up a rather drawn-out, tortuous experience.  The multiple connections that romantic partners establish via online technology become difficult to sever and the information provided by these connections often becomes difficult to stomach.  On remaining socially networked to exes, for instance, one interviewee in the NPR segment said, “It’s basically like stabbing yourself in the heart again every four hours or so.”  The L.A. Times writer used a similar metaphor when contemplating whether he should stay Facebook friends with his ex: “Seeing her updates and knowing I wasn’t in the inner circle anymore would be like another knife in the heart.”

I see two themes emerging from these stories.  The first is of access.  With the overwhelming access to information that the Internet has afforded us, we have lost a sense of propriety.  We have come to feel entitled to full access to information about our partner as well, no matter how early in the relationship—or no matter if the relationship has ended.  The second theme is of remorse.  There is a price we pay for making our private lives so incredibly public online.  Amidst our current debates over corporate and government invasions of privacy, be it the Patriot Act or Google Buzz or Facebook, we sometimes forget that we also willingly put ourselves out there, often too soon and too much in focus.

Read Full Post »

For three years now I have not had a television.  I enjoy telling people this, as friends and family (as well as strangers I’ve met at parties) will readily testify.  Whenever they talk about some commercial or new show, I always say, “I don’t know what you’re talking about.  I don’t watch TV.”  The pleasure I get from this disclosure comes in equal measure from witnessing their disbelief and from tingling with my own self-righteousness.  (As someone who teaches the history of popular culture, the fact that I don’t watch TV is ironic if not problematic, but that’s probably the subject of another post).  Recently, however, several of my close friends have begun to call me out on my smug pronouncement.

Them: “So you don’t have a television in your apartment?”
Me: “Well, I have a monitor that includes a built-in DVD player that I watch movies on.”
Them: “But it is technically a television set.”
Me: “Yes, technically, but I don’t have cable.”
Them: “Do you watch television shows on DVD?”
Me: “Well, sure.  I rent seasons of Arrested Development, Californication, Six Feet Under, and so on.”
Them: “Do you watch television shows on your computer?”
Me: “Well, once in a while I’ll watch clips of Jon Stewart online, or I’ll watch episodes of old TV shows on YouTube.”
Them: “So you watch TV shows online and on DVD?”
Me: “Yes.”
Them: “Then I hate to break it to you, but you still watch TV.”

Hence these friends of mine, these masters of logic and forensics, believe they have put me in my place.  And perhaps they have.  Two cultural questions emerge here.  First, what does it mean to “watch TV” today?  And second, what does it mean to say, “I don’t watch TV”?  American Studies practitioners (as well as psychologists) would probably be less interested in debating whether or not I technically “watch” television (technically I suppose I do), and more interested in analyzing why I feel the need to tell people, “I don’t watch TV.”  What does it mean that I say this?  What does it reveal about my sense of identity?  What does it say about how I perceive my relationship to modern technology, to mass culture, to social conventions?  Why is not watching TV an essential component to my vision of who I am—to my vision of self—living in the United States in the year 2009?

As an American Studies practitioner, I’ll try to answer my own questions.  Here are four possible ways to interpret my TV boast, though this list is by no means inclusive.  1) It is my way of asserting control over technology.  When I say, “I don’t have TV,” maybe what I really mean is, “TV does not have me.”  I mean that I can watch TV shows however and whenever I want, on my own terms—on DVD, or online, or not at all.  I am not yoked to the technology, not beholden to the medium.  I am master of it.  2) It is my way of signaling that I can control the flow of information into my domestic space (I also don’t have internet access at home).  I can limit the intrusion of the outside world into my private sphere.  3) It’s my way of pushing back against information overload in this new digital millennium.  I am making a return to simple living in a complex age.  4) It is a way to express my individuality, to present myself as a nonconformist, and—let’s be honest—to use culture and taste to make me feel superior to others.

I can already hear my friends saying, “So why don’t you just get cable and a DVR and watch specific shows whenever you want and shut up already?”  Fair enough.  But if I had cable, I’d probably watch TV all day long.  And then TV would definitely have me.

Read Full Post »

I frequently find myself caught somewhere between the criticism of twitter as full of mindless blather and the praise of it as a new venue for communication and information sharing (see: protests in Iran as examples of the latter and just about anywhere else as examples of the former). But as someone trained as an historian, I’m less interested in historical ruptures and things being created ex-nihilo than I am in the strange ebb and flow of historical tides, especially where technology is concerned.

So, I found myself thinking about Ham Radio because the thing about Ham Radio was that people mostly tinkered in their basements and sent out signals trying to get in touch with as many people as they could. When you tuned in another Ham operator, you usually acknowledged receipt of the signal by sending them a postcard in the mail, noting the time and day of the signal you tuned in. The postcards themselves are sometimes really beautiful, but that’s a different story.

See where this is going? On Ham Radio, people were communicating over long distances, with one another, but the impetus and conventions here had less to do with saying something in particular; the impulse here was to say anything at all. The goal was not the proverbial “deep and meaningful” conversation, but just the act of communication.

So, and I went ahead and bought a bunch of ham radio postcards on ebay (mostly because I could, but also because, it turns out, they’re fascinating). I bought a lot of 264 cards from the early 1950s, collected by a man named Dale Wolters of Zeeland, Michigan. Call Letters: W8GEH. All of these cards were sent to him from people who heard his signal — and they are from all over the world: South Africa, Spain, Germany, the Caribbean, Mexico. Ham Radio was global long before all this talk of “globalization.”

People were connecting just to connect long before twitter breathed its first tweet.

Read Full Post »