Feeds:
Posts
Comments

awesome hanging chain blue earrings

ok, so you can get these awesome chain earrings for only $4.00. remember, only one pair avaliable. please gmail us at eandsbeads@gmail.com

In my late thirties, I developed an obsession with American cinema of the 1970s that surfaced yearly in the dead of winter.  The obsession would start with me reading a scholarly book about seventies films.  This would be followed by a manic updating of my Netflix queue and then marathon screenings.  I would usually burn out by late March.  I would move on to other interests.  I would forget about seventies films.  But the cycle would start again the next year.  Always like clockwork, yet never planned.

The obsession began in the winter of 2006, when I bought The Films of the Seventies, an annotated filmography of every American, British, and Canadian movie that had come out between 1970 and 1979.  I bought the book on a lark at a used bookstore in Winston Salem.  I was living in North Carolina with a girlfriend, in a relationship that was still a year and a half away from its end but already on the decline.  When I got the book home, I highlighted all of the films I had already seen: Jaws, Star Wars, Breaking Away, Alien, Grease—the stuff of childhood memories.  Then I proceeded to watch classics that, embarrassingly, I hadn’t yet seen: Dog Day Afternoon, Network, Serpico, All the President’s Men, The French Connection, Chinatown.  At the time, I saw this as an exercise in self-education.  I had recently finished a Ph.D. in American Studies, and I felt I needed to know about these films.  The project preoccupied me for a few months, and then it ended.  Or so I thought.

In February 2007, the obsession returned.  I read, in quick succession, Peter Biskind’s Easy Riders, Raging Bulls: How the Sex-Drugs-And-Rock ‘N’ Roll Generation Saved Hollywood (a book I had been assigned in grad school but never read) and Peter Lev’s American Films of the 70s: Conflicting Visions.  The reading had nothing to do with my teaching or my academic research.  I just found myself enthralled by these assessments of cinema and the 1970s.  Biskind hooked me when he wrote, “this was a time when film culture permeated American life,” when film was “no less than a secular religion.”  I was intrigued by his claim that “this was to be a directors’ decade if ever there was one.”  Lev similarly captured my imagination when he noted that this was “the most exciting and most experimental period of the American feature film.”  I made lists as I read, and I subsequently watched movies like Shampoo, Scarecrow, The Parallax View, The Conversation, Marathon Man, The Last Detail, and, fittingly enough, Obsession.  This time, my fixation lasted into the spring.  By the summer, I had moved to California to start a new job and my six-year relationship had ended.  I was thirty-seven years old.

A year went by without a relapse.  I was busy teaching at a new school and processing my failed romance.  I spent February and March in a battle with a loud, temperamental neighbor.  In search of a fresh start, I moved into a beachside apartment in April.  I cultivated new friendships.  There was no seventies that year for me.

In the winter of 2009, however, it started again.  I devoured a collection of essays, American Cinema of the 1970s: Themes and Variations.  I made more lists.  My viewing included Klute, Nashville, Night Moves, Joe, Phantom of the Paradise, Hardcore, McCabe and Mrs. Miller.  But that year, I had a revelation.  I recognized a pattern not in the cycle of what I was doing, but in my experience of it: as it turned out, every time I revisited the ’70s, I liked reading about the films more than I liked watching them.

The books made the movies sound better than they were.  The books put the films in a historical context, explicated their cultural themes, imbued them with political significance—Watergate and Vietnam, the working class, paranoia and conspiracy, generational strife.  But the movies themselves?  They were often plodding, incoherent, indulgent, dull.  Not all of them.  But many of them.  The Last Detail bored me.  Night Moves confused me.  Phantom of the Paradise was whacky.  Joe was improbable.

In retrospect, the books I had read warned me of this. Peter Lev, for example, states bluntly (albeit on the very last page of his book), “one should not overly romanticize the films of the 1970s.  In a period of uncertainty and change, many mediocre films were made.”  In his introduction to Themes and Variations, editor Lester Friedman counsels the reader to beware of “hyperbolic assessments” of the era, for they “belie the fact that terrible movies were made during the 1970s.”  He writes this on page twenty-three, so I suppose I have no excuse there.

The same thing would happen each year. I regularly found myself waiting impatiently for a movie to end just so I could highlight the title in my filmography and be done with it.  Or avoiding watching it altogether.  My ex had noticed this tendency of mine when we still lived together in North Carolina.  Some of the titles that arrived from Netflix would sit next to the television for weeks.  She would threaten to mail them back the next time I was away from the house.  “You don’t have to do this to yourself,” she said at one point, sending us both into a fit of laughter.  And yet I did.  I was stuck in a cycle of pleasure and tedium.

Then I turned forty in 2010 and it all stopped.

I have two related theories about my obsession with American cinema of the 1970s.  The first is that the books I read provided a historical context for a time in my life when I was aware of very little beyond my own childhood.  The books offered a portrait of a cultural milieu that was taking shape outside of my tiny world of family, play, and school.  I could insert the memories and images I had of my own seventies childhood—a quite happy childhood—into a bigger picture of what was going on in America.  This was an exercise in intellectual nostalgia, in historicizing my youth.  That’s why I liked reading about the films more than watching them.  The moving pictures I had in my head, of me then, were much better.

The second theory is that reading about the seventies provided an escape.  In my late thirties I was going through major transitions: a stressful job search, the end of a serious relationship, a move to California.  Rather than deal with the corrosion and change that defined my present, I retreated to a safer past.  It wasn’t really about the seventies at all.  Sometimes we sit through bad movies because we need to.

So was my obsession with seventies films about nostalgia or escape?  I’m not sure they’re so different.  I was certainly obsessed with both at a particular stage of my life.  All I know is that when I finally let her go, when I turned forty and resolved to look forward more than I looked back, that decade also lost its purchase on me.  It’s true that one should not overly romanticize American cinema of the 1970s.  But it’s also true that one not overly romanticize the past.  Those moving pictures may sound good in theory, but sitting through them can just be another way of making the present pass by.

Not Again

When I first learned of the massacre of children and teachers in Connecticut, my first thought was “not again.” Based on my Facebook and Twitter feeds I am not alone. “Not again,” we say, first in disbelief, and then in resolve. Yet until something changes, we all know, it will happen again.

The first step, I think, is to realize that we face a human problem. Evil is part of our nature as surely as good is. We are a species of violence. This is a profound truth—a historical truth, a biological truth, a moral truth. The great theologian Reinhold Niebuhr once declared, “The doctrine of original sin is the only empirically verifiable doctrine of the Christian faith,” and now we have yet more horrific evidence of this truth. Yet if we stop our thinking there, we remain mired in a hopeless nihilism. Surely evil will always be with us—in this world, at least. But the persistence of evil is no reason to cease working for good. We can’t make a perfect world, but we can make a far better one.

We are now appallingly familiar with the script that plays out in the hours and days after an atrocity like this, and so we know that soon enough some pundits or politicians will start declaring that the answer to gun violence is more guns. If only a teacher had been armed, or the principal. Since evil will always be with us, they say, if the other guy has a gun, I better have one too. A bigger one. This is the answer of moral nihilism. We can do better.

Pondering evil is a fine place to start, but we need to understand that this kind of evil is not just a human problem—it is, sadly, a particularly American problem. We live in a culture diseased by violence. Ours is a frontier society, born and nurtured in conquest, genocide, enslavement, and revolution. Ours is a beautiful and noble experiment in freedom and democracy too, admirable and good in many ways—but also, clearly, a society still living off an inheritance soaked in blood. This inheritance has left us a culture that glorifies war and weapons, that preaches perverse doctrines of honor and distorted notions of manhood. The Swedes and Japanese and Indians share our same human condition, but they don’t kill like we do. We are reaping what we have sown.

This realization too might lead one to despair. With so many guns, and such a legacy of violence, the best we can do, says the gun lobby and its enablers, is arm ourselves, protect our own. Aside for the empirical reality that such defensiveness rarely works—as in the case in Newtown, the guns in ours homes are far more likely to kill through crime or accident than to protect—this position fails to understand that cultures can and do change. Our American cult of violence is a product of history, and it can be undone in history. Not easily, not quickly, but schools and parents and churches and governments and people of courage and character can move the tectonic plates of culture. We may never live in a world without violence, but we can live in a society that doesn’t worship killing and glorify killers. We can. Some day.

But we don’t have to wait for that day before we save lives. We can start tomorrow.  Most obviously:

We can regulate guns.
We can provide better mental health care.

These steps would vastly reduce the mass killings that happen in those places where people say, “we never thought it could happen here.” To middle class folks. To white folks. In malls and offices and kindergartens. Good policy ideas abound, ideas that could immediately reduce the number and scope of these kinds of mass murders. All we need is the political courage and leadership to make it happen. These simple steps really shouldn’t be that hard.

Most murders, of course, don’t happen in those places. Most killings happen in ghettoes of violence, in the places where people know all too well that it can happen here, because it does, relentlessly. The chronic violence of these forgotten places, places where the poor and black and brown live, is a product of our particular history too, the result of our uniquely American combination of racism and neglect and greed. We can and must root out the violence in these places, through policies that provide educations and jobs, hope and opportunity, free from the shackles of systemic poverty and racial injustice.

This is hard, never-ending work. But we must start, now, if we want “not again” to mean anything more than an empty cry.

I am a member of the faculty at UC Davis, and my campus has been roiled these past few days by violent actions taken by the campus police, directed at non-violently protesting students. The video of campus police pepper-spraying students has been seen ’round the world, and letters of protest, outrage and demands have been blowing up my email inbox all weekend.

 

On Sunday, just as I was catching up on Friday’s events, I was also set to present at a panel at the American Academy of Religion Conference, in San Francisco.  The panel was convened by the good people behind Freq.uenci.es, a “collaborative genealogy of spirituality” and a website to which I’ve contributed.  I prepared some remarks, but I felt impelled to preface them by saying how weird it felt to talk about spirituality life on my campus — for my students, colleagues and friends — had changed irrevocably.  I opened with some comments about feeling that tension, and then went on with the show.  I explained that I was having a hard time reconciling these two things, but that I was okay to live with that unreconciled tension.

 

This morning, on my way to participate in the General Assembly at UC Davis, a friend told me about another video.  A second video – not of police and protesters, but of the Chancellor, Linda Katehi, being walked to her car, following an on campus meeting Saturday night.  In the video, you see Katehi walking down a road, flanked by hundreds of silent, standing students, making their protest as loudly as possible by not saying a thing. 

 

All you hear are her heels click-clacking on the asphalt.

That video, not the one of the cops, speaks to the power of these students that this community.  And the sound of her heels amidst the crowd of hundreds of students express perfectly the potential in this moment.  Potential that is political, and dare I say: spiritual, too.

 

The most recent data released by the U.S. Bureau of Justice Statistics (2008) disclosed that 11.7 million persons in the United State experienced at least one attempted or successful incident of identity theft during a two-year period.  The incidences increased 22% from 2007 to 2008.  For the past decade it has been a federal crime to “knowingly transfer, possess, or use, without lawful authority, a means of identification of another person.”  A steady stream of news stories reminds us of the dangers of identity theft and identity impersonation—from credit card and social security numbers stolen, to bank accounts hacked into, to child predators trolling social networking sites under false identities.

Our legal system and news media have sent us a clear message about identity theft: it is criminal.  It is unacceptable.  It is a menace to society.  You need to protect yourself, vigilantly, against it.  But in the imaginary world of popular culture, the ethics of identity theft is, well, far less rigid.  In the fictional stories we tell ourselves about identity fraud, the act itself can actually be a justified means to a moral end.  In the past year alone, three films have been released with plots that are propelled by characters who assume false identities, identities inhabited “knowingly” and without permission.  In the sci-fi film Source Code (2011), a soldier is transposed into the body of an everyday man who is riding a train that will blow up in eight minutes; he is occupying the man’s body in order to try to find the bomber on the train.  In Unknown (2011), a doctor awakens after a car accident to find that another man has assumed his identity, is married to his wife, and has his career (the other man’s photograph even shows up when the doctor’s name is searched on the internet).  Finally, the documentary Catfish (2010) explores the world of Facebook and the ways in which users perform their identities for people they’ve never met in real life.  I don’t want to give too much more away from any of the three movies, as each contains various twists and turns, but suffice it to say that identity cloning and identity manipulation is a theme that runs through all of them.

And here’s what strikes me about this cultural content and its broader context: in each case, the ethics of assuming someone else’s identity is not exactly black and white. In fact, in the moral logic of Source Code, posing as another person is wholly justified if it can help save the lives of others.  In Unknown, it turns out that identity fraud is ultimately a redeeming act for one of the characters.  And Catfish challenges the audience to reflect on whether questionable identity performance justifies the good works that can result in the process.  In short, if another person’s identity is assumed for reasons other than criminal purposes and/or economic gain, then it may just be okay.

It’s not just these three movies that explore the ethical gray area of identity theft.  We can go back farther into the decade to find others.  As readers of the Harry Potter series know, the magical “polyjuice” is a potion that will transform the imbiber into another person, at least in outward appearance.  While it is used to evil purposes in one of the books, it is also used by Harry and his friends Hermione and Ron throughout the series to obtain valuable information and pursue dangerous missions.  Their temporary acts of identity theft are deemed morally defensible.  Another example: the hero protagonist of the popular 2002 film Catch Me If You Can is a clever identity imposter—based on a real-life criminal—whom the audience finds itself rooting for.  And in the TV show Mad Men (spoiler warning…), the lead character is not actually the “real” Don Draper, but is a Korean war vet named Dick Whitman who has assumed the identity of an officer killed in the war—named Don Draper.

Of course, the theme and plot device of false identity is not a new preoccupation of literature and cinema.  Mark Twain explored it in Pudd’nhead Wilson, Herman Melville in “Benito Cereno,” Vladimir Nabokov in his dark novel Despair.  Older movies, like Day of the Jackal, the film version of the play Six Degrees of Separation, and the science fiction story Gattaca all revolve around identity fraud.  But in the world we live in today, where online identity theft is a rapidly growing criminal phenomenon, I find these contemporary iterations of the theme even more noteworthy.  As the performance of our identity becomes more fluid, unstable—indeed more uncomfirmable—in this online age of social networking, it seems that our sense of the morality of identity performance has become more muddied and culturally contested.  The law and the news media are telling us one story about the ethics of identity theft and manipulation; our popular culture is telling us another; and, I daresay, what we post on our own Facebook and Twitter accounts is telling us a third.

1. Gather your friend(s) together.
2. Listen to new episode of This American Life.
3. At the end of the episode, as Ira runs down the credits, each person guesses which clip will be used to mock Torey Malatia.
4. The person who guesses correctly gets to drink.
5. Repeat.

How did the humanities lose out to neuroscience in making culture seem interesting?

I’ve been listening to a lot of NPR and podcasts lately. I’ve given my historical favorites a little break (sorry, This American Life and Sound Opinions), and I’ve been listening more and more to Planet Money and Radiolab (as podcasts), and to the usual NPR fare that airs while I do dishes or cook dinner: All Things Considered, Marketplace, and of course: Fresh Air.

What I’ve noticed is how often scientists and economists show up on theses shows to talk about things I thought were the main interests of humanists and social scientists. Questions like the how restaurants work, whether or not race matters, why certain songs get stuck in our heads, how people calculate the value of things or make decisions they know are not in their best interests, and so on.

These are the questions to which I have long sought answers by looking at culture and its various expressions, and in which my field of American Studies has long been interested (albeit in different forms, over time).

Yet somehow, every time I turn on the radio, I find one or another neuroscientist (or, often enough, Jonah Lehrer) talking about precisely these same questions, and about how the pathways of neurons and synapses can answer questions art or love or whatever.

So here’s my question to my colleagues: how did we become so untouchable or so uninteresting to mainstream media? How come the good people at NPR (and, presumably, their listeners) find neuroscientists and economists more interesting and more capable of talking about these questions that we are? How did they become the go-to academics for understanding how and why people do what they do? Social scientists and humanists look at those phenomena, too, but somehow, we have become generally less interesting than our colleagues.

This is not the neuroscientists’ fault: they are good at what they do, and their creativity in asking profound questions that teeter on the line between culture and biology ought to be encouraged. Similarly, it’s not the fault of the radio programmers; they are looking for the most intelligent, engaging guests they can find. And they’re finding them in neuroscience and economics, not in the Humanities.

Why is everyone else talking about culture but us? Are we that boring? Have we grown so adept at gazing at our own navels that we can’t talk about other things? Does “the public” think that so-called “hard” science is really the only arbiter of actualities in the world?

How have we become so irrelevant even on topics that are ostensibly our areas of expertise and scholarly interest?

Follow

Get every new post delivered to your Inbox.