Feeds:
Posts
Comments

Archive for the ‘books’ Category

The great irony of the The Social Network, of course, is that its central theme is not connectivity but disconnection.  A film about the genesis of a technology designed to bring people closer together features a main character who experiences the painful dissolution of social bonds. The plot is driven by protagonist Mark Zuckerberg’s exclusion from social groups, the end of his romantic relationship, and the collapse of his close friendship.  This sense of disconnection is pervasive despite the fact that The Social Network is a crowded movie: characters are almost never seen alone (only when they are individually “wired in” to their computer screens), and certainly never seen enjoying a moment of quiet solitude. Instead, the characters are regularly packed together in small places—a legal office, a dorm room—or in big, loud impersonal places—a nightclub, a drunken party. But these spaces, despite their capacities, are repeatedly portrayed as lonely and alienating.

While the characters may inhabit a decidedly unsocial non-network beneath a façade of constant social interaction, the film itself serves as a remarkably vibrant cultural network. For the student of American culture, The Social Network is a fountainhead of intertextuality.  Perhaps appropriate for a film about Facebook, The Social Network functions as a busy crossroads of cultural referents, many readily recognizable, others unstated but nevertheless present.  The movie obviously plays on our familiarity with Facebook, but it is also features appearances by Bill Gates and the creator of Napster (both portrayed by actors), a musical score by industrial rock luminary Trent Reznor of Nine Inch Nails, and a Harvard setting (even though campus scenes were mostly filmed at The Johns Hopkins University).  It is also directed by David “Fight Club” Fincher and written by Aaron “West Wing” Sorkin.  One of the students on Zuckerberg’s Facebook team is played by Joseph Mazzello, the actor who starred as the little boy Tim Murphy in Jurassic Park.  In other words, what is really “social” about The Social Network is the way in which it serves as a pulsating intersection for a range of icons, myths, and expressive forms that circulate in the audience’s collective imagination.  It is populated by cultural detritus and ephemera with which we are friendly, if you will.

I imagine these multiple and varied cultural associations may in part explain the source of the film’s pleasure for viewers. The experience of viewing The Social Network is akin to data-mining.  It rewards a 21st century audience accustomed to scanning mounds of digital information and quickly categorizing that information into familiar frames of reference. For example, the brief appearance of Bill Gates evokes our Horatio Alger myth of success and the American dream.  The presence of Sean Parker of Napster infamy conjures associations with the lone rebel, hacking the system and sticking it to the man.  And Zuckerberg himself comes across as a nerd pulling one over on the Olympic jocks.

Reznor’s musical score triggers memories of his earlier work on such Nine Inch Nails albums as Pretty Hate Machine (1989).  That record opens with the line, “god money I’ll do anything for you/god money just tell me what you want me to,” and builds to the chorus, “head like a hole/black as your soul/I’d rather die/than give you control.” Pretty Hate Machine, with its loud synthesizers, drum machines, and vocal wails, is not unlike The Social Network: an expression of male adolescent angst and rage confined inside an electronic world.

And there are still other resonances: Fincher’s directorship reminds us of his previous explorations of masculinity and antisocial behavior in Fight Club, The Game, and Zodiac.  Sorkin’s dialogue echoes the brainy loquaciousness of the political characters he developed for the television show The West Wing. Nearly twenty years ago, in Jurassic Park, actor Mazzello played a child victimized by a technology abused and gone awry.

As I watched The Social Network, I even found correspondences with The Lonely Crowd (1950), the sociological study of “other-directed” social character that became a bestseller in postwar America.  Co-authored by David Riesman and his team, the book argues that Americans are increasingly motivated by the need for peer acceptance.  More and more, our “inner gyroscope” is set in motion not by individualistic self-reliance, but by the drive to win approval and fit in.  Consequently, our time is spent trying to decode what is popular and adjust our personalities accordingly: “The other-directed person acquires an intense interest in the ephemeral tastes of ‘the others’… the other-directed child is concerned with learning from these interchanges whether his radar equipment is in proper order” (74). What is The Social Network if not a depiction of a lonely crowd?  Indeed, isn’t Facebook itself a kind of lonely crowd?

I can’t help but wonder if this way of reading the movie—this pleasurable scanning of its cultural allusions—in some way works to conceal its underlying critique of 21st century connectivity. The film’s central theme of social dissolution is undercut by its teeming network of familiar, friendly signifiers.  Its “ephemeral tastes.”  Yet we are “friends” with these signifiers in as much as we are “friends” with hundreds of people on Facebook—another text, by the way, that we scan with the same data-mining mindset.  As portrayed in The Social Network, Facebook doesn’t really bring people closer together in any meaningful way; it facilitates casual hook-ups in bar bathrooms, or it breeds addiction, as one character confesses, or it quickly delivers trivial information like the results of a crew boat race. You would think The Social Network, with its depiction of Zuckerberg and his creation, would compel a mass exodus from Facebook, or at least spark critical public reflection on our complicity with this technology of the lonely crowd. But instead it rewards and reinforces our ways of reading: our ingrained habits of consuming images, scanning information, reducing human experience to pannable nuggets of gold.

Advertisement

Read Full Post »

I have two bookish confessions to make.  The first is that I keep a running list of every book I have read since I graduated from college some twenty years ago.  The list includes the author and title as well as the month and year I finished each book.  The list includes books I read for the classes I teach as well as books I read for “pleasure.”  The list includes unabridged audio books, or at least it would if I had ever actually finished listening to one in its entirety.

I keep the list to remember and to mark my progress—each year I try to read at least one more book than I read the previous year.  The list also helps me recall not just the titles, but the cities in which I lived and the specific places in which I read those books: at a coffee shop in Austin, on the beach on vacation in Puerto Rico, in my grandparent’s house one winter in Vermont, on the couch in my small apartment in New Hampshire, in a summer sublet in Boston.  The list generates strong associations with the past—with the seasons, the weather, the people around me, the fleeting episodes of happiness, confusion, sadness in my life.  The list, in other words, is more than a tally of books.  Each book anchors me in a moment.

My second bookish confession is that each year, after Thanksgiving, I begin the pleasantly agonizing process of selecting the books I will take home with me for the holidays. I stack, I list, I revise lists, I restack.  I try to figure out which books I want to relish during my December visit to family in New Jersey.  It is, admittedly, quite the process.  Almost always it must be a novel.  Something I have wanted to read for a while.  Something that will offer satisfaction and escape, and not too much challenge.  Something I will look forward to reading after my parents have gone to sleep (which happens earlier and earlier each year), or before my nephews rush into my bedroom to wake me up the next morning (which, thankfully, happens later and later each year).  I usually start with about fifteen books and taper it down to two or three by the time I pack my suitcase.

This holiday season, I took a moment to revisit the books I read during past December visits.  Ever the connector of dots, I found some curious patterns.  Unbeknownst to me until now, the books I read over the holidays tend to fall into four sometimes overlapping categories: from mom’s bookshelf, work-inspired, political, and ambitiously literary.

Apparently on several visits home, I abandoned my carefully chosen books and instead selected one from my mother’s bookshelf in her office.  My mother likes buying used books at public library sales, and she’s amassed an impressive collection over the years. I pulled Hard Times from the shelf in 1994, A Clockwork Orange the year before (why did my mother have this one?).

When I taught in an Education department at a college in North Carolina a few years ago, it seems I was inspired to read school stories: Old School, the excellent prep school novel by Tobias Wolff, followed the next winter by the classic Goodbye, Mr. Chips.

Election seasons punctuate my list: Howard Dean’s Winning Back America in December 2003, Barack Obama’s The Audacity of Hope in December 2006.

The rest of the books fall into the ambitiously literary category, books I must have picked because I was on some mission to expand my literary horizons over the holidays.  One Christmas I read Kafka’s The Castle.  Another year I worked through Nabokov’s Palefire (so much for the “not too challenging” criterion) and Kurt Vonnegut’s Cat’s Cradle.  In 1997 I read a biography of Zora Neale Hurston.  In 2008, it was Toni Morrison’s A Mercy.  Raymond Carver’s Cathedral and George Eliot’s Silas Marner were my Yuletide pleasures in 1996.

What these four categories have in common is setting: my parents’ house, the place I left at age 18 and return to every Christmas.  These are the books I have read in my childhood bed, now long donated to one of my sisters.  In the guest bed in the basement.  On the couch in front of the fireplace in the living room.  In the oversized recliner in the den.  Each holiday book brings back the smells and tastes of home cooking, family voices, a glimpse of snowflakes outside the window, the barking of my parents’ dog.  This may be why I select the books with such care: they will absorb these memories.  Preserve them.  These books will carry a little bit of family history with them.  They will be so much more than an author and a title on a list.

Which reminds me—I still need to whittle down my stack for next week.  Happy holidays, readers.

Read Full Post »

What do Philip Roth’s polio novel Nemesis and Justin Cronin’s vampire novel The Passage, both published in 2010, have in common?  Quite a bit, actually.  Roth’s novel renders the atmosphere of fear surrounding a polio outbreak in Newark, New Jersey in the summer of 1944.  The protagonist is a young playground director futilely trying to protect the children he supervises from contracting polio.  Cronin’s novel imagines a world ninety years in the future that has been overrun by killer vampiric “virals” as the result of a scientific experiment gone awry.  It focuses on a small colony of people struggling to defend themselves against the virals.  Both books deal with the pervasive threat of an invisible, poorly understood contagion that has no known cure.  Both focus on adult attempts to shelter children from the contagion.  In both, characters wrestle with existential questions and religious doubt—why would God inflict polio on children?  Do virals have souls?  Both books demonstrate how natural time changes—spring to summer, day to night—can heighten the threat of contagion and force humans to change their routines and behavior.  Both show how suspicion and fear can cause communities to turn against one another.  As cultural artifacts of the moment, both novels also resonate powerfully with contemporary anxieties surrounding contagion in our everyday lives.

In recent years, Americans have been alerted to the threats posed by multiple toxins and diseases: salmonella in eggs, ecoli in spinach and peanutbutter, cadmium in Shrek glasses sold at McDonald’s and jewelry sold at Wal-Mart, H1N1, West Nile virus.  Such warnings came about against the backdrop of broader fears about biological terrorism post-9/11. The current wave of viral panic is of course the most recent chapter in a very long history of epidemics—Ebola in the 1990s, mad cow disease in the 1980s, polio in the 1930s and 1940s, the black death in the 14th century, and so on.

There is a cultural dimension to this biological history—a cultural history of contagion.  Nemesis and The Passage are the latest examples of artistic expressions, literary and otherwise, that in some way reflect, reimagine, or comment upon viral panics.  Think Daniel DeFoe’s Journal of the Plague Year (1722), Thomas Mann’s Death in Venice (1912), Albert Camus’s The Plague (1947) (literary critic Elaine Showalter recently made a brilliant connection between Nemesis and The Plague), and Stephen King’s The Stand (1978/1990).  Think nonfiction books like The Hot Zone (1994).  Or films like Panic in the Streets (1950), Omega Man (1971), and Outbreak (1995).

As artifacts of the 21st century culture of viral panic, unlikely bedfellows Nemesis and The Passage join a diverse cultural cohort that includes the comic book series Y, about a plague that kills every mammal on earth with a Y chromosome; the 28 Days and 28 Weeks Later films, featuring the “rage” virus; the horror movie Cabin Fever, starring a flesh-eating virus (tagline: “Catch It”); Max Brooks’s book World War Z, an “oral history of the zombie wars”; and the forthcoming young adult novel Virals, about a group of teenagers exposed to canine parvovirus.

These contemporary examples of the culture of viral panic offer audiences ways to process a number of fears and anxieties that circumscribe our everyday lives in the 21st century.  Anxieties about the limitations and abuses of science and medicine.  Anxieties about our ability to protect children and youth from invisible menaces.  Anxieties about community cohesiveness and civility.  Anxieties about the government’s ability to respond to disaster.  In other words, the culture of viral panic is never just about the threat of contagion.  It always speaks to broader concerns about institutions, social relations, childrearing, science, spirituality.  Read in this context, Nemesis and The Passage have a great deal in common.

Read Full Post »

I, for one, am thrilled that “bromance” has been officially added to the newest edition of the Oxford English Dictionary.  Bromance: “a close but nonsexual relationship between two men.”  The word received its 2010 OED stamp of approval along with “chillax,” “frenemy,” “exit strategy,” and “defriend,” among others.  Now that bromance is part of our venerated lexicon, we can wield it with authority, apply it smartly, inject it into our elite discourse.  No longer must we wave the word around with a wink and a nudge, smugly proving our pop culture literacy and familiarity with the film “I Love You, Man.”  Let the studious scholarship on bromance—Bromance Studies—begin in our literary and cultural circles.  Bromance has been given the green light, man.

I imagine that literary critic Leslie Fiedler (1917-2003) would approve of this development, whispering “I told you so” from beyond the grave.  Fiedler rocked the literary world with his 1948 Partisan Review essay, “Come Back to the Raft Ag’in, Huck Honey!” in which he pointed out that a dominant archetype in American literature is the homoerotic love affair between two men who light out for the territory in order to escape civilization’s responsibilities and constrictions (and its women).  Moreover, as Fiedler observed—even more controversially—these bromantic pairings tend to involve a young white man and a man of color: Huck Finn and Jim, Ishmael and Queequeg, Natty Bumbo and Chingachcook.  These “mythic” relationships are always characterized by the “pure love of man and man,” offering the reader a model of “chaste male love as the ultimate emotional experience.”  The love is of course always innocent; there “lies between the lovers no naked sword but a childlike ignorance, as if the possibility of a fall to the carnal had not yet been discovered.”  Bromance!  The OED should cite Fiedler.

Fiedler elaborated on his argument in his 1960 book Love and Death in the American Novel. I propose an updated edition of this book be released with a new title: Bromance and Death in the American Novel.  With an introduction by Judd Apatow.  I suspect this would make Fiedler’s argument more legible than ever to the reading public.  And to dudes, especially.

Bromance can now be applied without shame to literary studies of other male pairings, even to those men who do not abandon civilization for the wilderness.  Take the bromance between Nick Carraway and Jay Gatsby.  Or Joe Starrett and Shane.  Or Sal Paradise and Dean Moriarty.  At long last we have the word to describe these complex character relationships.

And why stop with bromance?  I believe several other new additions to the OED would prove quite useful to literary critics.  Ahab just needed to chillax.  Hemingway’s Old Man needed an exit strategy when the fish proved too much for him.  Hester Prynne should have argued that she merely had a wardrobe malfunction.  Portnoy overthinks sex.  Biff Loman couldn’t get a hold in Death of a Salesman because maturity was too much of a buzzkill for him (and his dad needed better soft skills).  Faulkner’s Compson family should have defriended one another.  Were Gene and Finny friends or frenemies in A Separate Peace? How did Silas Lapham’s toxic debt affect his moral compass in Dreiser’s novel?  How would Sonny’s blues have sounded different in James Baldwin’s short story if he had played a vuvuzela?

The possibilities abound.  Book critics would do well not to be fussbudgets or haters and follow the OED’s lead.  Time to update the literary glossary.  As for me, I’m off to take a chill pill and eat some turducken while I reread that prescient cool hunter, Leslie Fiedler.

Read Full Post »

During my recent hiking and camping trip to Utah, I re-read Desert Solitaire, Edward Abbey’s eloquent and ornery work of nature writing.  First published in 1968, the book recounts the author’s three-season stint as a park ranger in Arches National Park in the late 1950s, where he was stationed in a house trailer twenty miles from the nearest human being.  Desert Solitaire is many things at once: a Whitmanesque inventory of the desert’s ecosystem, a poignant contemplation on the vagaries of solitude, a mystic’s philosophy of symbiosis.  The narrative flows seamlessly from poetic geological catalogs—“chalcedony, carnelian, jasper, chrysoprase, and agate”—to existential meditations on those moments when “solitaire becomes solitary,” when “alone-ness became loneliness.”  Desert Solitaire is also a rebel’s complaint about human disregard for nature.  Abbey rails against exploitation of the natural environment by the oil, mining, and tourist industries; in his preface, Abbey warns the reader, “this is not a travel guide but an elegy.  A memorial.  You’re holding a tombstone in your hands.”  The author similarly takes jabs at the unserious tourists he met as a park ranger, including one who asked, “Where’s the coke machine?” and another who suggested that spotlights be shined on the rock arches to make their appearance more dramatic.  Ultimately, the book offers both a close-up view of nature and a wide-angle perspective on one man’s place in the universe.

Rambling about Zion and Bryce Canyon and Arches by day and revisiting Abbey by night, I found that many of his observations and a few of his plaints still resonated forty years after the book was published.  It was easy, for example, to sigh along with his chapter titled, “Polemic: Industrial Tourism and the National Parks.”  Here, Abbey unloads on automobile culture (specifically the endless construction of new highways) and the rising popularity of family vacations (courtesy of Disneyland and the station wagon), two postwar manias that together have wreaked havoc on the national park system.  Abbey too idealistically proposes that all cars be banned from parks in order to protect the environment and enhance the experience of visitors.  Such a move, Abbey argues, would elevate parks to the esteemed position they should rightly hold in our culture:

“No more cars in national parks… We have agreed not to drive our automobiles into cathedrals, concert halls, art museums, legislative assemblies, private bedrooms and the other sanctums of our culture; we should treat our national parks with the same deference, for they, too, are holy places.  An increasingly pagan and hedonistic people (thank God!), we are learning finally that the forests and mountains and desert canyons are holier than our churches.  Therefore, let us behave accordingly.”

As I re-read this chapter, I found myself less taken with Abbey’s lament about cars—today, after all, some parks like Zion ban private vehicular traffic and run propane shuttle buses to cart visitors around from trailhead to vista—and I was instead struck by his point about treating parks with deference.  Have we, in the forty-two years since Abbey published Desert Solitaire, come any closer to viewing parks as sacred places?  Or do we still tread less lightly, using parks rather like playgrounds, like vast, government-protected jungle gyms?  How many of us come to national parks seeking sanctuary, hoping for some kind of a sacred experience?  Is that even our default cultural mentality?  Or do we tend to view our outdoor spaces as a series of scenic overlooks ready-made for digital snapshots (the park as nude model, posing for us, a stranger to us, there solely to be gazed upon, salivated over)?  I can’t tell you how many people I saw rush off of the buses in Zion and Bryce, with camera in hand, who would speed walk over to a panorama, look at said panorama only through their camera lens, pause not a wit, snap, snap, snap, and then race back to the shuttle in order to take in another “sight” elsewhere in the park.  Per Abbey, I wondered—would we treat our churches, museums, and concert halls the same way?

Abbey says we need nature to “startle the senses and surprise the mind out of their ruts of habit, to compel us into a reawakened awareness of the wonderful—that which is full of wonder.”  It can sometimes be difficult to find the space and the quiet to surprise the mind and find the wonder, especially in the busy month of August, but I believe nature yet holds an esteemed place for many of us.  And at the end of the day, we get out of parks what we put into our visits.  Personally, some of the most transcendent experiences in my life have occurred out-of-doors, in nature.  Not necessarily life changing, but awe-inspiring nonetheless.  I’ll always remember, for example, wading barefoot into icy Lake Ediza in the Sierras.  Looking up at the endless night sky in Big Bend.  Listening to the wind whip around outside my tent on Mt. Washington.  Touching the White Cliffs as my canoe glided past them on the Missouri River in Montana.  Even when visiting elbow-to-elbow, car-to-car Yellowstone, I was overwhelmed by the sheer power and grace of the falls of Yellowstone River.  Parks can be our sanctums as long as we chose to experience them as such.

Despite Abbey’s sporadic grumpiness and desert rat elitism, Desert Solitaire remains a classic work of nature writing, one that narrates the wonders of the desert as brilliantly as John Muir relates the mysteries of the mountains.  Like the best nature writers, Abbey articulates for us the relationship between wilderness and civilization: In nature, away from our cultural infrastructure, we are reminded that “out there is a different world, older and greater and deeper by far than ours, a world which surrounds and sustains the little world of men as sea and sky surround and sustain a ship. The shock of the real.  For a little while we are again able to see, as the child sees, a world of marvels.”

Read Full Post »

(photo: Modesto Bee)


FAMOUS QUOTE

Double Rainbow Guy:

“Whoa, that’s a full rainbow.  All the way.  Double Rainbow.  All the way.  It’s a double rainbow all the way.  Whoa.  So intense.  Whoa.  Man.  Whoa.  Whoa.  Whoa!!! My God!!!  Oh my God!  Oh my God!  Woooo!  Oh Wow!  Woooo!  Yeah!!!  Oh my, oh my, oh my God look at that.  It’s starting to look like a triple rainbow.  Oh my God.  It’s full.  Double rainbow all the way across the sky.  Oh my God.  (audible sobs).  Oh God.  What does this mean?  Oh my God.  It’s so bright.  Oh my God it’s so bright and vivid.  Oh.  Oh.  Oh.  It’s so beautiful.  (audible sobs).  Oh my God.  Oh my God.  Oh my God.  Double complete rainbow.  Right in my front yard.  Ha, ha, ha, ha, ha!!!  Oh my God.  What does this mean?  Tell me!  It’s too much.  I don’t know what it means.  Oh my God.  It’s so intense.  (big sigh).  Oh my God.”

–3-minute YouTube video, recorded January 2010

Henry David Thoreau:

“I stand in awe of my body, this matter to which I am bound has become so strange to me.  I fear not spirits, ghosts, of which I am one… but I fear bodies, I tremble to meet them.  What is this Titan that has possession of me?  Talk of mysteries!—Think of our life in nature,—daily to be shown matter, to come into contact with it,—rocks, trees, wind on our cheeks!  The solid earth!  The actual world!  The common sense!  Contact!  Contact!  Who are we?  Where are we?”

–“Ktaadn and the Maine Woods,” 1848, from The Maine Woods

VITAL STATISTICS

DRG: Organic farmer; divorced; two children; currently has girlfriend; 265 pounds (was 465)

HDT: Transcendentalist; proposed marriage to Ellen Sewall in 1840 but was rejected; never dated again; generally considered homely

FORMER CAREER

DRG: Cagefighter, dog breeder, firefighter, long-haul truck driver

HDT: Schoolteacher, pencil manufacturer

NICKNAMES

DRG: Born Paul Vasquez, known as Hungry Bear and Yosemite Bear

HDT: Born David Henry Thoreau; went by Henry David after college

CULTURAL INFLUENCES

DRG: Participates in Native American ceremonies

HDT: Read East Asian philosophy

FACIAL HAIR?

DRG: Yes

HDT: For many years; thought women found it attractive

PRIMARY GEOGRAPHIC LOCATION

DRG: Yosemite National Park

HDT: New England

UNDER THE INFLUENCE?

DRG: For this video, no, but admittedly yes for others he has posted

HDT: “I would fain keep sober always… who does not prefer to be intoxicated by the air he breathes?”

FRIENDS IN HIGH PLACES

DRG: Jimmy Kimmel

HDT: Ralph Waldo Emerson

MEANS OF SELF-REFLECTION

DRG: 250 videos uploaded to YouTube documenting his experiences near Yosemite

HDT: 40 journals covering twenty-four years of his life, running nearly two million words

POPULARITY

DRG: Over 3 million views on YouTube

HDT: A Week on the Concord and Merrimack Rivers (1849): 100 sold, 75 given away; Walden (1854): 2,000 copies sold over the course of five years; The Maine Woods (1864), published posthumously

FAMOUS MORAL STANCE

DRG: Purist who will not accept ad revenue for YouTube video: “I’m not opposed to making money, I just don’t want to degrade it or disrespect it.”

HDT: Abolitionist who would not pay poll tax.  Did not recognize authority of a government “which buys and sells, men, women, and children, like cattle at the door of its senate-house.”

ON WHY HE HAS SUBLIME EXPERIENCES

DRG: “It’s not about me.  I’ve always told people that the universe flows through me.  I’m just a vessel.”

HDT: “Sometimes, when I compare myself with other men, it seems as if I were more favored by the gods than they.”

LEGACY

DRG: Has spawned numerous mash-ups on YouTube, including “The Autotune Double Rainbow Song” and “Double Rainbow Connection Remix” featuring Kermit the Frog

HDT: His writings influenced Gandhi, Martin Luther King, Jr., John F. Kennedy, B.F. Skinner, Willa Cather, Edward Abbey, Frank Lloyd Wright, and many others.

Happy Birthday, David Henry, born July 12, 1817.

Read Full Post »

In his essay, “The End of Solitude,” recently published in The Chronicle of Higher Education, William Deresiewicz argues that we are undergoing a historical shift in the social significance of solitude.  Once viewed as a necessary means to better connect with God, or with Nature, or with the fraught Self, solitude has now become synonymous with loneliness in our contemporary culture.  Deresiewicz makes the case that those of the web generation, particularly young people today, are increasingly unable to be alone without the fear of feeling lonely or, perhaps worse, feeling bored.  Technology has weakened our concentration, our privacy, and most significantly, our propensity for solitude.  According to Deresiewicz, our sense of self today is enhanced not by solitary communion but by communal validation: our worth is measured by numbers of Facebook friends and Twitter followers, our identity is only reaffirmed by constant public visibility and validation by others.

Agree or disagree with Deresiewicz, his argument is undeniably provocative, as it forces us to at least contemplate—either alone or in the company of others—the role of solitude in our own lives.  His essay certainly got me thinking about solitude.  Solitude is something I have always valued.  I have always made a space for solitude in my life.  But why?  Where did I learn to value solitude?  Books played a role, this much I know.  Books I read when I was younger modeled the idea of standing alone without being lonely.  From books I learned not to fear solitude.  From books I learned the difference between loneliness and solitude.  And the first book to teach me this was a novel published in 1959 called My Side of the Mountain, by Jean George.

George’s book tells the tale of a teenage boy who leaves home in New York City to live in his family’s abandoned old farmland in the Catskills.  At the book’s outset, Sam Gribley tells his father he is running away to live in the woods, and his father casually replies, “Sure, go try it.  Every boy should try it” (probably not something parents would say in 2010, but this was the fifties).  He leaves with only a penknife, a ball of cord, an ax, a flint and steel, and $40.  After hitching rides and arriving in the Catskills, Sam finds an isolated part of a mountain where he hollows out a tree and calls it his new home.  During the year he spends in the woods, Sam learns how to hunt, fish, whittle, and collect berries and food.  Some of these skills he learns on his own, some with the help of books he reads at the closest library he occasionally wanders into town to visit.  George’s book includes illustrations teaching the reader such things as how to build shelters, make willow flutes, set snares, and identify wild onions.

Reading this book as a—what, ten or eleven-year-old?—I was entranced.  Sam was not scared, not lonely, not in danger.  He was communing with animals, with the changing seasons, with the challenging weather, with the complex relationships he observed all around him in nature.  During the course of his self-imposed walkabout, Sam embraced solitude as both a rite of passage and an antidote to city life.

My Side of the Mountain both romanticized and validated solitude for me.  Its lingering impression was reinforced by other books I read as I grew older.  Thoreau’s On Walden Pond.  Ralph Waldo Emerson’s essays.  Herman Hesse’s Siddhartha. These were my literary models of solitude, models that showed me what solitude looked like, showed me what it could produce, showed me why it was important.  And there were others: Barry Lopez’s reflections on the revitalizing power of being alone in the wilderness in Crossing Open Ground.  Elise Boulding’s beautiful essay on the importance of solitude in the life of a child from her book One Small Plot of Heaven.  Annie Dillard’s solitary meditations on God and Nature in Pilgrim at Tinker Creek.  Diana Michener’s “Catching the Sun,” a refreshingly honest essay on the complicated relationship between work, art, solitude, and the self.  And Leaves of Grass, Whitman’s teeming, breathing, vibrant love letter to America that was, to me, so clearly the product of his own solitary reflection.

At the same time, literature contrasted these images of solitude with images of loneliness, helping me understand the difference between the two.  There was Ralph Ellison’s Invisible Man, drinking sloe gin fizz and listening to Louis Armstrong’s “Black and Blue” in his underground hole, isolated and terribly lonely.  In John Irving’s The Cider House Rules, there was Dr. Wilbur Larch, ether addict and secret abortionist, inhaling the anesthetic at the end of the day in his office, revisiting past anguish, doing good works in his orphanage yet so heartbreakingly alone.  In novels by J.M. Coetzee and Milan Kundera and Philip Roth, I found protagonists who claimed to be living voluntarily in solitude, but their loneliness was always palpable to me.  Reading James Dickey’s poem, “Hospital Window,” with its image of a dying father waving back to his departing son, I was haunted not by the depiction of the old man strangely grinning from his window, but by the looming loneliness trailing his grown child in the street below.

If we have indeed lost sight of the importance of solitude today, perhaps literature can remind us why we must make space for it.  If we have come to conflate solitude with loneliness, then perhaps literature can clarify the difference for us.  Deresiewicz concludes his essay by asserting, “Those who would find solitude must not be afraid to stand alone.” For me, books have always suggested ways to stand alone without being lonely.  Deresiewicz’s essay prompted me to reflect on this truth, in solitude.

Read Full Post »

Nearly 100 years ago, baseball impresario Albert G. Spalding published his sprawling, 500-plus-page book, America’s National Game.  This 1911 tome, replete with illustrations by political cartoonist Homer Davenport, is one part history, one part autobiographical recollection, and many parts unabashed celebration of Spalding’s own contributions to the development of the game.  Spalding (1850-1915) was a former pitcher for the Chicago White Stockings who subsequently became the team’s president as well as a magnate of the sporting goods industry.  His book traces the long history of baseball, starting with early games of ball played in ancient Greece and Rome, through the invention of baseball’s modern rules by Abner Doubleday in Cooperstown, NY in 1839, through the creation of the National league in the 1870s.  Throughout America’s National Game, Spalding stresses the factors that make baseball a distinctively American sport.  To wit:

“I claim that Base Ball owes its prestige as our National Game to the fact that as no other form of sport it is the exponent of American Courage, Confidence, Combativeness; American Dash, Discipline, Determination; American Energy, Eagerness, Enthusiasm; American Pluck, Persistency, Performance; American Spirit, Sagacity, Success; American Vim, Vigor, Virility.  Base Ball is the American Game par excellence because its playing demands Brain and Brawn, and American manhood supplies these ingredients in quantity sufficient to spread over the entire continent.”

Spalding’s love of adjectives and alliteration aside, another overt agenda of America’s National Game is to promote the idea that baseball can facilitate the spread of U.S. empire and cultural influence.  Early on in the book, Spalding declares, “baseball follows the flag.”  As evidence of its Americanizing—and allegedly civilizing—potential, Spalding proudly cites the establishment of the sport on recently acquired American colonies such as Hawaii, Puerto Rico, Cuba, and the Philippines.  “Wherever a ship flying the Stars and Stripes finds anchorage today,” writes Spalding, conveniently erasing the violence of imperialism, “somewhere on nearby shore the American National Game is in progress.”  Spalding himself organized a baseball world tour in 1888-89, sending two teams to Europe, Egypt, and Australia in the hopes of spreading not just the game but American ideals as well.  Apparently, American cultural insensitivity was also on display, as Spalding recounts how in Egypt, U.S. ball players used one of the Great Pyramids for a backstop and photographed themselves atop the Sphinx, much to the chagrin of locals.

Spalding’s unapologetically imperialist book reminds us of the long and sometimes forgotten history of baseball’s relationship to war and globalization.  Another chapter in this long history is currently being written in Iraq.  Spalding’s century-old claims about baseball’s universal appeal are being newly tested, albeit haltingly, by the upstart Iraqi national baseball team.  The team was founded by several young Iraqi Americans who had played in the United States and drummed up curiosity about the sport during their 2005 visit to Baghdad.  Since then, the sport has taken an initial but tenuous hold in Iraq.  In fact, a year ago, the team had only one softball bat, one baseball cap, three balls, no official rulebook, and nine used gloves.  They had no cleats so they wore running shoes.  They received threats from Sunni insurgents who accused them of playing “an occupation game.”  Fortunes changed for the team following the publication of a McClatchy news article about their struggles—a story subsequently picked up by the Rachel Maddow Show on MSNBC—when U.S. corporate donors offered to ship the much-needed equipment.  Since then, the co-ed team has raised its profile and started to play in international competitions.  In May of this year, the Iraqi ball club paid a ten-day visit to the United States courtesy of the U.S. State Department.  The brief tour included outings to Nationals Park in D.C. and a baseball camp sponsored by Cal Ripken, Jr.  According to a McClatchy report, U.S. State Department officials expressed hope that the trip would inspire the players to speak positively about both baseball and the United States when they returned to Iraq.

Albert Spalding once compared baseball to war, arguing that the sport could transform foreign cultures just as effectively as an invading military force.  Following the U.S. invasion of Iraq, baseball may not be transforming Iraqi culture—and whether it even should is certainly subject to debate—but nonetheless, the game seems to have gotten to first base.

Read Full Post »

On Howard Zinn

My journal entry from April 17, 1997: “Bought Zinn’s Can’t Be Neutral.  Ate a heavy calzone and too much garlic bread.  At Wordsworth books in Harvard Square, someone asked the clerk where he could find books on marionettes.”

In this rather unremarkable fashion, Howard Zinn’s memoir entered my life, equally sharing diary space that day with food and puppets.  I had already read and reveled in A People’s History of the United States, thanks to a colleague who had recommended it, but I knew little of Zinn the man.  You Can’t Be Neutral on a Moving Train: A Personal History of Our Times introduced me to Zinn the teacher, the activist, the historian.  I have reread sections of that book many times since then.  So last week, the first thing I thought of when I heard that Zinn had passed away was the experience of reading that book thirteen years ago.

I read Can’t Be Neutral as I was finishing up my third and final year of teaching high school English at a private boarding school in New Hampshire.  Those three years had constituted a kind of intellectual awakening for me that Zinn’s autobiography capped off perfectly.  I was reading widely and often during my tenure as a secondary teacher—books by Albert Murray, Cornel West, C.S. Lewis, Sven Birkets, bell hooks, Michel Foucault, Theodore Sizer, Jonathan Kozol, Vladimir Nabokov, Kurt Vonnegut, Jr., Jane Smiley, J.M. Coetzee, Alice Walker, John Irving, Toni Morrison.  These theorists, critics, and novelists constantly filled my head with ideas and inspired me to fill my journals with amateur stabs at cultural analysis.  But it was Zinn that focused my wayward reading habits for me.

Zinn prompted me to conceptualize how I could translate everything that I was reading and growing passionate about into a classroom teaching practice.  He helped me see how my writing and research interests could complement my teaching, rather than remain separate from it.  Moreover, he helped me understand that my own life experience, my own ideas about the world, my own beliefs did not necessarily have to be disconnected from what I did in the classroom.  This may seem like an obvious proposition to many in academia today, but for me, at the time, as a twenty-something high school teacher contemplating applying to Ph.D. programs, this was a revelation.  I found Zinn’s passion, his zest for teaching history, contagious.  That fall, I applied to graduate school.  I thought the field of American Studies seemed like just the place where I could put my new ideas about research and teaching into practice.  As it turns out, I was right.

I always find it interesting how certain books come into our lives at particular moments.  How we choose to buy that one book instead of any other in the store.  How we pull that one book off of our shelf—the one that has been collecting dust for years, one of many we have never read—and glance at the first few pages and suddenly leap in.  How we then integrate that book into our everyday life at that exact moment, using it to help us make sense of our world.  I’m grateful that I picked up Can’t Be Neutral when I did.

Read Full Post »

I recently pulled my copy of Joan Didion’s memoir The Year of Magical Thinking off of my living room bookshelf and reread sections that I had marked off when I first read it two years ago.  Didion’s book about the death of her husband is a beautiful meditation on loss, grief, faith, and enduring.  I started thinking about the year the book was published—2006—and realized that death was a prominent theme in American culture at mid-decade.  That same year saw the publication of Philip Roth’s novel Everyman, a stark reflection on aging and the human body’s inevitable deterioration that begins with the protagonist’s funeral on page one.  The HBO television show Six Feet Under, a drama about a family that owns a funeral home, ended in 2005 after five seasons.  Also in 2005 the indie rock band Death Cab for Cutie released its album Plans, which featured several musical ruminations on aging and death (most notably “What Sarah Said,” with its haunting closing refrain, “Love is watching someone die.  Who’s gonna watch you die?  Who’s gonna watch you die?”).  I can’t help but try to connect dots everywhere that I see them, and I started contemplating the meaning of this convergence of four serious cultural meditations about death, all appearing in the mid-2000s.     

My initial thoughts are that these serious treatments of aging, dying, and death emerged in the years after 9/11 and the start of the war in Iraq, and in the midst of Katrina, the Indian Ocean tsunami, and even the much publicized debate about Terri Schiavo.  For whatever reason, American culture seemed to take death seriously at mid-decade, with literature and popular culture encouraging audience reflection on life, mortality, and saying goodbye.

I can’t help but contrast these realistic, sober cultural offerings of five years ago to our current pop culture plate.  Much of our mass culture today is devoted not to the dead but to the undead—to zombies and vampires and people who never seem to die.  At decade’s end, we hungrily consume stories about lives (and loves) that transcend death, be it True Blood or Twilight or Zombieland.  Does this mean that 9/11 has receded too far from our memory?  That in the idealistic age of youthful Obama we are more concerned with possibility and new beginnings than with decline and the end (death panels notwithstanding)? 

Or maybe the rhythms of our culture simply reflect what we as human beings experience in our daily lives: there are moments when our contemplation of death is intense, other moments when we feel we will live forever, and many other moments when we simply don’t think about death at all.  I happen to be experiencing the first moment these past few months—thinking more about mortality as I enter mid-life—and perhaps that’s why I am even seeing these connections in the first place.

Read Full Post »

Older Posts »