Sabtu, 31 Maret 2012

The Foreign Language Requirement

Last week, while I was busy procrastinating revising my conference paper, I became tangentially involved in a minor kerfuffle on Twitter regarding the foreign language requirement for doctoral students in the humanities -- specifically, for PhD students in U.S. history and/or English/American literature. 

The chief interlocutors in the debate were Rosemary Feal of the MLA, Rob Townsend of the AHA, and Erik Loomis, an environmental historian who writes for the blog "Lawyers, Guns & Money."  (Loomis is an assistant professor at the University of Rhode Island.)

Feal was touting the MLA's latest recommendation calling for "advanced competence" in a single foreign language for students pursuing PhDs in English, and Townsend seemed to think the same requirement would be a good idea for U.S. history PhDs.  Loomis, on the other hand, decried any foreign language requirement for U.S. history PhDs as an unnecessary bit of academic gatekeeping that smacked of elitism and would alienate students from working-class backgrounds.  He suggested that it would be more useful and valuable for students to learn a programming language.  I will respond to (some of) Loomis's ideas about the needs of working-class students below. 

First, though, I'd like to say a brief word about Twitter -- what it is, and what it ain't.  Twitter is a great medium for making quick synaptic connections -- promoting links, bloggers, blogs, news articles, finding interesting people doing interesting work.  It's a great place for starting conversations, but an infelicitous forum for carrying on a sustained argument.  Generally speaking, a Tweet is less a fully-articulated thought than a place-holder or pointer indicating that one has some Ideas Worth Discussing that Ought to Be Explored in More Depth Elsewhere.  Happily, Erik Loomis has elaborated his ideas on the language requirement in a blog post here

Loomis, Feal and Townsend are not the first academics to publicly disagree about the worth or utility of a foreign language requirement.  Indeed, this is a Very Worn Argument.  If you want to have a little fun with JSTOR, do a search for "foreign language requirement" (in quotes) and "history."  Here are the search results, ranked by relevance, and here they are ranked by date (newest to oldest). As you can see, the earliest salvos in this debate are pretty dang old.

Doug Steward's 2006 article, "The Foreign Language Requirement in English Doctoral Programs" (Profession: 203-218), is especially helpful in sketching out a historiography of the debate about the foreign language requirement for literature PhDs.  Steward's assessment of the issue offers a far more nuanced and nicely argued version of the basic idea that informed my incredulous tweet accusing Erik Loomis of advocating provincialism.  Steward writes:

Our linguistic and research biases in the English profession are as "US-centric" as biases usually are in the United States, and that includes a kind of obeisance to the definition of science that obtains here.  Two consequences of such a deference to the scientific research model are narrow specialization...and the utilitarian devaluation of any skill, such as knowing a foreign language, that does not yield quickly tangible research benefits.  These biases also include an unconscionable, if unconscious, complicity in the English language's global hegemony and in the views that language is a transparent medium of communication and that English is the language of the United States....Culturally, English monlingualism means national isolationism and a parochial self-regard.  If this is a problem in the United States English-language population at large, I can think of no good reason to condone such isolationism among the most educated Americans -- those with research degrees... (213-214)
Now, Steward is talking about English (including, of course, American Literature).  Simply substituting "the U.S. history profession" for "the English profession" here would probably represent a wildly inappropriate conflation of one discipline's history and character for another.  So I would be grateful if my colleagues and/or readers who are historians of higher education or have an especially good command of the history of the profession would offer some insights here on whether or not Steward's assessment "in the abstract" might be aptly applied to the particular situation of U.S. history or U.S. historians.  Narrow specialization, complicity in the English language's global hegemony, parochial self-regard -- is this us?  Alternately, if someone can point us to an article that lays out the history of the foreign language requirement (or lack thereof) for U.S. history -- something I was not able to find -- that would be helpful.  Perhaps U.S. history has been less monolingual as a profession or discipline than English or American lit.  And perhaps not.

In the meantime, though, I would like to address one aspect of Loomis's argument.  He writes:
History should be a tool for work. It should be a tool for the work of many of us. And our students are in fact leaving the history major because they don’t see it as valuable for their future. Holding onto the belief that people should major in humanities because they will be smart has its own value, but it’s also not enough to compete in the reality of the 21st century university marketplace, particularly among students with working-class backgrounds. We need to show our students that history does have concrete value for their future, including BUT NOT ONLY, that it will make them more educated and interesting.
It is not clear what Loomis means here by the "reality of the 21st century university marketplace," but I assume he means the academic job market, since he concludes his post with a discussion of the difficulties of finding a tenure-track job.  However -- and regrettably -- the "marketplace" has come to the university in a multitude of ways that reach well beyond the "job market."  I suppose a resigned acknowledgment that the market has triumphed once and for all, that only those skills that are easily instrumentalized and obviously and immediately lucrative are worth the time and money it takes to teach them or learn them, is a sensible and even defensible position.  But it is a presentism with no future -- no future for the academy, and no future for the working-class students about whom Loomis is rightly concerned.

Whatever the degree requirements for getting a PhD in history, the profession of U.S. history as a whole -- I mean the whole field, broadly construed -- requires that there be scholars who are conversant in languages other than English, not only for archival research but also to critically engage with current scholarship.  If the hegemony of "the market" has done anything, it has helped to undermine the paradigm of the singular nation-state with discrete geographic, economic, cultural and epistemic borders.  The field of U.S. history extends far beyond the borders of the United States, and the contours of the field for practitioners within the U.S. seem to be bending towards something like a transnational turn, if that's not already in our rearview mirror.  So the profession will continue to need trained historians who are skilled in languages other than English, and those scholars who do have such language skills might have a competitive edge in "the market."

Besides, if "the market," rather than the profession, is to dictate the skill set that makes for well-trained historians, how are working-class students served by the suggestion that they forgo language training?  It is hard enough to compete for a tenure-track job against star applicants from top-tier programs that will probably not waive the language requirement.  Why make it harder by choosing to further diminish the kind of training offered to PhD students at non-elite institutions?  You do working-class students no favors by telling them to aspire to be less skilled and less prepared than the top tier of their cohort.  If anything, language training should receive more emphasis and support at non-elite schools.  Instead of clamoring to have this requirement waived, PhD students in U.S. history should clamor for more and better language training for undergraduates at their institutions, and more support for intensive language tutoring in their graduate programs.

As a profession, we don't have to stand by and let "the market" decide that a less skilled, less well-trained academic workforce is beneficial.  It's not beneficial to society, and it's not beneficial to the discipline of history.  It's only beneficial to those who seek to deliver education on the cheap.  Working-class students -- and I have been one, and I guess I still am one -- have been shortchanged in so many ways already.  They don't need to be complicit in the further narrowing of their own horizons.  Instead, they -- we -- ought to collaborate with the other disciplines in the humanities -- and especially with the beleaguered foreign language departments -- to contend for a more robust, more rigorous graduate education for all of us.

Jumat, 30 Maret 2012

The Importance of Learning: Liberal Education and Scholarship in Historical Perspective (Call for Papers)

Howard Hotson, Professor of Early Modern Intellectual History at Oxford and President of the International Society for Intellectual History, just called my attention to the isih's 2012 conference, which will be held at Princeton University on September 4-6 and is entitled The Importance of Learning: Liberal Education and Scholarship in Historical Perspective.

It looks like a fascinating event and should be of interest to members of S-USIH and readers of this blog.

Paper and panel proposals are due on April 16, 2012.  The complete Call for Papers can be found at the above link and below the fold.


The Importance of Learning: Liberal Education and Scholarship in Historical Perspective
Princeton University
4-6 September 2012

Call for Papers

It is an inescapable fact of contemporary life that the idea of a liberal education, an education that aims primarily at the cultivation of the intellect and sensibility rather than at preparation for a particular vocation, is widely under attack all over the world. In country after country, the idea of learning for its own sake is being swept aside, as institutions of higher education are pressured to devote themselves primarily to preparing students for careers in practical areas. The global membership of the International Society for Intellectual History is in a unique position to illuminate these questions from a genuinely historical and cosmopolitan perspective.
The range of potential questions is vast:


  • What role did the ideal of liberal education play in classical Western intellectual culture?
  • How did the idea arise in ancient Greece? How was it modified in transmission to ancient Rome?
  • Was it fundamental to higher education in the medieval and early modern periods?
  • How did it mix and mingle with other, more practical conceptions of higher learning throughout history?
  • How did the high theorists of university teaching and research in the nineteenth and early twentieth century develop the idea?
  • Have their writings been superseded? If so, how and why?
  • What lessons can be learned from the intellectual history of non-western cultures?
  • What cultural values and intellectual assumptions have sustained the quest for new knowledge and its transmission to the next generation outside the Western world?
  • What forces and agendas are propelling the current redefinition of university learning around the world?
  • What prospect is there for reviving elements of this traditional concept in an idiom appropriate to the twenty-first century?

During the conference, a series of distinguished keynote speakers, to be announced shortly, will help determine some of the broad lineaments of the topic, which will be further explored in contributions of two main kinds submitted in response to this call for papers. The first and principal form of contributions will be brief papers relating to the theme of liberal education, scholarship, and their place in society. Papers can concentrate on any period, region, tradition or discipline, including the arts, humanities, sciences, and various forms of professional learning. As well as individual papers, we welcome proposals for panels of up to three papers and a commentator. Individual papers will be twenty minutes long, followed by ten minutes of discussion.

The second set of contributions will be posters designed to draw on the international scope of the Society. The purpose of the posters is to document the various attempts to reform higher education being pursued simultaneously in various countries. As well as brief narratives of major legislative and reform programs and opposition to them, posters should include references to resources for studying the national situation further: news broadcasts, documentaries, serious journalism, academic studies, and legislation itself. As well as displaying the posters and discussing them in a poster session, it is hoped that this material can be collected and archived on the Society’s website as a readily accessible resource.

Please submit abstracts of no more than 500 words for each paper or poster. Proposals for panels featuring a maximum of four papers should not exceed 2500 words. All proposals – for papers, panels, and posters – should be accompanied by a brief CV or biographical statement. Individual contributors are welcome to present both a paper (or panel) and a poster at the conference. All proposals are due 16 April 2012. Decisions will be announced by 1 May. Please send proposals to James Lancaster (james.lancaster(at)postgrad.sas.ac.uk), to whom you should also address any queries.



When did the cultural left become the "Cultural Left"?

Recently, Andrew Hartman asked in a facebook post about the historiography of the cultural left and its relationship to movies and television. I mentioned there that I had done some writing on this topic and as I sifted through other comments I was reminded how difficult I found identifying this body of work. Nonetheless, in a book I wrote entitled Freedom to Offend, I argued that there seemed to be a moment of transition between what had been a cultural left represented by Underground artists, filmmakers, critics, and even audiences, and a more self-consciously Cultural Left, who sought to create a movement out of a critical position. That position, it seemed to me, was best reflected in the work of New York critic Parker Tyler, who had done pioneering work on underground film, including gay film.

Tyler was one of handful of people in New York City who, like others involved in the Underground art world in other big cities, believed that the post-1945 period offered an opening for a genuine alternative aesthetic to mainstream culture. In New York City movie culture, one could find that world in Amos Vogel's revolutionary film society, Cinema 16; in Dan and Toby Talbot's vital New Yorker Theater; in the work of filmmakers who gathered around the energetic Jonas Mekas and his New American Cinema group; and in the writing of critics such as
Tyler, Manny Farber, Susan Sontag, Andrew Sarris, and eventually P. Adams Sitney.

But a rift opened among this group in the late 1960s and Parker Tyler took the breach personally. Near the end his 1969 book Underground Film: A Critical History (with an emphasis on "critical') Tyler unleashed the fury of his polemic on "fetish footage" a trend that came to define Andy Warhol's films (especially The Chelsea Girls) and infected the Underground in general. Censors regarded the Undergound as little more than peep show hucksters masquerading as avant-garde. For his part, Tyler had stood against such policing authority in print and as an expert witness in cases such as the one that sought to ban Andy Warhol's Blue Film. But at the same time, Tyler did appear to wonder what he had come to defend. He contended: "To insist on responsibility, from the widest Underground standpoint, is to betray the very life blood of the avant-garde, whose prevalent aim is to exist without being measured or weighed by anything but its own self-approval. Underground film and Pop Art represent the only elites in human history without any visible means of earning or sustaining those privileges; that is, without any values that can be measured, or even, properly speaking, named except by its own labels" (175).

I understood Tyler's argument as one that covered a spectrum of decidedly "cool" movies and movements in the arts. Tyler felt disconnected from an aesthetic community because his cultural left had itself become the kind of ideological authority that, like the Victorians of old, could not be questioned without falling afoul of what simply was true and right. J. Hoberman wrote in a very good introduction to Tyler's Underground Film that Tyler had brought "modernism full circle" with his critique because he described how a cultural left had gone from "anti-illusionism to a narcissistic fantasy world that [produced a] celebritizing virus which, incubated in the Warhol factory, would infect the entire media system."

Some of the writers who have grappled with this moment in modernism were mentioned in response to Andrew's post. Among those I continue to find most interesting are J. Hoberman whose The Dream Life stands in significant contrast to Mark Harris's Pictures at a Revolution because Hoberman argues persuasively that if cinema was truly "revolutionary," its radicalism existed in an imaginary world--a world of myth--as much as in any kind of social politics. The illusion of radicalism in popular culture and the arts was something that gave me fits when trying to write about it. I often thought I sounded reactionary when trying to parse what seemed like a genuinely radical critical stance from that which was not. Grey Taylor's book, Artists in the Audience took on the development of the term camp as a way to demonstrate the dilemma at the heart of Tyler's criticism. "In its extreme catholicism," Taylor writes, "[Jonas] Mekas's critical attitude epitomized the Underground's new vanguard camp. Borrowing and exploding traditional cultism's careful aesthetic selection of conventionally nonaesthetic material while merging it with camp's aesthetic transmutation of nondescript moviemaking, the new stance flaunted a critical perspective that would see any and all films as potentially aesthetic, exciting, and beautiful to anyone" (112).

Tyler's dilemma echoed throughout the culture of the late 1960s and could be seen in the criticism of Susan Sontag, Arthur Danto, and (a bit later) David Hickey. Craig Seligman's book, Sontag and Kael: Opposites Attract Me, takes up the tension between a radical aesthetic forged by Sontag and an aesthetic radically reimagined by the intellectually slippery, and incredibly influential film critic Pauline Kael.

In the end, this period continues to fascinate me in the way it seemed to squeeze a great deal self-conscious change through wildly popular medium like movies. It still seems to me a rare treat to read so many good critics who write about the intellectual ground moving beneath their feet. This was the moment at which at least idea of culture became accessible and broad enough to imagine people going to war over it.

Kamis, 29 Maret 2012

Book Review: Cox On The American Bourgeoisie

Review of Sven Beckert and Julia B. Rosenbaum, Editors, The American Bourgeoisie: Distinction and Identity in the Nineteenth Century. Studies in Cultural and Intellectual History, Palgrave Macillan: New York City, 2011.

Reviewed by Nicholas P. Cox
University of Houston


“Who in the world today, especially in the realm of culture, defends the bourgeoisie?”
-Daniel Bell, The Cultural Contradictions of Capitalism (1976)


If Sven Beckert and Julia B. Rosenbaum, editors of The American Bourgeoisie: Distinction and Identity in the Nineteenth Century, do not propose to defend the American bourgeois, then they do at least advocate for a greater scrutiny and recognition of the impact that bourgeois Americans made—on the production and consumption of culture, the gradations of taste and manners, and the role of the bourgeoisie in the founding of seminal cultural institutions from art museums, symphonies, and elite university alumni organizations.

Not surprisingly in a volume that collects fifteen conference papers subsequently elaborated for publication there are just as many definitions of bourgeoisie as there are contributions to this volume. Definitions of who were the bourgeois range from my personal favorite for its brevity and inclusiveness “Americans who distinctively wedded culture to capital” (from the editors’ erudite introduction) to a series of satisfactory and functional definitions. These others are less about taste shaping or cultural production and more about consumption as the definition of Bourgeois experience. The essays in this volume touch on a broad range of topics such as culinary arts, miniature portraiture, travel memoir, and orchestra patronage. Consequently it is inevitable that each contributor’s idiosyncratic definition of who was bourgeois and the boundaries of their studies begin to conflict, abandoning any effort at precisely looking at the same class or behaviors. One of my concerns is the fluidity in which the bourgeois become less distinguishable from more conventional definitions of the larger middle class—a group with a work ethic that dignified the accumulation of wealth combined with a moderate regard for ostentatious displays of consumption. That definition of the middle class has much in common with this volume’s consideration of the bourgeois, which sees the leisure ethic of both the extraordinarily wealthy and the depressingly downtrodden as sharing the same stubbornness against dignified labor and thriftiness, if not sharing, of course, the same financial position. For the purposes of the examination of the bourgeoisie, then, this class often includes people of extraordinary wealth, clearly having transcended the middle class, such as the industrialists-turned-philanthropists like a Carnegie or Rockefeller. The Scottish immigrant and the Midwestern Sunday school teacher expressed middle class values of hard work, thrift, and sobriety and yet, their unparalleled wealth in combination with these values made possible the emergence of a bourgeoisie in the United States. In general, in this essay collection, it is not income or property that defines class, but manners and the Protestant work ethic.

The contributors to this volume do not note historian John Thomas’ assessment, in Alternative America (1983), which sought to explain why Midwestern anti-capitalism, from men such as novelist Theodore Dreiser and labor organizer and perennial Socialist presidential candidate Eugene Debs, sought to put the brakes on the taste making and cultural hegemony of the bourgeoisie: “Liberals in all the professions suddenly realized that their most urgent task was educating middle-class Americans by helping them to adjust their preferences for the fluidity and individualism of an agrarian social order to an industrial one in which these values seemed dysfunctional.” Indeed, the bourgeoisie in the pages of this volume are seeking to transform an agrarian rural hinterland and a growing, increasingly immigrant, industrial proletarian class into middle-class Americans who appreciate music, meals, architecture and furniture that accord with bourgeois taste. In the classic Arnoldian battle between low-brow and hi-brow, it is the bourgeois who are mediating—and who are evangelizing their view of taste.

Defending bourgeois campaigns to establish “distinction and identity” in the 19th century U.S. is entirely another matter. As cultural critic Daniel Bell pointed out after the tumultuous 1960s, no one seriously defends the notion that upper-class or middle-class preferences are superior to those of the democratic mass in the classic Dwight Macdonald-style jeremiads against demotic art, music, or popular entertainments; nor does anyone seriously defend the relatively bullying methods that the bourgeois shaping of culture was a necessary part of training the working class and immigrants without acknowledging the denigration of an already financially subordinate group. If an edited collection of essays can be said to have a thesis, then the most central shared argument Is that, rather than fight these cultural wars again, the contributors to this volume wisely avoid making overtly judgmental or praiseful commentary on the practices of the bourgeoisie, but instead uncover, through close scrutiny of social practices, just how the bourgeoisie shaped the development of modern American culture.

Essays by Anne Verplanck, Francesca Morgan, and Paul DiMaggio stand out for their close attention to the role of the bourgeois in early 19th century Philadelphia, in the aristocratic pretensions of genealogical societies, and of late 19th century Chicago. Verplanck’s reading of the brief ascent of miniature painted portraits before the advent of photographic portraiture traces out the privileging of artisanal skill and patronage networks that facilitate the production of gorgeous miniature paintings, unfortunately swept away by the cheaper, quicker, and more democratic demand for photography. Morgan’s study of bourgeois fixations on aristocratic, Mayflower, or Revolutionary genealogical family lineages demonstrates the aspirations to cultural elitism that motivate the same clubbing and memorializing that would become ubiquitous in the turn of the century as Sons of This and Daughters of That sought to create restrictive but elevating social clubs. In a volume that is almost entirely preoccupied with the activities of a small class of northeastern urbane cultural consumers and producers, DiMaggio’s wonderful essay reminds us that the Midwest had its own form of bourgeois taste making where new business-generated wealth fashioned a civic culture distinct from the seaboard bourgeois of Boston, New York or Philadelphia.

The volume includes another dozen essays of relatively similar interest and talent, rounding out a volume that will be of interest to scholars across disciplines, but especially with regard to class identity formation or art production and consumption. There are essays on the habits of shoppers in a transitional moment from street vending to shop-keeping, on the stories of Henry James (of course…), on interior design, on patronage of museums and symphonies, higher education, and other related bourgeois experiences. In addition to the breadth of topics, the interdisciplinary range of the contributors is also rather remarkable with scholars of U.S. history and literature, as well as American Studies, but also from fields of museum studies, public affairs, sociology, art history, music history and culinary history.

There are, however, a few disappointing concerns, none of which are easily dismissed. First, my friend Daniel Wells (Origins of the Southern Middle Class, 2003), I am certain, would like me to remind scholars that there was a southern middle class in cities such as Baltimore, DC, Charleston, Louisville, and New Orleans and consideration of any of the bourgeois experiences in these southern cities would have added to the geographic range necessary if this volume intends to describe the American experience. Second, the absence of working class, immigrant, and African American considerations in the essays is surprising. Certainly, a defense could be fashioned that these subaltern populations were often not bourgeois taste-makers, but they were the intended consumers. Some thought on their reception or rejection of bourgeois intensions seems worth consideration. And as labor historian E. P. Thompson and sociologist Daniel Bell have noted, and for that matter Walter Benjamin and Karl Marx predicted, the subaltern’s combination of their own democratic cultural production with their rejection of bourgeois cultural prescriptions is precisely the location, for good or ill, of post-bourgeois 20th century cultural production. This collapse of bourgeois influence, so taken for granted by Modernists, avant gardists, beatniks, and the others in the first half of the 20th century, made possible the assumptions behind Bell’s statement that bourgeois values are indefensible. And yet, the persistence of middle class values, bourgeois arts, and working class aspirations toward bourgeois living persist despite the efforts of the avant garde. What of this persistence in the 20th century?

Decent libraries everywhere should stock this volume while I unhesitatingly encourage scholars of a wide range of interests to take the time to read the excellent introduction and the essays relevant to their own work; indeed, reading this cover to cover is reward in itself for its commitment to speaking across disciplinary boundaries. I sincerely hope that an inexpensive paperback edition follows quickly for widespread course adoption.

Rabu, 28 Maret 2012

"Politics of Respectability" Take 2

I get to indulge my love of shoes when I go to black history conferences. Unlike the botanists and biologists I grew up around, who would wear REI clothes to a formal ball if they could, the people at black history conferences DRESS. There is a slight generational curve here--at the banquet, the older generation flows through the door, beautifully attired in Kente cloth, with lavish headpieces, or in elegant suits and dresses. There is a slight lessening of the sumptuousness of the outfits of my generation, but they are still well dressed.

One time I was hanging out with new friends near the computers. I was done with my computer and sat back to read a book. For some reason, I threw my feet up on the desk--rather informal for me, but nothing I thought twice about. I was immediately chided on all sides for my indecency--particularly because these black women thought they would be blamed for my poor manners.

In the second example, chiding me for my bad manners, these young black women were certainly inhabiting respectability. In the first, there is a lingering element of black people dressing better than whites because they needed to to prove their respectability. In the conversation around Trayvon Martin's hoodie, black men on twitter have recently discussed all the different ways they are profiled depending on how they are dressed--hoodie or suit. But I think there is something else here, too. There is joy and there is structure. Men and women who don't enjoy dressing up feel the inherent white male privilege of those who can dress down on campus and at conferences. Well, probably everyone feels that, but at the same time, people who enjoy dressing up feel a freedom and a joy in gathering with other black people and showing off their dress and their beautiful, elegant coiffures.

For me, it is that joy and freedom that is missing in discussions of the "politics of respectability." (Maybe it is written in terms of race pride and self-respect--maybe that is what I missed).



I was challenged last week to update my historiography on the "politics of respectability," so I've spent this week working on that, among other things. This 2008 review by Rosalind Rosenberg of Stephanie Evans' Black Women in the Ivory Tower highlights one way to reconsider uplift and respectability.
"In many ways black educators' careers paralleled those of white women at the time. They both joined countless clubs dedicated to social reform and shared what Evelyn Brooks Higginbotham has called “the politics of respectability” (p. 64). Both sought to remake the black working‐class in their middle‐class image in what Ula Taylor has dubbed “the iron cage of uplift” (p. 64). But unlike their white peers, whose efforts at uplift were directed for the most part toward people from different national, ethnic, and racial groups, black educators reached out to members of the same race, who faced the same kinds of political, legal, social, and economic obstacles they did. Evans argues that this difference made them less willing to accept the biologically based, hierarchical thinking of the day. In common with an increasing number of black feminist theorists, Evans analyzes her subjects less in the either/or terms of class division and more in the both/and terms of shared oppression." 
Both/and--this is what Michelle Moravec suggested in the comments to my post.

It was also suggested that I read Victoria Wolcott's Remaking Respectability. She also goes beyond a simple condemnation of the bourgeoisie for advocating uplift, explaining that many in the lower classes also adhered to those modes of being. 
"The shared norms of behavior in the black community reflect a greater degree of 'circularity' between dominant and subordinate classes than was present in white society. Segregation and racial discrimination heightened reciprocal cultural influence among African Americans. Thus, at times working-class and middle-class women's notions of respectability converged. Both focused on domesticity as the central terrain of uplift, on the need to defend African American women against sexual harassment and rape, and on racial pride." ... "Indeed, black women throughout the twentieth century have used respectability to enhance their reputation, ensure social mobility, and create a positive image for their communities. To be 'respectable' was an identity that any African American could embrace, whatever his or her economic standing." (pg 8)
In her exploration of middle-class marriage in the interwar era, Anastasia Curwood notes that "To both men and women, status and respect seemed fundamentally linked to the moral superiority that marriage granted, and marriage in a very real sense provided financial security and emotional support. Therefore, for middle-class African Americans at the turn of the twentieth century, marriage, in addition to its emotional functions, illustrated the fact that black people could be sexually moral and supported spouses engaged in accomplishing the work of uplift" (Stormy Weather, 18). Men were invested in building a masculinity that emphasized the patriarchal role of the father figure who could provide for his family alone. Women attempted to balance responsibility "to themselves, to their families, and to the race."

Ok. I can see where I was too constrained in my description of the "politics of respectability" last week. Most of the scholars who write about it acknowledge the nuance and cross-class dialogue that went on.

I guess what I'm interested in, is the intra-class dialogue of the 1920s in which the idea of respectability was being called into question. The younger generation of writers no longer wanted to adhere to strict moral guidelines, much like the white authors who were advocating free love and The End of Innocence. Countee Cullen, one of the more straight-laced of the Harlem Renaissance poets, starts off the poetry section of the New Negro with two poems urging free love, with only a hint of caution. I will quote one:

TO A BROWN BOY
That brown girl's swagger gives a twitch
To beauty like a queen;
Lad, never dam your body's itch
When loveliness is seen.

For there is ample room for bliss
In pride in clean, brown limbs,
And lips know better how to kiss
Than how to raise white hymns.

And when your body's death gives birth
To soil for spring to crown ,
Men will not ask if that rare earth
Was white flesh once, or brown.
My challenge is to find out what the African American women who engaged in clubwork thought of their lives and their organizations. We know that many in the Harlem Renaissance "were jaded at the conformity and materialism they saw in respectable middle-class black families." (Stormy Weather, 55; I gave more examples last week).  Recovery work which relies heavily on gossip columns with few references to reactions to parties waylays me a might bit. But I argue that the 1920s was a time of intra-class disagreement over what was the right way to act; more African American women still believed in (out of choice and out of necessity) respectability. For some, like Marita Bonner, this was intensely constraining. For others, their clubwork was freeing and joyful.

Thank goodness for Eslanda Robeson, even if she writes more about her husband than herself.

"Just as white Americans flock to New York for greater opportunity or adventure, so does the Negro come to Harlem, and for the same reasons. There is the Y.M.C.A and the Y.W.C.A., where he meets young Negroes from all parts of the world; there are inter-collegiate fraternities and sororities, and other college and social clubs; there are educational, social, political, and philanthropic organizations, all made up of, and entirely run by and for, Negroes. There are innumerable public and private dining-rooms and restaurants where a Negro is an expected and welcome guest. A Negro knows exactly where he is in Harlem: he is among friends, he is at home" (Paul Robeson, Negro, 50-51).
Even so, the Robesons made their home in London for many years because they did not have to worry about the vagaries of discrimination, which were checkerboarded across Manhattan--here they could eat and there they could not, here they could see a show and there they could not.

Selasa, 27 Maret 2012

Diane Ravitch Speaks

I'm excited. Tonight, Diane Ravitch speaks on my campus. She is to give a talk at Braden Auditorium, which seats just under 3,500 people. Based on the fact that ISU has one of the largest colleges of education in the nation, and that Ravitch has become the nation's staunchest supporter of teachers, my bet is the event is standing-room only. Given Ravitch's career trajectory, which Tim Lacy first called attention to at this blog over two years ago, this is remarkable.

I first got to know the work of Ravitch in graduate school. I took a class on the history of education reform with Donald Collins, who has been a mentor and friend ever since. Donald assigned authors with various perspectives, including some he disagreed with, namely, Ravitch, who was known at the time to be a fairly conservative historian and education reformer, appointed assistant secretary of education by George H.W. Bush. We read Left Back: A Century of Failed School Reforms. In this book, her magnum opus, Ravitch argued that the progressive education movement took a wrong turn somewhere shortly after Dewey. It had dumbed down the national curriculum, lowered standards, made schooling too vocational, too adherent to this or that trend. Left Behind is carefully researched, which is no surprise given that Ravitch learned the craft of history from Lawrence Cremin, and I learned a lot by reading it. But at the time I thought it was crankily nostalgic for a classic liberal curriculum that was never as liberal as its proponents claimed. I also thought it was too hard on progressive education, though I was certainly critical of some elements of that movement in my book, Education and the Cold War, though from a different vantage point.

But now Ravitch is the most prominent voice against the so-called education reform movement. This is evident in her latest book, the excellent and bestselling The Death and Life of the Great American School System: How Testing and Choice Are Undermining Education, which is not only an immaculately researched brief against the education reform movement, but also Ravitch's mea culpa. She apologizes for getting caught up in the zeitgeist that reigned supreme in policy-making circles: privatization, testing, "choice," charters, etc. Ravitch's resistance to the education reform movement is also evident in her being one of the most influential "tweeters" around (@DianeRavitch). Ravitch also now regularly writes essays against the education reform movement at The New York Review of Books. Her latest, which critically reviews Teach for America founder Wendy Kopp's recent book, aligns with my thinking on Teach for America, which I make clear in my Jacobin article, "Teach for America: The Hidden Curriculum of Liberal Do-Gooders." Again, this is remarkable. I'm excited for tonight.

Senin, 26 Maret 2012

The Academy in Peril

As readers of this blog may be aware, I was in Dallas on Saturday where I delivered the keynote address at the 4th annual RAW symposium at the University of Texas at Dallas. Thanks, again to USIH's own L.D. Burnett for inviting to speak on the Future of the Humanities.

Rather than reproduce my entire talk,* I wanted to briefly summarize my main arguments and then re-answer a question from the q&a period, having come to regret the answer I gave at the time (ain't blogging grand!).

I began the talk by suggesting all the hopeful things going on in the humanities these days...and I really think there are many. We're experiencing an explosion of interesting humanistic work beyond the academy. And academic humanities departments are operating in relative internal peace compared to the often nasty divisions found within them well into the 1990s.

But the second half of my talk, sub-titled "The Academy in Peril," focused on what I feel is the biggest challenge facing the humanities today: the threat posed to higher education by both its enemies and many of its supposed friends, who have all embraced a narrow, economist understanding of higher education that sees tenured and tenure-track (TT) faculty as an unconscionable fixed cost that interferes with rationalizing the operation of the university.**

We are already seeing the effects of decades of forced casualization of the academic labor force. Today, two-thirds of the 1.5 million college and university faculty are in non-tenure track (NTT) jobs.  Most of these faculty don't have PhDs, though many of them are pursuing the degree.  Marc Bousquet, one of the most trenchant observers of the current state of the academic labor system, summarizes the situation grimly, but not entirely inaccurately, as one in which PhD recipients are the system's waste products.  Far from preparing them for an academic career, graduate school is, for many PhD candidates, the academic career itself, as they become markedly less employable after they receive their PhDs. But while it makes perfect sense from an administrative perspective focused on the bottom line to replace newly expensive (though pedagogically experienced) recently minted PhDs with cheap (though totally inexperienced) entering graduate students, neither teaching nor scholarship is served by such a management technique.

Unfortunately, rather than thinking comprehensively about this whole labor system, faculty, especially TT faculty, tend to think in terms of a semi-mythical job market, which involves the competition among PhDs for the shrinking number of TT positions.

I concluded my talk not with a solution to these problems (I'm not sure what that would be), but rather with three things we should all do in order to work toward a solution:

1) Analyze.  We need to have a better understanding of the overall labor system of academia.  TT faculty, in particular, need to gain a better understanding of the experiences, needs, and hopes of NTT faculty.  Before we can devise solutions, we need to understand the problem.

2) Organize. Any solution will involve organizing.  Though I'm less convinced than Bousquet that doing so is the solution, unionization is a critical first step...where possible. Unfortunately, many states, including my own, ban faculty unions at public institutions. And the most powerful form of unionization--one that would link TT and NTT faculty in a single bargaining unit--is being bitterly contested even in places in which faculty unionization is well established.***  But even if you can't join a union, you can--and should--join AAUP. And NTT faculty should also join the New Faculty Majority.****  Finally, it is appalling that many faculty still oppose graduate student unionization efforts.

3) Advocate.  For the humanities, on and off campus. And for a new vision of higher education that puts the focus back on education and scholarship broadly understood.

So that's more or less what I had to say.

Let me add here that these things are especially important right now because a lot of people on and off our nation's campuses want to make the lives of faculty--and with them the quality of education and scholarship that goes on in our colleges and universities--immeasurably worse.

If you're a tenured or TT faculty member (or just hope to be one some day), now's a good time to take a look at how the NTT faculty are treated on your campus. Include graduate instructors in your field of vision.  What you'll see won't be pretty. And yet, what you'll see is what your administrators and educational reformers--Democratic and Republican--want your job to look like in the future.*****

The latest example of this sort of thing appeared in the Washington Post on March 23. Opining that faculty don't work hard enough, educational consultant and former New School Chancellor David C. Levy argues that all college and university faculty ought to be teaching 6:6 or 7:7 loads.******

All of which brings me to the audience question from Saturday that I wish to re-answer.

The q&a opened with a faculty member (I think), asking me what my work on Leo Strauss (which L.D. had referenced in her introduction of me) had to do with my "activism" (this word is usually meant as a pejorative in such a context).

My answer was simple: "Nothing," I said.

What I meant by that was that my work on Strauss and the Straussians really isn't informed at all by the sort of things I was discussing in this talk.  I understand--and appreciate--that many humanists understand their activism and scholarship as a kind of seamless web, that their scholarly work is, in a fundamental sense, engagé.  But that ain't me.  I'm kinda old-fashioned that way.

But I think I misunderstood the spirit of the question. Or at least, my very pithy answer didn't cover all it's potential meanings.

Almost immediately after the q&a ended, however, I began to wonder whether this question may have actually been intended as way of asking "Shouldn't you be doing your work instead of this sort of thing?"*******

At any rate, there is, in fact, a very direct relationship between my scholarly work and my "activism": if my conditions of employment significantly deteriorate, I will not be able to do my work on Leo Strauss and the Straussians or much of anything beyond teaching (badly due to the increased load).  Which is, I suppose, also why I object a little to the word "activism" here.  I'm not, after all, choosing to be involved in a political struggle. I'm just thinking about and trying to improve, or at least preserve, my own labor conditions.  And suggesting that if others in a similar position don't do so as well, we will all likely pay a very heavy price for our failure.

_____________________
* I gave the talk off of an extended outline, so reproducing it would either involve reprinting a series of notes to myself or attempting to reproduce the thing ex post facto.

** I've blogged at greater length about these issues on USIH before here, here, and here.

*** Last week, an Illinois State Court prevented such a combined union from gaining recognition at the University of Illinois at Chicago, which is happy to bargain with split unions.

**** Seriously. If you're a faculty member--and that includes grad students who teach--and aren't a part of those organizations, follow those links now!

*****Although my talk at UTD was generally well-received by its majority grad student audience, I heard through the grapevine that some senior faculty felt that I had come across as young and naive.  Though I'll admit that I've reached the age at which being called "young and naive," like being carded, has become more compliment than insult, I really do think that, with the possible exception of those of us lucky enough to be tenured at the best-funded and least sectarian private institutions in this country, all of our jobs are--or will shortly be--under direct, sustained attack.  The naive position at this point is to conclude that, because you have tenure, the bell does not toll for you.

****** Levy's argument is pretty self-refuting. But if you crave refutations, you can visit Paul Krugman at the Grey Lady (for a short one) or Robert Farley at LGM (for a longer version).

******* Though in all fairness to my interlocutor, my own capacity for guilt and self-punishment enables me to translate, say, a desire to grab a second cup of coffee into the question "shouldn't you be doing your work instead of this sort of thing?"

“You Can’t Say That”: A Reply to Michael Fisher

Author: James Livingston

[Editor's note: Here is Fisher's original review. - TL]

I

Since I started writing Against Thrift in 2009, the typical response from my liberal and left-wing colleagues has been “You can’t say that!” I’ve heard it a hundred times by now. They mean it. They assume that consumer culture is, at best, the place where bad taste, bad faith, and bad manners rule with the permission of advertising—it’s redeemable only by recourse to the suspicious methods of cultural studies—and is, at worst, the place where conscience, commitment, and even common sense go to die. When I was a fellow at the Cullman Center of the New York Public Library three years ago, one of my colleagues responded to the description of the project by saying, with no trace of irony or humor, “You’re the Devil.”

The book has been reviewed in the Wall Street Journal and the Financial Times, but not in the New York Times or The New Republic. It’s been reviewed in Bloomberg Business Week, but not in Dissent, The American Prospect, The Atlantic, or The Nation. Meanwhile I’ve written book-related op-eds for mainstream publications like Wired, the LA Times and the Christian Science Monitor, and been interviewed by NPR stations from San Francisco to New York. The last radio interview I did, however, was with John Batchelor at WABC, where his talk show colleagues include Sean Hannity and Rush Limbaugh.

How to account for this discrepancy? Is it a clear Left/Right divide, or just a difference between academic and middlebrow discourse? My argument on behalf of consumer culture makes no sense in the absence of my argument for a redistribution of income and a socialization of investment—and vice versa. Conservatives like John Batchelor and Leftists like Sasha Lilley at KPFA/San Francisco have grasped the connection between these arguments, and have responded with reasoned aplomb rather than astonishment. So the question becomes, Why is the liberal, academic Left so uniform in its views of consumption that its reflexive response to my defense of consumer culture is exasperation and dismissal (“You can’t say that!”), if not horror and disgust (“You’re the Devil”)?

You could say it’s a trade book with an incendiary title, so what’d you expect? Of course the middlebrow radio stations and the mainstream newspapers would pay attention—they need as much “provocative” content as they can get to attract listeners and readers—but you can’t expect serious journalists, intellectuals, and academics, typically liberals who are necessarily suspicious of finance capital, to entertain an argument that takes the universalization of exchange value (a.k.a. commodity fetishism) for granted, and that meanwhile treats advertising as the last utopian idiom of our time.

In these serious parts, it goes without saying that commodities are the enemy of the spirit; that consumer culture privatizes our experience and infantilizes our desires, thus precluding local community as well as progressive political action, not to mention the salvation of our souls; that advertising, the advocate of mindless consumption and the enemy of plain speech, puts everything up for sale, including our very souls; and that consumerism is clearly the most dangerous threat to the environment.

II

Actually, it doesn’t go without saying, and that fact raises a different question: why do we need to keep repeating ourselves? The same thing gets said over and over, as if hundreds of clerics were transcribing one master text—as if the critique of consumer culture is a reaction formation that has finally become a repetition compulsion. From Max Horkheimer to Paul Goodman, from David Riesman to David Potter, from Stuart Ewen to Juliet Schor, from Benjamin Barber to Jackson Lears, from James A. Roberts (an earnest marketing professor) to Kalle Lasn (the editor of Adbusters and a crucial inspiration of Occupy Wall Street), and—while we’re at it—from Robert Samuelson to David Brooks, the refrain never changes. It goes like this: Americans are the pliant products of a social pathology specific to the extremity of capitalism; they’re the willing subjects and the passive objects of a consumer culture induced by advertising and enabled by debt.

Like Christopher Lasch, who claimed thirty years ago that consumerism was the material condition of what he named the culture of narcissism—it was no longer an occasional personality disorder—these writers repeat the refrain because they assume it’s self-evident. Barber, for example, knows that his readers are already familiar with the master text, and so he never bothers to make an argument in Consumed: How Markets Corrupt Children, Infantilize Adults, and Swallow Citizens Whole (2007); instead, he reintroduces Lasch to an audience that might have forgotten him and proceeds directly to the requisite hyperbole: “Lasch’s account of narcissism resonates with much of what I will portray as the new capitalist ethos of infantilism. The ethos animating postmodern consumer capitalism is one of joyless compulsiveness. The modern consumer is no free-will sybarite, but a compulsory shopper driven to consumption because [sic] the future of capitalism depends on it. He is less the happy sensualist than the compulsive masturbator, a reluctant addict working at himself with little pleasure, encouraged in his labor by an ethic [not ethos?] of infantilization that releases him to a self-indulgence he cannot altogether welcome.” (51)

Sound familiar? Of course it does. Benjamin Barber, a political theorist by training, holds an endowed chair at the University of Maryland, and, according to the flap copy on his book, he “consults with political and civic leaders throughout the world on democratization, citizenship, culture, and education.” James A. Roberts is a professor of marketing at Baylor University in Waco, Texas; he’s not a communitarian critic of capitalism, and he’s never been to Camp David. But in a new book called Shiny Objects: Why We Spend Money We Don’t Have in Search of Happiness We Can’t Buy (2011), he explains the difference between intrinsic and extrinsic goods, cites Jean-Paul Sartre on the meaning of life—I am not making this up—and then reproduces Barber’s boisterous critique of consumer culture in prose that would put a ferret to sleep: “Compulsive buyers are preoccupied with the importance of money as a solution to problems and as a means of comparison. Like status consumers, they make purchases in an attempt to bring into balance the discrepancy between their identity and the lifestyle projected by various products. . . .But as compulsive buying becomes more severe in an individual, and more prevalent in our society, it causes serious personal, interpersonal, and social problems.” (102-3)

Is this strangled prose a kind of plagiarism? If I were grading Roberts, I’d have to consult my university’s guidelines under the heading of “permissible paraphrase.” But then I’d have to bet on a source, and what could I exclude from the database? Barber, a likely source, isn’t the author of the master text—his renunciation of argument is evidence of his own borrowing—he’s just another cleric with a pornographic imagination and a strong prose style. But if it’s not plagiarism, what is it, what do we call this borrowing? Is Barber’s purple prose convincing because it works at the level of rhetoric, where close observance of the conventions, speaking of pornography, permits but also requires the occasional flourish, that moment when the argument is completed not by reference to evidence but by the athletic effect of a perfect metaphor or a quick cut?

These plaintive questions, which I ask without irony, boil down to just one. Why do we—academics, journalists, artists, intellectuals, writers, editors, readers—take the master text for granted, so that the typical response to my argument on behalf of consumer culture is, “You can’t say that”?

III

Michael Fisher makes the question quite poignant in his smart, funny, and friendly review of my book. He has of course borrowed from the master text transcribed by Barber, Roberts, et al., knowing that the original was written, once upon a time, by high-brow fugitives from mass culture and learned critics of its industrial apparatus. But he has tried to translate that text, to transpose it into a new key, where we might read and listen differently. He’s not just reiterating; he’s riffing.

Fisher deftly summarizes the economic argument of Against Thrift, and, like most of the comrades on the Left who favor the idea of redistribution in the name of equality, he finds it convincing. But, again like most of the comrades, he labels it “hard-boiled” and “descriptive,” as in dispassionate and reportorial—as if my disagreements with every other explanation of the Great Recession are unimportant, as if I hadn’t chosen to argue against the conventional wisdom on the role of consumption in economic growth, as if my description of the current crisis (or any other description, for that matter) is not already an analysis with an accompanying policy agenda.

Fisher then makes a slow turn, from what he calls my “descriptive argument for why consumer culture is good for us” to what he calls the “normative argument.” At this point, the equally ancient distinction between intrinsic and extrinsic goods makes a timely appearance, and it hereafter serves as sturdy rhetorical protection against the intellectual intrusions that follow. At the gates of hell, these metaphysical niceties have always served as prayerful homilies: when nothing else abides and your soul is at stake, you can always console yourself by writing a footnote to Plato. Ask P. G. Wodehouse.

“Thankfully, Livingston is not one to shy away from ambitious intellectual tasks (he likens himself to Galileo early in the book). In ‘Part Two: The Morality of Spending,’ he unveils his normative argument for consumer culture’s goodness, this time with respect to our souls, and tries to re-designate consumption, instant gratification, and instinctual satisfaction as intrinsic moral goods.”

Or do I? Is my language a “subtle pragmatist’s trick”? It is true, I have no patience for metaphysics. I’m a pragmatist through and through, and so I don’t see how any description of any phenomenon excludes or postpones a normative argument—that is, an actionable attitude toward the object of knowledge. I also don’t see how a distinction between extrinsic and intrinsic goods holds up under the condition we call modernity, or post-modernity, when the universalization of exchange value (“reification”) is complete. But I do show that it is only in the neighborhood of consumer culture—at our leisure, after hours and at play—that we learn to treat each other as ends in themselves rather than means to the ends of our incomes or careers. In this sense, I show that what comes of buying, using, and sharing goods is better for us than what comes of producing goods under the sign of alienated labor. It beats working.

Instant gratification or instinctual satisfaction—and how, pray tell, would we gain access to our instincts?—can’t be an “intrinsic moral good” in these terms, and I never claimed either was such a good, because we can’t know anything’s value, moral or otherwise, except in retrospect, as a moment in an unfolding semiotic sequence. In other words, value, moral or otherwise, is like truth: neither can be known until exchanged, unless represented. Here is how William James put the proposition: “Day follows day, and it contents are simply added. They are not themselves true, they simply come and are. The truth is what we say about them.”

And yes, it is true, I suggested in the introduction to the book that Galileo was my hero because he wasn’t a deep thinker, just a radical empiricist looking to demonstrate the new facts made visible by his telescope. It was my clumsy way of choosing history over theory. I said that “the telescope at my disposal compresses time rather than space,” and hoped readers would, as a result, understand the obvious limits of the project rather than attribute inordinate ambitions to its author. But I now want to make those ambitions clear, because no review of the book, including Fisher’s, and no interview about it, not even at Pacifica Radio, has yet revealed the scope or the implications of the argument.

IV

I wrote this book in the hope of allowing us to see that consumption is the proper goal and the necessary limit of production. When it has been or becomes this goal and limit, the use values that consumers want can contain—not displace—the pursuit of exchange value, of wealth in the abstract. Money and credit, accordingly, can become means of exchange, not ends in themselves: the formula for capital (M-C-M*) can then give way to something like simple commodity circulation (C-M-C), something closer to the archaic yet real and pleasurable circuits of gift economies.

This seemingly utopian urge—this hope of mine—is actually validated by the measurable trends of recent economic history, the last hundred years of development. We can make consumption the goal and the limit of production. But to do so, to accept and act on my economic argument, is to interrogate what we mean by “character.’ The structure of our moral personalities is at risk in that interrogation, because the realization of desire we call “spending” and the deferral of gratification we call “saving” are both emotional achievements and material accomplishments. Max Weber and Sigmund Freud understood this social-psychological congruence, and tried, accordingly, to itemize the historical conditions of an ascetic or anal-compulsive character type that could systematically and happily abstain from the pleasures of the world. On the wings of the Owl of Minerva, they were explaining the Cartesian ego at the very moment of its dissolution.

It’s time that we followed their example—it’s time that we tried to itemize the historical conditions of new character types and the moral (not to mention political) horizons that become visible from their standpoint. But how? My procedure in Against Thrift was to begin with the economic history of the last hundred years as an indispensable preface to a defense of consumer spending and consumer culture. Redistribution was the least of my goals—it was just the first step, I thought, toward the imagination of a moral universe in which repression, denial, and delay of gratification are no longer the foundation of the social-psychological structure we recognize as “character,” and, consequently, in which any fixed boundary between inner self and outer world (the central conceit of modernity, according to Nietzsche) is erased.

So let me retrace my steps.

Private investment out of profits is an unimportant source of growth, and so the pursuit of profit as such is, as Keynes put it in 1930, a “somewhat disgusting morbidity.” It follows that the forced savings or deferred consumer choices that corporate retained earnings represent are worse than pointless, they’re destructive. It also follows that we don’t need to keep decisions about our future in the hands of those who think that the bottom line is a larger sum of exchange value—rewarding CEOS and traders with lower taxes and higher profits is a recipe for economic and moral disaster.

Let me put it as plainly as I can. The members of the investing class—we used to call them capitalists—are now as superfluous and superannuated as the European landed nobility had become by the late 18th century. They’re good for deep background, baroque settings, and self-parody if you want to write a novel or make a movie about a civilization that has already expired. Otherwise they don’t matter. Otherwise we need to get on with a future that excludes them except as public servants, as “humble, competent people, on a level with dentists,” according to the Keynesian designation of economists. We begin by redistributing income away from the 1%, toward the 99%.

What then?

We socialize investment because we need to—because we need to redefine profit to include the social consequences (the “externalities”) of investment, including the environmental consequences, and because the pattern of economic growth can no longer be determined by the insatiable needs of those who honestly believe that more money in the bank is the purpose of life and the insignia of success. That means we take responsibility for the future, or rather that we stop sacrificing the possibilities and pleasures of the present to a future held ransom by our own deference to an archaic economic model and an outmoded character type.

It means that we stop saving for a rainy day, and stop assuming that the inner-directed, anal-compulsive character is normal and, dare I say, normative.

Notice that our obligation to future generations is enlarged, not diminished, by this commitment to, and in, the present. But notice, too, that when we stop saving for a rainy day because we can, we have already begun to reconstruct our “character” in ways that move us beyond inner-direction and anal compulsion. In this sense, we have already begun to move beyond Protestant Christianity—that old work ethic—as the “deepest moral resource” of our everyday lives. So yes, of course, we have already begun to redefine individualism, the very nature of our selves, as soon as we ask who and what we’re saving for.

V

That’s what Against Thrift is about, this ongoing, incomplete, still inarticulate movement toward a new moral universe made navigable by the passage beyond what Marx and Marcuse called the realm of necessity, where hard work and emotional sacrifice add up to the cause of character and the price of civilization. Either way, in retrospect or prospect, it’s not a pretty picture—the future I sketch looks like hell itself according to Michael Fisher—but either way, we don’t have much of a choice in the matter. We can treat the differences between these pictures as moral possibilities that are real historical events and thus empirical problems, or we can continue to copy from the master text, which simply denies that consumer culture contains any possibility worth contemplating.

Fisher is of course correct to suggest that I am uninterested in “lasting salvation”—who except a dead man can tell us what that means?—and to label Christian faith as the moral adhesive of the civil rights movement. But I would insist that my godless project is in keeping with the social origins and import of this faith, indeed that it aims to complete what religion (and, in its own fashion, advertising) can only attempt. In the beginning, the criterion of need—from each according to his abilities, to each according to his needs—regulated the disposition of the church’s economic, emotional, and doctrinal resources: you were your brother’s keeper, so charity wasn’t a choice. But as the church became a going concern in the post-republican, Hellenic world, the criterion of need became politically problematic. In the absence of ways to deliver the goods to everyone—in a world dominated by disease, hunger, and poverty—this criterion became local or eschatological, either the creed of communities that had withdrawn from the larger society, or, what is practically the same thing, the ideological correlate of faith in an impending apocalypse.

We still inhabit a world dominated by disease, hunger, and poverty. But withdrawal is not an option, not anymore, because we know how to deliver the goods to everyone: we know that scarcity, whether economic or emotional, is socially contrived and culturally enforced. We’ve long since solved the problem of production; we haven’t even begun with the problem of consumption because we’re so afraid of what it will cost us in the currencies that underwrite our “character.” We can finally afford to be our brother’s keeper—we can live by the ancient criterion of need, and, in doing so, we can live up to the original challenge of Christianity. We don’t yet know how because we’re still too afraid of the material abundance that enables consumer culture.

My purpose in writing Against Thrift was to lay these fears to rest—or rather to explain them, to myself among other adults made anxious by the extremities of very late capitalism. Michael Fisher understands that, I think, because he has refused to merely reiterate the master text that has allowed so many smart people to say the same thing about consumer culture without thinking, and without evidence. He never falls back into the parental moment when “You can’t say that” sounds like the appropriate response to bad taste, bad faith, or bad manners. Still, his review would suggest that I have only inflamed our fears of the future. That makes me nervous.

Minggu, 25 Maret 2012

Why History Matters

Yesterday, at the fourth annual Graduate Student Symposium at the University of Texas at Dallas, Ben Alpers give an incredible keynote address on "The Future of the Humanities." I won't summarize his talk here, but I will summarize the audience response:  Ben rocked the house.  

After Ben spoke, I presented my own paper as part of a panel addressing the theme, "Why the Humanities Matter."  My colleagues Michele Rosen and Sara Keeth did an extraordinary job of explaining the cultural value of translation studies and literary studies.  Then it fell to me -- because I had gladly if somewhat naively volunteered to do it -- to make the case for "Why History Matters."

The actual title of my talk was more elaborate, but it was no more ambitious than the daunting task I had before me:  to explain why the study of history matters to somebody besides me, and to do so while standing next to a particularly formidable American intellectual and cultural historian, Daniel Wickberg, who moderated our panel.

I have my reasons for valuing history as I do -- and I spoke about some of those reasons yesterday.  But the argument that the study of history can enrich one's life -- and it certainly can, and it certainly has -- is not sufficient to explain what history has to offer to those outside the ivied walls of the ivory tower.

What is the cultural value of being a historian?  Why should the academy continue to train historians?  Beyond the self-perpetuation of the profession, what does that get us?

Training scholars to think historically gets us people who can wade into the contentious conversations in the public sphere while retaining -- and modeling -- some sense of irony and perhaps the faintest bit of humility about the limits of one's own understanding. We get, in other words, people who would (ideally) offer a different model of discourse besides the rhetoric of righteous indignation and moral or political absolutism that is constantly swirling around us.

That was the basic argument of my paper.  Keep in mind that I was explaining "thinking historically" to an audience of people who were coming from other disciplines in the humanities.  I used my own struggle to contextualize the radical abolitionism of William Lloyd Garrison as a way of framing my discussion.  So here's a slightly revised version of part of my argument, which includes some references to discussions taking place on this blog:

This effort to suspend judgment in favor of understanding is part of what it means to "think historically."  It's hard to do.  But it is a disciplinary and self-disciplinary hallmark of historians.  This professional self-discipline, Thomas Haskell explains, "requires detachment":  the ability to "suspend or bracket one's own perceptions long enough to enter sympathetically into the alien and possibly repugnant perspectives of rival thinkers,...to achieve some distance from one's own spontaneous perceptions and convictions, to imagine how the world appears in another's eyes, to experimentally adopt perspectives that do not come naturally."[1]  And, I would argue, the most unnatural perspective of all is to recognize that our own mental conception of the world -- whether it's the idea that slavery is wrong, or the idea that historians ought to approach their subjects with some measure of detachment -- is not a timeless truth; our moral and mental conceptions are ideas, and they too have a history of their own.

Keeping the history of our own mental framework in mind even as we write history practically forces us to embrace some sense of disciplinary humility.  We do our best work when we keep front and center "a conception of the limits of historical knowledge."[2] Instead of a totalizing grand narrative, or grandiose explanatory claims, historians attentive to the limitations of our own discipline would (ideally) write, Allan Megill suggests, with a "greater humility and reflexiveness with regard to the interpretation of the past" -- a "self-ironic style."[3]  This is the practice of history as a sort of tonic of humility -- an ironic self-awareness that we must not be self-righteous in our own certainties.   

Irony. Humility. Detachment.  This is what history has to offer. 

This is what history has to offer?  Oh, how the mighty have fallen!  But it is, I would argue, a fortunate fall -- though not all historians have made the leap.  Indeed, one of my colleagues on the U.S. Intellectual History blog is concerned that I seem to have lost a sense of the heuristic purpose of doing history:  to draw from the particular circumstances of the past some generally applicable principles for the present.  "Isn't that part of how we learn from history," my colleague asked me in a recent blog comment, "learn to avoid the mistakes of the past? Or," he continued, "are you a postmodernist such that you see these applications as overwrought with hazards? How do you talk truth with those who come to history seeking SOME limited universal truths? How do you sell historical thinking if you disallow present applications?"[4]  It's a fair question.  After all, I had been saying there -- and I am saying here -- that "when history veers into a discussion of what is true in a transhistorical or 'timeless' sense, it ceases to be history."[5]

Indeed, "timeless" and "historical" don't really go together. 

Instead, what history can offer is a careful, attentive, thick if not thorough interpretation of a particular past context.  The point of such interpretation is not to distill some lesson that will help us "avoid the mistakes of the past"; because time is on the move, the mistakes of the past are unrepeatable.  That's one of the implications of contingency:  each present moment is the product of innumerable choices made and chances taken by others.  So history can't repeat itself; we are wonderfully free to make new mistakes now.  How's that for a liberated future? 

That's a very liberated future.  In his famous -- and much assigned -- essay, "The Burden of History," Hayden White describes how foregrounding contingency frees up the future.  History has the "special task" of bringing people to "an awareness that their present condition was always in part a product of specifically human choices, which could therefore be changed or altered by further human action."[6]  As George Cotkin affirms, "we are born into structures of power and culture that constrain us.  But, at the same time," he continues, "we retain a degree of agency that may assimilate or change those structures to varying degrees."[7]  Intellectual history in particular foregrounds the way that such structuring ideas shape not only the world that people are born into but also the very conceptual limits of how they can imagine reacting against that world.  Grasping this dynamic -- getting some sense of the sheer weight of what people have to wade through to ever think differently about anything -- makes it easier to view individual people, past and present, with less judgment and more understanding.

...

So our job as historians is to speak to our time, to speak to our culture, by speaking faithfully about the past.

The historian's task is simple:  to reconstruct a moment, or a series of moments, from the past, and to hold that past up to view not so that we might commend or condemn it, but so that we might understand it.  That exercise -- the demonstration of how to suspend judgment in favor of understanding -- repeated again and again, defines both the process and the product of history faithfully practiced.  Leading others through that exercise with us is the pedagogical, the scholarly, and -- in a certain sense -- the pastoral work of  secular historians.  We must leave to our historical subjects the harsh and uncompromising and proudly immoderate rhetoric of moral absolutism.  Let Garrison thunder like a prophet, and let us hear him in the context of his time.  Ironically, in the context of our time, to demonstrate ironic detachment and some sense of humility about our own habitual [historical?] certainties is, in its own way, profoundly prophetic.
--------------------
[1]Thomas L. Haskell, "Objectivity is not Neutrality: Rhetoric vs. Practice in Peter Novick's That Noble Dream," History and Theory, Vol. 29 No. 2 (May, 1990), 132.
[2]Allan Megill, Historical Knowledge, Historical Error: A Contemporary Guide to Practice (Chicago: University of Chicago Press, 2007), 56.
[3]Megill 186.
[4]Tim Lacy, March 15, 2012 (10:35 a.m.), comment on Tim Lacy, “Tim's Light Reading (3-15-2012): Gretel Adorno, Richard Theodore Greener, Ontics, Defending First Principles, and Tony Judt via Jennifer Homans,” U.S. Intellectual History, March 15, 2012, http://us-intellectual-history.blogspot.com/2012/03/tims-light-reading-3-15-2012-gretel.html.
[5]L.D. Burnett, March 15, 2012 (9:15 a.m.), comment on Lacy, "Light Reading (3-15-2012)."
[6]Hayden White, "The Burden of History," History and Theory, Vol. 5, No. 2 (1966), 133.
[7]George Cotkin, "History's Moral Turn," Journal of the History of Ideas, Volume 69, No. 2 (April 2008), 305.

Jumat, 23 Maret 2012

The Death of a Family: Michael Haneke's Seventh Continent

Austrian filmmaker Michael Haneke makes me sick--and that is not necessarily a criticism. His best-known films, Funny Games (1997 and a searing American remake in 2008), Cache (2005), and White Ribbon (2009), force viewers to squirm for two hours while the perversities of humanity take revenge on our liberal sensibilities. The most notorious example of Haneke's cinematic style is the American version of Funny Games in which a happy, little, upper-middle class American family is taken hostage in their summer lakeside cottage by two sadistic, upper-middle class teenagers and tortured over the course of a weekend. Pleasant. Haneke said about the film: "Funny Games was always made with American audiences in mind, since its subject is Hollywood's attitude toward violence." In short, he asserted: "I'm trying to rape the viewer into independence." So says Haneke the contemporary provocateur.

Yet, shotgun blasts (a signature scene in Funny Games) do little for me compared to events captured in all their mundane detail in Haneke's first film, The Seventh Continent. I must confess, this film left me devastated. It took me two weeks to watch it--I found myself so unhappy that I would turn it off--and made me consider Jim Livingston's and Daniel Rodgers's books (respectively) on post-industrial America in new ways.

Haneke based his cinematic story on a real-life incident that left Austrian society (momentarily) befuddled. In the mid-1980s, a middle-class Austrian family had been found dead in their suburban home, and even though there was a suicide letter from the father of this family, people involved in the investigation refused to accept that folks similar to them might simply choose to take their lives--including the life of their pre-teen daughter. Among the details of the case, the one that seemed to disturb people the most was something Haneke took particular care to depict in the film (and in the photograph above)--the family had flushed their cash down the toilet before committing suicide. Was this the ultimate expression of alienation in a post-industrial society?

I found Haneke's film so effective because it is so quiet and normal--right up to the last fifteen minutes, of course. The film begins with the family sitting in their car as it goes through an automated carwash. You can't see the faces of the people and no one speaks. The next scene is everyone waking--starting a routine familiar to us all, but one that takes on a sense of foreboding because we know how the story will ultimately end--this family will wake one morning and that day they will die. There have been other films that deal with the alienation from a society enthralled by consumerism or the modern state. But this film does not dwell on an unusual situation, decision, or confrontation. As Adam Bingham observes at Kinoeye:

Haneke creates an entirely original narrative syntax to convey directly the experiences of his characters' as their souls are ground down in the crushing vacuum of modern existence. And also to allow the viewer the space to make their own connections and to draw their own inferences and conclusions as to what the film means and, more crucially, how relevant it is. Shocking in both form and content, this is a film about utter despair born from the everyday, the mundane.

Critic Michael Wilmington found it "a calm chronicle of hell."

But this is not some dystopian hell, like Blade Runner or The Hunger Games or to use one of the films Jim Livingston points to, The Terminator. Of course, Haneke's film was no where near as popular--in fact the entire film is available on-line. But the film does get to Livingston's interesting point about the New Right of the 1980s. "Conservatism in the late twentieth century," he argues, "was not a blanket endorsement of what free markets make possible; like the radicalism of the same moment in American history, it was a protest against the heartless logic of the market forces created and enforced by consumer capitalism" (The World Turned Inside Out, 56-7). Was Haneke demonstrating his conservatism?

The sadness that pervades Haneke's film saturates the viewer because this sadness emanates through the family--an entity that we sociologically analyze and politically idealize, but also, in the end, typically use to find purpose in life. To witness the destruction of a family--the family--by means that we (or at least I) take for granted, was frankly hard to watch. The family in the film has some stress: the mother has lost her mother relatively recently and her brother has found it difficult to move on. The father has an opportunity to advance in his career, but at the expense of an older colleague. The daughter seems a bit disassociated from other children at school and craves more attention from her parents. None of these elements are extraordinary and Haneke does not attempt to pin the inevitable tragedy on a collective psychosis. Their lives had routine, but a routine that failed to connect to anything larger than themselves. They seem incapable of knowing how to grieve, why to be ambitious, and how to love.

Livington's contends that a few "hugely popular movies of the late twentieth century...require us to experience and explain" (57) the collapse of a society that used late-capitalist methods to attempt to secure bourgeois trappings of family and community. As Livingston suggests, that bargain did not work. Daniel Rodgers offers a prognosis: conservative intellectuals "yearned for a common culture...but their ideas of society had been infiltrated by the new market metaphors, the notion of communities of choice, the narrowing of the language of obligation, and the appeal of the idea of natural, spontaneous civil society. They could desire a common culture. But only in fragmented ways could they envision the institutions that might create it" (Age of Fracture, 219).

In The Seventh Continent, family has failed, work has become a sham, and perhaps most tragically childhood is hopeless--fragmentation is complete. The father in the film writes in his suicide letter, "nothing is holding us here." Indeed, place has become a vacuous idea. He explains that the life they led made ending it easy.

This film is not about people being bored, or an awakening of social consciousness or a heroic act against the soul-crushing authority of the state, system, or cultural mores--it's not even necessarily witness to the trouble of a time. There was no big idea to explain the death of a family. Haneke asks if we shouldn't fear that realization.