Sabtu, 30 Juni 2012

Civil Religion and Culture Wars

The greatest culture war in American history led to its most destructive war.  Or so argued James Davison Hunter in Before the Shooting Begins, the 1994 follow-up to his provocative and influential book Culture Wars (1992).  "Culture wars always precede shooting wars," he wrote, "otherwise, as Philip Rieff reminds us, the latter wars are utter madness: outbreaks of the most severe and suicidal efforts to escape the implications of any kind of normative order.  Indeed the last time this country 'debated' issued of human life, personhood, liberty, and the rights of citizenship all together, the result was the bloodiest war ever to take place on this continent, the Civil War." (emphasis in the original)

In the abstract, then, culture wars describe the contest over the question of how a people--a society--ought to order its life together.  Hunter contended that by the early 1990s the national debate over the language people used to describe themselves, their opponents, and their nation had utterly broken down.  Thus the rhetoric of war aptly described the impasse reached in this national conversation. For our own Andrew Hartman, the culture wars describe the messy assembling of new conversations in the wake of the demise of the kind of common culture that Hunter seemed to mourn.  For Hartman, culture wars operate on many levels, rhetorically, philosophically, politically, and socially.  In this way, he offers the term as the defining narrative of postmodern America.

And it is a term with immediate, almost visceral connotations.  As Patrick Buchanan notoriously declared at the 1992 Republican National Convention: "There is a religious war going on in this country.  It is a cultural war, as critical to the kind of nation we shall be as the Cold War itself.  For this war is for the soul of America."  War--Soul--Nation: these are also the elements of a term that is often seen as the opposite of culture wars--civil religion.

Rhetorically, civil religion appears to be opposed to conflict and war; practically, though, it is deeply indebted to both.  For if civil religion is the appropriation of religion by politics, there is nothing more serious for politicians to do than to justify killing and dying, and nothing gets that job done better than coupling religion and war.  If we carry Hunter's statement above to its conclusion we might note how the culture wars begat the Civil War which begat an American civil religion.  Religious historian Harry Stout argued as much in his recent book, Upon the Altar of the Nation: "The Civil War taught Americans that they really were a Union, and it absolutely required a baptism of blood to unveil transcendent dimensions of that union." (xxi)

It is that transcendent understanding of America which has fractured since the 1970s.  And Americans have lived with two overlapping, at times mutually influential spheres of war--the culture wars and wars to affirm an American civil religion.  War is at the heart of the relationship between culture wars and civil religion; war links them, but with an irony. The culture wars emerged in the twilight of the cold war and the humiliating aftermath of Vietnam--in short, in the absence of war, the culture wars had room to grow.  The rise of the culture wars narrative eclipsed the civil religion that had been manufactured from the last years of the Second World War through Vietnam.  And yet, throughout the years of the culture wars--when the language of war grew dominant--a civil religion began to slowly re-emerge, constructed as it has been since the Civil War through a new American experience in war.

When 9/11 happened and God and war came roaring back into political rhetoric, religious scholar Mark Silk captured the moment appropriately.  Writing about George W. Bush's speech at the National Cathedral, Silk observed, "If civil religion is about anything, it's about war and those who die in it."  His comments echoed not merely what he heard from Bush, but what Americans have memorialized from Lincoln--the original culture warrior, who in his Gettysburg Address dedicated the killing and dying that remained in the war to those who had given their last full measure of devotion to the nation and what he hoped that nation might ultimately become. 

We live, therefore, in a space between and within the spheres of culture wars and civil religion. At once appropriating the language and power of war rhetorically to thump our opponents and rally people to our side while we memorialize a mythical nation through the sacrifices made in real war.  If the term 'culture wars' resonates with us more than civil religion it might be because we understand ourselves as a people--both united and divided--through the idea of war.  

Jumat, 29 Juni 2012

Round Table on Murphy's The New Era: Final Entry--Murphy Replies


 By Paul Murphy

When anyone begins a review of your book by suggesting it is “likable and usable,” you had better start worrying, and if the reviewer isJames Livingston, start sweating.  It did not take for the other shoe to drop, although the shoe drops from a very peculiar direction.  Livingston emits a heartfelt cri de coeur.  The book may be “good, even brilliant,” but it remains merely a work of history, an attempt to “reproduce” the past, which is, in Livingston’s eyes, an obsolete and now meaningless preoccupation.  “This shit doesn’t matter anymore!”—truly a bracing line to encounter near the end of a book review.  I am indebted to all of the participants in this USIH roundtable on my book.  It is good to have Livingston as a reviewer; he is a corrective to the positive reviews by Lynn Dumenil and Kristoffer Shieldshttp://us-intellectual-history.blogspot.com/2012/06/round-table-on-murphys-new-era-entry-2.html, whom I am sure are overly generous.  They praise the book as a clear and straightforward account of American intellectual and cultural life in the 1920s, precisely the qualities Livingston finds so hopeless.  He raises the question, “What is the point of this book?,” which is a good one.  The book is a text for students priced at$80.00.  Not for the casual buyer, to say the least.  It does seem wrong to force such a traditional work of History on students if, as Livingston would have it, our goal should be to “get over it.”  So what?  Who cares?  Exactly.  It is a question I often asked myself. 

Livingston finds the book merely descriptive; it does not explain anything and imports the present concerns of the author into its description of past actors.  The book has no answers to the reviewer’s questions about the 1920s.  It contains, he notes with exasperation, “just fair and balanced reporting” (in the Fox News sense of “fair and balanced,” I presume):  “How is that even possible unless the author aspires to be the writer of a textbook, or unless he assumes that the distance between the college boys and the proles—sorry, the gap between intellectuals and the masses—can’t be crossed?”   (I am not sure what is impossible—that students would be able to decide for themselves whether the Harlem Renaissance was a failure or success?)  Livingston’s comment recalled for me a recent review-essay of three new books on late twentieth-century American cultural and intellectual life by Neil Jumonville, including Livingston’s new book (The World Turned Inside Out:  American Thought and Culture at the End of the 20th Century, a more dazzling exercise in cultural analysis by far) as well as Colin Harrison’s American Culture in the 1990s and Daniel T. Rodgers’s Age of Fracture. [1]  Jumonville classifies Harrison and Livingston’s books as “textbooks” and Rodgers’s as a “regular work of history.”  Later in the review, he allows that Livingston’s book is “too spirited to be kept inside the usual flat narrative corral that most textbooks inhabit.  The reader will find the story bucking its way over the fence before long and into the spot on the bookshelf where the regular books reside” (156).  Well, Livingston does buck a lot, so perhaps this is fair.  I suspect, however, that Livingston is unhappy to have been promoted by Jumonville to the status of a “regular.” 


As it is, I am happy to be classed among the Irregulars.  I’ll stick in my corral; put me on the shelf with the other sad-sack textbook writers, near the floor and behind the waste basket.  I am reminded of another review:  I have a friend who paged through my book and merrily declared (approvingly I guess) that the book didn’t tell him anything he didn’t already know.  I have been hardened by years of teaching, and so I took it as a compliment.  The book is a textbook; it does purport to report some of the highlights of American intellectual and cultural life in the 1920s in a way useful for undergraduate and graduate students and maybe for the professor looking to compose a lecture.  If it provides an “imaginative framework for understanding the intellectual life of the 1920s,” as Dumenil suggests, or suggests perspectives that are “illuminating and thought-provoking,” as Shields declares, I’ll take the compliments, for I hoped it might do so.  It excludes too much.  Shields is right to highlight my failure to discuss Legal Realism in the 1920s.  That, and an almost complete omission of the economic thought of the time, are glaring and embarrassing omissions, driven, in part, by the dictates of space.  As Shields notes, my chosen framework—the self-discovery and self-definition of intellectuals as a distinctive class who were defined not only by their well-known alienation from their country but also by a relentless effort to use culture to move the nation closer to their ideals—may well encompass the Legal Realists in any case.  Dumenil quite fairly suggests that parts of the book do not always hang together; there are distinct chapters, about mass culture, for example, that lack the “connective tissue” needed to meld them into the larger argument about the gap-consciousness of the “strange new 20th-century stratum of purposefully superfluous individuals” (Livingston’s wonderful description of intellectuals).  The 1920s were vast—the task of fashioning a history of the period’s cultural and intellectual life, of selecting and ordering data, was challenging. One colleague suggested to me that I could use Peter Gay’s essay, Weimar Culture:  The Outsider as Insider (1968) as a model (a daunting one to be sure), but I opted for impurity, combining elements of an interpretative essay with the framework of a comprehensive text.  The results are sometimes ungainly and choppy. 

Livingston presents several provocative critiques of the work.  My definition of modernism is flawed because it presumes the intellectuals’ self-created categories of “intellectuals” and “masses” to be historically valid, when they were not.  Social classes were new at this time, he argues.  This may be so, but it was the intellectuals’ self-conception that I wanted to report.  Can I explain why twentieth-century, self-defined modernists alighted on this particular framework?  Perhaps not.  But they explained it to themselves in terms of the onrushing force of industrialization and commercial culture.  (Yes, the Southern Agrarians were haunted by the same demons as the cosmopolitan New York Intellectuals, and this is why they converged in the 1940s and 1950s, if not earlier.  Malcolm Cowley was best friends with Allen Tate and his crew and wrote parts of Exile’s Return in Tennessee.  They were all conservatives, just as they were all radicals.  The New Yorkers were Marxists; the southerners antimodernists.)  Livingston highlights technology, which I supposedly make a deus ex machine, in another example of my slipshod journalism.  Yet, I (and they, I think) focused on industrialization:  The modernist critics of the 1920s knew they were being industrialized, meaning efficient production was becoming the watchword and, yes, humanistic values risked replacement by imperatives to profit, standardization, and the rest.  Livingston has me presenting Mumford and his generation of critics as anti-technological, which they were not, he argues.  This may be the case, but my point was that they were distressed by the larger socioeconomic and cultural process of industrialization.  Likewise, Livingston presents the Harlem Renaissance as a great success because black intellectuals embraced mass culture.  True—of some, at least.  There were a lot of fights about this.  Yet, as George Hutchinson points out, the Renaissance intellectuals were embedded in the broader white critical discourse of their time.  They had time for Brooks, Mencken, and all the other Young Intellectuals, and the hybridized Renaissance was a result of this collaboration (a cultural movement “in black and white”). 

The focus on technology is a bit of a red herring. Livingston wants to find me obsolete, along with all of my pedantic, Irregular colleagues who write textbooks and not the “regular” books that should be attended to in our profession.  I give space to an irrelevant species of critic who, in Livingston’s view, failed because not embracing the “possibilities of modern technology,” as the successful leaders of the Harlem Renaissance did (even though, in fact, the Renaissance writers wrote poems and books and critical essays published in our soon-to-be obsolete codex form as well as the little magazines). 

There are, I think, some presentist anxieties coursing through Livingston’s review, which are absent from Dumenil and Shields.  Shields writes, “Our hope as historians is that by better understanding these gaps [the divisions caused by gender, region, race, etc.] and the intellectual and artistic responses to them, we can gain access to a better understanding of the cultural conflicts and evolutions of the decade.  This may in turn help us better understand the social and political clashes of the 20th century.”  Livingston writes:  “Isn’t it clear by now that finely wrought books like Paul Murphy’s are monuments to a comically Nietzschean will to believe—mere vestiges of the urge to make sense in the Present and of the Future by citation of the Past?”  They must have interesting seminars in the Rutgers University History Department!

I appreciate Shields’s suggestion of an alternate path into mid-twentieth-century modernism—and perhaps one that works better than my gap-consciousness—in the liberal modernists’ desperate focus on authenticity as the hoped-for basis of a renewed authority they knew they lacked but wanted to regain.  This is a good idea, and as I re-read portions of the book, references to authenticity began to jump out at me.

I fear that Livingston finds himself in the same boat.  After all, is he not he urging us to “get real” and quit being self-infatuated professors who simply tell students what happened in the past, as if that was relevant?  I think he may be in the gap.  He frets that the academic professor is irrelevant and, really beside the point.  There is more than a little Santayana here.  The gap between the intellectuals and the masses (“masses” is perhaps a poor term for all the rest of the people out there, existing nebulously in the intellectuals’ imagination of their audience), still exists, and Livingston yearns to bridge it.  His tone is postmodernist, but the anxieties seem much the same as those held by the 1920s generation of modernists and their mid-century epigones.  Academia became a redoubt for intellectuals in the 1940s and 1950s and remained so for a period of time, even for those employed in the emerging multiversities of the 1960s. As Livingston believes, the left seized the “commanding heights of higher education.” [2]  Now that is changing:  Industrialization is hitting the K-12 profession very hard now, as the forces of privatization grow more powerful and public commitment to funding education falters.  The rising waters are lapping at the shores of Academia.  We are about to be industrialized.  I agree with Shields.  As we fight the creative destruction wrought by “massive open online courses” and the like (Ted Babbitt, it should be noted, son of George F., was, like David Brooks and many others featured in the New York Times a fan of “home-study courses,” promulgated in his case by the Shortcut Educational Publishing Co. of Sandpit, Iowa [3]), it will repay us to “mind the gap.”

Livingston challenges me on the claim that, after 1920, “culture became the essential terrain of social and political action.” [4]  I made the same statement in a conference paper, and I remember Andrew Hartman asking me the connection between such a claim and the late twentieth-century phenomenon of the “culture wars,” which he is studying.  Livingston wants an explanation as to why this was the case.  Intellectuals at this time, Livingston argues, were not for or against things; the past is not a “sporting event.”  Yet, there were fights in the 1920s, big ones.  Many were “against” the New Humanists, the conservatives of the day.  To liberal modernists, the Humanists represented a profound threat to their own critical project, which was to wield culture as a tool against industrialization in all its manifestations.  Humanists wanted to control culture, too, but represented values the modernists bitterly opposed.  Why did the modernists of that era worry these issues?  Culture seemed the best tool they had to change society. 

If I could change one thing in the book now, it would be to clarify this point.  In the book, I spend much time reporting on the emerging anthropological notion of culture. [5]  The liberal modernists to whom I devote so much attention seem implicated in this new view.  I think this is an erroneous assumption.  One delight in reading Warren Susman’s work on the 1920s and 1930s is his profound and complex focus on the obsessions of this generation of intellectuals with culture, with communications technology, and with their relationship to both.  We all should re-read Susman. [6]  He had a better understanding of intellectuals’ conception of culture in the 1920s and 1930s (and it was his own, too, I think).  Since the 1960s, we have conceived culture as a source of empowerment, a resource available to the oppressed to resist the authority of the elite (and the source of that elite’s subtle hegemonic powers).  Culture is that which must be respected, for it validates personal identity—a complex system of symbols patterning behavior that somehow allows for personal agency by the mechanism of choice among its multifarious elements.  To mid-century modernist intellectuals, culture was a tool of a different sort, not merely a surrogate for contested moral values nor the anthropologists’ patterning of ideas and behaviors that define a society.  You did culture when you used ideas and images and symbols (images and symbols and myths were favorite words of academics of Susman’s generation and the social critics they studied) as levers to move the country in a different direction.  Culture was something intellectuals and artists used, and, in order to believe it effective, they had to assume that everyone was potentially under its influence.  There was a (potential) national mind, or “frame of reference”; better, the aim of intellectuals was precisely to create and impose such a thing.

We no longer think this way.  The “culture wars” of our times have been about the repudiation of such notions of homogeneity in the name of diversity and pluralism.  Perhaps there is a solution to Livingston’s dilemma in Susman’s analysis, however.  (Susman seems to me as much a hero of my book as Lewis Mumford.)  To Susman, writing history was doing culture.  History is part of culture, and historians inevitably create new forms of culture as they create their histories; history as culture and culture as history. [7]  As Susman famously noted, Walt Disney was as influential a force in history as FDR; the director of movie westerns, John Ford, was probably the most influential historian in the country. [8]  They certainly had wider platforms than a textbook, but Susman was not one to dismiss the cultural significance of any text, even the Irregular ones consigned to the bottom shelf.  Textbooks, given that we force our students to read them, may even be more influential than many others…and more revealing.

-------------------------------------------------------------
Notes
[1] Neil Jumonville, “Learn This Forward but Understand It Backward,” Journal of the History of Ideas, 73:1 (Jan. 2012), 146-62.  The books under review are Colin Harrison, American Culture in the 1990s (Edinburgh University Press, 2010); James Livingston, The World Turned Inside Out:  American Thought and Culture at the End of the 20th Century (Rowman & Littlefield, 2010); and Daniel T. Rodgers, Age of Fracture  (Harvard University Press, 2011).  

[2] Livingston, World Turned Inside Out, xv.

[3] Sinclair Lewis, Babbitt(1922; reprint, Bantam, 1998), 79-80.

[4] Paul V. Murphy, The New Era:  American Thought and Culture in the 1920s (Rowman & Littlefield, 2012), 10.

[5] This is a topic that has since received exhaustive analysis in John S. Gilkeson, Anthropologists and the Rediscovery of America, 1886-1965 (Cambridge University Press, 2010).

[6] Warren I. Susman, Culture as History:  The Transformation of American Society in the Twentieth Century (New York:  Pantheon, 1984).

[7] See Susman, Culture as History, xii-xiii, 3, 17-18, 185.

[8] Susman, Culture as History, 103, 197; Warren I. Susman, “Film and History:  Artifact and Experience,” Film & History, 15:2 (May 1985), 31.

The Intellectual History of College Coaching and the Penn State Scandal

Today's guest blogger, Brian M. Ingrassia, is the author of The Rise of Gridiron University: Higher Education’s Uneasy Alliance with Big-Time College Football (University Press of Kansas, March 2012). He completed his Ph.D. at the University of Illinois, Urbana-Champaign, and currently teaches at Middle Tennessee State University.

What the Intellectual History of College Coaching Can Tell Us about Penn State
By Brian Ingrassia

In recent weeks, Americans listened with horror as graphic tales of sexual abuse emanated from the criminal trial of former Penn State defensive coordinator Jerry Sandusky, who last Friday was convicted of 45 counts of molesting boys over the course of more than a decade. The scandal and its cover up—which led to the firing of iconic coach Joe Paterno and Penn State president Graham Spanier last fall—have demonstrated the incredible power that departments of athletics hold in many universities. Here, I would like to explore the intellectual history of college coaching (yes, I argue that there is such a thing) in order to gain a fuller understanding of the Penn State fiasco and the place of big-time sports in American colleges.

First, realize that coaching was transformed from an amateur vocation into a profession in the early 1900s. Before this time, players served as coaches, or unpaid graduate advisors like Yale’s Walter Camp worked day jobs while leading their teams to gridiron success. By the 1920s, though, dozens of men—such as Amos Alonzo Stagg of the University of Chicago, Fielding Yost of Michigan, and John Heisman of Georgia Tech and Penn, among others—devoted their careers to coaching. These men had to make a living from the game. Ultimately, some of them garnered large salaries and even gained professorial rank (as did Stagg and Yost, and later Paterno) at their respective universities.

How did such men justify faculty status and outsized paychecks? In short, they portrayed themselves, in both writings and speeches, as educators skilled at the art of training young men in the ways of discipline and self-control. They were “teaching” football, but they were also—so they claimed—teaching so much more. This argument appeared in numerous popular manuals written by early-1900s coaches. For example, in Principles of Football (1922), John Heisman called the athletic field “the best laboratory known where a young man can get the training, the discipline, [and] the experience” that he would need for a successful post-college life. Such arguments were echoed throughout this genre, as coaches said that they were better able to teach discipline than anyone else—including fathers, ministers, and professors.


These men were writing at a time when American universities were growing larger and more fragmented. Once upon a time, legendary professors like Francis Wayland, Mark Hopkins, and Noah Porter had stressed the teaching of moral, mental, and physical discipline in intimate classrooms and residence halls. But by 1900, expanding universities embraced the modern division of labor and taught knowledge to students in a multitude of new disciplines: history, economics, and psychology, just to name a few. In this pragmatic era, higher education was no longer dedicated to teaching individuals moral virtue. Rather, universities were designed to train members of a population how to make contributions to a complex, highly differentiated society. Athletics seemed to fill the void that opened when the curriculum shifted from moral discipline to academic disciplines. Coaches claimed to assume the important duty of teaching young men proper behavior, ethics, morality, and self-control. The athletic department, at least in theory, became the one department of the modern university dedicated to this task.

By the early 1900s, coaches even regularly asserted that football arenas (note that we are not talking about physical education courses) were essential spaces for teaching discipline. Coaches even posited that college athletics could teach the multitude or the crowd, not just a handful of players. Thousands of fellow students and spectators might learn discipline by watching sports. Coaches like John Heisman and Princeton’s Bill Roper contended that a team could only win on Saturday afternoon if the entire student body observed training rules, thus supporting the team’s efforts throughout the whole week. Fielding Yost, in his 1905 manual Football for Player and Spectator, went even further. He wrote that a football game could inspire spectators to absorb temperate habits and forge “a spirit which reaches out from the athletic field through the campus and into the very recitation room.” The games occurring in the stadium, Michigan’s coach thus asserted, could even improve a college’s academic environment.

As farfetched as such arguments may sound, we need to keep in mind that Yost was not the only (ahem) intellectual to make such claims during the Progressive Era. Josiah Royce said something similar in a turn-of-the-century piece when he stated that properly supervised physical training could “extend its influence to large bodies of boys who, as spectators of games or as schoolmates, are more or less influenced by the athletic spirit.” All those present at expertly conducted games, in other words, could learn ethical lessons useful in modern society.

Can disciplined behavior really be taught to a few men on an athletic field during a commercial spectacle? Do tens of thousands of people really learn how to be better, more disciplined members of society in football stadiums? These are good questions, ones that would be difficult to answer in any historical context. (Not only is it dubious that football teaches players and spectators temperate habits, as Fielding Yost posited in 1905, but there is also no way to measure such an expansive claim.) Nevertheless, similar rhetoric is often utilized today. Great coaches, we are told, are not just great because they win games, recruit star athletes, or reap publicity for their universities. Rather, what makes them great is their dedication to young men—and their identities as “teachers” or mentors, as well as coaches.

Until November 2011, many Americans thought that Joe Paterno and his coaching staff, including Jerry Sandusky, were true educators dedicated to teaching young men morality and discipline—not to enriching themselves. The excesses of big-time intercollegiate athletics could more easily be overlooked when they were balanced out by feel-good programs like Penn State’s. As it turns out, though, at least one coach in Happy Valley appears to have been abusing his position and doing something perverse and antithetical to the teaching of virtue. In any case, Sandusky’s alleged criminal behavior—and the late Paterno’s apparent complicity in covering it up—indicates that we need to be skeptical of tropes such as that of the wise football teacher who adroitly instills disciplined behavior and good sportsmanship. After all, the questionable notion that famous, powerful, and highly compensated coaches successfully teach young men (and the public) virtuous lessons through spectator sport is one that dates back roughly a century.

Ultimately, understanding the intellectual history of college coaching can help us interrogate the place of athletics in America’s universities, a task that needs critical attention now more than ever.

Rabu, 27 Juni 2012

Round Table on Murphy's The New Era: Entry 3--James Livingston


Review of Paul V. Murphy’s The New Era: American Thought and Culture in the 1920s (Lanham: Rowman and Littlefield Publishers, Inc., 2012) ISBN: 9780742549258.  267 pages.

Reviewed by James Livingston
Rutgers University


This is a likable and usable book if you want to turn political slogans into periodizing devices—“Normalcy,” “New Era,” “New Deal,” “Cold War,” “Old Left,” “Third Way,” and so forth, as if journalism really is the First Draft of History, or rather, as if your task as an historian is to reproduce the past.  Or if you think undergraduates will read this book just because you’ve assigned it.  Otherwise you might have some objections to the project as such, and that project is History as we know it—as we do it. 

I will offer my objections in the form of questions.

Begin with the premise.  What is the point of this book?  What audience outside of USIH will pay attention, and why?  Don’t get me wrong, I’ve written a book for the same series—I don’t have any answers, my book is far less readable, less enjoyable, and less important than this one.  But I think it’s time we started asking the questions.  They are miniatures or components of the question we’re addressing when we contemplate the bleak future of the university, or, for that matter, the death of The Professor, the learned, solemn sidekick of the modern slapstick individual.  The Professor is of course the guy who explains shit after you experience it, or, as the straight man, while you’re at it.
 
Then, the very idea of modernism.  “However, modernism was, first and foremost, an internal discussion among artists and writers about their own precarious social status, which resulted from a loss of a vital connection between themselves and the masses.” Hello?  Modernism was what?  OK, there was that “fragile bridge” between the new working class and the new middle class before the Great War, but both parties to the bargain were just that—new.  You might want to argue that modernism was the intellectuals’ way of coming to terms with exile, but then you’d have to explain the phenomenon of intellectuals, this strange new 20th-century stratum of purposefully superfluous individuals, and you’d have to locate their country of origin, the place they fled or the people that expelled them.  You’d have to address Pound, Eliot, Yeats, and Stein, then explain why William Carlos Williams stayed home.

And speaking of gaps between the intellectuals and the masses, let Paul Murphy introduce you to technology as the deus ex machina we all recognize as a magic trick: “in a larger sense, the battle was over what force would shape change—industrial and technological progress or culture.”  Really?  Technology is not itself a social and cultural artifact?  Historians still get to ask questions as mystifying as this: “Would the nation be defined by purely industrial and commercial imperatives or humanistic ones?”  The favorite metaphor of writers, artists, and intellectuals in the 1920s (and after) was “the machine”—it regulates Middletown, among other landmarks of that decade.  But these writer, artists, and intellectuals weren’t for or against: this was not a sporting event, regardless of how much we would like to read contemporary intellectual contests into the circumstances of a formative moment from the past.  You can’t have it both ways, saying on the one hand that we’re still speaking their language and on the other that they didn’t quite get it.  Well, OK, you can, but sooner or later readers will notice the autobiographical—or is it Oedipal?—integument.

How can you say that “after the 1920s, culture became the essential terrain of social and political action” if you’re not willing to ask why—and then venture an explanation?  The intellectuals of that decade certainly did, Lewis Mumford and W.E.B. Du Bois among them, but their explanations were derived from studies of the political economy of their own time.  Like the young Herbert Marcuse, they didn’t think that technological progress was the enemy of artistic achievement; they thought instead that such progress would liberate us from necessary labor, and thus free us from what mutilated every imagination.  But however you define culture, for then or for now—them or us—you’d better be prepared to understand what they did.

Also, the Harlem Renaissance.  Are we still willing to “explain” its failure rather than interrogate the assumption that it did fail—because those rarified uptown intellectuals never connected to the masses inside or outside Manhattan?  Or are we now with Houston A. Baker, Jr., Ann Douglas, Cheryl Wall, and George Hutchinson, thus willing to say that, Randolph Bourne notwithstanding, the promise of American life was fulfilled in Harlem in the 1920s and 30s, not postponed by the Great War?  There’s no answer in this book, just fair and balanced reporting.  How is that even possible unless the author aspires to be the writer of a textbook, or unless he assumes that the distance between the college boys and the proles—sorry, the gap between intellectuals and the masses—can’t be crossed?     

And while we’re at it, now that we’re experiencing something, what about the huge differences between Mumford, the hero of the book, and almost everybody else (apart of course from Van Wyck Brooks, the mentor)?  Of course it’s true that Mumford, Brooks, and the “Young Americans”—Waldo Frank, Harold Stearns, Paul Rosenfeld, Floyd Dell, among many others—sought an “organic” linkage to a usable past that would enable an inhabitable future, and that in doing so they were trying to ground their criticisms of the present in a cultural tradition, as Warren Susman and Martin Sklar convincingly argued long ago.  Who didn’t seek this grounding then?  Who doesn’t now?

The sorry fact is that Mumford’s usable past, not to mention those conjured by Brooks, Frank, and Dell, had more in common with the Agrarians—and with T. S. Eliot’s notion of civilization—than with the leaders of the Harlem Renaissance, who, by and large, embraced the possibilities of modern technology (oops) and, as a result, were better able to discover and appreciate their “roots” in antebellum America—and that would be the historical moment of slavery’s apogee—than the young intellectuals who gathered around the little magazines.  Mumford couldn’t find anything worth admiring in American culture after 1860, and so, like Alan Tate, William Yandell Elliott, Robert Penn Warren, and other luminaries of the very Old School convened at Vanderbilt, he valorized the 1850s, the “Golden Day” when Emerson, Thoreau, Whitman, Hawthorne, and Melville created the terms of literary debate.

So what?  Why object to such a good, even brilliant book?  Who cares?  I’ll play The Professor, you go ahead and play along, answer at will, and get as angry as you want. 

This shit doesn’t matter anymore!  Isn’t it clear by now that finely wrought books like Paul Murphy’s are monuments to a comically Nietzschean will to believe—mere vestiges of the urge to make sense in the Present and of the Future by citation of the Past?  Aren’t these books just oddly-shaped things that are soon to be placed in dioramas, alongside frogs and other endangered amphibians?  Or will they always exist as the material evidence of a deeper urge to get tenure—live forever and all that—which, as we all well know, is already a thing of the Past (not the urge, the thing itself)? 

What follows?  I don’t know, that’s not in my script, no matter how many times we’ve congratulated ourselves, as professional historians, for saying that the Future can’t be navigated without some map of the Past.  What a joke!  For now, I know that the congratulations are not yet in order, and that the joke is on us.  For too long, we have been merely reproducing the Past, as if we were well-educated preservationists with the right quotation.  It’s time we learned how to get over it, and told our fellow citizens why we did.

Selasa, 26 Juni 2012

Round Table on Murphy's The New Era: Entry 2--Kristoffer Shields

Review of Paul V. Murphy’s The New Era: American Thought and Culture in the 1920s (Lanham: Rowman and Littlefield Publishers, Inc., 2012). ISBN: 9780742549258. 267 pages.

Reviewed by Kristoffer Shields, PhD Candidate
Rutgers University

Mind the Gap


Too often in the high school and university U.S. History surveys of the past, the 1920s was taught as a “gap” decade: the gap between WWI and the Great Depression, between the Progressives and the New Deal, between the Volstead Act and the end of Prohibition. The time period merited little more than a chapter in a textbook, usually titled “The Jazz Age,” “The Roaring Twenties,” or, coincidentally (or perhaps not so coincidentally), “The New Era.” Thankfully, more recent historians, picking up on the work of William Leuchtenberg (particularly in his 1958 book The Perils of Prosperity), have largely corrected this. These historians, led by Lynn Dumenil, present the 1920s as a significant turning point in American thought and culture and “work…with the premise that the decade of the 1920s illuminates fundamental issues of the 20th century.”[1] Paul Murphy clearly agrees and continues this project in his comprehensive and engaging new look at the intellectuals and intellectualism of the ‘20s, The New Era: American Thought and Culture in the 1920s.

In fact, Murphy in a sense brings the era full circle, assigning a whole new meaning to the term “gap decade.” He presents a framework that understands the decade as defined, in fact, by cultural gaps. Murphy recognizes and acknowledges the regional, class, gender, racial, and country/city gaps that existed, but he focuses on a different cultural gap—that between the growing mass culture and the intellectuals and artists attempting to guide, shape, and explain it. He describes a type of intellectual crisis, in which social commentators and elites struggled to come to terms with their changing roles in a rapidly changing cultural landscape. Murphy writes, “However, first and foremost, modernism was an internal discussion among artists and writers about their own precarious social statuses, which resulted from a loss of a vital connection between themselves and the masses” (5). In particular, many of the intellectuals and artists Murphy discusses were concerned about the deleterious effects they saw in the growth of mass culture. “It was this hearty embrace of a commercialized mass culture, with its often cheap character and the seemingly crude and superficial pleasures it provided, that troubled intellectuals and widened the gap between them and the mass public” (12).


It is too simple to simply say that these elites were horrified by mass culture, though; they were also drawn to it. This was in part for practical reasons: “While modernist impulses set many American intellectuals and artists in the 1920s apart from society, the aspiration to become arbiters of cultural change promoted connection” (7). Many of these intellectuals displayed a clear desire to understand the new culture and connect with it. The gap mattered to them not just because they needed to bridge it but also because they wanted to understand it. To them, that was the key to shaping future cultural change. These intellectuals, particularly modernist liberals in Murphy’s account, refused to cede control of culture and instead fought to find ways to direct or at least impact it. Our hope as historians is that by better understanding these gaps and the intellectual and artistic responses to them, we can gain access to a better understanding of the cultural conflicts and evolutions of the decade. This may in turn help us better understand the social and political clashes of the 20th century. “After the 1920s,” Murphy writes, “culture became the essential terrain of social and political action” (10). Ultimately, Murphy is describing a battle to control culture, fought amongst conservatives, progressives, and modernist liberal intellectuals, a “battle…over what force would shape change—industrial and technological progress or culture” (10). Put another way, this is the battle over the gap and it is the battle that would define the intellectual decade.

It is difficult to do justice to the breadth of Murphy’s research and analysis in a short review such as this. He begins by analyzing the need for intellectuals to understand and conceptualize the changes they were experiencing and describes the growth of cultural anthropology to provide that service. He then discusses the ambivalent relationship between intellectuals and the mass culture in more detail. He focuses here on the difficulties intellectuals faced, distrusting what they saw as the conformist nature of mass culture and yet still recognizing its power. Murphy writes, “Attuned to the immense power of culture, eager to repudiate the role of moral guardians, and avid proponents of personal freedom, they were profoundly distressed when the choices audiences made failed to reflect their own values or advance their aspirations for America” (71). There were many ways in which to deal with this ambivalence and Murphy presents a number of divergent ones.

Having established the existence of the culture gap and some of the complicated ways in which intellectuals generally understood and responded to it, Murphy turns to a more specific look at how different groups of intellectuals attempted to bridge the gap—or, more accurately perhaps, find space for themselves within the gap. He begins with literary intellectuals, particularly the differing approaches taken by Young American critics and the Dadaists, ultimately locating the decade’s “dominant literary currents” as “the search for symbols from the American past with which to fashion an organic American culture and the elevation of modern technology and commercial enterprise to the level of a new folklore” (105). Next, Murphy discusses the role of race, ethnicity and immigration in creating and overcoming cultural gaps, specifically noting how “immigrants reflected the contest between conformity and personal autonomy that formed one of the chief features of modern life” (111). He then turns to an analysis of Pragmatism and the social sciences, situating them as efforts by intellectuals to drive Americans out of a “cultural abyss” before finishing with a chapter on the battles between and within religion and science.

It is difficult to find fault with the impressive breadth of Murphy’s coverage of the intellectual trends of the time period. We all bring our biases and backgrounds as readers, however, and I personally missed a discussion of the Legal Realists, primarily because the discussion concerning the role of law in society taking place in the legal community at the time fits Murphy’s conception of the culture gap so well. When discussing the Pragmatists, for example, Murphy writes, “The pragmatists presented their ideas—which challenged belief in a single, real, unseen, and unchanging world beyond our own that vouchsafed absolute truth—as the means of intellectual and cultural regeneration” (176). This is similar to the challenge taking place in the world of law, particularly in the conversation between Roscoe Pound and Karl Llewellyn concerning the tenets of true Legal Realism.[2] Legal intellectuals were dealing with their own version of a gap between intellectuals and mass culture in their attempts to re-think both the status and the role of law in culture. This is one manifestation of the gap that seems to be missing, though really, this may be more of a suggestion of a way in which Murphy’s framework could be used in the future than a criticism of the work itself.

More important, one of the challenges for Murphy is find ways to link such a wide assortment of intellectuals, artists, approaches, and philosophies and to deal with each sufficiently. Murphy works hard to ensure that this book is more than a list of biographical or ideological entries and he is mostly successful. Murphy’s discussion of Gilbert Seldes, for just one example, is fascinating, showing how we can read different intellectuals not just for the substance of what they said, but for how they can be read together to better understand the debate (69). But I admit that at times it feels as though just as one is drawn into the discussion of a particular person or concept, Murphy is forced to move on. He resolves this challenge in a couple of ways. Primarily, he continually brings the narrative back to his central theme of the gap these intellectuals were facing and reminds the reader of the long term impact of the differing approaches they took.

To me, though, another unifying concept Murphy suggests is even more interesting: the concept of authenticity. The intellectuals Murphy describes seem unmoored by the cultural changes taking place around them and are, to some extent, searching for an anchor. Most, if not all, turn to some aspect of authenticity to find that cultural stronghold. Much of the writing, playing, singing, talking, and “acting” that Murphy describes comes down to using authenticity as a way to carve out space within the cultural gap. From Murphy’s discussion of the commercial packaging of hillbilly records to his description of the attempts of Dadaists to create a usable modern “folklore” to the attempt by various groups to construct a “usable past” from which to create a new culture, authenticity is everywhere in Murphy’s account. Whether Dadaists, artists, Fundamentalists, immigrants, or intellectuals, much of the work to find space within the gap took the form of the search for (or creation of) an “authentic” self. Murphy often notes this explicitly, but at other times leaves the implication to the reader.

For just one example, Murphy writes of the Harlem Renaissance, “Much of the renewed racial consciousness in the Renaissance was similarly tied to a discovery of roots, whether a respectful attention to the southern black fold roots of Negro culture or the deeper African cultural tradition” (134). Similarly, as I noted above, Murphy elsewhere describes the dominant literary currents of the 1920s as “the search for symbols from the American past with which to fashion an organic American culture and the elevation of modern technology and commercial enterprise to the level of a new folklore” (105). Read one way, both of these approaches are a search for an authentic identity, one that could be used by these intellectuals to find authority within the cultural gap. The different intellectuals Murphy describes have different ways of trying to find what is “real,” but they are virtually all linked by the understanding that discovering or creating something “real” (or that simply appeared seemed real) was important. This is present in Murphy’s account, but I would love to see more discussion of the different approaches to the authentic self taken by these intellectuals and how they used the concept both to understand the changes and attempt to control them. For whom was the authentic self the true project, for example, and for whom was the creation of a seemingly authentic self simply a means by which to assert cultural control?

Ultimately, though, for me, the best test for any work of History is that it is both illuminating and thought-provoking, that it both clarifies and clouds one’s understanding. Murphy’s work passes that test with flying colors. There is much to learn in Murphy’s book; it is useful for any scholar of the 20thcentury, particularly any graduate student. More important, though, there is also much to think about here. Like the actors themselves, the historian of this time period can sometimes feel ungrounded by the contradictions, cultural changes, and shifting foundations of the decade. We, too, are sometimes attempting to describe a cultural gap even as we struggle to keep ourselves above it. Works like this—that meet the complexities of the 1920s head-on and do not back down from the difficult contradictions—help show us the way forward.


----------------------------------------------

[1] Dumenil, Lynn, The Modern Temper: American Culture and Society in the 1920s. (New York: Hill and Wang, 1995), 12.

[2] For a much more complete account of the culmination of this discussion, see Llewellyn, Karl, “Some Realism About Realism—Responding to Dean Pound,” 44 Harvard Law Review 1222 (1931).

Senin, 25 Juni 2012

Round Table on Murphy's The New Era: Entry 1--Lynn Dumenil


Review of Paul V. Murphy’s The New Era: American Thought and Culture in the 1920s (Lanham: Rowman and Littlefield Publishers, Inc., 2012). ISBN: 9780742549258.  267 pages.

Reviewed by Lynn Dumenil, Robert Glass Cleland Professor of American History
Occidental College

Paul V. Murphy’s The New Era: American Thought and Culture in the 1920s is an ambitious history of American intellectuals in the 1920s. One of its most valuable contributions is the two dozen or so short portraits of key members of the intelligentsia, a diverse group that includes Mary Follett, Margaret Mead, Robert Lynd, Jean Toomer, Nella Larsen, Walter Lippmann, H.L. Mencken, and Malcolm Cowley. These individuals made their mark on the “new era” as they grappled with the meaning of a radically transformed “modern” society.  Murphy’s starting point is the end of the “genteel tradition” whereby intellectuals viewed themselves, and were viewed as, cultural arbiters and moral exemplars. Massive industrial development, a voracious consumer culture, and omnipresent mass media disrupted older values, with World War I putting the finishing touches to the old order.

At the center of Murphy’s analysis is his argument that the response of intellectuals led the new sense of modernism, which he defines as “an internal discussion among artists and writers about their own precarious social status, which resulted from a loss of a vital connection between themselves and the masses” (p. 5).  A new notion of culture  as a people’s way of being reinforced this bifurcation – a point Murphy makes well in his discussion of the powerful ideas of anthropologists Franz Boas and Margaret Mead, and sociologist William F. Ogburn (the originator of the concept of “culture lag”). This point also emerges clearly in his assessment of Robert and Helen Merrell Lynd’s approach in Middletown. Armed with this notion of culture, intellectuals viewed themselves as the interpreters of cultural change, and Murphy argues, many sought to construct solutions based on tolerance, critical inquiry, and personal fulfillment, to the dilemmas modern mass society had created.

A number of chapters offer compelling evidence of Murphy’s assessment of the gap between intellectuals and the people and their effort to close it. In chapter 3, “The Bridge,” which is a wide-ranging discussion that also includes H.L. Mencken, Hart Crane, Joseph Stella, and Ernest Hemingway, he examines the conflicting approach of two groups of cultural critics for resolving this bifurcation.  On the one hand, he presents the self-styled “Young Americans” like Lewis Mumford and Alfred Stieglitz, who “aimed to tap into American cultural roots.” As Murphy writes of Mumford, he called upon the modern writer “not to depict the ‘blank reality’ of the universe but to create the veils that sustain humans in their faith and work; to undertake the task of building culture, not to recreate exile; to halt the pioneering and instead cultivate, recover and retrieve past forms; and to create the symbols that would endow a new age of personality” (p. 88). On the other hand, opposed to this organic approach rooted in the past, Murphy argues, were the Dadaists. These French-influenced “anti-art” artists and critics reveled in mass and mechanical culture. As critic Matthew Josephson wrote, “’We must write for our age…the poets should be no less daring or inventive than the mechanical engineers of wartime; our literature should reflect the influences of the cinema…the saxophone’”(p. 97).

Another arena of conflict between intellectuals seeking to bridge the chasm between themselves and the wider public was in public policy. Here, Murphy impressively tackles the well-known issue of intellectuals’ retreat from progressive reform in the aftermath of World War I. He argues that 1920s post-progressive reformers “spent the decade reimagining a liberal politics shaped now by a recognition of the bifurcation between reform-minded elites and an indifferent, innervated, or simply ‘eclipsed’ public” (p. 15 3). Murphy posits two new roles that political analysts developed in the 1920s to address the problem of the “eclipsed public.” One focused on social science experts’ potential for training leaders to educate the public. Walter Lippmann, with his interest in the power of propaganda and the challenge it brought to democracy, was an exemplar of this approach, with Murphy describing him as a modern Machiavelli, who proposed molding public opinion through the “creation of a new cadre of expert intelligence workers, many of whom were already working in research bureaus, legislative reference libraries, specialized lobbied funded by corporations and unions, advocacy groups, watchdog publications, and foundations”  (p. 158). In contrast, a “group of academics, educators, social welfare workers, and ministers,” were more optimistic about democracy and “imagined new forms of small-group, democratic decision making” (p. 152). Here, Murphy focuses on the work of Mary Parker Follett, who called for improved social integration based on small group interaction. Whatever the approach of these liberal reformers, Murphy notes that for both groups the “public became an abstract thing, a topic of study as well as a subject of analysis,” a clear indicating of the “yawning gap” separating intellectuals and the public (p. 179).

While all of Murphy’s chapters offer provocative insights into aspects of intellectual history of the 1920s, including his discussions on cultural pluralism, the Harlem Renaissance, the fundamentalist controversy, Chicago school sociologists, blues and hillbilly music, and the1920s roots of contemporary conservative intellectuals, in some places the arguments seem to be missing connective tissue. The chapter on mass culture, which includes an interesting analysis of the silent comedies of Buster Keaton, Harold Lloyd, and Charlie Chaplin and their attention to the vulnerable “little man,” doesn’t offer enough commentary on intellectuals’ response to mass media to sustain Murphy’s argument about the gap between them and the masses.  Similarly, while the assessment of new roles of women and changing sexual and marital patterns is well done, a lengthy discussion of the battle over the Equal Rights Amendment is not obviously related to the book’s thesis about intellectuals and modernism. At times the chapters are choppy, perhaps a result of the effort to fit too many subjects and ideas into the book’s overarching theme.

These quibbles aside, readers will find valuable assessments throughout the book. Murphy does not offer evidence for his concluding claim that “The critical concerns of the 1920s – conformity, intolerance, materialism, mass culture, propaganda, cultural repression, censorship – and the liberal values proffered in response – sexual equality, free expression, personal fulfillment, pluralism, tolerance, critical inquiry, scientific rationality, and open-mindedness – came to define the next forty years of American intellectual life,” but the book does provide an imaginative framework for understanding the intellectual life of the 1920s.(p. 209).

From Rome with Love: Summing Up

By Elisabeth Lasch-Quinn


It is exciting for us that the Italy-U.S. Fulbright Commission requested permission to post links to the pieces, thus featuring the USIH blog internationally. Here are all the posts: Part I, Part II, Part III, Part IV, and Part V.

As I reread the five brief essays, I asked myself, looking back, what major points other than the specifics that I wished readers might get out of my random meditations on my semester’s exposure to Italian intellectual life. Most of what I was getting at fall under the following headings:

1. Living with Precarity. Bad news: we are not alone in this. Good news: we are not in this alone. Italian faculty members, graduate students, and others devoted to scholarly inquiry in the Humanities face the same crisis we do.Yet, like us—--like USIH—--many of them are actively nurturing passions and practices in areas intersecting with American intellectual history, often in spite of a lack of institutional and moral support from above.

2. Conversation. Social critics and cultural historians must at some point generalize. Yet there is a very real fear that the line between drawing general conclusions about other people (or ourselves), on the one hand, and stereotyping or “othering” them (or ourselves), on the other hand, can become blurred. Instead of complexification as the answer to simplification, the veils between us and those we are analyzing (even when we have met them and they are us) are as much the subject as any kind of authentic identity beneath they might reveal if lifted. They can only be an understood part of the subject if we are at once self-reflective and conversant—--linguistically, physically, imaginatively—--that is, in conversation with those whom we would like to understand and with others trying to understand them.

3. Food. The Slow Food Movement in Italy suggests that hope can be found in the blend of tradition and innovation that allows us to fuse longstanding practices of pleasure, insight, and excellence with visions and desires for changing what ails us. This transcends tired liberal-conservative dichotomies that frustrate and siphon off potentially new, dissenting, or oppositional energies. Being able to feed ourselves literally and figuratively, in ways that do not separate need from taste, is the starting point.


4. Place. As American scholars of things American, cultivating a habit of looking beyond our usual geographically delimited horizons could bring new possibilities for comradeship and conviviality. This is not the call for the internationalization of American history with which we are already familiar, but a new kind of inter-continental localism that suggests that alertness to and immersion in the particular locations in which inquiry takes place can deepen our personal-professional pursuits. The magic of face-to-face interaction and specific settings in which the intellectual arts are practiced can and should affect our scholarship.

5. Intimacy. Intellectual historical inquiry is one pathway of connection to another person and his or her inner life. The physical aspect of another person’s library, in part or whole, brings to light overlapping worlds of the abstract and the material, the mundane and the timeless. Even the most searing criticism of the past and present involves and invokes reverence, awe, humility, acknowledgement of mystery, and limits upon knowledge.Libraries confide all of this and more.

Intimacy, place, food, conversation…sounds like Italy, right? It’s the best way to live with precarity, isn’t it? No, it’s the only way.

Stepping back now, the overarching sense of things I was trying to convey in ruminating over some of my Italian adventures was just that possibilities for life-enhancing and even at times life-saving connection can be found in places we might not necessarily look first. In other words, it is nothing more than the truism most readers of this blog probably knew long before I did, that a change of location can foster a vitally new perspective on matters one thought one already had some kind of window into—such as, in our case, intellectual life in one’s own country. After all, no one from USIH stared dumbfounded, though others certainly did, at the thought of a modern Americanist’s itinerary to study and conduct research in Rome. In sum, I was invited to Rome to teach and, predictably, Rome taught me more than I could ever have learned otherwise—and not just about Rome.

At the start of his life of Demosthenes, Plutarch ridicules the notion that “to a man's being happy it is in the first place requisite he should be born in ‘some famous city’” (Quotes from John Dryden’s translation as it appears here). As with the word “is,” that depends on what your definition of “happy” is:

But for him that would attain to true happiness, which for the most part is placed in the qualities and disposition of the mind, it is, in my opinion, of no other disadvantage to be of a mean, obscure country, than to be born of a small or plain-looking woman. For it were ridiculous to think that Iulis, a little part of Ceos, which itself is no great island, and Aegina, which an Athenian once said ought to be removed, like a small eyesore, from the port of Piraeus should breed good actors and poets, and yet should never be able to produce a just,temperate, wise, and high-minded man. Other arts, whose end it is to acquire riches or honour, are likely enough to wither and decay in poor and undistinguished towns; but virtue, like a strong and durable plant, may take root and thrive in any place where it can lay hold of an ingenuous nature, and a mind that is industrious.

Then he goes on to talk about what living in a location, such as a major urban center, especially when not one’s place of birth,can offer. His view of travel for research is an expansive one that would call into question the asocial archival burrower of our time as the consummate professional historian.

But if any man undertake to write a history that has to be collected from materials gathered by observation and the reading of works not easy to be got in all places, nor written always in his own language, but many of them foreign and dispersed in other hands, for him, undoubtedly, it is in the first place and above all things most necessary to reside in some city of good note, addicted to liberal arts, and populous; where he may have plenty of all sorts of books, and upon inquiry may hear and inform himself of such particulars as, having escaped the pens of writers, are more faithfully preserved in the memories of men, lest his work be deficient in many things, even those which it can least dispense with.

But for me, I live in a little town…


As I return to my own little town, the small city of Syracuse, Plutarch’s words resound with an uncanny, otherworldly echo, just as the place names of Central New York will never sound in my ears with their Italian counterparts far behind. Since his travels “in Rome and other parts of Italy,” he wrote, “that which happened to me may seem strange, though it be true; for it was not so much by the knowledge of words that I came to the understanding of things, as by my experience of things I was enabled to follow the meaning of words.” Perhaps this should be the same for us.

Elisabeth Lasch-Quinn
Syracuse, New York

Jumat, 22 Juni 2012

The Past and Future of Public Higher Education

Today's guest blogger, Nick Strohl, has just completed his second year as a doctoral student in History and Educational Policy Studies at the University of Wisconsin-Madison. His areas of study include the history of education, American intellectual and cultural history, and higher education history and policy. His current research focus is American higher education during the interwar years.

The Past and Future of Public Higher Education
By Nick Strohl

First of all, I would like to offer my thanks to L.D., and to all of you who manage and contribute to this terrific blog, for the opportunity to write a guest post. I have been a regular reader for about a year, hovering on the margins of the discussion as one might at a lively cocktail party conversation. I do not first and foremost consider myself an intellectual historian—if, at this early stage of my graduate career I consider myself anything—although I have always found intellectual historians to be among the smartest and most interesting folks in the room, and I always learn something new when I spend time with them. This virtual community has not failed to disappoint in that regard.

My topic for this post is public higher education in the United States, its past and just as importantly its future. This is not intended to be another summary of the various ways in which the university is in “crisis,” which, arguably, it has been since Peter Abelard challenged accepted modes of theological discourse in 12th century France.[1] Instead, my goal is to begin a conversation about how intellectual and cultural historians can help in understanding some of the most recent changes in the landscape of American public higher education.

As I will briefly outline below, the halcyon days of public higher education (c. 1945-1970) are over. Many states have demonstrated an unwillingness or inability to fund public higher education at all levels: elite flagships are increasingly expected to make up for declining state appropriations through research dollars and tuition revenue, while community colleges struggle to meet the demand for their services. As a historian, I am interested in what these changes portend for the place of public higher education in American society. To what extent are we seeing a revision of a historic “social contract” between the state and its citizens to provide postsecondary education, as some suggest?[2] And to what extent are public institutions of higher education obligated to perform public service, especially as they come to rely less and less on taxpayer money? Finally, to whom are public institutions of higher education ultimately beholden—students, taxpayers, elected officials, faculty, others?

A long history of this subject—the meaning of public higher education in American society—might begin with a consideration of the Trustees of Dartmouth College v. Woodward in 1819 or the passage of the Morrill Act in 1862. The latter example, in particular, first made explicit the duty of publicly-funded institutions to serve a broader public, to pursue useful research, and to be responsive to local needs. By the middle of the twentieth century, this public mission had expanded to include the provision of mass higher education, an ideal embodied most fully by the California Master Plan (1960), but also embraced by many other states, especially in the Midwest and West.[3] By about 1970, public higher education had come to dominate the landscape American higher education, enrolling nearly eighty percent of all American students in postsecondary institutions (up from fifty percent in 1950). As historian of higher education Roger Geiger has explained, “The English language has no word for the opposite of privatization. Yet, that is what occurred from 1945 to 1980 in American higher education (as well as other spheres). American states poured enormous resources into building public systems of higher education: flagship universities were expanded and outfitted for an extensive research role; teachers colleges grew into regional universities; public urban universities multiplied and grew; and a vast array of community colleges was built.”[4]

Today, public institutions still educate a large majority of postsecondary students (about 72 percent), but they do so in ways that, I would contend, represent a growing departure from their historic mission(s). In at least several areas, public institutions and systems—at all levels—are much less “public” than in the past: in their sources of funding, in their governance structures, and in their cost and accessibility to students, among other things. Some of these changes are most striking at the elite institutions, such as UW-Madison or UC-Berkeley, but they filter down to students at all levels, with perhaps the most important consequences for those at the margins of the public system: community college students. As a recent report from the Center for the Future of Higher Education demonstrates, budget cuts and enrollment limitations at the top of the public higher education pyramid have “cascaded” down to those students—often low-income, non-traditional, and first-generation—at the bottom. For the first time since the rise of mass public higher education in the middle of the century, willing and able high school graduates are being turned from the very institution—the community college—that was supposed to be a last bastion of educational opportunity beyond high school.[5]

I do not have enough space to illustrate fully the trends described above, although a few facts might paint a picture. 

*Here at UW-Madison, state appropriations now make up about 15% of the university operating budget, down from about 27% in 1997-1998 and from a majority share of the budget several decades ago. In other states, the share of state support in university budgets can be much lower. At a recent conference on higher education here in Madison, University of Colorado-Boulder political science professor Ken Bickers reported that UC-Boulder now counts state appropriations as an astonishing 4.6% of its overall budget. For most professors and administrators at Boulder, Bickers said, the question of whether to “privatize” the flagship campus is an essentially meaningless one at this point.

*At the elite level of public higher education—the state flagships—reform is decidedly in the direction of privatization—for lack a better term—meaning greater reliance on research dollars, public-private partnerships, patent revenue, private donations and tuition revenue in lieu of state funds.[6] Examples of comprehensive reform plans around the country include Louisiana State’s “Flagship Coalition”, championed by Louisiana business leaders and prominent LSU alumni such as James Carville; UC-Boulder’s “Flagship 2030” plan; and Ohio’s “Enterprise University” plan. Similar plans in Wisconsin and Oregon—the “New Badger Partnership” and the “New Partnership,” respectively—have met been met with greater resistance from the public, other system leaders, and the legislatures in those states. In both Wisconsin and Oregon, the failure to achieve these proposed reforms led to the abrupt departures of UW-Madison Chancellor Biddy Martin (after three years on the job) and UO president Richard Lariviere (after two years). Even more recently, University of Virginia President Teresa Sullivan became the victim of a dispute over “philosophical differences” with the UVA Board of Visitors. Sullivan, it appears, was unwilling to implement a program of zero-cost reforms, such as online learning, with the speed and alacrity expected from the Board.

*Perhaps the most “felt” change in public higher education today is its cost to students and their families. If declining state support has meant one thing at all levels of public higher education—whether at flagships or at community colleges—it is that students must pay more of the share of the cost of attendance. As rising tuition and costs have far outpaced growth in family incomes and various forms of grant-based financial aid, more students and families have had to rely on loans. A recent report by Demos, a public policy research and advocacy organization based in New York, offers an excellent overview of the deleterious effect of higher education cuts for the American middle class. Among its findings, the report explains that “states have reached a turning point in their relationship to public higher education, and the policy choices of the next few years will determine the extent to which public institutions of higher education continue to function as a bridge to the middle class for young adults, especially for those from low- and moderate-income backgrounds.”[7]

Taken together, the above trends suggest a much narrower conception of the mission of public higher education than has historically been the case, at least since the mid-twentieth century: research is increasingly valued for its potential to reach the marketplace; students are expected to pay for the cost of their own education beyond high school, even if that requires that they take on significant debt; and institutions, especially elite flagships, demand to govern themselves with less state oversight, including the freedom to set their own tuition levels. Of course, private universities have long enjoyed many of these freedoms now sought by these public universities; and private universities themselves provide some measure of public service in their teaching and research. But the aggressiveness with which public universities have moved in the direction of private models is something new, and, in my view, represents a revision of the historic mission of public higher education without an open discussion about what is in fact taking place.

This, I think, is where historians, and especially intellectual and cultural historians, can provide a service. In my view, we need to better understand not just the structural changes in American higher education in the last fifty years, but also the broader intellectual and cultural changes that have accompanied them. Why should taxpayers share some, if not all, of the cost of higher education for all? To what extent should public universities be controlled by democratically-elected state legislatures, instead of being free to run their own affairs? And to what extent should taxpayers provide support for those students who cannot afford the cost of postsecondary education? These are questions that, to some extent, can be answered by economists and political scientists.8 But they are also questions whose answers are embedded in culture and politics.

My own two cents, based on the work of Christopher Newfield (Unmaking the Public University, 2008) and Andrew Hartman (“Occupy Wall Street: A New Culture War”) is that a better understanding of the place of the public higher education in American society will come with a better understanding of the meaning of the “culture wars” of the last several decades. As both Newfield and Hartman explain, the “culture wars” paradigm has tended to obscure the connection between economic and cultural issues, which are more often than not one-and-the-same. Thus, the demand that students pay for their cost of their college education—whether through loans, scholarships, parental support, work, or some combination of all of these—resonates with political discourse about personal responsibility; the desire of public flagships to operate with greater autonomy makes sense in a world where the corporate model is the gold standard; and the idea that the government should get out of the business of funding higher education aligns with the view that less government intrusion leads to more efficient market outcomes.

My question for the readers of this blog, therefore, is how one might begin to historicize the changes in public higher education since the middle of the twentieth century. Are these changes part of broader intellectual and cultural changes in the postwar era? What other frameworks might one use to understand the place of higher education, public or private, in American society today? I am interested to hear how others might begin to assess some of the trends described above, as well as how and whether these changes have been felt on the ground.

____________
[1]For an insightful assessment of the most recent spate of books on the shortcomings of American higher education, readers should see Anthony Grafton’s reviews in the New York Review of Books, particularly the one brought to the attention of this blog last fall by Andrew Hartman.

[2] See, for example, John Aubrey Douglass, The Conditions for Admission: Access, Equity, and the Social Contract of Public Universities (Palo Alto, CA: Stanford, 2007).

[3]See John Aubrey Douglass, The California Idea and American Higher Education: 1850 to the 1960 Master Plan (Palo Alto, CA: Stanford, 2000): 1-18.

[4]Roger L. Geiger, “Postmortem for the Current Era: Change in American Higher Education, 1980-2010,” Working Paper No. 3, (State College, PA: Center for the Study of Higher Education, Pennsylvania State University, July 2010), 3. The enrollment statistics cited above also come from this source.

[5]I am in debt to UW-Madison professor Sara Goldrick-Rab and her blog, “The Education Optimists,” for pointing me in the direction of some of the most recent reports and data on the issues discussed in this post. See http://eduoptimists.blogspot.com/ for more.

[6]Higher education researchers Sheila Slaughter and Gary Rhoades provide a detailed and comprehensive analysis of the development of these trends in the last three decades of the twentieth century in their book, Academic Capitalism: Markets, the State, and Higher Education (Baltimore, MD: Johns Hopkins University Press, 2004).

[7]The Great Cost Shift: How Higher Education Cuts Undermine the Future Middle Class (2012), 3.

[8]For a historical analysis of the connection between educational attainment and long-term economic growth, see the work of economists’ Claudia Goldin and Lawrence F. Katz in The Race Between Education and Technology (Cambridge, MA: Belknap Press, 2008).



Kamis, 21 Juni 2012

Ray Haberski, "God and War: American Civil Religion Since 1945"


It is my pleasure to announce that regular USIH blogger Ray Haberski's new book, God and War: American Civil Religion Since 1945, is now published. Congratulations Ray! 

Abstract: Americans have long considered their country to be good—a nation "under God" with a profound role to play in the world. Yet nothing tests that proposition like war. Raymond Haberski argues that since 1945 the common moral assumptions expressed in an American civil religion have become increasingly defined by the nation's experience with war. God and War traces how three great postwar “trials”—the Cold War, the Vietnam War, and the War on Terror—have revealed the promise and perils of an American civil religion. Throughout the Cold War, Americans combined faith in God and faith in the nation to struggle against not only communism but their own internal demons. The Vietnam War tested whether America remained a nation "under God," inspiring, somewhat ironically, an awakening among a group of religious, intellectual and political leaders to save the nation's soul. With the tenth anniversary of 9/11 behind us and the subsequent wars in Iraq and Afghanistan winding down, Americans might now explore whether civil religion can exist apart from the power of war to affirm the value of the nation to its people and the world.

Blurbs: "The idea that America has a civil religion has a notoriously slippery history. Raymond Haberski, Jr. gives us a wonderfully lucid and keenly perceptive account of how this idea has been variously appropriated and refashioned since World War II." (Gary Dorrien Union Theological Seminary and Columbia University)

"A self-proclaimed 'nation under God,' the United States has a pronounced affinity for war. In this illuminating and important book, Raymond Haberski explores the intimate and largely pernicious relationship between these two abiding aspects of American identity."(Andrew J. Bacevich Boston University)

God and War perceptively reveals the component parts of America’s civil religion since 1945. It is a troubling story steeped in a mythical idea that the nation’s violence was blessed by God.” (John Bodnar author of The Good War In American Memory )

"The best book on American civil religion since The Broken Covenant. Haberski takes us up to the present day, illuminating how times of war can both summon and distort civil religion." (Philip S. Gorski Yale University )

Rabu, 20 Juni 2012

"Zeitgeist prevailed over hearth"

David Levering Lewis writes in the first half of his biography of W.E.B. Du Bois that the intellectual felt bound by a "covenant with his people to serve them and, if possible, save them from a future as blighted as their past had been sorrowful" and that this covenant distanced him from his wife and daughter. "Zeitgeist prevailed over hearth." (346)

It is a supposition of mine that intellectuals and other people passionately devoted to their work don't tend to make good parents (this was my justification for why there are almost no good parents in the Bible, which seemed odd to me given how much of contemporary Christian culture is precisely about how to be good parents).

I'm curious--do you pay attention to the kind of parents your intellectuals made? Am I wrong in my supposition? Do you have counter-examples of good parents who also had a passion in their lives?

Anti-feminists condemned Intellectual or career-driven women for leaving their children (indeed, the question over the effect of day care on children is still a pertinent one). Men were not judged in the same way. Is consideration of intellectual men as fathers a 21st century concern?


Two weeks ago, some of the commentators challenged my use of the word "tragic" to describe the relationship of Du Bois to his daughter. Lewis does a better job explaining what I was trying to get at (but, "shows" rather than "tells"--a mark of his excellent writing style);

“A theoretical feminist whose advocacy could erupt with the force of a volcano (as in “The Burden of Black Women’ in the November 1907 Horizon, or in “The Damnation of Women” in the 1921 collection of essays, Darkwater), Du Bois proved to be consistently patriarchal in his role as husband and father. The all-too-commonplace truth is that he increasingly acted as a well-intentioned tyrant at best and a bullying hypocrite at worst. Over the next two years, when he found time to pay some attention to Nina and Yolande he saw them as symbols—as Wife and Daughter, special enough to be sure, because they were his wife and daughter, and therefore the paradigmatic wife and daughter of the Talented Tenth. If his expectations of Nina were narrow, they remained exacting. She had the duty not to hinder his own private and public involvements and to follow his prescriptions for their daughter’s intellectual development. His expectations of Yolande were as exalted as they were unrealistic.
 "Daughter Yolande was to be sacrificed time and again to the cruelest of double standards. On the one hand her life, like her mother's, was controlled by the head of the family--a man whose faith in his own wisdom was serene and always unequivocal; but, whereas other late-Victorian husbands and fathers were determined to shelter their womenfolk from overexposure to education and public life, Du Bois's marching orders commanded Yolande to become superlatively educated and emancipated.... Yolande was to mature into a wise and moral Zora [a character from his novel The Quest] endowed with the cosmopolitanism of a Caroline Wynn. But there was surely something more--the sublimation of a father's loss of a son through a daughter. What the golden-haired Burghardt could have done, spunky Yolande would do as well--and with less risk, because, although it was hard to be a black woman, it was not usually fatal to be an intelligent, enterprising one, as often was the case with black men. ... Yolande would attain her goals and she would not cringe. He told her that repeatedly--in letters, at the dinner table, and during those increasingly rare bedtime sessions that she relished for the closeness between them.” (451)

I was avoiding reading Lewis until I had gone through the primary sources myself. Humph. It is a bit depressing that we come to the same conclusions and he says it so much more eloquently than I. The point of my Yolande chapter is not so much her relationship with her father (which, given the sources, is impossible to avoid) but her relationship to her international travel. I hope to offer something new with that latter piece. For better or worse, one place I differ from Lewis' interpretation of Yolande is that I am more sympathetic to her. In part two of his biography of Du Bois, Lewis writes,

“Yolande was self-indulgent, under-achieving, uncertain, chronically overweight, and often ill. She appears to have craved her father’s approval in almost exactly the proportions she sensed that her inadequacies would preclude her winning it.” (30-31)
This perception of Yolande is very Du Bois-centric. I am trying to build a picture of Yolande that incorporates her own understanding of herself and perceptions by people outsider the family.