Tampilkan postingan dengan label Paul Murphy. Tampilkan semua postingan
Tampilkan postingan dengan label Paul Murphy. Tampilkan semua postingan

Senin, 05 November 2012

Professional Societies in an Age of Academic Transformation


The following post is Society for U.S. Intellectual History President Paul Murphy's "President's Column" for the Fall 2012 issue of the S-USIH Notes, the Society's newsletter.  In addition to being President of S-USIH, Paul is Professor of History at Grand Valley State University and is the author of The Rebuke of History: the Southern Agrarians and American Conservative Thought (University of North Carolina, 2001) and The New Era: American Thought and Culture in the 1920s (Rowman & Littlefield, 2012).  He was also a founding member of this blog.


We have been in a national conversation about the future of education at all levels for a number of years now.  For those in higher education, this has meant wrenching debates about the professoriate, the future of public education, and the nature of a university education itself.  We who are professors may have reached a peak of discomfort with the latest round of high-tech start-ups in online university education, garbed as they are with the democratic promise of universally available quality education although bearing, too, the promise that the key to efficiencies and cost savings in higher education may finally be in the grasp of knowledge industry executives.  We begin to feel like E. P. Thompson’s weavers, confronted with the grim logic of technological innovation and needful again of the advice of the Assistant Commissioner for the West Riding, delivered in 1840:  “…warn them to flee from the trade, and to beware of leading their children into it, as they would beware the commission of the most atrocious of crimes.”[1]  Computerization and the internet revolution drive this conversation in part, but the roots of the debate are deeper, relating to globalization, the eclipse of public commitments to education, and, ultimately, what amounts to a new stage in the industrialization of education.  If we can replace the hardcover book with an interactive, multimedia digital data file, why not do the same with the professor?
This conversation ought to be of intense interest to the Society for U.S. Intellectual History because the transformation of the modes and manners of academic life is of necessity the transformation of American intellectual life as well, given the extent to which intellectuals are ensconced in academia.  Are we contemplating the digitization of intellectuals, as well?  More directly, as the university potentially becomes transformed into something new and quite different, becoming less and less a center for social scientific and humanistic inquiry (a theme of the concluding plenary session at the Third Annual U.S. Intellectual History Conference in 2010, entitled “Intellectual History for What?”), what is the fate of the ancillary features of academic life, such as professional societies?[2]  What is the role for a new Society such as our own at a time of such convulsive change?
            In one sense, S-USIH is a product of this transformation, born out of a listserv discussion, defined by an award-winning blog, present on social media (Facebook, Twitter), and existing organizationally, in many ways, because of the internet as a mode of communication.  S-USIH embodies the new-fangled means of connectivity.  One of our major efforts has been creating a new web portal for the Society ( http://s-usih.org/), a project that Secretary Ray Haberski has been shepherding to completion.  The new site will incorporate the existing U.S. Intellectual History blog and allow for a new range of connections and opportunities.  Our conferences have been organized and publicized via electronic forms of communication and publication.  This reflects one of the Society’s founding principles:  “Using all forms of media to reach broad audiences and engender vital debate and exchange of ideas.”
            At the same time, one of the key questions confronting our membership today (which should be over 100 members by the time you read this column) is the degree to which S-USIH should be defined by an academic “industry” in the midst of internal transition and at a historical moment when “academic” work (or “knowledge production,” scientific and otherwise) is shifting to new and proliferating venues—including the laboratories and campuses of technology companies, free-standing centers and think tanks of all sorts, and various online sites.  Is academia breaking down, and will (and ought) professional societies do the same?  The question is particularly acute as we consider another big agenda item:  Ought the Society to create a new academic journal devoted to U.S. intellectual history, and should it appear in print and be produced by an academic press along the lines of previous such journals?
            The answer may be entwined with another question:  Will S-USIH breathe new life into an old form, or ought it to be something else entirely?  In seeking to “advance the historical study of American thought among academic and non-academic scholars,” we need to think seriously about the degree to which twenty-first-century internet and digital communications, the further industrialization of higher education, and the de-funding of public education might actually be disintegrating the academy as we know it.  The fact that many of those who have created S-USIH have been unable to find stable academic jobs adds poignancy and urgency to the inquiry.  Are organizations like S-USIH going to become the places where intellectual life takes root?
            As a reflexive and obdurate traditionalist in many ways (who has spent part of my academic career studying even more cranky versions of this type), I find it puzzling to be in the position of advocating a move away from the red-brick, ivy-covered college of my imagination.  I can see, however, that I neither work in such a setting nor do I imagine that it exists in many places anymore.  Though I will clutter my home with cloth- and paperbound books to my dying day, we need to think of how we preserve the essential goals of academic professional societies of the past for the future:  Creating communities of scholars that are real and durable and fostering scholarship based on respectful but rigorous standards of self-criticism.  We think we are doing this by leveraging the internet in a way that brings us together as scholars in conferences, by fostering a widely read blog that reaches beyond academia, and, now, too, by creating such things at the Annual Book Award (the first of which will be awarded next year) that allow us to celebrate accomplished scholarship.  We have further goals, to embrace the full diversity of intellectual life in our past and broaden and make more inclusive the dialogue about that past in the present.  In this, we try to do what professional societies of the past have done.  However, professional societies have also served as adjuncts to the universities, helping in the credentialing process, designing and erecting barriers to entry into the field, laying out norms, channeling discussions into academic forums, and patrolling the borders of what is legitimate (and, in this way, achieving legitimacy for a specific field).  Are these latter tasks fruitful any longer?
            We continue as an organization devoted to revitalizing the subfield of U.S. intellectual history in the broader field of American history and in the discipline as whole.  We are, however, a nimble and creative newcomer to this task, arriving at a moment when internet portals, such as we are creating, may become the more democratic, more accessible, and dynamic sites for scholarship about the American past.  It may well be that our first task will be to salvage the discipline and its attendant community and traditions from the increasingly crisis-ridden and compromised academic system in which they have heretofore been nurtured. 




[1]E. P. Thompson, The Making of the English Working Class (New York:  Vintage, 1966), 301.
[2]See Sarah Leonard, “Intellectual History for What?” The New Inquiry, Nov. 9, 2010 http://thenewinquiry.com/features/intellectual-history-for-what/.  

Kamis, 09 Agustus 2012

James Livingston's Comedy of History

[Today I offer you a guest post from long-time USIH blog and conference participant, William "Bill" Fine. Enjoy! - TL]

In reviewing Jim Livingston’s recent USIH articles and a few posts in his Politics and Letters blog* I’m persuaded that he speaks out of a deeply personal desire to advance a view of history that supports present and future hopes---in spite of his assertion that the priority must be on “going back” to recover historical contexts. [“Museum”]

Livingston might be seen to contradict himself in declaring that the past both exists and does not exist, or that history can be a guide and yet be open to whatever meanings we might project onto it. For him, knowledge of reality is mediated by language and other cultural forms, it’s “what we can act upon,” so “the past as such [is] indistinguishable from what we have said about it.” At the same time, he makes numerous truth-claims. For example, he’s annoyed that Paul Murphy allegedly didn’t practice sufficient “estrangement” from the ‘20s context, on the assumption that it’s a foreign world. But the warrant for that truism isn’t obvious. It’s not that Murphy draws lessons from the past, it’s that he misunderstands its relation to our “different” historical moment, and so draws the wrong lessons. Maybe we can’t understand ‘20s intellectuals without “going back,” but we have to understand them before we can assume the need to do so. [“Museum”]

Both Livingston and his interlocutors soften contradictions into paradoxes that seem to capture the multi-dimensional character of historical practice. Two of Livingston’s best are: “You can’t ‘go back’ to the past without the conviction that it’s always already different from the present, but when you feel you’re just catching up with something from the past, that difference is erased”; and “the past is both real and artificial. Like God, it’s consequential because we created it. The past is what we make of it…but we make it from these raw materials, and meanwhile they make us.” [“Museum”]


Ray Haberski generously finds in the latter a way “around” the either/or of determinism versus freedom, while Dan Wickberg opines that inconsistency is a matter of perception, and at least our modern critical tools protect us from the temptations of totalism. [“Museum”] Along similar lines, William Cronon, in “Loving History,” in Perspectives on History, sees “two fundamentally competing orientations for approaching history”---summarized as "how did things get to be this way?" and “the past is a foreign country.” But he concludes that, if history is to retain its attraction for professionals and others, “we need them both.” What Livingston said of the Rutgers conference on W.A. Williams into might apply here: “there was too much consensus, too little conflict.”

We could shift from logic to rhetoric, to see where and how Livingston deploys points that otherwise might be thought irreconcilable. Sometimes, in criticizing another historian, such as Paul Murphy, or advancing his own positions, he implies a fairly certain knowledge, saying that some views “deny reality,” while others accord with “the world as it actually exists.” There is no particular reason not to take him at his word, as others do in responding to his truth-claims. But at other junctures, to defend against the charge he isn’t a proper historian, he adopts a different rhetoric, asserting at one point that there’s no “practical difference between history, the past as such, and historiography, our interpretations of the past, because all we actually know about the past is contained in those interpretations.” [“Williams”]

The past has a number of discrete though related meanings in his work. Sometimes it’s the almost infinitely plastic material we shape in acts of knowing, as if the very idea of representation implies a closed reality. At other times the past possesses “legibility,” a metaphor he uses repeatedly. The term implies something pre-constituted as text-like, decipherable by historical actors and historians. So, for ‘20s intellectuals, the divide between traditional and modern “became legible…as fundamental change in the meanings of work, labor, and necessity.” [“Museum”] In his article on W.A. Williams, “historical circumstances” themselves are described as directly legible, as are both the “original intent” of the Open Door and the immanence of our “ethical principles” in that setting.

On the other hand, the past is presented as virtually created and cumulatively enriched through narrativizing. In his blog, discussing Edward St. Aubyn’s “Patrick Melrose” novels dealing with the trauma of a main character raped by his father, Livingston criticizes those who say it’s a denial of real suffering to read such events as only “fantasy,” since in themselves they are “meaningless.” [“Pol and Letters” - 5.2.12]. Some might insist instead that human events are by definition meaningful, though it’s not clear how much difference it makes -- to the traumatized, for instance---whether stories recount things that actually occurred. Still, if events lack their own legibility, in what sense can they be “falsified” by narratives, except by contrast to some other unnamed device? In any case, here narratives produced over time by different actors accumulate richer meanings, redeeming traumatic pasts, rendering events “significant…as moments in a meaningful sequence,” a story that can be retold. [“Politics and Letters” - 5.18.12]

Livingston’s freewheeling constructionism implies an ontological and epistemological pluralism that stands over against what he calls “metaphysical” accounts that presume an objective, realist history, a “Totalism” or “Absolute”; monolithic, deterministic, and typically declensionist. As he puts it at one point, “the truth of historical reality is always plural. The rhetorics are the reality.” [“Museum”] Perhaps he tends to universalize the interpretive chaos of our historical moment; either way, that wouldn’t prove we don’t live in a block universe after all.

Sometimes he goes further, insisting that good history confirms an emancipatory potential immanent in the historical process, and that history tends toward the good---or will if we think and act properly. For example, in his article on Williams, Livingston writes approvingly that:

“Williams assumed that a post-imperialist future was legible in the original intent of what he called the “imperialism of idealism” and in the real differences over diplomatic means and ends…. Without that assumption,…his critique…stops making sense because its fulfillment would then require the evasion of the world as it actually exists [which] would not allow for history as a way of learning. If the ethical principles of a post-imperialist future do not reside in the historical circumstances…then our only honorable recourse is to repudiate and escape those circumstances….”

Later, in the same essay, he generalizes what’s at stake:

“When people start wishing that things were better and assuring you that they won’t be, largely because nothing has ever changed except for the worse, you know that you’re in the presence of Kantian radicals who lack, or despise, historical consciousness---you’re in the presence of metaphysicians who know the past cannot be a guide to the present because it is the repository of myths, lies, deceptions, and their attendant moral atrocities.”

Livingston is fond of drawing a contrast between Kant and Hegel [though he seems to mention the former by name more than the latter], which he understands in part as the contrast between a morality that stands apart from and tries to impose itself on the world, and one that sees morality to “reside in and flow from historical circumstances,” as he wrote in his “Response to Cotkin.” [JHI 69, 1, April 2008] This may be taken as a descriptive sociological point---and Livingston among others have shown how “the social self” developed out of the interaction of Hegelian idealism and pragmatism, along with feminism. But to me this is quite different than using it to build a philosophy of history.

In “Socialism Without Socialists” [“Politics and Letters” - 7.25.12] Livingston challenges the pessimists of the left, using the terms of Kant vs Hegel to frame a stark choice: one embraces the present, recognizes that the left is everywhere, and “sees the ethical principles of socialism residing in and flowing from historical circumstances…not as ideas imposed from elsewhere,” partly because cultural has replaced political change. The other, from a fixed point of moralistic purity, gives up on a hopeless world, taking refuge in a saving remnant with no one to save, “leading us away from the world as it is.”

Whatever case might be made for Livingston’s argument that the left has been victorious, here he gives full-out expression to a historical teleology almost beyond intention or agency, a progressive conductorless orchestration that “recognize[s] what we ought to be doing in what we’re already doing.” [cf. his quotation of an early, still very Hegelian Dewey in the Williams article] Unless we accept these ideas, we’re denying reality and destroying all hope and possibility; the study of the past must affirm what present circumstances demand.

At the outset I speculated that Livingston is motivated by “deeply personal desire,” which of course goes far beyond what I could know, much less demonstrate, and it’s not terribly relevant anyway in assessing his work. I’m not sure how much is accountable as an act of will in depressing times, and how much flows from a secure confidence in his comic-religious, metaphysical vision. Because he so often frames his arguments in terms of the either/or of hope vs. despair, my guess is that there’s a good bit of trying in his believing. His efforts to show empirically that left-liberalism has “won” notwithstanding, my guess is that most people will see it as a framework brought to history---a leap of sorts---rather than supported by it. Indeed, if he is as much of a Hegelian as I think, it’s not easy to see how it ever could be, partly because we’re not done yet.

Anyway, seeing what comedy sometimes makes of irony, I miss a sense of the obdurate, of difficult trade-offs, the complexities of unintended consequences, the limitations of narrative----and a sense of the past as/in the present, the world to which we accommodate ourselves. I’m inclined to read his comic teleology as more a reflex of its historical moment than a way beyond.

 ---------------------------
 * I draw on the following sources and indicate them following quotations:

 USIH

- “Near Dark at the Museum,” 7/12/12; and
- “William Appleman Williams: Fifty Years After His Book on the Tragedy of American Diplomacy,” 7/19/12.

Politics and Letters

a. “The Weight of the Past,” Part 1 (4/22/12), Part 2 (5/2/12), and Part 3 (5/18/12);

b.  “Socialism Without Socialists, or, What’s the Matter With Leftists?” 7/25/12.

Rabu, 11 Juli 2012

Near Dark at the Museum

by James Livingston

Five years ago I wrote a piece on Richard Hofstadter for boundary 2, the literary journal edited by Paul Bove out of the University of Pittsburgh.  The occasion was David S. Brown’s strangely reductive biography. Here’s how the thing began in draft.

The cultural function of the modern historian is to teach us how to learn from people with whom we differ due to historical circumstances (and these circumstances include the range of ideological commitments they can profess with plausibility).  We “go back” to the people of the past in the hope of changing our perspectives on the present and thus multiplying our choices about the future. 

But these people with whom we differ, and from whom we must learn, are, to begin with, other historians; for we can’t peek around our corner of the present as if they aren’t there, standing between us and the archive, telling us how to approach it.

No one gets to the “primary sources,” whether they’re constituted as the historical record or the literary canon, without going through the priests, scribes, librarians, professors, critics—the professionals—who created them in retrospect, in view of their own intellectual obligations and political purposes.  In this sense, history is not the past as such, just as the canon is not literature as such; it’s the ongoing argument between historians, among others, about what qualifies as an event, a document, an epoch.  It’s the endless argument about what the future holds; for the form and content of the past matter only to those with political commitments in the present, and so to the future.

Richard Hofstadter understood these obvious yet awkward facts better than anyone of his generation, even better, I think, than William Appleman Williams or Eugene D. Genovese or C. Vann Woodward, three great scholars whose published works had improbably profound political consequences in the 1950s, 1960s, and 1970s.  “Historians do not have direct access to their subjects,” as he put it in 1956.  So we don’t have to “go back” very far to appreciate Hofstadter’s lasting effects on American intellectual life.  Indeed I would suggest that we’re just now catching up with him. 
I borrowed these metaphors of going back and catching up from Lewis Mumford’s book on Herman Melville, which was published in 1929.  It’s not actually a biography: the author consciously omitted quotation marks around his subject’s utterance, so there’s no way to tell where Mumford leaves off and Melville begins.  You could of course read this book as a therapeutic answer to Van Wyck Brooks’s unfinished biography of Emerson—the writing of which drove Brooks mad.  But no matter how you read it, these metaphors are unequally useful.  You can’t “go back” to the past without the conviction that it’s always already different from the present, but when you feel you’re just catching up with something from the past, that difference is erased. 

The historian’s obligations, as I understand them, are unequal in the same sense.  I can demonstrate to you how we finally caught up to Hofstadter—I can show you how fresh and immediate his methods and insights remain—but my prior task is to lead you across the historical gap that separates us from him, regardless of how close he might appear to us in chronological time or in political sensibility.  If I don’t get this (hermeneutical) priority straight, if I don’t explain why and how we have to “go back,” I’m ignoring or denying the difference between the past and the present, not to mention our more local differences with Hofstadter as historians—I’m treating him as Mumford treated Melville, as a contemporary.  In doing so, I might be producing extraordinary insights, as Mumford did in ventriloquizing Melville, but I’m not doing History, because I’m overlooking the distinction between meaning and significance (Skinner), meanwhile assuming that understanding and explanation are equivalents (White).  Or put it this way: I’m doing History in the antiquarian mode, acting as if the continuity between past and present is a given—it’s normal or natural—rather than a mere possibility that has to be both produced and proven (Nietzsche, Foucault).

So what?  For many years now, my colleagues have told me that I’m not doing History, anyway—not even the history of ideas.  Their definition of my deviance takes two venerable forms.  On the one hand, they say that I’m doing Theory or Philosophy (of History?), on the grounds, as I understand them, that feminist theory or pragmatist philosophy can’t be both the method and the object, the means and the ends, of intellectual enterprise; or rather, they can be, but the result won’t be History.  On the other hand, they say that what I’ve written about pragmatism, particularly William James and John Dewey, is “fanciful” and “imaginative,” which means either that it lacks a known referent—admissible textual evidence—or that it exceeds the permissible, disciplinary boundaries of historical interpretation by favoring the sublime over the beautiful (Kloppenberg, Westbrook). 

Either way, the colleagues are saying that what I’ve written isn’t based on the facts, documents, and events that they know are relevant to the understanding (not explanation) of the subject at hand, whether that is pragmatism or feminism or corporate capitalism.  It’s instead overdetermined by an unseemly politics of interpretation which allows me to say that pragmatism and feminism were crucial ways of comprehending the transition from proprietary to corporate capitalism as a social, political, and intellectual opportunity—as the opening act of a comedy, not the final scene of a tragedy (White, Burke).

_______________

Why, then, do I insist that I’ve been doing History all along?  Because I don’t see any practical difference between the past as such and what historians (and others) have said about it.  You know the refrain: “Of course interpretations of the past have changed, but not the past itself.”  (E. H. Carr naturalized this notion with his metaphor of the mountain that can be approached from many angles but never changes its shape.)   To which I say, really?  How would you know?  How would anybody?

Yes, there must be unknown moments from the past that still affect us—but again, how would we know without hiring a metahistorical psychoanalyst who was prepared to reveal this dreamwork to us?  The content of our thinking, no matter the object of our scrutiny, is not determined and cannot be known until it takes concrete social form, in language (signs) of some kind.  The content of our thinking about the past is no exception to this rule.  So why would we insist that the past exists apart from our thinking, writing, and teaching about it?

You might answer by saying that even before historians of class, gender, race, and sexuality built out the archive with new documents dredged from abandoned grottoes, subaltern groups were shaping the past from below, always already making history: workers did this, women did that, slaves and freedmen did it too, and eventually homosexuals got in the act.  The relevant facts were always there, right in front of us, we just didn’t notice them.  In short, the past didn’t change, our interpretations did.

Bullshit.  The past changed because the facts changed because the world changed.  You can put these nouns in any sequence you like.  The past in question here began changing for good in the 1950s, when the whole history of Reconstruction had to be revisited (“revised”) because the NAACP got the attention of the Supreme Court and the Montgomery Improvement Association meanwhile got the attention of everybody else.  When black folk became visible historical agents by organizing consumer boycotts, denouncing apartheid, and demanding the right to vote, the political past looked different, and so did the future, or rather the past looked different because the political future did.  In fact, the past was, suddenly, different.

Now the past I invoke is like any other reality, it’s what we can act upon—it’s what we can take for granted because we have decided on its scope and limits, or it’s what we have accepted without thinking, as an unspoken but effective grammar.  When we say that the reality has changed, however, we typically mean that we have changed it, and we mean this because we’re modern individuals.  The scientific revolution of the 16th and 17th centuries taught us what the Reformation did, that the condition of certainty in knowledge was experiment, in the sense that you had to manipulate objects in a purposeful manner if you were to produce the truth.  You couldn’t posit a prime mover and proceed logically from that premise to your conclusion about, say, planetary motion or salvation, you had instead to approximate the motion of bodies in space by miniaturizing and measuring them in a laboratory—or you had to claim that the Kingdom of God could be established here and now, on this earth and in these times.  Philosophers had interpreted the world differently.  Scientists and Protestants changed it.

I’m neither a scientist nor a Protestant, not anymore, and I can’t imagine how anybody would teach History as a science—actually, I can, because I served as Paul Kleppner’s TA for a year—but the point remains: and yet it moves.  The past as such, being indistinguishable from what we have said about it, changes to the precise extent that what we say about it changes.  This actionable object of knowledge—this reality—is as malleable as the mountains of West Virginia.

_________________ 

It follows, I think, that we cannot reproduce the past in any meaningful sense, at least not in good faith.  As both Werner Heisenberg and Elton Mayo discovered in the 1920s, observation is participation.  We’re acting on the past, changing the reality, whenever and however we write it up, because we enter from a world elsewhere—either we “go back”  in time or we acknowledge our separation in social space.

Paul Murphy’s book mystified and irritated me, then, because the writing exhibits no temporal or any other kind of estrangement from the historical moment we call the 1920s.  We don’t have to “go back” to understand this decade (and why do we follow the US Census in deciding on credible historical periods?), he assumes, because we’re in the same place—us intellectuals are unduly alienated from the masses, just like Christopher Lasch and Richard Rorty and Nelson Lichtenstein said we were!  Once upon a time, before and after the 1920s, we weren’t, but our fallen state is not permanent.  We don’t have to stay all modernist, we too can make an “organic” connection to a usable past!

In this excruciating, exhortative respect, Murphy’s book reads like a reproduction of the present—in exactly the same way Mumford’s “biography” of Melville did.  But these writers, artists, and intellectuals of the 1920s are not our contemporaries, no matter how close they might feel and sound.  For every one of them, the novelty of the New Era was determined by its radical break from the past, and this break became legible, for every one of them, as fundamental change in the meanings of work, labor, and necessity.  They created new connections to the past, which is to say they recreated the past as such, because they knew it had been whirled away.  Unlike them, we can take that fundamental change for granted, so an explanation of it is in order—we’ll never understand them, or for that matter ourselves, without such an explanation, without “going back” to where they stood.

________________

My colleague Jackson Lears has recently written an important historiographical intervention that intersects with what I’ve been arguing here.  The piece is called “The Trigger of History”; it was published in the Spring 2012 issue of The Hedgehog Review

Lears wants out of the “master narrative of modernity” because he insists we can claim the future by extricating ourselves from the story told there—it is only by changing the very idea of the past, he explains, that we can hope to make a future worth inhabiting.

That master narrative was of course codified by Marx, Weber, and Freud, who were able to show, sometimes without intending to, that capitalism and modernity are pretty much the same thing, because they share a “common future orientation and a common commitment to endless dynamism.”  In explaining this affinity or linkage, the masters sought a trigger of history, “a prime mover that would explain that quickening pace, that forward thrust, wherever it occurred”—something that expelled pre-modern people from their supposedly idyllic, naturally static lives, and that meanwhile started the train of “linear progress” toward the neoliberal wreck of our time.

Lears deftly criticizes all three of the masters, but he doesn’t blame them for the bleak techno-determinism their clerical epigoni have drawn from the original texts.  In fact, he suggests that the trigger metaphor can be “a weapon in the fight against determinism” because it lets us identify who pulls it under real historical circumstances, when genuine choices are available to real human beings.  And he evades the charge of nostalgia by insisting that longing for the good old days is like boarding the train of linear progress on its return trip—it puts you back on the same track first laid by the master narrative of modernity.

Get off the train, Lears urges us.  To do so, he says, is to develop a “politics of place,” which avoids an “oppressive linearity” by sidestepping not the condition of modernity but the master narratives that have convinced us of its ubiquity and inevitability.  Practically speaking, however, the necessary move is rhetorical, because the future changes only insofar as we are able to change the past, rewrite the story, and act upon it accordingly.  Our ability in this regard depends, though, on how seriously we can take the exiles and “off-modernists” among us, mainly artists like Bruno Schulz, Joseph Cornell, Walter Benjamin, Vladimir Nabokov, but then again the small farmers of Egypt who opted out of the increasingly neoliberal agricultural regime of the late 20th-century—all of them experts in getting out of the way of History.

My only objection to the argument is the how part, and this is where I’ll try, finally, to get myself in some trouble.  I agree, we have to change the past if we expect to change the future—and vice versa—but, speaking of the people with their fingers on the triggers, who can afford to ignore both by evacuating the present?  Apart from the 1%, who has enough resources to give up the gun and step off the train?  OK, those small Egyptian farmers did because they could feed themselves.  Can a working stiff, an administrative assistant, or a middle manager in the US do that?

“Determinism, in short, denies history,” Lears declares.  Really?  Are you are now free to choose your destiny because today is the first day of the rest of your life?  Or does the sequence work rather differently, like this: either we acknowledge that the past weighs like a nightmare on the brain of the living, shaping and determining us in the present, forcing us back to the primal scene of our crimes, making us change our stories, or we pretend that we’re weightless, free of all ties and obligations to this hollow shell we call the past, able to sample it at our leisure as if it’s a song on the iPod? 

Isn’t the question we’re asking as historians more difficult, more basic: how to refuse this either/or choice?

The past is both real and artificial.  Like God, it’s consequential because we created it.  The past is what we make of it—it’s narratives all the way down—but we make it from these raw materials, and meanwhile they make us. 

What the Hell

A few weeks ago, this blog published Jim Livingston's savage critique of Paul Murphy's new book.  The critique was part of a roundtable organized by Tim Lacy, our outgoing book review editor.  I am our current book review editor.  When I read Livingston's piece, I thought, What the hell have I gotten myself into?

I wasn't just wondering about the book review editing gig, or doubting the wisdom of the blogging gig (from which I was taking a working break).  What I really had in mind was the entire field and practice of U.S. intellectual history. 

That review scared me to death.  It was a hit job; I mean, it was a frickin' crime scene.  I was stunned, shocked into silence -- sure that I had witnessed something awful and wrong, but afraid to say a thing about it publicly.  I did write Jim Livingston privately about it, but I didn't say a thing on this blog. 

Nobody did.  The whole day that review just sat there, drawing traffic like crazy, and generating who knows what back-channel conversations -- but eliciting not a single remark from any of us, all day long.  Then at three in the morning on the day after the essay went up, a blog commenter I hadn't seen in this space before remarked (in a way) on our collective spectatorial silence, but said nothing about the spectacle itself:  a senior historian practically eviscerating a junior scholar.

Finally, the day after the review went up, Dan Wickberg took Livingston on.  I saw Wickberg's name in the sidebar, and I thought, "Thank God; we're off the hook.  The Cavalry has arrived."  If anybody exemplifies both intellectual rigor and professional comity, it is Dan Wickberg. If anybody were to speak with unquestionable authority and unwavering collegiality, I knew that Dan would be the one to do it.

Then I came to the closing lines of his comment:

"Instead of farting in the museum, give us a careful and considered argument. We're all adults here; I think we can handle it."

I could not believe my eyes.  Farting in the museum?  Dan Wickberg wrote this?

What the hell!


But of course, when I re-read the whole comment, it made perfect sense, and I could see exactly what Wickberg was trying to do.  He was basically asserting that Livingston's critique amounted to a bit of adolescent behavior meant to shock the priggish, stuffy, dusty old traditional profession of intellectual history.  By starting his comment with an offhanded "oh, hell," and ending it with a nonchalant reference to "farting in the museum" (something, alas, I will never be able to un-read), Wickberg was attempting to demonstrate that if Livingston's "screed" is falling on deaf ears, it's not because historians are scandalized by his potty-mouthed language.  It's because he's not making a convincing argument.

But if you don't bring a knife to a gun fight, I guess you don't bring a fart to a shit-slinging contest.  Livingston's reply to Wickberg's charge of "farting in the museum" was an affectionate and admiring "fuck you."

"What the hell," I thought.  "He did NOT just say that."

It's not that I was scandalized or even surprised by Livingston's reply.  (It's not that I wasn't ever so slightly amused by it, either.)  Instead, I was disappointed. I wanted to hear the argument that Dan had invited him to make.

So I wrote Jim and told him that he blew it; he missed an opportunity to make his case to someone who represents the profession in a way that is open to critique and willing to take Livingston's ideas seriously. 

"You have to admit," I wrote, "he rolled out the red carpet for you to deliver a damning indictment of the whole discipline.  A scathing Jeremiad.  But you have too much of the Prophet Ezekiel about you today.  But whenever you're ready to write a guest post, let me know."

I am very pleased to say that Jim Livingston took me up on my offer.  He has written a thoughtful essay that is predictably provocative, but perhaps profitably so as well. 

It's not a scathing jeremiad, nor is it the cryptic musing of a prophet too easily mistaken for a madman.  In this essay, which will be published here on the blog on Thursday, Livingston revisits and to a certain extent revises his earlier critique of Paul Murphy's book.  He does so as part of an insightful meditation on the practice and purpose of doing history.  Indeed, to call this piece a "historiographic essay" doesn't quite do it justice.  Such a description gets at the genre but not the gist, the type but not the tone.  Livingston's essay is historiographical, and then some -- it is an inescapably elegiac homage to the historical profession, a profession that must not be content with writing elegies.

Jumat, 29 Juni 2012

Round Table on Murphy's The New Era: Final Entry--Murphy Replies


 By Paul Murphy

When anyone begins a review of your book by suggesting it is “likable and usable,” you had better start worrying, and if the reviewer isJames Livingston, start sweating.  It did not take for the other shoe to drop, although the shoe drops from a very peculiar direction.  Livingston emits a heartfelt cri de coeur.  The book may be “good, even brilliant,” but it remains merely a work of history, an attempt to “reproduce” the past, which is, in Livingston’s eyes, an obsolete and now meaningless preoccupation.  “This shit doesn’t matter anymore!”—truly a bracing line to encounter near the end of a book review.  I am indebted to all of the participants in this USIH roundtable on my book.  It is good to have Livingston as a reviewer; he is a corrective to the positive reviews by Lynn Dumenil and Kristoffer Shieldshttp://us-intellectual-history.blogspot.com/2012/06/round-table-on-murphys-new-era-entry-2.html, whom I am sure are overly generous.  They praise the book as a clear and straightforward account of American intellectual and cultural life in the 1920s, precisely the qualities Livingston finds so hopeless.  He raises the question, “What is the point of this book?,” which is a good one.  The book is a text for students priced at$80.00.  Not for the casual buyer, to say the least.  It does seem wrong to force such a traditional work of History on students if, as Livingston would have it, our goal should be to “get over it.”  So what?  Who cares?  Exactly.  It is a question I often asked myself. 

Livingston finds the book merely descriptive; it does not explain anything and imports the present concerns of the author into its description of past actors.  The book has no answers to the reviewer’s questions about the 1920s.  It contains, he notes with exasperation, “just fair and balanced reporting” (in the Fox News sense of “fair and balanced,” I presume):  “How is that even possible unless the author aspires to be the writer of a textbook, or unless he assumes that the distance between the college boys and the proles—sorry, the gap between intellectuals and the masses—can’t be crossed?”   (I am not sure what is impossible—that students would be able to decide for themselves whether the Harlem Renaissance was a failure or success?)  Livingston’s comment recalled for me a recent review-essay of three new books on late twentieth-century American cultural and intellectual life by Neil Jumonville, including Livingston’s new book (The World Turned Inside Out:  American Thought and Culture at the End of the 20th Century, a more dazzling exercise in cultural analysis by far) as well as Colin Harrison’s American Culture in the 1990s and Daniel T. Rodgers’s Age of Fracture. [1]  Jumonville classifies Harrison and Livingston’s books as “textbooks” and Rodgers’s as a “regular work of history.”  Later in the review, he allows that Livingston’s book is “too spirited to be kept inside the usual flat narrative corral that most textbooks inhabit.  The reader will find the story bucking its way over the fence before long and into the spot on the bookshelf where the regular books reside” (156).  Well, Livingston does buck a lot, so perhaps this is fair.  I suspect, however, that Livingston is unhappy to have been promoted by Jumonville to the status of a “regular.” 


As it is, I am happy to be classed among the Irregulars.  I’ll stick in my corral; put me on the shelf with the other sad-sack textbook writers, near the floor and behind the waste basket.  I am reminded of another review:  I have a friend who paged through my book and merrily declared (approvingly I guess) that the book didn’t tell him anything he didn’t already know.  I have been hardened by years of teaching, and so I took it as a compliment.  The book is a textbook; it does purport to report some of the highlights of American intellectual and cultural life in the 1920s in a way useful for undergraduate and graduate students and maybe for the professor looking to compose a lecture.  If it provides an “imaginative framework for understanding the intellectual life of the 1920s,” as Dumenil suggests, or suggests perspectives that are “illuminating and thought-provoking,” as Shields declares, I’ll take the compliments, for I hoped it might do so.  It excludes too much.  Shields is right to highlight my failure to discuss Legal Realism in the 1920s.  That, and an almost complete omission of the economic thought of the time, are glaring and embarrassing omissions, driven, in part, by the dictates of space.  As Shields notes, my chosen framework—the self-discovery and self-definition of intellectuals as a distinctive class who were defined not only by their well-known alienation from their country but also by a relentless effort to use culture to move the nation closer to their ideals—may well encompass the Legal Realists in any case.  Dumenil quite fairly suggests that parts of the book do not always hang together; there are distinct chapters, about mass culture, for example, that lack the “connective tissue” needed to meld them into the larger argument about the gap-consciousness of the “strange new 20th-century stratum of purposefully superfluous individuals” (Livingston’s wonderful description of intellectuals).  The 1920s were vast—the task of fashioning a history of the period’s cultural and intellectual life, of selecting and ordering data, was challenging. One colleague suggested to me that I could use Peter Gay’s essay, Weimar Culture:  The Outsider as Insider (1968) as a model (a daunting one to be sure), but I opted for impurity, combining elements of an interpretative essay with the framework of a comprehensive text.  The results are sometimes ungainly and choppy. 

Livingston presents several provocative critiques of the work.  My definition of modernism is flawed because it presumes the intellectuals’ self-created categories of “intellectuals” and “masses” to be historically valid, when they were not.  Social classes were new at this time, he argues.  This may be so, but it was the intellectuals’ self-conception that I wanted to report.  Can I explain why twentieth-century, self-defined modernists alighted on this particular framework?  Perhaps not.  But they explained it to themselves in terms of the onrushing force of industrialization and commercial culture.  (Yes, the Southern Agrarians were haunted by the same demons as the cosmopolitan New York Intellectuals, and this is why they converged in the 1940s and 1950s, if not earlier.  Malcolm Cowley was best friends with Allen Tate and his crew and wrote parts of Exile’s Return in Tennessee.  They were all conservatives, just as they were all radicals.  The New Yorkers were Marxists; the southerners antimodernists.)  Livingston highlights technology, which I supposedly make a deus ex machine, in another example of my slipshod journalism.  Yet, I (and they, I think) focused on industrialization:  The modernist critics of the 1920s knew they were being industrialized, meaning efficient production was becoming the watchword and, yes, humanistic values risked replacement by imperatives to profit, standardization, and the rest.  Livingston has me presenting Mumford and his generation of critics as anti-technological, which they were not, he argues.  This may be the case, but my point was that they were distressed by the larger socioeconomic and cultural process of industrialization.  Likewise, Livingston presents the Harlem Renaissance as a great success because black intellectuals embraced mass culture.  True—of some, at least.  There were a lot of fights about this.  Yet, as George Hutchinson points out, the Renaissance intellectuals were embedded in the broader white critical discourse of their time.  They had time for Brooks, Mencken, and all the other Young Intellectuals, and the hybridized Renaissance was a result of this collaboration (a cultural movement “in black and white”). 

The focus on technology is a bit of a red herring. Livingston wants to find me obsolete, along with all of my pedantic, Irregular colleagues who write textbooks and not the “regular” books that should be attended to in our profession.  I give space to an irrelevant species of critic who, in Livingston’s view, failed because not embracing the “possibilities of modern technology,” as the successful leaders of the Harlem Renaissance did (even though, in fact, the Renaissance writers wrote poems and books and critical essays published in our soon-to-be obsolete codex form as well as the little magazines). 

There are, I think, some presentist anxieties coursing through Livingston’s review, which are absent from Dumenil and Shields.  Shields writes, “Our hope as historians is that by better understanding these gaps [the divisions caused by gender, region, race, etc.] and the intellectual and artistic responses to them, we can gain access to a better understanding of the cultural conflicts and evolutions of the decade.  This may in turn help us better understand the social and political clashes of the 20th century.”  Livingston writes:  “Isn’t it clear by now that finely wrought books like Paul Murphy’s are monuments to a comically Nietzschean will to believe—mere vestiges of the urge to make sense in the Present and of the Future by citation of the Past?”  They must have interesting seminars in the Rutgers University History Department!

I appreciate Shields’s suggestion of an alternate path into mid-twentieth-century modernism—and perhaps one that works better than my gap-consciousness—in the liberal modernists’ desperate focus on authenticity as the hoped-for basis of a renewed authority they knew they lacked but wanted to regain.  This is a good idea, and as I re-read portions of the book, references to authenticity began to jump out at me.

I fear that Livingston finds himself in the same boat.  After all, is he not he urging us to “get real” and quit being self-infatuated professors who simply tell students what happened in the past, as if that was relevant?  I think he may be in the gap.  He frets that the academic professor is irrelevant and, really beside the point.  There is more than a little Santayana here.  The gap between the intellectuals and the masses (“masses” is perhaps a poor term for all the rest of the people out there, existing nebulously in the intellectuals’ imagination of their audience), still exists, and Livingston yearns to bridge it.  His tone is postmodernist, but the anxieties seem much the same as those held by the 1920s generation of modernists and their mid-century epigones.  Academia became a redoubt for intellectuals in the 1940s and 1950s and remained so for a period of time, even for those employed in the emerging multiversities of the 1960s. As Livingston believes, the left seized the “commanding heights of higher education.” [2]  Now that is changing:  Industrialization is hitting the K-12 profession very hard now, as the forces of privatization grow more powerful and public commitment to funding education falters.  The rising waters are lapping at the shores of Academia.  We are about to be industrialized.  I agree with Shields.  As we fight the creative destruction wrought by “massive open online courses” and the like (Ted Babbitt, it should be noted, son of George F., was, like David Brooks and many others featured in the New York Times a fan of “home-study courses,” promulgated in his case by the Shortcut Educational Publishing Co. of Sandpit, Iowa [3]), it will repay us to “mind the gap.”

Livingston challenges me on the claim that, after 1920, “culture became the essential terrain of social and political action.” [4]  I made the same statement in a conference paper, and I remember Andrew Hartman asking me the connection between such a claim and the late twentieth-century phenomenon of the “culture wars,” which he is studying.  Livingston wants an explanation as to why this was the case.  Intellectuals at this time, Livingston argues, were not for or against things; the past is not a “sporting event.”  Yet, there were fights in the 1920s, big ones.  Many were “against” the New Humanists, the conservatives of the day.  To liberal modernists, the Humanists represented a profound threat to their own critical project, which was to wield culture as a tool against industrialization in all its manifestations.  Humanists wanted to control culture, too, but represented values the modernists bitterly opposed.  Why did the modernists of that era worry these issues?  Culture seemed the best tool they had to change society. 

If I could change one thing in the book now, it would be to clarify this point.  In the book, I spend much time reporting on the emerging anthropological notion of culture. [5]  The liberal modernists to whom I devote so much attention seem implicated in this new view.  I think this is an erroneous assumption.  One delight in reading Warren Susman’s work on the 1920s and 1930s is his profound and complex focus on the obsessions of this generation of intellectuals with culture, with communications technology, and with their relationship to both.  We all should re-read Susman. [6]  He had a better understanding of intellectuals’ conception of culture in the 1920s and 1930s (and it was his own, too, I think).  Since the 1960s, we have conceived culture as a source of empowerment, a resource available to the oppressed to resist the authority of the elite (and the source of that elite’s subtle hegemonic powers).  Culture is that which must be respected, for it validates personal identity—a complex system of symbols patterning behavior that somehow allows for personal agency by the mechanism of choice among its multifarious elements.  To mid-century modernist intellectuals, culture was a tool of a different sort, not merely a surrogate for contested moral values nor the anthropologists’ patterning of ideas and behaviors that define a society.  You did culture when you used ideas and images and symbols (images and symbols and myths were favorite words of academics of Susman’s generation and the social critics they studied) as levers to move the country in a different direction.  Culture was something intellectuals and artists used, and, in order to believe it effective, they had to assume that everyone was potentially under its influence.  There was a (potential) national mind, or “frame of reference”; better, the aim of intellectuals was precisely to create and impose such a thing.

We no longer think this way.  The “culture wars” of our times have been about the repudiation of such notions of homogeneity in the name of diversity and pluralism.  Perhaps there is a solution to Livingston’s dilemma in Susman’s analysis, however.  (Susman seems to me as much a hero of my book as Lewis Mumford.)  To Susman, writing history was doing culture.  History is part of culture, and historians inevitably create new forms of culture as they create their histories; history as culture and culture as history. [7]  As Susman famously noted, Walt Disney was as influential a force in history as FDR; the director of movie westerns, John Ford, was probably the most influential historian in the country. [8]  They certainly had wider platforms than a textbook, but Susman was not one to dismiss the cultural significance of any text, even the Irregular ones consigned to the bottom shelf.  Textbooks, given that we force our students to read them, may even be more influential than many others…and more revealing.

-------------------------------------------------------------
Notes
[1] Neil Jumonville, “Learn This Forward but Understand It Backward,” Journal of the History of Ideas, 73:1 (Jan. 2012), 146-62.  The books under review are Colin Harrison, American Culture in the 1990s (Edinburgh University Press, 2010); James Livingston, The World Turned Inside Out:  American Thought and Culture at the End of the 20th Century (Rowman & Littlefield, 2010); and Daniel T. Rodgers, Age of Fracture  (Harvard University Press, 2011).  

[2] Livingston, World Turned Inside Out, xv.

[3] Sinclair Lewis, Babbitt(1922; reprint, Bantam, 1998), 79-80.

[4] Paul V. Murphy, The New Era:  American Thought and Culture in the 1920s (Rowman & Littlefield, 2012), 10.

[5] This is a topic that has since received exhaustive analysis in John S. Gilkeson, Anthropologists and the Rediscovery of America, 1886-1965 (Cambridge University Press, 2010).

[6] Warren I. Susman, Culture as History:  The Transformation of American Society in the Twentieth Century (New York:  Pantheon, 1984).

[7] See Susman, Culture as History, xii-xiii, 3, 17-18, 185.

[8] Susman, Culture as History, 103, 197; Warren I. Susman, “Film and History:  Artifact and Experience,” Film & History, 15:2 (May 1985), 31.

Rabu, 27 Juni 2012

Round Table on Murphy's The New Era: Entry 3--James Livingston


Review of Paul V. Murphy’s The New Era: American Thought and Culture in the 1920s (Lanham: Rowman and Littlefield Publishers, Inc., 2012) ISBN: 9780742549258.  267 pages.

Reviewed by James Livingston
Rutgers University


This is a likable and usable book if you want to turn political slogans into periodizing devices—“Normalcy,” “New Era,” “New Deal,” “Cold War,” “Old Left,” “Third Way,” and so forth, as if journalism really is the First Draft of History, or rather, as if your task as an historian is to reproduce the past.  Or if you think undergraduates will read this book just because you’ve assigned it.  Otherwise you might have some objections to the project as such, and that project is History as we know it—as we do it. 

I will offer my objections in the form of questions.

Begin with the premise.  What is the point of this book?  What audience outside of USIH will pay attention, and why?  Don’t get me wrong, I’ve written a book for the same series—I don’t have any answers, my book is far less readable, less enjoyable, and less important than this one.  But I think it’s time we started asking the questions.  They are miniatures or components of the question we’re addressing when we contemplate the bleak future of the university, or, for that matter, the death of The Professor, the learned, solemn sidekick of the modern slapstick individual.  The Professor is of course the guy who explains shit after you experience it, or, as the straight man, while you’re at it.
 
Then, the very idea of modernism.  “However, modernism was, first and foremost, an internal discussion among artists and writers about their own precarious social status, which resulted from a loss of a vital connection between themselves and the masses.” Hello?  Modernism was what?  OK, there was that “fragile bridge” between the new working class and the new middle class before the Great War, but both parties to the bargain were just that—new.  You might want to argue that modernism was the intellectuals’ way of coming to terms with exile, but then you’d have to explain the phenomenon of intellectuals, this strange new 20th-century stratum of purposefully superfluous individuals, and you’d have to locate their country of origin, the place they fled or the people that expelled them.  You’d have to address Pound, Eliot, Yeats, and Stein, then explain why William Carlos Williams stayed home.

And speaking of gaps between the intellectuals and the masses, let Paul Murphy introduce you to technology as the deus ex machina we all recognize as a magic trick: “in a larger sense, the battle was over what force would shape change—industrial and technological progress or culture.”  Really?  Technology is not itself a social and cultural artifact?  Historians still get to ask questions as mystifying as this: “Would the nation be defined by purely industrial and commercial imperatives or humanistic ones?”  The favorite metaphor of writers, artists, and intellectuals in the 1920s (and after) was “the machine”—it regulates Middletown, among other landmarks of that decade.  But these writer, artists, and intellectuals weren’t for or against: this was not a sporting event, regardless of how much we would like to read contemporary intellectual contests into the circumstances of a formative moment from the past.  You can’t have it both ways, saying on the one hand that we’re still speaking their language and on the other that they didn’t quite get it.  Well, OK, you can, but sooner or later readers will notice the autobiographical—or is it Oedipal?—integument.

How can you say that “after the 1920s, culture became the essential terrain of social and political action” if you’re not willing to ask why—and then venture an explanation?  The intellectuals of that decade certainly did, Lewis Mumford and W.E.B. Du Bois among them, but their explanations were derived from studies of the political economy of their own time.  Like the young Herbert Marcuse, they didn’t think that technological progress was the enemy of artistic achievement; they thought instead that such progress would liberate us from necessary labor, and thus free us from what mutilated every imagination.  But however you define culture, for then or for now—them or us—you’d better be prepared to understand what they did.

Also, the Harlem Renaissance.  Are we still willing to “explain” its failure rather than interrogate the assumption that it did fail—because those rarified uptown intellectuals never connected to the masses inside or outside Manhattan?  Or are we now with Houston A. Baker, Jr., Ann Douglas, Cheryl Wall, and George Hutchinson, thus willing to say that, Randolph Bourne notwithstanding, the promise of American life was fulfilled in Harlem in the 1920s and 30s, not postponed by the Great War?  There’s no answer in this book, just fair and balanced reporting.  How is that even possible unless the author aspires to be the writer of a textbook, or unless he assumes that the distance between the college boys and the proles—sorry, the gap between intellectuals and the masses—can’t be crossed?     

And while we’re at it, now that we’re experiencing something, what about the huge differences between Mumford, the hero of the book, and almost everybody else (apart of course from Van Wyck Brooks, the mentor)?  Of course it’s true that Mumford, Brooks, and the “Young Americans”—Waldo Frank, Harold Stearns, Paul Rosenfeld, Floyd Dell, among many others—sought an “organic” linkage to a usable past that would enable an inhabitable future, and that in doing so they were trying to ground their criticisms of the present in a cultural tradition, as Warren Susman and Martin Sklar convincingly argued long ago.  Who didn’t seek this grounding then?  Who doesn’t now?

The sorry fact is that Mumford’s usable past, not to mention those conjured by Brooks, Frank, and Dell, had more in common with the Agrarians—and with T. S. Eliot’s notion of civilization—than with the leaders of the Harlem Renaissance, who, by and large, embraced the possibilities of modern technology (oops) and, as a result, were better able to discover and appreciate their “roots” in antebellum America—and that would be the historical moment of slavery’s apogee—than the young intellectuals who gathered around the little magazines.  Mumford couldn’t find anything worth admiring in American culture after 1860, and so, like Alan Tate, William Yandell Elliott, Robert Penn Warren, and other luminaries of the very Old School convened at Vanderbilt, he valorized the 1850s, the “Golden Day” when Emerson, Thoreau, Whitman, Hawthorne, and Melville created the terms of literary debate.

So what?  Why object to such a good, even brilliant book?  Who cares?  I’ll play The Professor, you go ahead and play along, answer at will, and get as angry as you want. 

This shit doesn’t matter anymore!  Isn’t it clear by now that finely wrought books like Paul Murphy’s are monuments to a comically Nietzschean will to believe—mere vestiges of the urge to make sense in the Present and of the Future by citation of the Past?  Aren’t these books just oddly-shaped things that are soon to be placed in dioramas, alongside frogs and other endangered amphibians?  Or will they always exist as the material evidence of a deeper urge to get tenure—live forever and all that—which, as we all well know, is already a thing of the Past (not the urge, the thing itself)? 

What follows?  I don’t know, that’s not in my script, no matter how many times we’ve congratulated ourselves, as professional historians, for saying that the Future can’t be navigated without some map of the Past.  What a joke!  For now, I know that the congratulations are not yet in order, and that the joke is on us.  For too long, we have been merely reproducing the Past, as if we were well-educated preservationists with the right quotation.  It’s time we learned how to get over it, and told our fellow citizens why we did.

Selasa, 26 Juni 2012

Round Table on Murphy's The New Era: Entry 2--Kristoffer Shields

Review of Paul V. Murphy’s The New Era: American Thought and Culture in the 1920s (Lanham: Rowman and Littlefield Publishers, Inc., 2012). ISBN: 9780742549258. 267 pages.

Reviewed by Kristoffer Shields, PhD Candidate
Rutgers University

Mind the Gap


Too often in the high school and university U.S. History surveys of the past, the 1920s was taught as a “gap” decade: the gap between WWI and the Great Depression, between the Progressives and the New Deal, between the Volstead Act and the end of Prohibition. The time period merited little more than a chapter in a textbook, usually titled “The Jazz Age,” “The Roaring Twenties,” or, coincidentally (or perhaps not so coincidentally), “The New Era.” Thankfully, more recent historians, picking up on the work of William Leuchtenberg (particularly in his 1958 book The Perils of Prosperity), have largely corrected this. These historians, led by Lynn Dumenil, present the 1920s as a significant turning point in American thought and culture and “work…with the premise that the decade of the 1920s illuminates fundamental issues of the 20th century.”[1] Paul Murphy clearly agrees and continues this project in his comprehensive and engaging new look at the intellectuals and intellectualism of the ‘20s, The New Era: American Thought and Culture in the 1920s.

In fact, Murphy in a sense brings the era full circle, assigning a whole new meaning to the term “gap decade.” He presents a framework that understands the decade as defined, in fact, by cultural gaps. Murphy recognizes and acknowledges the regional, class, gender, racial, and country/city gaps that existed, but he focuses on a different cultural gap—that between the growing mass culture and the intellectuals and artists attempting to guide, shape, and explain it. He describes a type of intellectual crisis, in which social commentators and elites struggled to come to terms with their changing roles in a rapidly changing cultural landscape. Murphy writes, “However, first and foremost, modernism was an internal discussion among artists and writers about their own precarious social statuses, which resulted from a loss of a vital connection between themselves and the masses” (5). In particular, many of the intellectuals and artists Murphy discusses were concerned about the deleterious effects they saw in the growth of mass culture. “It was this hearty embrace of a commercialized mass culture, with its often cheap character and the seemingly crude and superficial pleasures it provided, that troubled intellectuals and widened the gap between them and the mass public” (12).


It is too simple to simply say that these elites were horrified by mass culture, though; they were also drawn to it. This was in part for practical reasons: “While modernist impulses set many American intellectuals and artists in the 1920s apart from society, the aspiration to become arbiters of cultural change promoted connection” (7). Many of these intellectuals displayed a clear desire to understand the new culture and connect with it. The gap mattered to them not just because they needed to bridge it but also because they wanted to understand it. To them, that was the key to shaping future cultural change. These intellectuals, particularly modernist liberals in Murphy’s account, refused to cede control of culture and instead fought to find ways to direct or at least impact it. Our hope as historians is that by better understanding these gaps and the intellectual and artistic responses to them, we can gain access to a better understanding of the cultural conflicts and evolutions of the decade. This may in turn help us better understand the social and political clashes of the 20th century. “After the 1920s,” Murphy writes, “culture became the essential terrain of social and political action” (10). Ultimately, Murphy is describing a battle to control culture, fought amongst conservatives, progressives, and modernist liberal intellectuals, a “battle…over what force would shape change—industrial and technological progress or culture” (10). Put another way, this is the battle over the gap and it is the battle that would define the intellectual decade.

It is difficult to do justice to the breadth of Murphy’s research and analysis in a short review such as this. He begins by analyzing the need for intellectuals to understand and conceptualize the changes they were experiencing and describes the growth of cultural anthropology to provide that service. He then discusses the ambivalent relationship between intellectuals and the mass culture in more detail. He focuses here on the difficulties intellectuals faced, distrusting what they saw as the conformist nature of mass culture and yet still recognizing its power. Murphy writes, “Attuned to the immense power of culture, eager to repudiate the role of moral guardians, and avid proponents of personal freedom, they were profoundly distressed when the choices audiences made failed to reflect their own values or advance their aspirations for America” (71). There were many ways in which to deal with this ambivalence and Murphy presents a number of divergent ones.

Having established the existence of the culture gap and some of the complicated ways in which intellectuals generally understood and responded to it, Murphy turns to a more specific look at how different groups of intellectuals attempted to bridge the gap—or, more accurately perhaps, find space for themselves within the gap. He begins with literary intellectuals, particularly the differing approaches taken by Young American critics and the Dadaists, ultimately locating the decade’s “dominant literary currents” as “the search for symbols from the American past with which to fashion an organic American culture and the elevation of modern technology and commercial enterprise to the level of a new folklore” (105). Next, Murphy discusses the role of race, ethnicity and immigration in creating and overcoming cultural gaps, specifically noting how “immigrants reflected the contest between conformity and personal autonomy that formed one of the chief features of modern life” (111). He then turns to an analysis of Pragmatism and the social sciences, situating them as efforts by intellectuals to drive Americans out of a “cultural abyss” before finishing with a chapter on the battles between and within religion and science.

It is difficult to find fault with the impressive breadth of Murphy’s coverage of the intellectual trends of the time period. We all bring our biases and backgrounds as readers, however, and I personally missed a discussion of the Legal Realists, primarily because the discussion concerning the role of law in society taking place in the legal community at the time fits Murphy’s conception of the culture gap so well. When discussing the Pragmatists, for example, Murphy writes, “The pragmatists presented their ideas—which challenged belief in a single, real, unseen, and unchanging world beyond our own that vouchsafed absolute truth—as the means of intellectual and cultural regeneration” (176). This is similar to the challenge taking place in the world of law, particularly in the conversation between Roscoe Pound and Karl Llewellyn concerning the tenets of true Legal Realism.[2] Legal intellectuals were dealing with their own version of a gap between intellectuals and mass culture in their attempts to re-think both the status and the role of law in culture. This is one manifestation of the gap that seems to be missing, though really, this may be more of a suggestion of a way in which Murphy’s framework could be used in the future than a criticism of the work itself.

More important, one of the challenges for Murphy is find ways to link such a wide assortment of intellectuals, artists, approaches, and philosophies and to deal with each sufficiently. Murphy works hard to ensure that this book is more than a list of biographical or ideological entries and he is mostly successful. Murphy’s discussion of Gilbert Seldes, for just one example, is fascinating, showing how we can read different intellectuals not just for the substance of what they said, but for how they can be read together to better understand the debate (69). But I admit that at times it feels as though just as one is drawn into the discussion of a particular person or concept, Murphy is forced to move on. He resolves this challenge in a couple of ways. Primarily, he continually brings the narrative back to his central theme of the gap these intellectuals were facing and reminds the reader of the long term impact of the differing approaches they took.

To me, though, another unifying concept Murphy suggests is even more interesting: the concept of authenticity. The intellectuals Murphy describes seem unmoored by the cultural changes taking place around them and are, to some extent, searching for an anchor. Most, if not all, turn to some aspect of authenticity to find that cultural stronghold. Much of the writing, playing, singing, talking, and “acting” that Murphy describes comes down to using authenticity as a way to carve out space within the cultural gap. From Murphy’s discussion of the commercial packaging of hillbilly records to his description of the attempts of Dadaists to create a usable modern “folklore” to the attempt by various groups to construct a “usable past” from which to create a new culture, authenticity is everywhere in Murphy’s account. Whether Dadaists, artists, Fundamentalists, immigrants, or intellectuals, much of the work to find space within the gap took the form of the search for (or creation of) an “authentic” self. Murphy often notes this explicitly, but at other times leaves the implication to the reader.

For just one example, Murphy writes of the Harlem Renaissance, “Much of the renewed racial consciousness in the Renaissance was similarly tied to a discovery of roots, whether a respectful attention to the southern black fold roots of Negro culture or the deeper African cultural tradition” (134). Similarly, as I noted above, Murphy elsewhere describes the dominant literary currents of the 1920s as “the search for symbols from the American past with which to fashion an organic American culture and the elevation of modern technology and commercial enterprise to the level of a new folklore” (105). Read one way, both of these approaches are a search for an authentic identity, one that could be used by these intellectuals to find authority within the cultural gap. The different intellectuals Murphy describes have different ways of trying to find what is “real,” but they are virtually all linked by the understanding that discovering or creating something “real” (or that simply appeared seemed real) was important. This is present in Murphy’s account, but I would love to see more discussion of the different approaches to the authentic self taken by these intellectuals and how they used the concept both to understand the changes and attempt to control them. For whom was the authentic self the true project, for example, and for whom was the creation of a seemingly authentic self simply a means by which to assert cultural control?

Ultimately, though, for me, the best test for any work of History is that it is both illuminating and thought-provoking, that it both clarifies and clouds one’s understanding. Murphy’s work passes that test with flying colors. There is much to learn in Murphy’s book; it is useful for any scholar of the 20thcentury, particularly any graduate student. More important, though, there is also much to think about here. Like the actors themselves, the historian of this time period can sometimes feel ungrounded by the contradictions, cultural changes, and shifting foundations of the decade. We, too, are sometimes attempting to describe a cultural gap even as we struggle to keep ourselves above it. Works like this—that meet the complexities of the 1920s head-on and do not back down from the difficult contradictions—help show us the way forward.


----------------------------------------------

[1] Dumenil, Lynn, The Modern Temper: American Culture and Society in the 1920s. (New York: Hill and Wang, 1995), 12.

[2] For a much more complete account of the culmination of this discussion, see Llewellyn, Karl, “Some Realism About Realism—Responding to Dean Pound,” 44 Harvard Law Review 1222 (1931).