Tampilkan postingan dengan label Charles Taylor. Tampilkan semua postingan
Tampilkan postingan dengan label Charles Taylor. Tampilkan semua postingan

Jumat, 10 Juni 2011

Christian American Exceptionalism in a Secular Age


Last week I posted my thoughts on Charles Taylor’s theory of multiculturalism—what he terms the “politics of recognition.” (My post: "When the Zulus Produce A Tolstoy We Will Read Him.") This week, as promised, I continue engaging Charles Taylor with some preliminary thoughts on his 2007 tome, The Secular Age. My interests in Taylor are, of course, peculiar to my research about the U.S. culture wars. If you’d like more in-depth blogging on secularism’s relationship to religion and the public sphere, I highly recommend The Immanent Frame.

Although by accident, I could not have timed my reading of The Secular Age any better than I did, since it came during our roundtable discussion of David Sehat’s The Myth of American Religious Freedom. In his entry, Sehat responds to the charge that he had an uncritical attitude towards secularization by citing sociologist Mark Chaves. Sehat writes:

Chaves argues that secularization involves two distinct processes. The first is the differentiation of religion from other institutional structures such as government, education, or business. This differentiation of institutional structures is a mark of modernity. In other words, whereas in past ages governmental, religious, and business authorities might have been connected or combined in the king or prince or pope, modernity entails the division and separation of these various structures. Secularization then occurs as religious authority declines in scope, that is, as religious authority becomes increasingly confined into its own institutional sphere with less relevance for other institutional spheres. The key issue in this version of secularization theory is religious authority. Though many, many people may continue to believe or to practice various religious traditions, this fact is not relevant to the process of secularization, according to Chaves. By focusing on the scope and intensity of religious authority, Chaves offers a theory of secularization that focuses our attention not on the prevalence of individual belief but on the social significance of religion.

This explanation of secularization despite the persistence of religious belief coheres with Taylor’s thinking on the subject (and with a number of other thinkers, such as Stephen L. Carter, who argues in The Culture of Disbelief that the American public sphere is hostile to religiously informed action, despite the fact that well over 90% of Americans say they believe in God when polled). Taylor argues that the incommensurable steadfastness of religious belief in the United States lends credence to theories of American exceptionalism. The doggedness of religious belief amongst a vast majority of Americans in the secular age also helps explain the culture wars.

But first, let's begin with how Taylor defines the secular age by three meanings:

1) Religion is not a state matter, but rather a private one.
2) Hardly anyone believes in God.
3) Religion is a choice, and not an easy one to make.

Obviously, these meanings are not all equally true everywhere. Though all are mostly true in western Europe, only meanings one and three make sense when discussing the United States, where the vast majority believes in God. As Taylor writes: “the United States is rather striking in this regard. One of the earliest societies to separate Church and State, it is also the Western society with the highest statistics for religious belief and practice” (2). Taylor stresses that being a believer in a secular age—and in a secular society (not all contemporary societies are secular, as Muslim nations make clear)—is different than being a believer prior to secularization, when belief was a given. Now, belief is a choice, and not always an easy one. In other words, the secular age is one of differentiated belief: “We live in a condition where we cannot help but be aware that there are a number of different construals, views which intelligent, reasonably undeluded people, of good will, can and do disagree on. We cannot help looking over our shoulder from time to time, looking sideways, living our faith also in a condition of doubt and uncertainty” (11). In other words, belief (and unbelief) takes on new meaning in the secular age because we have passed from a “naïve” stage—where everyone automatically believed, because it was what people did, without question—to a “reflective” one—where people consciously think about how their lived experiences relate to the existence of God (or not).

Broadly, Taylor’s book is a polemic of sorts: he argues that the shift to secularism was not merely a history of “subtraction”—of people becoming disenchanted with God due to science and naturalistic explanations of creation and other phenomena. He thinks, rather, people found meaning or “fullness” in humanism—a humanism that had to be created, that wasn’t already fully formed. For this shift to happen it was “necessary to have confidence in our own powers of moral ordering” (27). As I have argued in an earlier post on secular humanism, the confidence in a human moral order, or even an immanent moral order, is the dogma underlying progressive pedagogy, and inasmuch as the public schools are secular and progressive, this dogma should be embraced as such by curriculum builders and educational theorists. In other words, religion should be engaged along these lines in our public schools, rather than ignored, which is current practice, and which confuses the very purpose of secular, progressive education.

Taylor’s analysis of secularization’s longue durée is interesting, though a little too abstract for my tastes. The book becomes more compelling by Chapter 13, which Taylor titles, in a nod to Lionel Trilling, “The Age of Authenticity”—about our current age, which he dates back to the postwar period, and marks more concretely with the cultural revolutions of the Sixties. He writes: “I believe, along with many others, that our North Atlantic civilization has been undergoing a cultural revolution in recent decades. The 60s provide perhaps the hinge moment, at least symbolically”—when expressive individualism or a “kind of self-orientation seems to have become a mass phenomenon” (473). More in the words of Taylor: “The causes cited for these changes are many: affluence and the continued extension of consumer life styles; social and geographic mobility; outsourcing and downsizing by corporations; new family patterns, particularly the growth of the two-income household, with the resulting overwork and burnout; suburban spread, whereby people often live, work, and shop in three separate areas; the rise of television, and others” (473). Taylor relates all of these developments to the seeming loss of community, and duly cites Robert Putnam, whose Bowling Alone heads up the “loss of community” canon (Rodgers’s Age of Fracture will no doubt join any such reading list).

This leads to a theoretical understanding of the source of the culture wars. It is against the era of expressive, individuated authenticity—“set in a wider critique of the buffered, discipline self; concerned above all with instrumental rational control” (476)—that conservatives revolt, both consciously and unconsciously. “The ideal [of expressive individualism], however distorted, is still powerful enough in a society like the U.S. to awaken strong resistance in certain quarters, and to be the object of what have been called ‘culture wars’” (478).

Taylor turns to Émile Durkheim to explain the different paradigms of how religion interacts with society, naming three social forms: paleo-Durkheimian, neo-Durkheimian, and post-Durkheimian (at the Immanent Frame, these typologies are criticized by Robert Bellah, who is otherwise a big fan of Taylor, calling A Secular Age one of the most important books he has ever read). In paleo-Durkheimian societies religion is embedded and undifferentiated, such as in premodern Europe. In neo-Durkheimian societies, of which the United States was arguably the first, religion is only partially embedded, but it still manages to express a larger national identity. Another word for this might be "civil religion" in that the religious form is semi-generic, does not bear on the everyday public practices of all citizens, but is still crucial to a coherent normative framework of citizenship. The post-Durkheimian society is the secular age, when religion is immanent. (As William James made clear in The Varieties of Religious Experience, written over a century ago, models for post-Durkheimian religious expression have long existed, even in the neo-Durkheimian United States.)

The United States culture wars pit neo-Durkheimians against post-Durkheimians. “In a sense,” Taylor writes, “part of what drove the Moral Majority and motivates the Christian Right in the U.S.A. is an aspiration to re-establish something of the fractured neo-Durkheimian understanding that used to define the nation, where being American would once more have a connection with theism, with being ‘one nation under God,’ or at least with the ethic which was interwoven with this.” But, and here’s the rub: “the very embattled nature of these attempts shows how we have slid out of the old dispensation”(488).

But what makes the United States exceptional relative to the other nations of the north Atlantic? (How lucky that one of our plenary sessions for the 2011 U.S. Intellectual History Conference will be on the topic of American Exceptionalism, featuring Eric Foner, Beth Bailey, Rogers Smith, and Michael Kazin. Only five days until the CFP deadline!) Taylor acknowledges that one of the hottest debates in secularization theory is over this question. He offers a few plausible answers. First off, historically, immigrants have found that integrating into the American mainstream is made easier through church attendance: “one can be integrated as an American through one’s faith or religious identity.” This explains divergent paths to modernity: whereas rural Sicilians who emigrated to the United States became more Catholic, rural southern Italians who emigrated to, say, Milan, usually become less religious, opting instead for the more common paths to assimilation there, typically through socialism or syndicalism.

Also, Taylor contends that the most important contemporary force for secularization is the modern academy. But although American academics are equally as secular as their western European counterparts, elite intellectual life does not influence the rest of American society the way it does in British or French or German societies. This is the old saw that Americans are less deferential, or in other terms, more anti-intellectual.

In explaining American exceptionalism, Taylor also builds on an argument made by Marx: that the ties of the church loosened in Europe because the state and church were so tightly wound that anticlericalism followed the age of revolution against monarchy. In contrast, since the United States never experienced such a paleo-Durkheimian phase, since it was always already neo-Durkheimian, there was never the urgent political need to revolt against formal religion. In contrast, religion often offered Americans apparent refuge from the intrusions of the state.

More compelling meta-analysis of American exceptionalism from Taylor: In European nations, any residual link between God and nation—the neo-Durkheimian society—was shattered by the trauma of World War I. But the United States never experienced such a trauma, at least not until the 1960s, when it experienced the combined shocks of a set of destabilizing forces—“the triple attack which the family-religion-patriotism complex of the 1950s suffered in the era of civil rights, Vietnam and the expressive revolution.” Taylor continues: “Was this not the analogue in the American case to the First World War for the British? Perhaps, but plainly not everyone sees it this way. Indeed, the different reactions to this era seem to underlie the ‘culture wars’ of contemporary U.S. politics. It seems that the fusion of faith, family values and patriotism is still extremely important to one half of American society, that they are dismayed to see it challenged, both in its central values (e.g., the fight over abortion or gay marriage), and in the link between their faith and the polity (fights over school prayer, the phrase ‘under God,’ and the like)” (527).

Of course, in spite of the trauma of Vietnam (and more recent traumas, Iraq and Afghanistan), the United States remains the most powerful nation in the world, which goes a long way in explaining the persistence of its civil religion, of its neo-Durkheimian state. “It is easier to be unreservedly confident in your own rightness when you are the hegemonic power. The skeletons [in the closet] are there, but they can be resolutely ignored, in spite of the efforts of a gallant band of scholars, who are engaged in the ‘history wars’” (528)

Or put more simply: “Most Americans have few doubts about whose side God is on.”

Christian American Exceptionalism in a Secular Age


Last week I posted my thoughts on Charles Taylor’s theory of multiculturalism—what he terms the “politics of recognition.” (My post: "When the Zulus Produce A Tolstoy We Will Read Him.") This week, as promised, I continue engaging Charles Taylor with some preliminary thoughts on his 2007 tome, The Secular Age. My interests in Taylor are, of course, peculiar to my research about the U.S. culture wars. If you’d like more in-depth blogging on secularism’s relationship to religion and the public sphere, I highly recommend The Immanent Frame.

Although by accident, I could not have timed my reading of The Secular Age any better than I did, since it came during our roundtable discussion of David Sehat’s The Myth of American Religious Freedom. In his entry, Sehat responds to the charge that he had an uncritical attitude towards secularization by citing sociologist Mark Chaves. Sehat writes:

Chaves argues that secularization involves two distinct processes. The first is the differentiation of religion from other institutional structures such as government, education, or business. This differentiation of institutional structures is a mark of modernity. In other words, whereas in past ages governmental, religious, and business authorities might have been connected or combined in the king or prince or pope, modernity entails the division and separation of these various structures. Secularization then occurs as religious authority declines in scope, that is, as religious authority becomes increasingly confined into its own institutional sphere with less relevance for other institutional spheres. The key issue in this version of secularization theory is religious authority. Though many, many people may continue to believe or to practice various religious traditions, this fact is not relevant to the process of secularization, according to Chaves. By focusing on the scope and intensity of religious authority, Chaves offers a theory of secularization that focuses our attention not on the prevalence of individual belief but on the social significance of religion.

This explanation of secularization despite the persistence of religious belief coheres with Taylor’s thinking on the subject (and with a number of other thinkers, such as Stephen L. Carter, who argues in The Culture of Disbelief that the American public sphere is hostile to religiously informed action, despite the fact that well over 90% of Americans say they believe in God when polled). Taylor argues that the incommensurable steadfastness of religious belief in the United States lends credence to theories of American exceptionalism. The doggedness of religious belief amongst a vast majority of Americans in the secular age also helps explain the culture wars.

But first, let's begin with how Taylor defines the secular age by three meanings:

1) Religion is not a state matter, but rather a private one.
2) Hardly anyone believes in God.
3) Religion is a choice, and not an easy one to make.

Obviously, these meanings are not all equally true everywhere. Though all are mostly true in western Europe, only meanings one and three make sense when discussing the United States, where the vast majority believes in God. As Taylor writes: “the United States is rather striking in this regard. One of the earliest societies to separate Church and State, it is also the Western society with the highest statistics for religious belief and practice” (2). Taylor stresses that being a believer in a secular age—and in a secular society (not all contemporary societies are secular, as Muslim nations make clear)—is different than being a believer prior to secularization, when belief was a given. Now, belief is a choice, and not always an easy one. In other words, the secular age is one of differentiated belief: “We live in a condition where we cannot help but be aware that there are a number of different construals, views which intelligent, reasonably undeluded people, of good will, can and do disagree on. We cannot help looking over our shoulder from time to time, looking sideways, living our faith also in a condition of doubt and uncertainty” (11). In other words, belief (and unbelief) takes on new meaning in the secular age because we have passed from a “naïve” stage—where everyone automatically believed, because it was what people did, without question—to a “reflective” one—where people consciously think about how their lived experiences relate to the existence of God (or not).

Broadly, Taylor’s book is a polemic of sorts: he argues that the shift to secularism was not merely a history of “subtraction”—of people becoming disenchanted with God due to science and naturalistic explanations of creation and other phenomena. He thinks, rather, people found meaning or “fullness” in humanism—a humanism that had to be created, that wasn’t already fully formed. For this shift to happen it was “necessary to have confidence in our own powers of moral ordering” (27). As I have argued in an earlier post on secular humanism, the confidence in a human moral order, or even an immanent moral order, is the dogma underlying progressive pedagogy, and inasmuch as the public schools are secular and progressive, this dogma should be embraced as such by curriculum builders and educational theorists. In other words, religion should be engaged along these lines in our public schools, rather than ignored, which is current practice, and which confuses the very purpose of secular, progressive education.

Taylor’s analysis of secularization’s longue durée is interesting, though a little too abstract for my tastes. The book becomes more compelling by Chapter 13, which Taylor titles, in a nod to Lionel Trilling, “The Age of Authenticity”—about our current age, which he dates back to the postwar period, and marks more concretely with the cultural revolutions of the Sixties. He writes: “I believe, along with many others, that our North Atlantic civilization has been undergoing a cultural revolution in recent decades. The 60s provide perhaps the hinge moment, at least symbolically”—when expressive individualism or a “kind of self-orientation seems to have become a mass phenomenon” (473). More in the words of Taylor: “The causes cited for these changes are many: affluence and the continued extension of consumer life styles; social and geographic mobility; outsourcing and downsizing by corporations; new family patterns, particularly the growth of the two-income household, with the resulting overwork and burnout; suburban spread, whereby people often live, work, and shop in three separate areas; the rise of television, and others” (473). Taylor relates all of these developments to the seeming loss of community, and duly cites Robert Putnam, whose Bowling Alone heads up the “loss of community” canon (Rodgers’s Age of Fracture will no doubt join any such reading list).

This leads to a theoretical understanding of the source of the culture wars. It is against the era of expressive, individuated authenticity—“set in a wider critique of the buffered, discipline self; concerned above all with instrumental rational control” (476)—that conservatives revolt, both consciously and unconsciously. “The ideal [of expressive individualism], however distorted, is still powerful enough in a society like the U.S. to awaken strong resistance in certain quarters, and to be the object of what have been called ‘culture wars’” (478).

Taylor turns to Émile Durkheim to explain the different paradigms of how religion interacts with society, naming three social forms: paleo-Durkheimian, neo-Durkheimian, and post-Durkheimian (at the Immanent Frame, these typologies are criticized by Robert Bellah, who is otherwise a big fan of Taylor, calling A Secular Age one of the most important books he has ever read). In paleo-Durkheimian societies religion is embedded and undifferentiated, such as in premodern Europe. In neo-Durkheimian societies, of which the United States was arguably the first, religion is only partially embedded, but it still manages to express a larger national identity. Another word for this might be "civil religion" in that the religious form is semi-generic, does not bear on the everyday public practices of all citizens, but is still crucial to a coherent normative framework of citizenship. The post-Durkheimian society is the secular age, when religion is immanent. (As William James made clear in The Varieties of Religious Experience, written over a century ago, models for post-Durkheimian religious expression have long existed, even in the neo-Durkheimian United States.)

The United States culture wars pit neo-Durkheimians against post-Durkheimians. “In a sense,” Taylor writes, “part of what drove the Moral Majority and motivates the Christian Right in the U.S.A. is an aspiration to re-establish something of the fractured neo-Durkheimian understanding that used to define the nation, where being American would once more have a connection with theism, with being ‘one nation under God,’ or at least with the ethic which was interwoven with this.” But, and here’s the rub: “the very embattled nature of these attempts shows how we have slid out of the old dispensation”(488).

But what makes the United States exceptional relative to the other nations of the north Atlantic? (How lucky that one of our plenary sessions for the 2011 U.S. Intellectual History Conference will be on the topic of American Exceptionalism, featuring Eric Foner, Beth Bailey, Rogers Smith, and Michael Kazin. Only five days until the CFP deadline!) Taylor acknowledges that one of the hottest debates in secularization theory is over this question. He offers a few plausible answers. First off, historically, immigrants have found that integrating into the American mainstream is made easier through church attendance: “one can be integrated as an American through one’s faith or religious identity.” This explains divergent paths to modernity: whereas rural Sicilians who emigrated to the United States became more Catholic, rural southern Italians who emigrated to, say, Milan, usually become less religious, opting instead for the more common paths to assimilation there, typically through socialism or syndicalism.

Also, Taylor contends that the most important contemporary force for secularization is the modern academy. But although American academics are equally as secular as their western European counterparts, elite intellectual life does not influence the rest of American society the way it does in British or French or German societies. This is the old saw that Americans are less deferential, or in other terms, more anti-intellectual.

In explaining American exceptionalism, Taylor also builds on an argument made by Marx: that the ties of the church loosened in Europe because the state and church were so tightly wound that anticlericalism followed the age of revolution against monarchy. In contrast, since the United States never experienced such a paleo-Durkheimian phase, since it was always already neo-Durkheimian, there was never the urgent political need to revolt against formal religion. In contrast, religion often offered Americans apparent refuge from the intrusions of the state.

More compelling meta-analysis of American exceptionalism from Taylor: In European nations, any residual link between God and nation—the neo-Durkheimian society—was shattered by the trauma of World War I. But the United States never experienced such a trauma, at least not until the 1960s, when it experienced the combined shocks of a set of destabilizing forces—“the triple attack which the family-religion-patriotism complex of the 1950s suffered in the era of civil rights, Vietnam and the expressive revolution.” Taylor continues: “Was this not the analogue in the American case to the First World War for the British? Perhaps, but plainly not everyone sees it this way. Indeed, the different reactions to this era seem to underlie the ‘culture wars’ of contemporary U.S. politics. It seems that the fusion of faith, family values and patriotism is still extremely important to one half of American society, that they are dismayed to see it challenged, both in its central values (e.g., the fight over abortion or gay marriage), and in the link between their faith and the polity (fights over school prayer, the phrase ‘under God,’ and the like)” (527).

Of course, in spite of the trauma of Vietnam (and more recent traumas, Iraq and Afghanistan), the United States remains the most powerful nation in the world, which goes a long way in explaining the persistence of its civil religion, of its neo-Durkheimian state. “It is easier to be unreservedly confident in your own rightness when you are the hegemonic power. The skeletons [in the closet] are there, but they can be resolutely ignored, in spite of the efforts of a gallant band of scholars, who are engaged in the ‘history wars’” (528)

Or put more simply: “Most Americans have few doubts about whose side God is on.”

Kamis, 02 Juni 2011

“When the Zulus Produce a Tolstoy We Will Read Him”: Charles Taylor and the Politics of Recognition


In his excellent recent Dædalus article, “Racial Liberalism, the Moynihan Report & the Dædalus Project on ‘The Negro American,’” Daniel Geary (author of Radical Ambition, the superb biography of C. Wright Mills, which I reviewed here) concludes his analysis with a cogent reference to the post-1960s shift in racial liberalism. He writes that “public criticism of the Moynihan Report emerged from an increasing disenchantment with the core assumptions of racial liberalism”—or, at least, racial liberalism as it stood in 1965, the year Moynihan authored his infamous government report on The Negro Family.

Left-leaning critics of that form of racial liberalism, who would go on to pioneer another form, otherwise known as multiculturalism, “came to reject the common sociological view that African American culture was a pathological distortion of white American culture and that blacks should have to conform to white values in order to achieve equality.” In short, the Moynihan Report predated the cultural turn taken by post-60s racial liberalism, anticipated by Ralph Ellison in his critique of liberal sociology circa 1965: “The sociology is loaded… The concepts which are brought to bear are usually based on those of white, middle-class, Protestant values and life style.” Ellison pointed towards a new form of racial liberalism: one that tended to valuate, and even celebrate, minority cultural forms.


For Geary’s purposes (he’s currently writing a book on the Moynihan Report), multiculturalism—or, the post-60s cultural turn taken by racial liberalism—merely serves as a bookend to the sociological liberalism that informed Moynihan and his cohort. But for my purposes—in my efforts to write a book on the culture wars—multiculturalism is that which needs full explanation, historical and epistemological. My research thus leads me to Charles Taylor’s indispensable long essay, “The Politics of Recognition,” the centerpiece of a book edited by Amy Gutmann, entitled, simply, Multiculturalism. (Following the suggestions of several readers, I’m in the midst of a deep engagement with Charles Taylor. Next week I hope to blog on his monumental A Secular Age—as if a blog post can do justice to a 900-page book!)

Taylor locates the source of what he calls “the politics of recognition,” which are more commonly referred to as “identity politics,” in the idea that “nonrecognition or misrecognition can inflict harm, can be a form of oppression, imprisoning someone in a false, distorted, and reduced mode of living.” Ideas about identity, culture, and history thus take on added political consequence, since it is assumed that oppressed people are conditioned to the view of themselves assigned by their oppressors, such that even when legal and other tangible barriers to equality are removed, psychological inhibitors remain. Although Taylor’s essay is mostly in the realm of abstract ideas—he’s a philosopher, not an intellectual historian—my research confirms his notions: concerns about self-loathing in relation to cultural identity drove the liberation politics of Black Power advocates like Stokely Carmichael and Chicano Power activists such as Corky Gonzales (check out my essay on the latter, in relation to the Arizona anti-ethnic studies law, here).

Taylor recognizes that perhaps the key thinker in this transition to thinking about misrecognition as oppression is Franz Fanon, “whose influential The Wretched of the Earth argued that the major weapon of the colonizers was the imposition of their image of the colonized on the subjugated people. These latter, in order to be free, must first of all purge themselves of these depreciating self-images.” Taylor contends that “the notion that there is a struggle for a changed self-image, which takes place both within the subjugated and against the dominator, has been very widely applied. The idea has become crucial to certain strands of feminism, and is also a very important element in the contemporary debate about multiculturalism.”

Fanon is not always considered a practitioner of identity politics or a theorist of multiculturalism. His revolutionary credentials tend to militate against such labels, since, in an age of multinational capitalism, only deluded multicultural theorists and their caustic conservative critics tend to think of multiculturalism as revolutionary. In a recent New Left Review retrospective ("Reading Fanon in the 21st Century") Immanuel Wallerstein argues that Franz Fanon’s 1952 book Black Skin, White Masks—recirculated in the 1980s and 1990s as part of the multicultural canon (the irony in coupling these two words together is not lost on me)—was “in no way a call to identity politics.” Wallerstein has a point, since Fanon concluded the book on a universal note, distancing himself from any particular conception of blackness: “The Negro is not. Any more than a white man.”

In his 1963 The Wretched of the Earth, perhaps the seminal anticolonial text, Fanon was critical of black nationalist celebrations of ancient African civilization. But he recognized such racialized expressions of history and culture as necessary first steps in severing ties with colonial power, especially since the representatives of Western Civilization “have never ceased to set up white culture to fill the gap left by the absence of other cultures.” In other words, cultural identity mattered to the larger struggle, and was integral to Fanon’s work, which accentuated “the experience of a black man thrown into a white world.” Fanon’s position resembled Richard Wright’s well-known response to the Bandung Conference, where he called for a “reluctant nationalism” that “need not remain valid for decades to come.” Both Fanon and Wright joined the Second Congress of the Negro Writers and Artists in Rome in 1959, dedicated to the “peoples without a culture,” a mission aligned with one of Fanon’s more famous dictums: “The plunge into the chasm of the past is the condition and the course of freedom.” In any case, it is obviously too easy to separate Fanon from identity politics.

Relating this back to Geary’s stated historical trajectory, it becomes clear that the break from the High Racial Liberalism of mid-century social thought, perhaps most famously represented by Gunnar Myrdal’s An American Dilemma: The Negro Problem and Modern Democracy (1944), to the multicultural liberalism of the 1980s and 1990s, was not so clean. Continuity is an important factor in the sense that thinkers of both eras were conscious of “damage,” the trope documented by Daryl Michael Scott in his important book, Contempt and Pity: Social Policy and the Image of the Damaged Black Psyche: 1880-1996 (which should be read alongside Alice O’Connor’s equally important Poverty Knowledge, which I discuss here). The difference was in how to treat damage. Mid-century social scientists such as Moynihan implied that assimilation to white-middle-class values was the ticket to salvation. Even black sociologist Kenneth Clark assumed as much, evident in his famous “doll study” that informed Brown v. Board, where black children were observed reacting positively to white dolls and negatively to black dolls. Multiculturalists from Fanon onwards believed that damage had to be overcome by acting against the dominant culture. In Fanon’s case, meeting colonial violence with anticolonial violence was the means of atonement. In the case of most American multiculturalists, the key to overcoming damage was in valuating their own culture as apart from the dominant culture. Thus, it became increasingly common in the 1980s and 1990s for predominantly black school districts to implement an Afrocentric curriculum.

Since the boundaries between historical eras of racial liberalism are blurry, it is helpful that Taylor takes a long view on the politics of recognition in relating them to the modern history of democratization. He writes: “What has come about with the modern age is not the need for recognition but the conditions in which the attempt to be recognized can fail. That is why the need is now acknowledged for the first time. In premodern times, people didn’t speak of ‘identity’ and ‘recognition’—not because people didn’t have (what we call) identities, or because these didn’t depend on recognition, but rather because these were then too unproblematic to be thematized as such.” In other words, in feudal times everyone had an identity—lord or peasant, master or slave—but because social mobility didn’t exist, identity (though it wasn’t called such) was static and unproblematic. In modern liberal democracies, where everyone is assumed to have equal status in theory—but, crucially, not in practice—identity matters because it relates to one’s ability or inability to climb social ladders and achieve “equal” status. (This reminds me of Orwell’s famous Animal Farm dictum that “some animals are more equal than others.”) A person’s position is tied up in their identity, especially, multiculturalists argue, in relation to racial and gendered identities. I would argue that identification is one of the most important forms of modern capitalistic classification but that, ironically, as identity politics became the dominant form of racial liberalism, what might be termed postmodern capitalistic classification took hold and, though class polarization grew starker, racial and gendered identities grew less adhesive.

Though Taylor subsumes the politics of recognition under liberalism broadly conceived, he admits that such politics seem illiberal to more traditional liberals who understand politics to be about the defense of universal rights accrued to the individual without relation to group affiliation. Of course, the standard critique of universalism applies here: particular groups, namely, men of European descent, achieve and maintain power in the name of universal rights that supposedly apply to all. Such criticism has been the theoretical backbone of efforts to reform higher education since the 1960s, in terms of actual and symbolic representation, or, affirmative action and multiculturalism. As Taylor writes: “Where the politics of universal dignity fought for forms of nondiscrimination that were quite ‘blind’ to the ways in which citizens differ, the politics of difference often redefines nondiscrimination as requiring that we make these distinctions the basis of preferential treatment.”

Of course, the politics of difference is not grounded in monolithic epistemological positioning. Whereas many multiculturalists challenge the dominant culture by valuing their own minority cultures and histories, some others theorize that there is no such thing as cultural value. These differences are seen in the 1980s and 1990s attempts to revise the traditional canon. Some such attempts, such as the successful revision of the core freshman reading list achieved at Stanford University in 1987-1988, lead by the Stanford Black Student Union, were implicitly based on the notion that all cultures were intrinsically valuable and that judgments to the contrary were based on prejudice or ill-will. But other academic activists, invoking what Taylor calls “half-baked neo-Nietzschean theories,” sought to undermine the very premise of a canon. “Deriving frequently from Foucault or Derrida, they claim that all judgments of worth are based on standards that are ultimately imposed by and further entrench structures of power.”

Taylor thus confirms one of the central points made by Francois Cusset in his provocative French Theory: How Foucault, Derrida, Deleuze, & Co. Transformed the Intellectual Life of the United States (which I recently blogged about here): that some American scholars misappropriated French theory in ways useful to American academic politics. “If Derrida or Foucault deconstructed the concept of objectivity,” Cusset writes, “the Americans would draw on those theories not for a reflection on the figural power of language or on discursive constructions, but for a more concrete political conclusion: objectivity is synonymous with ‘subjectivity of the white male.’” But as Taylor correctly points out: such argumentation backs the neo-Nietzscheans into a conceptual corner. The so-called “canon wars” were animated by a demand for respect—a politics of recognition. But if evaluations of cultural worth are mere expressions of power, then there is no objective ground upon which to respect or even recognize a minority culture.

Right-wing culture warriors made similar recognitions in their analyses of a nihilistic academic left. For example, in Telling the Truth, her 1995 summary of all that she had learned about the academy while directing the National Endowment of the Humanities, Lynn Cheney scorned the academic fashion of reading power and hierarchy into everything, including canonical texts. “The humanities are about more than politics, about more than social power,” she argued. “What gives them their abiding worth are truths that, transcending accidents of class, race, and gender, speak to us all.” Similarly, in his 1991 bestseller Illiberal Education, Dinesh D’Souza wrote that the so-called “victim’s revolutionaries” who had taken over college campuses in the wake of the 60s had joined forces with a new literary relativism, schooled in Foucault and Derrida. “Because the old notion of neutral standards corresponded with a white male faculty regime at American universities,” he intoned, “minority and feminist scholars have grown increasingly attached to the au courant scholarship, which promises to dismantle and subvert these old authoritative structures."

Taylor rejects the neo-Nietzschean form of identity politics in favor of that which values minority cultures. Though he is skeptical that all cultures are objectively equal—morally or aesthetically—he believes that we should approach learning about other cultures as if this were so. He concludes his essay: “There is perhaps a moral issue here. We only need a sense of our own limited part in the whole human story to accept the presumption [that other cultures have merit and value]. It is only arrogance, or some analogous moral failing, that can deprive us of this. But what the presumption requires of us is not the peremptory and inauthentic judgments of equal value, but a willingness to be open to comparative cultural study of the kind that must replace our horizons in the resulting fusions.” In other words, we need to be open to having our ethnocentric views challenged.

Framing humanistic learning as such, Taylor vehemently opposes the racial arrogance of the statement supposedly uttered by Saul Bellow: “When the Zulus produce a Tolstoy we will read him.” Taylor works against Bellow’s supposed stricture not only on the grounds that the Zulus might have produced a Tolstoy that is yet to be discovered, but also from the standpoint that Zulu culture might evaluate merit differently and that we would benefit from learning their evaluative system. This is what he means by replacing our narrow horizons with a vision formed from cultural fusion, what he calls the dialogic process.

“When the Zulus Produce a Tolstoy We Will Read Him”: Charles Taylor and the Politics of Recognition


In his excellent recent Dædalus article, “Racial Liberalism, the Moynihan Report & the Dædalus Project on ‘The Negro American,’” Daniel Geary (author of Radical Ambition, the superb biography of C. Wright Mills, which I reviewed here) concludes his analysis with a cogent reference to the post-1960s shift in racial liberalism. He writes that “public criticism of the Moynihan Report emerged from an increasing disenchantment with the core assumptions of racial liberalism”—or, at least, racial liberalism as it stood in 1965, the year Moynihan authored his infamous government report on The Negro Family.

Left-leaning critics of that form of racial liberalism, who would go on to pioneer another form, otherwise known as multiculturalism, “came to reject the common sociological view that African American culture was a pathological distortion of white American culture and that blacks should have to conform to white values in order to achieve equality.” In short, the Moynihan Report predated the cultural turn taken by post-60s racial liberalism, anticipated by Ralph Ellison in his critique of liberal sociology circa 1965: “The sociology is loaded… The concepts which are brought to bear are usually based on those of white, middle-class, Protestant values and life style.” Ellison pointed towards a new form of racial liberalism: one that tended to valuate, and even celebrate, minority cultural forms.


For Geary’s purposes (he’s currently writing a book on the Moynihan Report), multiculturalism—or, the post-60s cultural turn taken by racial liberalism—merely serves as a bookend to the sociological liberalism that informed Moynihan and his cohort. But for my purposes—in my efforts to write a book on the culture wars—multiculturalism is that which needs full explanation, historical and epistemological. My research thus leads me to Charles Taylor’s indispensable long essay, “The Politics of Recognition,” the centerpiece of a book edited by Amy Gutmann, entitled, simply, Multiculturalism. (Following the suggestions of several readers, I’m in the midst of a deep engagement with Charles Taylor. Next week I hope to blog on his monumental A Secular Age—as if a blog post can do justice to a 900-page book!)

Taylor locates the source of what he calls “the politics of recognition,” which are more commonly referred to as “identity politics,” in the idea that “nonrecognition or misrecognition can inflict harm, can be a form of oppression, imprisoning someone in a false, distorted, and reduced mode of living.” Ideas about identity, culture, and history thus take on added political consequence, since it is assumed that oppressed people are conditioned to the view of themselves assigned by their oppressors, such that even when legal and other tangible barriers to equality are removed, psychological inhibitors remain. Although Taylor’s essay is mostly in the realm of abstract ideas—he’s a philosopher, not an intellectual historian—my research confirms his notions: concerns about self-loathing in relation to cultural identity drove the liberation politics of Black Power advocates like Stokely Carmichael and Chicano Power activists such as Corky Gonzales (check out my essay on the latter, in relation to the Arizona anti-ethnic studies law, here).

Taylor recognizes that perhaps the key thinker in this transition to thinking about misrecognition as oppression is Franz Fanon, “whose influential The Wretched of the Earth argued that the major weapon of the colonizers was the imposition of their image of the colonized on the subjugated people. These latter, in order to be free, must first of all purge themselves of these depreciating self-images.” Taylor contends that “the notion that there is a struggle for a changed self-image, which takes place both within the subjugated and against the dominator, has been very widely applied. The idea has become crucial to certain strands of feminism, and is also a very important element in the contemporary debate about multiculturalism.”

Fanon is not always considered a practitioner of identity politics or a theorist of multiculturalism. His revolutionary credentials tend to militate against such labels, since, in an age of multinational capitalism, only deluded multicultural theorists and their caustic conservative critics tend to think of multiculturalism as revolutionary. In a recent New Left Review retrospective ("Reading Fanon in the 21st Century") Immanuel Wallerstein argues that Franz Fanon’s 1952 book Black Skin, White Masks—recirculated in the 1980s and 1990s as part of the multicultural canon (the irony in coupling these two words together is not lost on me)—was “in no way a call to identity politics.” Wallerstein has a point, since Fanon concluded the book on a universal note, distancing himself from any particular conception of blackness: “The Negro is not. Any more than a white man.”

In his 1963 The Wretched of the Earth, perhaps the seminal anticolonial text, Fanon was critical of black nationalist celebrations of ancient African civilization. But he recognized such racialized expressions of history and culture as necessary first steps in severing ties with colonial power, especially since the representatives of Western Civilization “have never ceased to set up white culture to fill the gap left by the absence of other cultures.” In other words, cultural identity mattered to the larger struggle, and was integral to Fanon’s work, which accentuated “the experience of a black man thrown into a white world.” Fanon’s position resembled Richard Wright’s well-known response to the Bandung Conference, where he called for a “reluctant nationalism” that “need not remain valid for decades to come.” Both Fanon and Wright joined the Second Congress of the Negro Writers and Artists in Rome in 1959, dedicated to the “peoples without a culture,” a mission aligned with one of Fanon’s more famous dictums: “The plunge into the chasm of the past is the condition and the course of freedom.” In any case, it is obviously too easy to separate Fanon from identity politics.

Relating this back to Geary’s stated historical trajectory, it becomes clear that the break from the High Racial Liberalism of mid-century social thought, perhaps most famously represented by Gunnar Myrdal’s An American Dilemma: The Negro Problem and Modern Democracy (1944), to the multicultural liberalism of the 1980s and 1990s, was not so clean. Continuity is an important factor in the sense that thinkers of both eras were conscious of “damage,” the trope documented by Daryl Michael Scott in his important book, Contempt and Pity: Social Policy and the Image of the Damaged Black Psyche: 1880-1996 (which should be read alongside Alice O’Connor’s equally important Poverty Knowledge, which I discuss here). The difference was in how to treat damage. Mid-century social scientists such as Moynihan implied that assimilation to white-middle-class values was the ticket to salvation. Even black sociologist Kenneth Clark assumed as much, evident in his famous “doll study” that informed Brown v. Board, where black children were observed reacting positively to white dolls and negatively to black dolls. Multiculturalists from Fanon onwards believed that damage had to be overcome by acting against the dominant culture. In Fanon’s case, meeting colonial violence with anticolonial violence was the means of atonement. In the case of most American multiculturalists, the key to overcoming damage was in valuating their own culture as apart from the dominant culture. Thus, it became increasingly common in the 1980s and 1990s for predominantly black school districts to implement an Afrocentric curriculum.

Since the boundaries between historical eras of racial liberalism are blurry, it is helpful that Taylor takes a long view on the politics of recognition in relating them to the modern history of democratization. He writes: “What has come about with the modern age is not the need for recognition but the conditions in which the attempt to be recognized can fail. That is why the need is now acknowledged for the first time. In premodern times, people didn’t speak of ‘identity’ and ‘recognition’—not because people didn’t have (what we call) identities, or because these didn’t depend on recognition, but rather because these were then too unproblematic to be thematized as such.” In other words, in feudal times everyone had an identity—lord or peasant, master or slave—but because social mobility didn’t exist, identity (though it wasn’t called such) was static and unproblematic. In modern liberal democracies, where everyone is assumed to have equal status in theory—but, crucially, not in practice—identity matters because it relates to one’s ability or inability to climb social ladders and achieve “equal” status. (This reminds me of Orwell’s famous Animal Farm dictum that “some animals are more equal than others.”) A person’s position is tied up in their identity, especially, multiculturalists argue, in relation to racial and gendered identities. I would argue that identification is one of the most important forms of modern capitalistic classification but that, ironically, as identity politics became the dominant form of racial liberalism, what might be termed postmodern capitalistic classification took hold and, though class polarization grew starker, racial and gendered identities grew less adhesive.

Though Taylor subsumes the politics of recognition under liberalism broadly conceived, he admits that such politics seem illiberal to more traditional liberals who understand politics to be about the defense of universal rights accrued to the individual without relation to group affiliation. Of course, the standard critique of universalism applies here: particular groups, namely, men of European descent, achieve and maintain power in the name of universal rights that supposedly apply to all. Such criticism has been the theoretical backbone of efforts to reform higher education since the 1960s, in terms of actual and symbolic representation, or, affirmative action and multiculturalism. As Taylor writes: “Where the politics of universal dignity fought for forms of nondiscrimination that were quite ‘blind’ to the ways in which citizens differ, the politics of difference often redefines nondiscrimination as requiring that we make these distinctions the basis of preferential treatment.”

Of course, the politics of difference is not grounded in monolithic epistemological positioning. Whereas many multiculturalists challenge the dominant culture by valuing their own minority cultures and histories, some others theorize that there is no such thing as cultural value. These differences are seen in the 1980s and 1990s attempts to revise the traditional canon. Some such attempts, such as the successful revision of the core freshman reading list achieved at Stanford University in 1987-1988, lead by the Stanford Black Student Union, were implicitly based on the notion that all cultures were intrinsically valuable and that judgments to the contrary were based on prejudice or ill-will. But other academic activists, invoking what Taylor calls “half-baked neo-Nietzschean theories,” sought to undermine the very premise of a canon. “Deriving frequently from Foucault or Derrida, they claim that all judgments of worth are based on standards that are ultimately imposed by and further entrench structures of power.”

Taylor thus confirms one of the central points made by Francois Cusset in his provocative French Theory: How Foucault, Derrida, Deleuze, & Co. Transformed the Intellectual Life of the United States (which I recently blogged about here): that some American scholars misappropriated French theory in ways useful to American academic politics. “If Derrida or Foucault deconstructed the concept of objectivity,” Cusset writes, “the Americans would draw on those theories not for a reflection on the figural power of language or on discursive constructions, but for a more concrete political conclusion: objectivity is synonymous with ‘subjectivity of the white male.’” But as Taylor correctly points out: such argumentation backs the neo-Nietzscheans into a conceptual corner. The so-called “canon wars” were animated by a demand for respect—a politics of recognition. But if evaluations of cultural worth are mere expressions of power, then there is no objective ground upon which to respect or even recognize a minority culture.

Right-wing culture warriors made similar recognitions in their analyses of a nihilistic academic left. For example, in Telling the Truth, her 1995 summary of all that she had learned about the academy while directing the National Endowment of the Humanities, Lynn Cheney scorned the academic fashion of reading power and hierarchy into everything, including canonical texts. “The humanities are about more than politics, about more than social power,” she argued. “What gives them their abiding worth are truths that, transcending accidents of class, race, and gender, speak to us all.” Similarly, in his 1991 bestseller Illiberal Education, Dinesh D’Souza wrote that the so-called “victim’s revolutionaries” who had taken over college campuses in the wake of the 60s had joined forces with a new literary relativism, schooled in Foucault and Derrida. “Because the old notion of neutral standards corresponded with a white male faculty regime at American universities,” he intoned, “minority and feminist scholars have grown increasingly attached to the au courant scholarship, which promises to dismantle and subvert these old authoritative structures."

Taylor rejects the neo-Nietzschean form of identity politics in favor of that which values minority cultures. Though he is skeptical that all cultures are objectively equal—morally or aesthetically—he believes that we should approach learning about other cultures as if this were so. He concludes his essay: “There is perhaps a moral issue here. We only need a sense of our own limited part in the whole human story to accept the presumption [that other cultures have merit and value]. It is only arrogance, or some analogous moral failing, that can deprive us of this. But what the presumption requires of us is not the peremptory and inauthentic judgments of equal value, but a willingness to be open to comparative cultural study of the kind that must replace our horizons in the resulting fusions.” In other words, we need to be open to having our ethnocentric views challenged.

Framing humanistic learning as such, Taylor vehemently opposes the racial arrogance of the statement supposedly uttered by Saul Bellow: “When the Zulus produce a Tolstoy we will read him.” Taylor works against Bellow’s supposed stricture not only on the grounds that the Zulus might have produced a Tolstoy that is yet to be discovered, but also from the standpoint that Zulu culture might evaluate merit differently and that we would benefit from learning their evaluative system. This is what he means by replacing our narrow horizons with a vision formed from cultural fusion, what he calls the dialogic process.