15 December 2004

Not So Smart-- And Proud Of It

      Reading this article from The Chronicle (the original article is here, but I've reproduced it here because it'll soon be inaccessible, as articles there tend to become), I'm reminded that I don't think I was ever described as "smart."   The very-frequently-used assessment that I was well-named now seems more than ever a badge of honour instead of a toss-off compliment. Also in light of this article, I do recall many times being characterized in terms of rigor-- or "rigour," as well spell it in a country that knows how to spell--, a fact that gives me pause considering my general distrust of the theoretical. All that's neither here not there. What I want is this guy now to take a good look at the word "sharp."   As in "acrid," perhaps?   (Leave "pungent" alone, the lot of ye.)  


Here's the Problem With Being So 'Smart'

By JEFFREY J. WILLIAMS

How often have you heard about someone's work, "You have to read it -- it's really smart."? Or "I didn't agree with anything he argued in that book, but it was smart."? At a conference you might hear, "I want to go to that panel; she's quite smart." You've probably also heard the reverse: "How did he get that job? He's not very smart." Imagine how damning it would be to say "not especially smart, but competent" in a tenure evaluation. In my observation, "smart" is the highest form of praise one can now receive. While it has colloquial currency, smart carries a special status and value in contemporary academic culture.

But why this preponderance of smart? What exactly does it mean? Why not, instead, competent? Or knowledgeable? Or conscientious? We might value those qualities as well, but they seem pedestrian, lacking the particular distinction of being smart.

Historically, smart has taken on its approbative sense relatively recently. Derived from the Germanic smerten, to strike, smart suggested the sharp pain from a blow. In the 18th century it began to indicate a quality of mind. For instance, the Oxford English Dictionary notes Frances Burney's 1778 use in Evelina: "You're so smart, there's no speaking to you." (We still retain this sense in the expression "smartass.") Smart indicated a facility and manner as well as mental ability. Its sense of immediacy also eventually bled over to fashion, in the way that one might wear a smart suit.

The dominance of smart in the academic world has not always been the case. In literary studies -- I take examples from the history of criticism, although I expect that there are parallels in other disciplines -- scholars during the early part of the 20th century strove for "sound" scholarship that patiently added to its established roots rather than offering a smart new way of thinking. Literary scholars of the time were seeking to establish a new discipline to join classics, rhetoric, and oratory, and their dominant method was philology (for example, they might have ferreted out the French root of a word in one of Chaucer's Canterbury Tales). They sought historical accuracy, the soundness of which purported a kind of scientific legitimacy for their nascent discipline.

During midcentury the dominant value shifted to "intelligent," indicating mental ability as well as discerning judgment. Lionel Trilling observed in a 1964 lecture that John Erskine, a legendary Columbia professor, had provided "a kind of slogan" with the title he had given to an essay, "The Moral Obligation to Be Intelligent." Trilling went on to say that he was "seduced into bucking to be intelligent by the assumption ... that intelligence was connected to literature, that it was advanced by literature." Literary scholars of this era strove to decipher that essential element of literature, and their predominant method was interpretive, in both the New Critics (of particular poems) and the New York intellectuals (of broader cultural currents).

The stress on intelligence coincided with the imperatives of the post-World War II university. Rather than a rarefied institution of the privileged, the university became a mass institution fully integrated with the welfare state, in both how it was financed and the influx of students it welcomed. As Louis Menand recounts in "The Marketplace of Ideas," the leaders of the postwar university, such as James B. Conant of Harvard, strategically transformed the student body to meet the challenge of the cold war as well as the industrial and technological burgeoning of the United States. These leaders inducted the best and brightest of all classes -- as long as they demonstrated their potential for intelligence. Conant was instrumental in founding the Educational Testing Service, which put in place exams like the SAT, to do so.

In the latter part of the century, during the heyday of literary theory (roughly 1970-90), the chief value shifted to "rigor," designating the logical consistency and force of investigation. Literary study claimed to be not a humanity but a "human science," and critics sought to use the rigor of theoretical description seen in rising social sciences like linguistics. The distinctive quality of Paul de Man, the most influential critic of the era, was widely held to be his rigor. In his 1979 classic, Allegories of Reading, de Man himself pronounced that literature advanced not intelligence but rigor: "Literature as well as criticism is ... the most rigorous and, consequently, the most unreliable language in terms of which man names and transforms himself."

Since the late 1980s rigor seems to have fallen out of currency. Now critics, to paraphrase Trilling, are bucking to be smart. This development dovetails with several changes in the discipline and the university. Through the 1980s and '90s literary studies mushroomed, assimilating a plethora of texts, dividing into myriad subfields, and spinning off a wide array of methods. In the era of theory, critics embraced specialization, promulgating a set of theoretical schools or paradigms (structuralism, deconstruction, Marxism, feminism, and so on). But while the paradigms were multiple, one could attribute a standard of methodological consistency to them.

Today there is no corresponding standard. Individual specializations have narrowed to microfields, and the overall field has expanded to encompass low as well as high literary texts, world literatures as well as British texts, and "cultural texts" like 18th-century gardens and punk fashion. At the same time, method has loosened from the moorings of grand theories; now eclectic variations are loosely gathered under the rubric of cultural studies. Without overarching criteria that scholars can agree upon, the value has shifted to the strikingness of a particular critical effort. We aim to make smart surmises among a plurality of studies of culture.

Another factor in the rise of "smart" has to do with the evolution of higher education since the 1980s, when universities were forced to operate more as self-sustaining entities than as subsidized public ones. As is probably familiar to any reader of The Chronicle, this change has taken a number of paths, including greater pressure for business partnerships, patents, and other sources of direct financing; steep increases in tuition; and the widespread use of adjuncts and temporary faculty members. Without the fiscal cushion of the state, the university has more fully modeled itself on the free market, selling goods, serving consumers, and downsizing labor. It has also internalized the chief protocol of the market: competition. Grafting a sense of fashionable innovation onto intellectual work, smart is perhaps a fitting term for the ethos of the new academic market. It emphasizes the sharpness of the individual practitioner as an autonomous entrepreneur in the market, rather than the consistency of the practice as a brick in the edifice of disciplinary knowledge.

One reason for the multiplicity of our pursuits is not simply our fecundity or our fickleness but the scarcity of jobs, starting in the 1970s and reaching crisis proportions in the 1990s. The competition for jobs has prompted an explosion of publications; it is no longer uncommon for entry-level job candidates to have a book published. (It is an axiom that they have published more than their senior, tenured colleagues.) At the same time, academic publishing has changed. In the past, publishing was heavily subsidized, but in the post-welfare-state university the mandate is to be self-sufficient, and most university presses now depend entirely on sales. Consequently the criterion for publication is not solely sound disciplinary knowledge but market viability. To be competitive, one needs to produce a smart book, rather like an item of fashion.

Smart still retains its association with novelty, in keeping with its sense of immediacy, such that a smart scholarly project does something new and different to attract our interest among a glut of publications. In fact, "interesting" is a complementary value to smart. One might praise a reading of the cultural history of gardens in the 18th-century novel not as "sound" or "rigorous" but as "interesting" and "smart," because it makes a new and sharp connection. Rigor takes the frame of scientific proof; smart the frame of the market, which mandates interest amid a crowd of competitors. Deeming something smart, to use Kant's framework, is a judgment of taste rather than a judgment of reason. Like most judgments of taste, it is finally a measure of the people who hold it or lack it.

The promise of smart is that it purports to be a way to talk about quality in a sea of quantity. But the problem is that it internalizes the competitive ethos of the university, aiming not for the cultivation of intelligence but for individual success in the academic market. It functions something like the old shibboleth "quality of mind," which claimed to be a pure standard but frequently became a shorthand for membership in the old boys' network. It was the self-confirming taste of those who talked and thought in similar ways. The danger of smart is that it confirms the moves and mannerisms of a new and perhaps equally closed network.

"Smart," as a designation of mental ability, seems a natural term to distinguish the cerebral pursuits of higher education, but perhaps there are better words. I would prefer the criticism I read to be useful and relevant, my colleagues responsible and judicious, and my institution egalitarian and fair. Those words no doubt have their own trails of associations, as any savvy critic would point out, but they suggest cooperative values that are not always inculcated or rewarded in a field that extols being smart.

Jeffrey J. Williams is a professor of English and literary and cultural studies at Carnegie Mellon University and editor of the minnesota review. His most recent book is the collection Critics at Work: Interviews 1993-2003 (New York University Press, 2004).

No comments:

Blog Archive