From the Archives: The Market-Model University
Humanities in the age of money
More than two decades ago—before the Great Recession caused states to slash funding for public universities (sending tuition bills soaring); before the gold rush into social media lured a generation of students into computer science and engineering; before online education made it easier to secure job-related credentials, further pressuring liberal-arts institutions; and before the decline in humanities enrollments became, for many disciplines, an outright collapse—James Engell, now Gurney professor of English literature and professor of comparative literature, and Anthony Dangerfield published this essay. (They further developed their ideas into a book, Saving Higher Education in the Age of Money, published in 2005.) Because some of their arguments seem prescient, we republish the essay, from the May-June 1998 Harvard Magazine, today. ~The Editors
Authors’ note: Though a sidebar gives pertinent facts for Harvard, this article is about the national state of affairs. It is based on two years of research in hundreds of educational and professional journals, studies, books, magazines, and statistical digests. A bibliography is listed here on this magazine’s archival website.
Economic pressures and rewards have transformed American higher education during the past 30 years: budgets grew, campuses grew, and tuitions grew appallingly—even after adjusting for inflation. Advanced degrees and centers for special studies multiplied. Administrative personnel multiplied faster still. After a slight downturn in the early 1970s, research funding resumed its climb and dispersed itself over a larger number of institutions. Faculty salaries eventually outstripped inflation, with a few “stars”—and quite a few administrators—earning sums unimagined in the early 1960s. The Age of Money had arrived, and happy days were here again, for some.
For others, these decades seemed like the winter of discontent. More teaching was assigned to part-time faculty, at lower pay scales. For financial—not pedagogic—reasons, graduate students assumed a growing portion of the enlarged part-time labor force, a trend with obvious implications for untenured full-time faculty as well as adjuncts. Eventually, even tenured professors at many institutions feared for their jobs. The environment became tough from top to bottom: the average term of college and university presidents is 3.9 years; they aren’t around as long as the undergraduates. Accountability, strategic planning, and downsizing came in vogue. The basic raison d’être of colleges and universities could no longer be assumed—or was forgotten. And all the while—in our opinion, not coincidentally—American colleges and universities systematically disinvested in the humanities. Consider the data.
Humanities represent a sharply declining proportion of all undergraduate degrees. Between 1970 and 1994, the number of B.A.s conferred in the United States rose 39 percent. Among all bachelor’s degrees in higher education, three majors increased five- to ten-fold: computer and information sciences, protective services, and transportation and material moving. Two majors, already large, tripled: health professions and public administration. Already popular, business management doubled. In 1971, 78 percent more degrees were granted in business than English. By 1994 business enjoyed a four-fold advantage over English and remained the largest major. English, foreign languages, philosophy, and religion all declined. History fell, too. Some fields plummeted. Library science shrank to near extinction, from 1,013 B.A.s to 97. On the Preliminary Scholastic Aptitude Test, only 9 percent of students now indicate interest in the humanities.
Measured by faculty salaries—a clear sign of prestige and clout—the humanities fare dismally. On average, humanists receive the lowest faculty salaries by thousands or tens of thousands of dollars; the gap affects the whole teaching population, regardless of rank, within colleges as well as universities. Nationally, in 1976, a newly hired assistant professor teaching literature earned $3,000 less than a new assistant professor in business. In 1984, that gap had grown to $10,000. In 1990, it was $20,000, and by 1996 exceeded $25,000. Beginning assistant professors in economics, law, engineering, and computer sciences enjoy a hefty advantage, too. In 1990 their salaries averaged $10,000 a year higher than those in literature, by 1996 more than $15,000. Nor is English literature the runt of the litter. Fine arts, foreign languages, and education are lower yet.
Salary figures don’t tell the whole story. Consulting fees and second jobs substantially boost incomes in many disciplines—except the humanities, where outside income represents less than one-third the average earned by all disciplines. The point is that professors in other fields, already more highly paid by the educational institution, spend more time on outside ventures and less on duties at the institution itself.
Humanists’ teaching loads are highest, with the least amount of release and research time, yet they’re now expected, far more than three decades ago, to publish in order to secure professorial posts.
Humanists are also, more than others, increasingly compelled to settle for adjunct, part-time, non-tenured appointments that pay less, have little or no job security, and carry reduced benefits, or none.
Consider, too, the health of graduate programs. From 1975 to 1992, the elite top quarter of Ph.D. programs in English cut their yearly output by more than 29 students per program; equivalent programs in chemistry increased on average by 38, computer science by 47. (On the other hand, some humanities programs with the lowest reputations have expanded.)
In 1960, one of every six faculty members professed the liberal arts; in 1988, one of 13. While one can argue that this returns to a norm present in the first half of the twentieth century, that’s hard to document. The truth is, there was a slow slippage in liberal arts beginning as early as 1900, interrupted in the 1950s and early 1960s. But in the last 30 years, the erosion has accelerated, cutting into a base now much weaker.
The weakened condition of humanities within higher education is also reflected in the caliber of students pursuing the disciplines.
By all available measures, national performance in the humanities has declined. Scholastic Aptitude Test verbal scores have fallen. Even allowing for the undisputed complexity of the causes, the key fact is that they’ve dropped far more than SAT math scores, both reported for the same population. Moreover, top performers (scores of 750 or higher) in math have climbed; in language they’ve plunged.
A more select population takes the Graduate Record Exams. But again, the verbal slide remains unmatched in math or analytical sections. In addition, between 1965 and 1992, scores on GRE chemistry and biology tests remained virtually unchanged, yet English literature scores dropped by some 60 points.
Teaching and mastery of languages other than English have declined. The educationally intensive skill that might have a positive impact for students who will work in a global economy—namely, ability in a foreign language—has been neglected. Across the country, college entrance and graduation requirements in language have been eased, even dropped. In 1960, for every 100 students in college, 16 enrolled in foreign languages. In 1970 it was 12, and by 1995, with a global economy in full swing, fewer than 8.
The most authoritative, trusted study of the subject (a yearly poll of college-bound high-school graduates) reveals that in 30 years a total flip-flop has occurred in the proportion of freshmen entering college who expect their higher education to enhance future job security and assure high-wage employment (greatly increased) versus those who want to develop values, form a broader social vision, experiment with varied forms of knowledge, and formulate a philosophy of living (greatly decreased).
Past declines of the humanities were changes in degree. In 1998, with weakened faculties and less well prepared students, we face an imminent, dangerous change in kind. As a society, we seem to be saying that the more we expand the number of students enrolled in college, the less important it is for them to study the humanities.
The Credentials Culture
Even in terms of securing future employment, does this make sense for the students themselves?
As the fate of liberal education darkens, the humanities, once seen as its core, have been largely replaced by occupational majors. Ironically, these courses of study fail to demonstrate that they’re better preparation than the liberal arts and sciences for their associated occupations and professions. Medical schools do not prefer particular majors, not even biology, as long as basic pre-med courses are taken successfully. The Association of American Law Schools recommends courses that stress reading, writing, speaking, critical and logical thinking. Law schools report that by yardsticks of law review and grades, their top students come from math, the classics, and literature—with political science, economics, “pre-law,” and “legal studies” ranking lower.
The idea that students are in college to prepare for full participation in society—including participation that won’t advance their careers or enlarge their bank accounts—no longer has much sway in higher education. More than ever before, policies, curricula, and salaries no longer follow what an institution thinks students and citizens need to prepare for life, work, judgments, and complex decisions requiring a social context of several kinds of knowledge; rather, they increasingly follow the voting feet of students from class to class—though students’ grasp of what training eventually helps to secure good jobs or a meaningful life, likely punctuated with several career changes, is comparatively naive and unformed. This practice can be rationalized as respect for student opinion, or meeting consumer demand: in the market-model university or college, what is prudent prevails.
But serious education entails unpopular decisions on the part of administrators and faculty. Students aren’t getting the education they deserve, a failing that affects their wisdom and judgment more than their intelligence. Narrow-minded doesn’t mean completely unobservant—students can tell which way the wind blows. As one American Association of University Professors report demonstrates, undergraduates became keenly aware which professors were getting paid more, and this strongly affected their choice of majors and classes, “with the result that enrollments in these fields began to increase rapidly, further accentuating the demand for faculty members in these disciplines.” The self-fulfilling prophecy continues to unfold.
Another reason students and parents choose as they do is that the United States has become the most rigidly credentialized society in the world. A bachelor’s degree is required for jobs that by no stretch of imagination need two years of full-time training, let alone four. Why do Americans think this is good, or at least necessary? Because they think so. We’ve left the realm of reason and entered that of faith and mass conformity. College credentializing has lowered pressure on secondary schools to keep up their standards, already so low that they prompted college credentializing in the first place. A sharply increased number of classes offered in four-year and especially two-year colleges over the past two decades must be categorized “remedial”; they teach what was once mastered in high school—or junior high. If high schools turned out graduates who had ninth-grade math, could read well, wrote correct simple sentences, engaged in problem-solving, and possessed basic computer skills and the ability to work in small groups, then a high-school education would suffice for middle-income jobs. Yet, collectively, high schools can no longer guarantee these minimal skills. So, even if some of their graduates greatly exceed them, they must still obtain the credential of a B.A.
The Contemporary College
Because they are more segmented, more market-driven, colleges and universities avidly pursue—and then advertise—trophies: star faculty, plush facilities, and the reputation of excellence, often while neglecting undergraduate teaching. Not to teach has become a reward. Professorial salaries correlate negatively with teaching load. It is not overstatement to conclude that the primary task of higher education is no longer to educate—certainly not to educate undergraduates. Higher education now reserves all its highest rewards for published research. In the last 30 years, the average number of maximum classroom teaching hours has remained steady, but the minimum—that is, the amount performed by those already teaching less, and that means those predominantly outside the humanities—has dropped. Research can and should inform and improve teaching. But a primary emphasis on research doesn’t foster that improvement. Sadly, of all data we studied, only one study is able to conclude that research correlates positively with teaching quality, but then only at four-year colleges, not at doctoral or research institutions.
Abundant anecdotal evidence connects a skewed emphasis on research with scant attention to teaching. For example, the president of Princeton has remarked that “many faculty members ‘suggest that undergraduate teaching “gets in the way” of front-line and increasingly complex research...[while other faculty members] argue that blockbuster grants for research centers...siphon internal funds away from teaching.’” We also have hard data from the late 1960s, reconfirmed recently: “The academic department’s legitimation of, and emphasis on, research specialization made reduced teaching loads not only acceptable as a professorial goal but indeed a demarcator of status on campus.” In the mid 1980s, one study reported “only 15 percent of the faculty members at high-quality research institutions said that they were very heavily interested in teaching.” In 1994, William Massy, vice president for business and finance at Stanford, and Robert Zemsky of the research institute on higher education at the University of Pennsylvania concluded that “the tendency to subordinate teaching to research seems to have spread from the major research universities, where it might conceivably be justified...to the much larger number of four- and even two-year institutions.” The most recent, exhaustive study (1997) reached a statistically unambiguous conclusion: “Our findings clearly indicate that research is rewarded more than teaching.”
But the erosion in teaching is not uniform. Faculty in the humanities teach differently and teach more, especially faculty in the languages and composition. They teach more because by tradition and by virtue of the areas they encompass, the humanities have charge of literacy in undergraduate education. Their basic mission is to insure that recipients of the bachelor’s degree can read and write critically, can reason in language, can argue, can persuade and be open to persuasion. Teachers in other disciplines assist humanities faculty in this task, and not infrequently surpass them. But the bulk of this job—much of it the hardest, most time-consuming, and least rewarded—belongs to the humanities. Which makes it all the more disturbing that the market forces are at work here, too. Why are first-year graduate students, some barely three months from the B.A., often assigned to teach freshman composition, a required course central to writing, critical thought, and the logic of argument? Because it’s far cheaper for the institution. The money saved is spent elsewhere.
Given that professors in the humanities are paid less than those in other fields, and given that the tuition paid by humanities majors usually equals the tuition paid by students in other fields, parents and students associated with the humanities thus actually subsidize the parents and students associated with other fields. In blunt terms, the poorer fields are required to enrich the richer—a lesson not without implications for our national life at large.
In short, test what you will—majors, salaries, graduate programs, cross-subsidies, teaching loads, requirements, languages, aims of education, standardized test scores—the results come back the same. The humanities’ vital signs are poor. There are pockets of health dotted about, but nationally the patient is not well. Since the late 1960s the humanities have been neglected, downgraded, and forced to retrench, all as other areas of higher education have grown in numbers, wealth, and influence. When we termed the last 30 years the Age of Money, we were in part referring to the dollar influx of research grants, higher tuitions, and grander capital improvements. But there’s another, more symbolic aspect to the Age of Money, and one not less powerful for being more symbolic. The mere concept of money turns out to be the secret key to “prestige,” influence, and power in the American academic world. Here’s how.
The Three Criteria
The starved logic that sees money as the most desirable result of education—that knowledge is money or should be directly convertible to it—has produced what we call the Three Criteria. Their rule is remarkably potent, uniform, and verifiable. Academic fields that offer one (or more) of the Three Criteria thrive; any field lacking all three languishes. This effect can be measured by any one or combination of indices: relative proportion of degrees earned, faculty salaries, time allotted for research, new numbers of faculty appointed, graduate or professional populations, capital investment in facilities, support staff, and alumni giving. In the Age of Money, the royal road to success is to offer at least one of the following:
A Promise of Money. The field is popularly linked (even if erroneously) to improved chances of securing an occupation or profession that promises above average lifetime earnings.
A Knowledge of Money. The field itself studies money, whether practically or more theoretically, i.e., fiscal, business, financial, or economic matters and markets.
A Source of Money. The field receives significant external money, i.e., research contracts, federal grants or funding support, or corporate underwriting.
The humanities, apart from a few superstar professors, satisfy none of the criteria. They’ve been penalized accordingly with a steady loss of respect, students, and, yes, money. Fields that study money, receive external money, or are associated—rightly or wrongly—with monetary rewards are precisely those that have fared best in American higher education in the last 30 years. (Theoretical physics is an interesting anomaly among the sciences: it has met the third criterion to some degree, but produces little of immediate utility and is often now cut from funding and high-paying jobs.) Psychology falls in the middle of all fields, sociology and anthropology slightly below. Health and computer sciences, law, business, engineering, and applied sciences: they’re all higher. The fine arts, languages, literature, history, religion, and philosophy: all lower.
Administrations and administrators of higher education nicely fit every one of the Three Criteria. Administration has been a booming industry, for decades outpacing—at times hugely—the growth, if any, in the size of faculties: more administrative and middle-level management jobs at higher pay, even as support-staff positions that directly help faculty members are often cut. Administration is the leading growth sector of higher education. Despite alarms sounded in the 1980s, this trend continues unabated at many institutions. Some administrative growth was required to meet increased governmental regulations and a changed student body with new needs for support. But no one pretends that these factors explain more than half of it.
The bitter humor of Parkinson’s Law is not that it’s a good joke but that his analysis of bureaucracy is true: “Officials make work for each other.” This nicely predicts, for example, that if administrative and executive personnel increase by x percent, then their subordinates will grow at twice that rate. This is preciselywhat happened in more than 3,000 U.S. colleges and universities from 1985 to 1990. While full-time faculty grew only 8.6 percent, administrative personnel rose by 14.1 percent, and their subordinates, “other professionals,” increased by double that, or 28.1 percent. It is a law that central administrations tend to expand, even when there is less work to do.
Many central administrations take a portion of overhead on research money to fund their own operations, including their own expansion, typically without any faculty oversight. And increasingly, administrators spend little or no time teaching or conducting research. Administrators have professionalized, becoming a distinct class. Little by little, historical ties between faculty and administration have loosened—or broken altogether. Exceptions exist, but many administrations and faculties square off as “us” versus “them,” an employer/employee pose. Faculties unionize. Power over personnel and budgets—hence over curriculum and policies—shifts away from faculties toward administrative bodies, presumably because faculty members would botch the task.
If this vast realignment has any justification beyond the imperatives of power and realpolitik, administrations must constantly be supposed the sounder judges of the needs and nature of higher education, research, teaching, and knowledge than are faculties themselves. A remarkable proposition, to be sure, but not by any means the oddest feature of higher education’s odd predicament. It’s usually hard, often impossible, for faculties to obtain a transparent budget, or to know, beforehand, of important decisions that affect their teaching, their students, their place in the institution, even their professional future. If faculty members influence budgetary decisions only marginally, then they cannot control major curricular decisions. At the extreme, departments or schools are cashiered out of existence. Perhaps some should be, but who should judge?
These developments prompted the late Bill Readings, associate professor of comparative literature at the University of Montreal, to claim in 1997 that “the University is becoming a transnational bureaucratic corporation....The University...no longer participates in...the historical project of culture.” More than four years earlier, Robert Zemsky had seen the trend. Universities, he said, are becoming “more like holding companies.”
The more that colleges and universities act as purely utilitarian operations, the more these forces intensify, and the more the Three Criteria come into play. When humanists raise these issues, they’re often told, or scolded with, the Feel-Good Funding Myth.
The Feel-Good Funding Myth
Administrators and scientists have long claimed or implied that external funding for research benefits not only the funded fields but all fields in the university. According to this pleasing and serviceable conjecture, funds delivered to one part of an institution permit an internal reallocation to benefit other parts of the institution—libraries, or perhaps the humanities and the poorer social sciences (history, anthropology, and sociology).
Any such claim should be expressed in a far more circumspect, complex way: when universities first receive outside research funds for science or other fields, they are able to support those fields in a new, expanded way. As funding continues, universities can sustain or expand those fields without siphoning funds from other departments.
But accepting outside funds entails a Faustian bargain: for if those funds are later cut, universities must either retrench (perhaps drastically) in those fields, or cut elsewhere. We can say that such funding increases the amount of scientific research and often the size of science faculties. But we found no evidence to confirm any direct or indirect financial benefit to fields not receiving external support. To top it off, some studies conclude that universities end up paying an overall unreimbursed cost for such support. The Three Criteria programs pocket the vast bulk of external funding; when it comes time to make up for funding cuts, it’s share and share alike.
Evidence for the truth of our critique of the Feel-Good Funding Myth is overwhelming. Five years ago, William Massy, echoing what Alice Rivlin had published 30 years earlier with the Brookings Institution, came to the belated, obvious conclusion that “there is a very real question of whether research is in fact being subsidized by undergraduate education.” He warned that federal funding cuts will place “even greater pressure on research universities to cross-subsidize sponsored programs from all available sources”—a genteel circumlocution that translates into “raid the already diminished funds available to the humanities and social sciences, ask alumni to support ‘the college’ or ‘the university,’ and, in all likelihood, hike undergraduate tuitions again, to underwrite the research of ‘sponsored’ programs.”
Absent a more specific rationalization for the current system, “prestige” is often offered up as the intangible benefit that accrues to the whole institution when some segments are fattened with more staff or better facilities while others make do. But like a gravel pit, “prestige” is a concept that grows more empty with use. To be sure, there seems a gain in “prestige” and perhaps, in certain areas, in quality for universities that enjoy external research support; these institutions may generate a “product mix” that attracts bright undergraduates in many fields, arguably including the humanities. But that explanation begs the question of whether it’s good policy or even honest to lure students with institutional prestige while chopping away at the very basis of that prestige. Self-beguiled, many universities and even colleges have in effect decided that their real business is golden eggs; the goose will just have to fend for itself.
None of the foregoing is intended as an assault on the sciences, or indeed on any funded field. Any cut in funding to science represents a grave danger to research universities and to us all. Scientific research is indispensable to national intellectual and economic life, as well as to health care. It has proven a wise collective investment, and we advocate its continuance and expansion. But the trickle-down fiction that the prosperity of externally funded programs will find its way to undergraduate instruction and to the humanities needs to be exposed for the fairy tale that it is.
When inequities between academic areas are pointed out, the last thing humanists should do is stay silent, fearful of precipitating a Kulturkampf against what “brings money in.” The pretty rhetoric produced by high-ranking officials of some universities—promulgating the notion that all boats are lifted by a rising tide—is devoid of hard figures. Even without going in for hard sciences with heavy external funding to the extent that many other institutions do, the University of Virginia still generates humanities programs and library collections of the first water. And smaller liberal-arts schools also give the lie to the humanities’ presumed financial dependence on the funded “useful” disciplines. They produce humanities undergraduates the equals of their college peers at research universities, and their faculties can rival and are a source for humanities faculties of those universities. As new federal guidelines for financial accounting in higher education go into effect, we’re likely to see—if administrators let us—that humanities and unfunded social-science programs have been cross-subsidizing so-called externally funded programs all along.
“Those Milder Studies of Humanity”
Knowledge has changed and proliferated. It has changed, too, in what John Dryden calls “those milder studies of humanity.” But no such changes can explain why universities and colleges have sharply disinvested in the humanities—the very fields which continue to ask how such changes affect our lives and values as human beings individually and socially. Our most difficult problems remain precisely those that do not admit of solutions by quantitative or technical means alone. Nor are they susceptible to solution by one traditionally defined profession working alone. Ethical debates in medicine, environmental crises, legal issues involving the history of race relations: in these and more we require eloquent language, hard analysis and persuasion in words, and the combined insights of science, history, religion, business, medicine, and ethical traditions.
But humanists of the last three decades responded to the Three Criteria with near-complete ineptitude. They yielded ground on nearly all fronts. Many of their tactical and strategic failures can be traced to their apologetic attitude to other disciplines, itself arising from self-doubt about the value and relevance of their own activities. Humanists began speaking—and arguing—more and more only with themselves. Their acquiescence in the role of grateful pensioner of the implicitly “useful” disciplines and administrations was tacit acceptance of their low rank in the academic hierarchy of our era.
It must be admitted, in fairness, that humanists have been maneuvered into a false position where any response seems like an endorsement of the pecuniary ethos. To insist on their fair share of funding, if only for equal salaries and library collections, is, in appearance, to accept the false proposition that money is the measure of everything. Yet if humanists endure without protest their Cinderella status vis-à-vis the Three Criteria disciplines, they end up conveying the same message: what is, is right.
No such problem would exist if humanists were not embarrassed to proclaim their traditional eminence in the academy. Humanists willing to stand up for their high relevance have only to assert both “Yes, we too need money—and more than we’re getting—to support our activities” and “No, that doesn’t mean we accept wealth as the paramount human and educational value.” Not having done so, humanists and their disciplines have come to be construed as a dispensable luxury. The scandal is that, collectively, by their silence in general, as well as in faculty meetings and administrative posts, humanists have acquiesced.
The humanities inform every deliberative body from the U.S. Congress to the local PTA. No matter what is happening in higher education, we don’t stop dealing with ethics and aesthetics, with language and rhetoric and religion and the arts, with the legacy of our past. We’re human—we couldn’t stop it if we wanted to. What we can do, evidently, is pretend that we can cope with these matters just as well if no one studies them. A peculiarity of American society is our capacity to question (with apparent sincerity) the desirability of producing and supporting minds trained in the study of such matters. In this capacity we seem to be unique. Our tradition of anti-intellectualism is all the more amazing in light of the nation’s history, since we count among our founding fathers some of the most distinguished and learned humanists ever to engage in political life: Madison, Franklin, Jefferson, John and John Quincy Adams, Marshall, and Jay, to name a few. This is a country that spends more to support beer and shaving cream on one Super Bowl Sunday (not to mention tax subsidies to build the stadiums) than its government spends on music and painting and theater in a year. As Richard Hofstadter noted in 1963, “In the United States the play of the mind is perhaps the only form of play that is not looked upon with the most tender indulgence.”
Remarkably, humanists have been active participants in their own subversion. Inner political and theoretical bickering in the humanities has contributed little wisdom to the political life of the country or local communities for two decades. Just as the cult of money was laying siege to the culture of learning, many beleaguered exponents of humanistic study divided into parties and embarked on a series of unedifying disputes, including ones that degraded the name “humanist.” The subjects were worthy enough: the nature of language and of gender, the roles of politics and race and non-Western culture. And these received new, welcome attention. But such gains were often squandered through endemic pettiness, bad faith, and guilt by association. Humanists developed their own politically motivated cult of personalities. And nowadays few people, understandably, want to write the way many professors of literature do. Fifteen years ago Northrop Frye warned that humanists, like Fortinbras in Hamlet, were fighting wars over territories barely large enough to hold the contending armies.
If recent internecine wranglings are impoverished, their appeal diminished by rebarbative jargon, name-calling, narrow specialization, and dull, predictable accusations of being on the wrong “side” of a polarized “war,” it’s all the more sobering to realize that the humanities have picked an especially bad time to fall upon each other. In 1997 Earl Shorris put it this way: “The division should come between market-driven culture and the humanities, not between the beauty of an Asian poem and a European poem.”
What Do We Want?
For three millennia in east and west the humanities have been associated not only with imaginative art but with the world of affairs and professions—law, medicine, trade, government. Apollo is the god of healers and poets. Solzhenitsyn’s chapter on the family doctor in Cancer Ward might be put before medical-school students and their teachers. Law has ancient, deep connections with rhetoric and composition. Solon wrote his legal code in verse. Behind the Iron Curtain—in fact, wherever there was or is repression and intolerance—poets and physicists alike have together kept the faith of humane action and human rights. The environmental movement unites sciences, social sciences, and humanities, business, economics, and religion. John Muir and Rachel Carson: scientists, humanists?
If we segment our education, prizing only what will produce one kind of economic value, we may segment the totality of our experience and trivialize all values. There is no faster way to guarantee the shattering of our societal mosaic than to assume that its higher education should be the sum of a series of separate professional specializations—and that these should be supplemented in the humanities primarily by arguments over the study of various cultures constrained to serve present political goals and social agendas. Are we ready to jettison 3,000 years of collective experience in higher education? In his eloquent book The Idea of Higher Education, Ronald Barnett concludes with a pertinent question: will higher education be forced to settle for “the narrowness of an industry-led competence-bound curriculum?”
Is our disinvestment in the humanities—what we might call the dehumanization of higher education—a legitimate response to desirable market factors? Or is it more accurately one core symptom of a national loss of faith in whole areas of human endeavor as they’re treated in the academic world—those areas not quantifiable, not primarily driven by economics, representing a quality of life we call culture? Whatever the answer, the systematic devaluing of humanistic study in higher education makes it suicidal for humanists to trivialize themselves—producing specialized studies few care to read—or to knuckle under to demands for more publication at the expense of more and better teaching and better, not more, publications.
An economic social Darwinism can apply itself to higher education. Our society distances itself from pursuits and learning that take considerable time and don’t pay immediate cash dividends. Economic competitiveness is responsible for much good and prosperity. But when visited on every segment of society, and on higher education, it may contribute to a social breakup. Do we want it increasingly applied to colleges and universities?
Do we care any more whether colleges and universities are custodians of collective, diverse cultures—whether they record, teach, and transmit traditions, and give us the linguistic and symbolic tools to express our veneration, criticism, and contribution to our culture, to make connections within its variety, to examine its checkered past and to imagine its possible future? If our institutions of higher education don’t do this, who will? For intelligent young people, do we want careers in the humanities to be obviously less attractive than many other options open to them? Do we want market forces thoroughly to work their will on the very set of institutions that we once, after careful deliberation, decided should be largely protected from them?
It all boils down to one question: Does it matter? To us it’s evident that our nation cannot steer the best course through our exciting but complex and perilous times without the aid and leadership of men and women who have mastered language, who can put together a sound argument and blow a specious one to bits, who have learned from the past, and who have witnessed the treacheries and glories of human experience profoundly revealed by writers and artists. But if nothing changes, we will soon face our difficult world and our endlessly complicated future without new generations so trained. We will soon be looking not at a weakened tradition of humanistic learning and education, but a defunct one.
You might also like
Turning the Black List into a business, to modernize Hollywood’s dream machine
A mansion promotes artist Winslow Homer’s roots in Belmont, Massachusetts.
Digitized herbaria collections data allow researchers to predict future plant ranges.