Consider this irony: Harvard and other elite American research universities, so crucial to innovation in almost every area of our lives, find it almost impossible to innovate within their own operations and embedded assumptions. They regularly transform everything from healthcare to public policy to our understanding of the universe, yet they are either unable or disinclined to transform themselves in any profound way. It is easier for Harvard to help develop a vaccine to fight COVID-19 than to alter its own departmental structure or teaching model.
This is not an exaggeration.
The question is not whether these universities in their current state are performing a profoundly important social function (they are). Rather, it is whether they could perform their specifically educative function more effectively if they were willing to examine seriously assumptions as central as the primacy of departments, the positive correlation between research productivity and teaching, and the traditional academic calendar. Each of these subjects is commonly thought of as a “third rail”—particularly by administrators who want to keep their jobs. The problem is that higher education, especially in its most rarified form, has more third rails than a rail yard.
Imagine putting aside fidelity to the status quo and thinking instead about impact: that is, imagine if even the wealthiest and most selective universities were willing to step back and look hard at the effectiveness of their current work. Higher education should in its ideal form lead to more economic security for more people, a more equitable and innovative society, and a well-functioning democracy: in the words of Harvard’s mission statement, it should prepare “citizens and citizen-leaders.” This is a mission of great import, one that deserves scrutiny at least as constant and careful as the scrutiny that the university’s many scholars and scientists exercise within their disciplines. Those disciplines are steadily changing in response to new knowledge and new insights.
Are our best universities?
A truly self-reflective university might begin by asking questions such as the following:
Are we organizing ourselves, and are we organizing knowledge, in the best way?
Colleges and universities are among the most compartmentalized organizations in the world. Harvard College alone organizes its faculty into 50 disciplinary and interdisciplinary concentrations; across the university there are hundreds of academic and administrative departments. Each year, it seems, more are added; almost none ever goes away. Like virtually every other university, what we call “Harvard” is a large accumulation of highly specialized pieces, constructed not by design but through gradual accretion over time.
Perhaps this is the best way to do things. Or perhaps not. Many organizational thinkers argue that the most successful work “requires the integration of skills and transcending functional boundaries.”1 Higher education, the ultimate siloed enterprise, is adept at neither. Would a university with fewer internal divisions be able to eliminate redundancies and reduce costs? Almost certainly. Would such a university also teach students more effectively and perform its internal functions at a higher level? I believe that it would.
The overwhelming majority of four-year colleges and universities in the United States require students to declare a major (at Harvard called a concentration) in a discipline housed in a department. Most concentrations are designed by faculty members essentially to reproduce themselves: if you want to be an English professor or a biology professor, moving through the curricula in those departments makes perfect sense. Yet only a tiny and shrinking percentage of English majors will go on to be English professors, and if you want to do almost anything else, I would argue, there are more effective ways to organize your education than by majoring in a single discipline. Should we take it as a given that building your education around 10 literature courses is preferable to building it around a global challenge like food insecurity or climate change, around the development of an ability like creativity or clarity of expression, or even around the growth in a capacity like empathy or resilience? This is not about the value of studying literature, but about how that study should be organized in relation to other areas of study in order to prepare students for work they will do and the problems they will need to solve.
When was the last time an elite university took a serious look at the logic and efficacy of its internal organization? My investigation of that question has to date left me debating between two answers: “a really long time ago” or “never.”
Are we wasting time?
During my decades as a student, teacher, and administrator, I’ve been associated with six colleges and universities, and at every one the central work—teaching students—happened at full bore for two-thirds of the year. (To be fair, this is not the case at many two-year and some four-year institutions.) Of what other essential industry is this true? Imagine if hospitals or supermarkets or the postal service took a pause in January and another from June through August.
The evolution of the “summer break” in both colleges and K-12 schools is often, but erroneously, linked to the country’s agrarian roots. In fact, it was a concession in the mid-nineteenth century to affluent vacationers. It has stuck around, essentially, because we like it, even though most experts agree that it does more to harm than to help student learning.2
The strongest arguments in favor of the eight-month academic calendar at colleges and universities in particular are the following: it allows students, faculty, and staff to decompress after the intensity of a term (though try making that case to someone who does almost any other job); it provides time for students to help fund their education through working; and it allows faculty both to prepare their classes and to engage in scholarship. At research universities and many liberal arts colleges, this last argument in particular is the one most commonly voiced.
The strongest arguments against the typical calendar are both financial and educational. The simplest way to reduce the cost of a four-year college degree would be to make it a three-year college degree, and this could be accomplished rather easily by expanding the length of the school year. And while the evidence for a “summer slide”—that is, a decline in achievement levels after a long break—is not incontrovertible, it is strong, and it suggests that the decline is worse among older students and among students of lower socio-economic status.3
The interruption of teaching in the service of research is commonly justified on two bases. Research universities in particular, as I have said, are essential drivers of innovation, and faculty members are therefore fulfilling two roles: teaching students and advancing society through the creation of knowledge. This argument is powerful, though it might not apply with equal force to all disciplines. Try as I might, I cannot convince myself that the world is a better place because I published a book on Dickens’s Little Dorrit—though it did help me get promoted.
The second and more debatable argument is that productive scholars make better teachers. This is among the most cherished of academic beliefs—but it is, unfortunately, unsupported by evidence. The author of a review of the research on the correlation between scholarship and teaching notes that while “academics overwhelmingly think that the roles are mutually supportive,” he “cannot conclude from evidence at hand that the link is strongly positive” (nor can he conclude, fortunately, that the link is negative, which is something of a relief).4 In a very recent study, David Figlio and Morton Shapiro conclude similarly that “regardless of which measure [is] used, top teachers are no more or less likely to be especially productive scholars than their less accomplished teaching peers.”5
Serious consideration of this information might lead to a rethinking of everything from graduate training to the standards for tenure to compensation. But when was the last time an elite research university actually examined the research about the connection between teaching and…research?
The interesting question is this: how have distinguished universities been so successful for so long while managing not to ask fundamental questions about themselves? If a group as essentially static as the Ivy League existed in almost any other industry it would long ago have disappeared or drastically changed.
The answer is brand strength.
Brands—or, if one wants to sound less crass, reputations—in higher education, particularly at the upper end of the food chain, are very nearly impregnable and immutable. U.S. News & World Report began ranking universities and colleges in 1983, and the top four universities in that first year, when the single input was peer reputation, were Stanford, Harvard, Yale, and Princeton. In 2021, after the addition of columns filled with ostensibly relevant data, the top universities were Princeton, Harvard, Columbia, Yale and MIT (tied)—followed by Stanford. Coincidentally, Fortune also began ranking the “World’s Most Admired Companies” in 1983, when the top four were Exxon, General Motors, Mobil, and Texaco. The top four in 2021 were Apple, Amazon, Microsoft, and Disney—none of which made the top 100 in 1983 and one of which had not even been founded. Thirty-eight years later, none of the top four from 1983 made the top 100.
Reputational stickiness among liberal-arts colleges is not very different: Williams College has been ranked number one for 19 years in a row.
Pick an industry—manufacturing, clothing retailers, supermarkets, technology—and you will find dramatic and rapid reputational change, along with many once-prominent companies that have simply vanished. Not in higher education, where the indestructability of reputation provides absolutely no incentive to evolve. Change is difficult, and organizations take on that difficulty either because they are forced to do so or because they are prompted by a powerful desire to be better. An organization that is perceived as the best and treated by the market as the best, regardless of what it does, will probably lack the motivation to do the hard work of change. If Apple could run back the iPhone 7 year after year, increase the price, and see a rise in demand, why would it mess around with an iPhone 8?
Now imagine if Apple was in a position to say “no” to 96 percent of those who wanted to purchase its product.
Unlike reputations in almost any other area, those in higher education are the products of an almost perfectly self-perpetuating system. Because the schools in the Ivy League have the best brands, they are the most selective and attract the best students; because they attract students who are almost certain to succeed, their graduation and job placement rates are also the best; because selectivity and graduation rates drive rankings, they remain at the top; because they are the most wealthy they raise the most money and become wealthier…and so it goes indefinitely, with the question of what actually happens between admission and graduation virtually irrelevant to the brand.
More self-reflective and socially beneficial practices in higher education are unlikely to originate among the most privileged. They are also unlikely to originate, I would argue, among the Googles and Amazons of the world, whose record of prioritizing the social good is less than encouraging.
Fred Swaniker, the entrepreneurial creator of a growing educational ecosystem in Africa, has argued convincingly that constraint drives innovation, and one thing that higher education has in abundance is constraint.6 For every university awash in applications and endowment income, there are hundreds across the country pressured by scarcity to become more efficient and creative; for every loosely connected confederation of professional schools, research institutes, and hospital systems, there are hundreds of colleges whose sole focus is teaching; for every institution with a centuries-old history, there are dozens around the world unburdened by the weight of that history. These colleges will not revolutionize medicine or artificial intelligence—for that, indeed, there is Harvard—but they might transform the way we educate the next generation of people who will.
It will not be Princeton but some small college in the Midwest that pioneers a new, less siloed organizational structure; it will not be Stanford but a struggling college in New England that rethinks the reward system for scholarship and teaching; and it will not be Harvard but a new university in Africa that designs a curriculum around missions and not majors.7
And eventually, perhaps, the Ivy League will catch up.
______________
1https://medium.com/the-ready/a-practical-guide-to-cross-functional-work-e94f7f51d41a
2https://www.pbs.org/newshour/education/debunking-myth-summer-vacation
3https://www.brookings.edu/research/summer-learning-loss-what-is-it-and-what-can- we-do-about-it
4https://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.145.1140&rep=rep1&type=pdf#:~:text=Teaching%20to%20Research%20%26%20Research%20to
%20Teaching&text=There%20is%20a%20correlation%20between,other%20spheres%20of%20academic%20activity
5https://www.insidehighered.com/views/2021/03/23/what-new-research-study-tells-us-about-staffing-higher-ed-classroom-opinion
6https://www.howwemadeitinafrica.com/fred-swaniker-opportunities-in-post-covid-19-africa/76684/
7https://www.alueducation.com/global-challenges-missions-not-majors/