At the Faculty of Arts and Sciences (FAS) meeting on November 4, Harry R. Lewis, Gordon McKay professor of computer science, posed a question. He had learned from two colleagues, he said, that students in their spring lecture courses had been photographed without prior notification or consent of either professors or students,to gather data for a Harvard Initiative for Learning and Teaching (HILT) study on attendance. This “surveillance,” he observed, appeared at odds with professors’ usual control over their classrooms—and with the lessons purportedly learned, painfully, during the 2012-2013 academic year: the news that resident deans’ e-mail accounts had been examined during an investigation of academic misconduct led to the drafting of new University policies on electronic privacy. He embraced the principle of “more peer feedback on our teaching,” Lewis said, but colleagues and students did not wish to go about their academic work “never knowing for sure whether we are being or have been under scrutiny.” He asked for assurance that all the subjects of “this nonconsensual study” be informed that they were photographed.
The study had come to light obliquely during a September conference, when Samuel Moulton, HILT’s director of educational research and assessment, discussed attendance in 10 unidentified lecture courses. His exhibits showed that attendance declined from the beginning to the end of most weeks, and during the semester. The main reason for showing up or not, he reported, involved a student’s reason for taking a course; pre-medical requirements were correlated with high attendance. Moulton added that attendance is a measure of student engagement: “People vote with their feet.”
Lewis’s remarks made clear what no one had noted at the time: the data were collected photographically. FAS does not routinely take attendance or assign seats.Lewis’s query brought to the microphone vice provost for advances in learning Peter K. Bol, who oversees HILT and the HarvardX online-learning program. (Bol is Carswell professor of East Asian languages and civilizations, and a director of Harvard Magazine Inc.) He had heard anecdotally, he said, that students were increasingly prone to skip class, among other signs of diminished academic rigor (less work outside of class, less note-taking). “Such anecdotes raised questions about the effectiveness of lectures as a way of helping students learn,” he said, “and suggested that there might be some value in exploring how new media and pedagogical techniques might be used by faculty to turn the lecture into something…more interactive and engaging….”But, Bol continued, “we did not have any data to support the anecdotes. I thus looked for a way of getting data on attendance, because that seemed to be the only thing that could be measured in a straightforward way that did not rely on self-reporting.” To avoid study bias and protect student identities, an experiment was designed to use photographic recording of lecture halls, from which full and empty seats could be counted. The Committee on the Use of Human Subjects in Research, he reported, determined that this was not “human-subjects research,” and so could proceed without prior notice or consent protocols. He shared the data, once analyzed, with the course heads, and the underlying images were destroyed.
Bol said there would be more consultation before studies involving undergraduates proceed in the future, and President Drew Faust said the oversight committee on electronic-communications policy would also be consulted. The few faculty members who commented from the floor suggested they could answer questions about their teaching and attendance directly, if asked.
The following week, Bol used a blind e-mail list of registrants to notify students in the courses that were photographed. He advised that, “The researchers involved in this study do not know who was enrolled” and that no individuals were identified, and invited comment on any lingering concerns. The Harvard Crimson, meanwhile, in a bit of enterprising reporting, discovered that 29 courses had been photographed, not just the 10 about which Moulton spoke: 22 from the College and the Graduate School of Arts and Sciences, and 7 from the Extension School.
Analysis of data on the 19 other courses has not been completed, and may not be, and the underlying images have been destroyed for all 29 courses, according to HILT’s director, Erin Driver-Linn, and Moulton. “[T]his research was never meant to bring scrutiny to individual courses, faculty, or students,” they wrote, “nor was it ever meant to judge individual courses or faculty.…The goal has consistently been to understand lecture attendance in order to be able to ultimately improve student engagement and learning.”
In mid November, HILT published findings on the 10 courses analyzed. Among them:
• On average, 60 percent of students attended any given lecture.
• There was significant variability among courses, with average attendance during the semester ranging from 38 percent to 94 percent.
• Overall, attendance declined during the semester, from 79 percent to 43 percent.
As explanatory factors, the report noted, “[C]ourses that measured and graded attendance had higher attendance than those that did not (87 percent vs. 49 percent, respectively).” Premed requirements also mattered, as noted above. Finally, “Other reasons for taking the courses (e.g., elective vs. General Education requirement) did not show significant effects, nor did time of day, day of week, published Q ratings [student course evaluations], or the availability of lecture videos.”
For this sample, at least, HILT acquired data on lecture attendance—at considerable financial cost, and at least some cost in faculty and student good will. If the study prompts further discussion of the efficacy of lectures versus more engaged “flipped” courses (where students watch recorded videos before class, and then come together to work on problems and master more difficult concepts—an experiment both Lewis, some years ago, and Bol, more recently, have pursued), that might be a good thing. So might professors’ voluntary agreement to invite peer review of and feedback on their pedagogy. Combined with HILT-funded teaching experiments and analytics, and HarvardX’s technological wizardry, such interventions present plenty of opportunities for gains in instruction and learning.