Harvard’s New Playbook for Teaching with AI

Faculty across Harvard are rethinking assignments to integrate AI. 

Loading icon surrounded by books, an apple, and a notepad with writing.

AI policies and assignments are “buffering” across Harvard schools | MONTAGE BY NIKO YAITANES/HARVARD MAGAZINE; IMAGES BY UNSPLASH

As generative AI becomes as common as word processors and spreadsheets, educators are shifting from debating whether to allow it in classrooms to creating clear guidelines for how to use it effectively.

At Harvard’s Initiative for Learning and Teaching (HILT) conference in late September, faculty from across the University shared some of the ways they’re rethinking assignments and instruction with AI in mind.

In a panel organized by Esther Kotecha, the associate director for teaching and learning technologies at Harvard Medical School (HMS), and Mae Klinger, the associate director of teaching and learning innovation at Harvard Kennedy School (HKS), educators explored how AI can support, not substitute, deep learning among students.

Rethinking Assignments with AI at the Kennedy School

Teddy Svoronos sitting at a panel table
Teddy Svoronos | PHOTOGRAPH By NEal Hamberg; COURTESY OF Harvard Initiative for Learning and Teaching (HILT)

Teddy Svoronos, a senior lecturer in public policy at HKS, introduced a “traffic light” framework for AI use in his courses this semester. Green means AI can be used without restrictions; yellow means AI use is allowed with limitations, and red signals that AI use is prohibited (often involving tasks in which the learning objective would be compromised by the use of AI).

Red-light tasks often precede in-class discussions where students must build and defend arguments collaboratively. Green-light tasks, by contrast, encourage experimentation. In those cases, students work with AI platforms or custom tutor bots and are prompted to reflect on questions such as: What did the AI assist with? What did it fail to address? What was learned through its use?

Svoronos also introduced AI-facilitated oral exams. Students engage in a Socratic dialogue with a conversational AI trained on course material. The AI doesn’t grade; instead, instructors review transcripts to assess performance. This preserves the human role in evaluation while exploring new learning formats.

In another assignment from Svoronos, students first create a data visualization in Excel (a task drawing on their existing skill set). Then, they use AI to generate a more advanced version (a task that pushes beyond, or is wholly outside, their current capabilities). And in a final phase, students critically audit the AI-generated output with questions like: What aspects were accurate? What content appeared fabricated? What would require verification before public use or academic submission?

Svoronos also had students co-develop the class AI policy, with the goal of increasing transparency over expectations and giving them ownership over learning objectives. The exercise also reinforced a broader pedagogical principle: students who help teach, even indirectly, often learn more deeply.

AI as a “Thinking Partner” at Harvard Medical School

Tari Tan, a lecturer on neurobiology and the assistant dean for educational scholarship and innovation at HMS, described how her students interact with AI in the context of lesson planning.

Tari Tan at a podium with an audience in a foreground
Tari Tan | PHOTOGRAPH By NEal Hamberg; COURTESY OF Harvard Initiative for Learning and Teaching (HILT)

Students first annotate their own lesson materials, then submit them to ChatGPT and compare the results. They reflect on the quality of their prompts, potential bias in the output, and how the AI’s responses align with their goals. This helps reinforce a key lesson, she said: while generative AI may mimic fluency, it doesn’t replicate human reasoning. As a consequent, students begin to conceive of it not as an infallible source of information, but as a thinking partner.

Tan emphasized the importance of metacognition, or “thinking about one’s own thinking.” When students can articulate how they’re using AI, she said, they’re better able to identify when they’re learning and when they’re outsourcing thought. She also noted that using AI in this way deliberately adds “cognitive load,” requiring students to assess the quality, relevance, and accuracy of AI outputs. In some cases, the effort of managing the tool becomes a distraction from the task itself.

Critically, Tan does not grade her students’ AI-generated content. Instead, she assesses their reflections and their ability to use AI meaningfully. Over time, students get better at writing effective prompts and critiquing the AI’s output. They learn to identify hallucinations, jargon that masks weak logic, and content that sounds plausible but lacks substance.

Tan’s broader takeaway from the experiment is that AI won’t make teaching obsolete. It may, however, render disengaged or uncritical teaching practices obsolete.

Reshaping Executive Education at Harvard Business School

As generative AI becomes central to the modern workplace, Harvard Business School is rethinking how to prepare students for using it in the real world. During Boston AI Week, which took place from September 26 to October 3, HBS faculty and staff discussed experiments with AI-powered tools, particularly custom GPTs, to enhance instruction and provide additional student support.

Fritz Kocher, a senior application analyst at HBS, introduced a GPT model that helps faculty process course feedback by turning lengthy student comments into actionable insights and visual summaries. The tool, he said, serves as a “warm handoff” to faculty, not a replacement for human judgment.

“You go from 50 pages of feedback, straight into an appointment to work on the insights it gives you,” Kocher said. The tool also provides structured visualizations and is aligned with teaching principles from the C. Roland Christensen Center for Teaching and Learning.

Dustin Hilt, the director of HBS’s Live Online Classrooms initiative, and Jaye Schneider, the associate director of business relationship management, said it was important to accustom faculty to use generative AI. Schneider’s team developed scaffolded “prompt libraries,” or standardized, pre-written phrasing to ease the learning curve and support successful adoption.

Wendy Riseborough, a senior creative producer at HBS, described an experiment in which AI-generated avatars were used to populate virtual classrooms when live participants were not present. The goal of this project was to ensure that asynchronous students could still feel “present” within the classroom environment, even remotely. The team explored dozens of video-generation platforms. Early results were mixed (some avatars were too stiff or obviously artificial), but advances in platforms like HeyGen have brought greater realism.

As more AI tools are introduced, panel members said, data privacy remains an important focus at the University. For example, HBS mandates that all generative AI tools must not be trained on user data; must operate in a so-called “walled garden,” or protected from outside parties; must comply with strict data privacy contracts; and must enable faculty to safely upload HBS cases and course data without including any personally identifiable information or financial data.

Historically, outright bans on new technologies often drive its use underground, making integration and regulation more difficult. Similarly, a punitive approach to generative AI may reduce misuse but risks fostering stigma and confusion, agreed panelists, which can lead to uninformed or inappropriate use.

Read more articles by Olivia Farrar

You might also like

Harvard’s New Online Orientation Emphasizes Intellectual Paths

A summer course for first-years focuses on academic success, diverse viewpoints.

Why Harvard Needs International Students

An ed school professor on why global challenges demand global experiences

Most popular

What Trump Means for John Roberts’s Legacy

Executive power is on the docket at the Supreme Court.

How Maga Went Mainstream at Harvard

Trump, TikTok, and the pandemic are reshaping Gen Z politics.

Awol from Academics

Behind students' increasing pull toward extracurriculars

Explore More From Current Issue

Aerial view of a landscaped area with trees and seating, surrounded by buildings and parking.

Landscape Architect Julie Bargmann Transforming Forgotten Urban Sites

Julie Bargmann and her D.I.R.T. Studio give new life to abandoned mines, car plants, and more.

Professor David Liu smiles while sitting at a desk with colorful lanterns and a figurine in the background.

This Harvard Scientist Is Changing the Future of Genetic Diseases

David Liu has pioneered breakthroughs in gene editing, creating new therapies that may lead to cures.