Applying AI—How and Why

AI-generated illustration showing a giant octopus, its tentacles reaching around a brick university building

An AI-produced illustration. ChatGPT’s DALL-E image generator made this in response to the prompt: Pretend you are artist Ernst Haeckel. Draw an intelligent-looking octopus with tentacles enveloping a university building. | Image generated using DALL-E

How will generative artificial intelligence change higher education? While attention has focused on AI’s potential to circumvent student learning of fundamental skills such as writing, demonstrations of its applications in fall semester courses suggest that it could also be a potent means of improving pedagogy: a French-speaking avatar that answers student questions about life in Paris, a chatbot that can tutor computer science students at any hour, an AI system that answers philosophical questions in the manner of Kant (or any other philosopher) in order to spark discussion and critique. Outside the classroom, AI tools such as ChatGPT and other large language models (LLM) could also enhance effectiveness in other ways—automatically encouraging students to come to office hours, particularly those who appear most likely to benefit.

To explore the risks and opportunities, a group of professors and deans in the Faculty of Arts and Sciences has been charged with guiding FAS “strategy and planning regarding AI systems” and making recommendations on needed major investments and policy or process changes. The Artificial Intelligence Systems Working Group Executive Committee began meeting last summer to assess the impacts on pedagogy, academic and research integrity, and administrative innovation.

“AI is a game changer,” says Paul professor of the practice of government and technology Latanya Sweeney, who cochairs the committee with FAS dean of administration and finance Scott Jordan. “The question is, ‘How will it change the game?’” The group’s role, she explains, is not to produce a report with rules and recommendations but to provide the University community with opportunities and support to encourage safe interaction with the technology. As part of that effort, Harvard’s information technology services organization has built an AI “sandbox” that holds seven different LLMs. Unlike commercially available versions of these tools, information uploaded within the sandbox does not become part of the AI’s training data. The University has also licensed a privacy-protected version of ChatGPT-4.

How the technology is deployed in courses is up to professors, whose directives to students range from outright bans to encouraging use of the technology. Is it okay to ask ChatGPT to summarize a reading? To edit a graded paper? No matter what the course policy, students remain responsible for their work. And since tools like ChatGPT are known to “hallucinate” data and facts (i.e., make things up) on occasion, students have approached the technology cautiously, reports Sweeney.

Some professors have incorporated AI tools into their courses in innovative ways. A December 6 faculty presentation demonstrated some of the innovations being tested in classrooms. Senior preceptor in romance languages and literatures Nicole Mills, director of Harvard’s French, Italian, and Portuguese language programs, demonstrated a virtual French world, set in Paris, and explained how she had staged a whodunnit murder mystery in the Centre Pompidou for students to solve in class. While there were some hiccups—occasionally the AI, trained on native French speakers, could not understand student accents—the reception by her class, she reported, was unanimously enthusiastic. Other instructors have used the technology to rapidly summarize student learning based on in-class polls or quizzes, and even used it as a stand-in for a student, asking ChatGPT questions and teaching from mistakes in its answers.

In the administrative realm, says FAS’s Jordan, many of the large-scale commercial software packages widely used in administration are likely to begin incorporating AI tools such as bots for answering commonly emailed questions or for accessing financial information and generating reports in a fraction of the time now required. “To run a query on the accounting system” today, he explains, requires specific account and department knowledge. The output is then incorporated in a spreadsheet for presentation. AI tools could “reduce all that to a world where you’re just having a conversation with your computer. ‘How much did we spend on bottled water last year? In the last five years? Could you put that in a graph for me?’”—a trivial example, but illustrative. The technology could also help ease the administrative burdens of faculty members—some of whom write scores of recommendation letters—by securely accessing student records and drafting custom text based on the data.

How will supervisors evaluate employees who use AI tools versus those who don’t? “Productivity is important,” says Jordan. “I see a time, several years from now… when it would be hard for an employee to keep up with their work” if they weren’t using the available technology—analogous to composing letters on a typewriter instead of a word processor.

But the norms of AI tool use across classrooms, administrative offices, and in research labs have yet to be firmly established, within Harvard or elsewhere. A case in point: submitting a manuscript prepared using ChatGPT to the journal Science is considered academic misconduct. Sending it to Nature, and acknowledging the tool’s use, is no impediment to publication. “The fact that both journals have positions,” says Sweeney, “and that the positions are opposite and extreme, I do think is very noteworthy about the moment that we’re in”—the sort of challenge the working group aims to help students, teachers, and staff sort out in the coming year. 

Read more articles by: Jonathan Shaw

You might also like

A New Chapter for Harvard Arts

The Office for the Arts turns 50, and its longtime director steps down.

Education School Announces Interim Dean

Nonie Lesaux will serve as dean during the search for a new one.

Harvard Students form Pro-Palestine Encampment

Protesters set up camp in Harvard Yard.

Most popular

Harvard Students form Pro-Palestine Encampment

Protesters set up camp in Harvard Yard.

Private Equity and the Practice of Medicine

Hundreds of U.S. hospitals are owned by private equity firms—does monetizing medicine affect the quality of care?

The Homelessness Public Health Crisis

Homelessness has surged in the United States, with devastating effects on the public health system.

More to explore

What is the Best Breakfast and Lunch in Harvard Square?

The cafés and restaurants of Harvard Square sure to impress for breakfast and lunch.

How Homelessness is a Public Health Crisis

Homelessness has surged in the United States, with devastating effects on the public health system.

Portfolio Diet May Reduce Long-Term Risk of Heart Disease and Stroke, Harvard Researchers Find

A little-known diet improves cardiovascular health through several distinct mechanisms.