Facebook’s Failures

Author and tech journalist Jeff Horwitz speaks at Harvard.

Montage illustration of person holding a phone with thumbs up and down in the background

In 2019, when Wall Street Journal reporter Jeff Horwitz began asking Facebook for information about the inner workings of its complex recommendation system, which pushes personalized content into users’ feeds—and is central to the platform’s immense growth—the answer he got back from the company was, “Absolutely not.” So he started digging.

On Thursday, Horwitz told this story during a lunchtime talk at Harvard about his book, Broken Code: Inside Facebook and the Fight to Expose Its Harmful Secrets, published in November. Hosted by the Kennedy School’s Shorenstein Center on Media, Politics, and Public Policy, and moderated by Latanya Sweeney, Paul professor of the practice of government and technology, the conversation also included Dylan Moses, a third-year Law School student and fellow at the Berkman Klein Center for Internet and Society. Moses previously served on Facebook’s trust and safety team, where he was responsible for mitigating hate speech and terrorism; he and Horwitz first met and started talking in 2019, soon after he left the platform.

Broken code cover

Broken Code extends the work of the “Facebook Files,” a prize-winning series of articles that Horwitz and his Wall Street Journal colleagues published in late 2021, based on screenshots of more than 20,000 internal documents, most of them shared by Frances Haugen, a conscience-stricken Facebook product manager, who left the company soon afterward and came forward as a whistleblower to speak out against it.

Horwitz’s book is full of revelations about failures and manipulations by Facebook that fractured politics around the world, rewarded extremism and violence, facilitated the spread of hatred and misinformation, and harmed countless users. At Thursday’s talk, Horwitz described the company’s reluctance to address problems with the platform even after its leaders, including CEO Mark Zuckerberg ’06, LL.D. ’17, belatedly became aware of them. Again and again, he said, they chose profits and relentless growth, no matter the social cost. He described “a kind of bias toward doing nothing,” which often resulted from “simple incentives.” Pointing to one example, he said, “It turns out that leaving up 100 pieces of bad content by a public figure is a lot less painful from a corporate point of view than taking down one piece of marginally OK content from the same figure….An aversion to PR fires turned into a guiding force.”

In the immediate aftermath of the 2016 election, when it became apparent that Facebook had played a significant role in rising American polarization and disinformation, Horwitz said, “There was a boom in internal research…partly because there was legitimate interest in figuring out what the hell had happened.” Facebook’s leaders came to understand how the platform had been “gamed” by users boosting content with dummy accounts that artificially inflated engagement numbers. Facebook officials saw how the platform’s algorithmic tweaks had been “actively optimizing for trash,” Horwitz said. But ultimately, even with this new knowledge, they didn’t make changes. “The ‘move fast and break things’ culture is very, very real.” But also: by the time they realized the mistakes that allowed provocateurs like far-right conspiracy theorist Alex Jones to gain prominence and power, there was already a whole “ecosystem” in place of popular users publishing this kind of content, Horwitz said. And Facebook was reluctant to disrupt that ecosystem, because it might threaten the company’s growth. “There was little desire to sacrifice engagement for societal stuff,” Horwitz said. “It sounds hyperbolic when I say it, but…[the idea of] trading a, say, 30 percent reduction in misinformation for .1 percent of daily active usage was dead on arrival. And that is not theoretical,” he added. “They repeatedly did make those decisions.”

In the face of all this, Horwitz said, his own work can sometimes feel frustratingly futile. The “Facebook Files” series caused an uproar when it was published—there were Congressional hearings, and Haugen, who’d been an anonymous source, went public with her story. Soon afterward, Facebook changed its company name to Meta. But the fervor seemed to dissipate. “The company gets publicly shellacked, and then—nothing happens,” Horwitz said. “And that was deeply depressing.” Still, he believes, “Writing up how all this stuff worked was really important, in the same way that it’s important to understand how the history of the printing press might have helped cause many decades of war in Europe. You write it up anyway, even if nothing’s going to change.”

During the past year, Horwitz has turned his attention more specifically to issues of child safety on Facebook and Instagram, both Meta platforms. His recent articles in the Wall Street Journal have covered how the company’s algorithms and tools make it easier for pedophiles to find children online, help shield predators from view, and foster networks of pedophiles. Last week, he wrote about how the company’s new paid subscription features are being misused by parents trying to profit from exploiting their children, selling images of young girls in bikinis and leotards to an overwhelmingly male audience whose comments on the platform make plain their sexual interest. He’s also explored how Instagram and Facebook seek to take advantage of teenagers’ psychological vulnerabilities, prioritizing young users’ engagement over their well-being.

Child safety, Horwitz explained, is “a better hill to fight on,” a rare space of clarity in the otherwise messy world of social media content management—and an area that could lead the way in finally forcing real regulation of platforms like Facebook. Proposing restrictions against users such as anti-vaccine activists spreading misinformation can get into free speech questions that make people feel “a little squeamish,” Horwitz said. “Whereas, guys that are building massive followings by posting images of little girls doing splits, which they stole from other users—what restrictions do we put on those guys? The answer is, all of them.”

Moses asked him about the Kids Online Safety Act, a bipartisan bill recently introduced into Congress that would require social media networks to take “reasonable measures” to prevent harms such as bullying, harassment, and sexual exploitation. “That seems like something that might have some legs?” Moses offered. Horwitz seemed lukewarm—the bill faces a difficult legislative path. Where he did allow himself some optimism was in the substantial knowledge that Facebook researchers put together on how to mitigate the platform’s harmful effects. That knowledge has so far gone mostly unused, but it is robust and potentially powerful. “Facebook trained so many people, in terms of trust and safety,” Horwitz said. “There is at this point a great body of literature demonstrating the specific effects of pulling specific levers.” Research has shown, for instance, that reducing a problematic piece of content’s virality even mildly can meaningfully reduce disinformation. “And that’s a lever that will work anywhere,” he said, if only companies like Facebook would use it.

Read more articles by Lydialyle Gibson

You might also like

Regenerative Biology’s Baby Steps

What axolotl salamanders could teach us about limb regrowth

The Secrets Glaciers Tell

A Harvard class explores the glacial legacy of pollution emitted by the Roman Empire

From Jellyfish to Digital Hearts

How Harvard researchers are helping to build a virtual model of the human heart

Most popular

Why Men Are Falling Behind in Education, Employment, and Health

Can new approaches to education address a growing gender gap?

The 1884 Cannibalism-at-Sea Case That Still Has Harvard Talking

The Queen v. Dudley and Stephens changed the course of legal history. Here’s why it’s been fodder for countless classroom debates.

Trump Administration Appeals Order Restoring $2.7 Billion in Funding to Harvard

The appeal, which had been expected, came two days before the deadline to file.

Explore More From Current Issue

Four young people sitting around a table playing a card game, with a chalkboard in the background.

On Weekends, These Harvard Math Professors Teach the Smaller Set

At Cambridge Math Circle, faculty and alumni share puzzles, riddles, and joy.

A stylized illustration of red coral branching from a gray base, resembling a fantastical entity.

This TikTok Artist Combines Monsters and Mental Heath

Ava Jinying Salzman’s artwork helps people process difficult feelings.

Evolutionary progression from primates to humans in a colorful illustration.

Why Humans Walk on Two Legs

Research highlights our evolutionary ancestors’ unique pelvis.