Right Now | Urban Tunnel Vision
Cities Too Smart for Their Own Good?
Wouldn’t everyone want a hometown that’s a “smart city”? The answer seems obvious: who, after all, wants to live in a stupid city? And indeed, the technologies touted by smart-city advocates can seem utopian: self-driving cars, pothole-reporting apps, and sensors to detect the public’s every need—all connected by free public WiFi!
But for Ben Green, this utopian view is precisely the problem. Informed by his experience designing technology for the cities of New Haven, Memphis, and Boston, Green—now a doctoral candidate in applied math at Harvard’s Paulson School of Engineering and Applied Sciences and an affiliate of the Berkman Klein Center for Internet and Society—believes the technologies sold to policymakers and the public as tools of a brighter, optimized urban future actually have much darker potential. “The smart city threatens to be a place where self-driving cars have the run of downtowns and force out pedestrians,” he writes in a new book, The Smart Enough City (MIT Press), “where civic engagement is limited to requesting services through an app, where police use algorithms to justify and perpetuate racist practices, and where governments and companies surveil public space to control behavior.”
At the heart of Green’s warning is a mind-set he calls “tech goggles,” a tunnel vision that answers every problem with new technology. Thus in “smart city” vision, seemingly apolitical aspirations like “smartness,” “efficiency,” and “innovation” take on distorted meanings, focused on technology to the exclusion of all else. Often, Green said in an interview, “these visions are put forward by tech companies with clear profit motives to shift both what the public wants and what city governments believe is useful and valuable.”
As an example of tech-first thinking distorting a solution, he draws a parallel between cities adapting to automobiles in the early twentieth century and to autonomous vehicles in the twenty-first. In the 1920s and ’30s, automakers and engineers pushed “scientific” and “objective” methods for optimizing traffic speeds to pave the way for their new transportation technology, the car. Municipal roads redesigned for speeding vehicles pushed bicycles, pedestrians, public transit, and playing children off the streets, forever changing the geography of the American city.
Recent research into using autonomous vehicles to reduce traffic is already repeating this mistake, Green warns. In one instance, researchers at MIT demonstrated the supposed efficiency gains of self-driving cars with a simulation of the intersection of Massachusetts and Columbus avenues in Boston’s South End. Instead of waiting at traffic lights or crosswalks, the simulated cars coordinated with each other to move seamlessly through the intersection. “But there’s one important thing missing,” Green writes: “People.”
When political decisions are hidden inside technology design, citizens can’t shape the future of their own city.
If the simulation came to life in the real South End, the bike lanes, crosswalks, and walkable business districts that make it an attractive place to live would vanish. For Green, that’s a warning of how a seemingly objective technical solution could obscure intensely political decisions. “Any time you are trying to make a system more efficient,” he said, “you are by definition cutting out the things deemed inefficient. There’s a great deal of hidden politics around what is actually being defined” that way. And when such political decisions are hidden inside technology design, it becomes impossible for citizens to shape the future of their own city.
Green brims with cautionary tales, from a consortium of companies that includes investment and leadership from Sidewalk Labs (a subsidiary of Google’s parent company, Alphabet), using its control of New York City’s free public WiFi hotspots to slurp up detailed personal data, to predictive policing algorithms that exacerbate biases in cities from Oakland to Chicago by sending officers to patrol poor and minority neighborhoods—and arrest the local residents for minor crimes.
Green thinks it’s possible to escape these “smart” mistakes without discarding technical innovation entirely, pointing to cities that considered the social implications of their data and technology as they were developing it—and were better for it. Johnson County, Kansas, for instance, used crime statistics and other data not to direct police patrols but to expand social services for citizens at risk of falling through the cracks, before they entered the criminal justice system. “It’s not that cities should have no people who care about technology,” he said, “but they shouldn’t have their entire sense of innovation and progress based around technology.”
He hopes his book will be a wake-up call both to practitioners already in the field and to residents pushing for better cities. He says cities should not compete to be “smarter” than their peers if all that means is having newer and more powerful technology. Instead, he challenges them to take off the “tech goggles” and build the cities people want to live in.