Buying Guides

Government Documents Reveal New Details About Tesla and Waymo Robotaxis’ Children


They are self-driving cars really just giant, remote-controlled cars with nameless and faceless people in remote call centers checking things out using consoles? As cars and their science fiction-like software proliferate in more cities, conspiracy theories are swirling around group chats and TikToks. It’s fueled, in part, by the reluctance of self-driving car companies to talk directly about the people who help make their robots go.

But this month, in government documents submitted by Alphabet subsidiary Waymo and electric car maker Tesla, the companies disclosed more information about the people and systems that assist cars when their software is confused.

The details of these companies’ “remote assistance” programs are important because the people who support the robots are essential to ensuring that cars drive safely on public roads, industry experts say. Even well-mannered robots often run into situations where their self-driving systems get confused. See, for example, the December power outage in San Francisco that killed the city’s stoplights, leaving a confused Waymos at several intersections. Although the government continues to investigate several incidents of these vehicles illegally stopping school buses dropping off students in Austin, Texas. (The latter has led Waymo to issue recall software.) When this happens, people get cars out of jams by steering or “advising” them remotely.

These activities are important because if people do them wrong, they can be the difference between, say, a stopped car or a red light. “For the foreseeable future, there will be people who will participate in the behavior of cars, so they have a security role to play,” said Philip Koopman, a self-driving car software and security researcher at Carnegie Mellon University. One of the most difficult security issues associated with self-driving cars, he says, is building software that knows when to ask for help from humans.

In other words: If you care about robot safety, pay attention to people.

The people of Waymo

Waymo operates a paid robotaxi service in six metros—Atlanta, Austin, Los Angeles, Phoenix, and the San Francisco Bay Area—and has plans to launch at least 10 more, including London, this year. Now, in a blog post and a letter sent to US Senator Ed Markey this week, the company has made public some aspects of what it calls “remote assistance” (RA), which uses remote workers to respond to Waymo’s vehicle software requests when it decides it needs help. These people provide data or advice to the programs, writes Ryan McNamara, Waymo’s vice president and head of global operations. The system may use or reject information provided by individuals.

“Waymo’s RA agents provide advice and support to the Waymo Driver but do not directly control, direct, or drive the vehicle,” McNamara wrote—refuting, flatly, the charge that the four Waymos are remote-controlled vehicles. About 70 assistants are on duty at any given time to monitor some 3,000 robots, the company said. A low rating indicates that vehicles are carrying heavy loads.

Waymo also confirmed in its letter what an executive told Congress in a hearing earlier this month: Half of these remote assistance workers are contractors overseas, in the Philippines. (The company says it has two other remote help desks in Arizona and Michigan.) These workers have driver’s licenses in the Philippines, McNamara wrote, but are trained in US traffic laws. All remote assistance workers are drug- and alcohol-tested when hired, the company said, and 45 percent are drug-tested every three months as part of Waymo’s random testing program.

Back to top button