Even if you’re not all that revved up about the arrival of self-driving vehicles, you might soon share the road with them as a driver, cyclist, or pedestrian.
So let’s say you get hit and injured by a self-driving car one sunny day in the not-too-distant future. And it was totally preventable. But in the headlong rush to perfect self-driving technology, no one wanted to share — so you’re left to recuperate and seethe.
What do I mean by share?
As test cars roam the streets of Boston, San Francisco, Pittsburgh, and other cities, they’re collecting information about the environment, as well as situations that they’ve never encountered before. It could be a way to safely merge from the Mass. Pike onto I-95, or a way to avoid hitting the person pushing a baby stroller out from between parked cars on Newbury Street. Roadways are complicated places — especially when it comes to trying to share them with human drivers, whose intentions can be unclear to a software chauffeur.
But as companies like Uber, Toyota, Ford, General Motors, and others use the data they’ve gathered on the mean streets to make their cars safer, there’s no incentive for them to share strategies for handling potentially dangerous situations. That could mean that when the vehicles are eventually sold, some will be safer than others.
Do we want makers of autonomous vehicles to pitch their product based on being less likely to mow down a pedestrian — or on other features, like comfort, range, or whether the speed of the in-car Wi-Fi is fast enough for five passengers to stream five different Netflix shows.?
Today, MIT researcher Lex Fridman says, the developers of self-driving cars are rather secretive. “Among the big companies,” he says, “nobody knows what success in this industry requires. So there’s a feeling of possessiveness about giving away stuff. Doing an occasional demo of autonomous driving through Boston or San Francisco produces a large amount of value. But giving away algorithms or details about the data you collect seems to diminish the value. Right now, it’s a game of hype.”
But to create an ecosystem where lots of manufacturers can succeed, Fridman suggests, “we have to share data and ideas.” He says it would be “extremely useful” to create some kind of database where manufacturers could deposit — or withdraw — models and strategies that would make their vehicles safer in dealing with other cars, pedestrians, or a deer darting out onto a dark country road.
One way to set that up would be as an industry consortium housed somewhere neutral. MIT, for instance, is already home to the Advanced Vehicle Technologies Consortium, which uses cameras and sensors that drivers voluntarily install in their cars to see how human drivers use today’s “autopilot” technologies, available in Teslas and some Volvos and Range Rovers. Consortium members such as Toyota, Liberty Mutual, and Jaguar Land Rover all can benefit from the data gathered.
Of course, there will be challenges — which will cause companies to push back against the idea in the short term. One question is how to “share the data in a way that will respect confidentiality of the underlying algorithms and designs,” says Herman Herman, director of the National Robotics Engineering Center at Carnegie Mellon University. By that he means ensuring that companies can keep the “secret sauce” of their software confidential, while sharing models of how to best anticipate and handle dangerous situations. Some of those situations will have been caused by humans being unpredictable. An example: the recent crash of an Uber vehicle in Arizona.
Herman also notes that there are questions about whether individual carmakers will choose to make different moral decisions about safety, or whether we will want a shared moral code. Is it preferable, for instance, for the vehicle to hit a cyclist, or should it slam into a telephone pole to avoid hitting the cyclist, risking injury to those in the car? “There are many ways to define safety,” Herman says, “including safety for whom?”
Creating any sort of shared repository will require establishing standards so that the data one company collects is useful to others, “which we are far away from at this point,” Sertac Karaman observes. He is president and chief scientist at Optimus Ride, a Boston startup developing autonomous vehicle technology. And, he adds, “the data sets are usually massive.” Optimus Ride has set up an indoor testing facility in the Seaport district, while another startup, nuTonomy, began road tests in that neighborhood earlier this year.
Building autonomous vehicles is expensive and complicated, and adding new layers of cost and complexity won’t be popular. But if you assume that data sharing to enhance safety would be useful — if difficult — there’s a big question looming ahead for self-driving cars. In the Trump era, will a federal agency like the National Highway Transportation Safety Administration require carmakers to create a data exchange? Will carmakers do it voluntarily? Or will the industry just kick the can down the road, until we realize that Brand X vehicles seem to be causing a lot more fatalities than Brand A, B, or C?
“Everybody in this industry wants to develop a safe solution, so I think the motivation is there to do the right thing,” Herman says. “An accident with one company’s product will taint the public perception for the whole industry as well.”