scorecardresearch
Hiawatha Bray | Tech Lab

Back seat driver? New car cams may soon sense driver fatigue, texting, other distractions

Affectiva Automotive AI identifies a driver’s state of mind, such as drowsiness, distraction or anger.
Affectiva Automotive AI identifies a driver’s state of mind, such as drowsiness, distraction or anger.(Affectiva)

While you keep an eye on the road, your car may soon be keeping an eye on you.

Automakers have already installed cameras inside some new car models that track the movement of the driver’s head and eyes, to ensure he or she is paying attention to the road, and not to a smartphone. The systems, which will become standard equipment on many European cars in a couple of years, are already available on at least one US model, the Cadillac CT6, from General Motors Corp.

Now a Boston company is moving inside your car, too, with a system to study a driver’s face to assess his or her mental state. Affectiva Inc. is running tests with several carmakers of software that analyzes facial expressions to determine if a driver is distracted, angry, scared, sleepy, or drunk.

Advertisement



”It’s not just about emotions. It’s about your state of mind,” said Affectiva cofounder Rana el Kaliouby. “We can take this a step further and go a level deeper.”

The software is based on technology Affectiva has deployed to advertisers to measure emotional responses to TV commercials and movies. The company declined to identify the carmakers testing its new system. Elsewhere the company has formed partnerships with Wind River Systems, a maker of specialized operating system software for cars, and Veoneer, a Swedish maker of automotive electronics systems.

And in September, Affectiva teamed up with Burlington-based Nuance Communications Inc., maker of speech-recognition systems used in about 200 million cars worldwide. The two companies are building an integrated system that will help a car’s systems adapt to the emotional and mental state of the driver.

“We can make our interaction with the driver more human-like,” said Nils Lenke, director of innovation at Nuance’s automotive group. “There are much more options if you can engage the driver in a dialogue.”

Advertisement



Imagine, for instance, that video from a dash-mounted camera shows the driver’s eyes flickering and his head slumping. Affectiva software determines the driver is falling asleep and passes this information to the Nuance system, which can send verbal suggestions, depending on the situation. If the car is idling at a traffic light, it might say, “Hey buddy, you could use a cup of coffee,” in a friendly voice, while turning up the air conditioning to make the car less comfortable. The message might be more urgent if the sleepy driver is rolling down the interstate at 80 miles an hour: “Pull over, right now!” the car might shout.

Founded in 2009 and based on innovations from the Massachusetts Institute of Technology, Affectiva trained its artificial intelligence to accurately detect emotional responses by feeding it seven million images of human faces, collected in 87 countries and including men and women of every age and ethnic background.

There are limits. Affectiva’s software cannot yet tell whether a driver is drunk. Kaliouby said the software can be taught to recognize inebriation by showing it a great many photos of drunk drivers, but those aren’t easy to come by. The company has begun collecting such images, but “we’d probably need hundreds of thousands of examples of people intoxicated,” said Kaliouby, “and it would have to be done in a safe way. So that’s a challenge.”

Advertisement



There are other challenges, according to Colin Barnden, an analyst at the British electronics research firm Semicast Research Ltd. Barnden said real-time emotion tracking will require a fairly powerful computer, one likely to cost a lot and consume a lot of electricity — major issues for modern cars already stuffed with power-hungry gadgets and fetching hefty price tags.

“It’s not clear to me how you reduce the system to the point where you can include it in a vehicle,” Barnden said.

Affectiva also has no experience making software for cars, a market with ferociously high reliability standards. Barnden said it usually takes at least three years for a new product to be certified as “automotive-grade,” and that carmakers are very slow to purchase new technology from vendors who have no experience making automotive-grade components.

Kaliouby acknowledged the challenge of landing contracts with the carmakers. That’s one reason her company has also partnered with companies such as Nuance, Veoneer, and Wind River.

“They obviously have a lot of expertise in this area,” she said.

But car software isn’t a sideshow for Affectiva; Kaliouby said it will be the company’s primary focus going forward. “Basically, we identified automotive as the next growth opportunity for our company,” she said.

To see why, look overseas. The European New Car Assessment Programme, which sets car safety standards for the European Union, has said that beginning in 2020, only cars with driver-monitoring systems will be eligible for its highest five-star safety rating. Because of Europe’s intensely competitive car market, Kaliouby believes that most new EU cars will monitor their drivers within a few years.

Advertisement



She also figures it’s only a matter of time before the same technology becomes commonplace over here. “Even if we don’t see regulations in the US,” she said, “it’s going to reset consumer expectations.”

Meantime, Kaliouby is eying other customers for Affectiva’s car software. The company wants to look beyond the driver to the faces of passengers, generating data that ride-sharing companies, for example, could use to create custom “ride profiles” for each customer.

Imagine a frequent Uber rider who smiles at country music on the radio, whose eyes widen with fear at an abrupt acceleration, or appears bored when the driver gets chatty. Uber could send a message to the driver that if he wants a bigger tip, he should slow down, shut up, and put on Willie Nelson.

Now imagine the day when self-driving Ubers and taxis arrive: The vehicle could simply download and run each passenger’s ride profile. And what about motion sickness? A 2015 study from the University of Michigan found that up to 12 percent of people suffer from it in cars. An Affectiva system could see a passenger getting sick in time for the car to pull over and let him out.

Kaliouby said the company is in early talks with ride-hailing companies, whom she declined to name, about passenger monitoring, and she insisted that any such system would be switched on only with the rider’s permission.

Advertisement




Hiawatha Bray can be reached at hiawatha.bray@globe.com. Follow him on Twitter @GlobeTechLab.