A Boston company called Neurala is teaching body cameras worn by police officers to detect suspicious people or locate missing children faster than the human eye.
The company uses artificial intelligence to distinguish different objects: a person from a pet, for example, or a bench from a street light. That can be programmed into a camera to help pick out a person in a crowd based on an identifying object — a brightly-colored hat, for example — and it may soon be able to distinguish facial features as well.
Neurala’s technology has broader applications, to help guide self-driving cars, military robots, and drones. That potential is why alarm bells went off in Washington when a Chinese investment firm, Haiyin Capital, took a small stake in Neurala. It also raises questions about whether nosy governments or corporations could use the company’s software to keep tabs on anybody and everybody.
The body camera project is being done in conjunction with Motorola Solutions, a major maker of police and military electronics. Essentially, Neurala feeds millions of images into powerful computers, which learn from the data in much the same way as a human brain does. Gradually, the computers create digital models of objects and people, shapes, and colors.
These models, capable of identifying hundreds of objects and body types, are transmitted to the body cams. So if police were looking for, say, a lost boy wearing blue jeans and a red shirt, an officer could program his body cam to scan a crowd and pick out potential matches.
If new information comes into the system — additional photos of the child, for instance — the software could update its model to improve its accuracy.
For now the Neurala body cam software is designed to recognize clothing, objects, and body types, but chief operating officer Heather Ames said it could eventually be upgraded to support facial recognition, allowing it to identify specific people.
Neurala’s and Motorola’s project for police cams are but the latest new technology to raise questions about privacy issues, especially as facial recognition systems are being tested by law enforcement agencies for surveillance and identity checks.
“The privacy issues are huge,” said David Schubmehl, research director of artificial intelligence systems at IDC Corp. in Framingham. Schubmehl said technology similar to Neurala’s body cam system could be embedded into millions of cameras in public places. Governments could use the system for surveillance, while businesses might use it to track customers’ activities for marketing purposes.
“At least the police are regulated by government,” Schubmehl said. “Private industry is not regulated today.”
But Neurala president Warren Katz said that we already live in a world of ever-present video cameras. His company’s products will simply make video surveillance more efficient and effective.
“The privacy question has already been answered,” Katz said. “What we’re talking about here is adding some automation on top of that.”
The company hopes to begin real-world tests of the Motorola system sometime next year.
The company was spawned in 2006 by Massimiliano Versace, now its chief executive, Ames, and Anatoly Gorshechnikov after the trio earned doctorates in cognitive and neural systems at Boston University. In the company’s first project, it worked with NASA on building artificial intelligence into robots for exploring the surface of Mars.
It takes up to 24 minutes for a radio signal to travel between earth and Mars — not nearly fast enough to send new orders to a robot in case of emergencies. So Neurala worked on “offline AI,” a system that can think for itself, without constant connections to remote data centers.
Offline AI has an array of earthly applications too, because it can run on cheap, low-powered chips like the kind in smartphones or consumer-grade drones.
For instance, Neurala created a “selfie” app for the French drone maker Parrot, enabling a camera-equipped drone to recognize a person’s face and literally zoom in for personalized photos and videos.
“We are replicating the way the human cerebral cortex works, and in particular the ability to learn instantaneously on the fly, and we are putting this in software,” said Versace. “The applications are endless.”
In June, 2016, Haiyin Capital, which has invested in several other Massachusetts technology companies and US robotics firms, participated in a $1.2 million funding round in Neurala.
At the time, Neurala said the deal would give it access to Haiyin’s network of equipment makers and technology businesses in China.
But the Haiyin investment triggered a warning from the Washington research firm Defense Group Inc. In October, Defense Group warned in a report to Congress that the deal with Haiyin risks allowing Neurala’s AI know-how to fall into the hands of the Chinese military.
But Versace said the investment from Haiyin Capital includes language that bars the Chinese from obtaining seats on the company board, or access to Neurala’s intellectual property.
Meantime, Neurala said it is looking beyond the law enforcement market for less politically sensitive uses of its AI system. The company is working with the Charles A. and Anne Morrow Lindbergh Foundation, which uses drone aircraft to identify wildlife poachers in several African countries.
The goal is to use Neurala technology to report the movements of animals, vehicles, and people in real time. The company is also developing navigation software for self-driving cars, and is working with two toy manufacturers to build playthings that learn from the children who own them. Since no two children are alike, each toy will be different. And unlike other smart toys, Neurala said these will not pose privacy risks, because they won’t be capable of connecting to the Internet.