After dark, a fleet of SUVs takes to the streets in cities and towns across America. They move slowly, less than 20 miles per hour, and look like normal vehicles except that mounted atop each is a large black box with wires running out the bottom — an imaging system with eight cameras and various other sensors for taking thermal pictures. As you sleep and the cars crawl along, the cameras log all the places that energy is escaping from your home.
It’s the data collection wave of the future, rolling right past your front door.
“Each vehicle can cover 5,000 to 8,000 homes in an evening in a normal suburban area. We’re able to look at the thermal data and get a good idea of how much energy your home is leaking, where it’s leaking, and what you can do to address it,” says Jan Falkowski, chief technology officer of Essess, the Cambridge company that conducts the thermal mapping.
The techniques used by Essess are part of a bigger trend toward large-scale, passive “sensing” of environmental features. It’s a trend that was inaugurated most prominently with the release of Google Street View in 2007 and has accelerated in recent years to include the detection of many different aspects of our surroundings, from building energy-loss to potholes to the location of sidewalk benches.
One of the most recent innovations involves the sensing of street lights. In a paper published in June, MIT engineer Sanjay Sarma and his collaborators (including former graduate students Sumeet Kumar and Jason Ku) describe a method for mapping the locations of street lights and detecting outages when they occur. They propose affixing simple light sensors and video cameras to the tops of city vehicles (buses, trash trucks, snowplows, street sweepers). As the vehicles go about their normal business, the sensors monitor street light conditions, doing away with the need for tedious 311 calls and giving cities a far more comprehensive picture of the lights they actually have.
“Lighting is a big deal in cities, it’s a significant consumer of electricity. Many cities have poor inventories of their infrastructure and don’t know if enough light is being delivered on the ground,” says Sarma, a leader in the field who also developed the technology behind Essess.
Some sensing revolutions allow scientists to perceive aspects of reality that were previously inaccessible. The first ever detection of gravitational waves last year is a good example of that kind of progress. Others scale commonplace observations (hey, that street light is out!) to create powerful new data flows and a more fine-grained view of the mundane activity all around us.
After the Essess vehicles complete their surveys, for instance, the company feeds that data to your local utility provider that then sends you a letter in the mail noting where energy is escaping and explaining options for making your home more efficient: air sealing for windows, spray foam by your foundation, better wall insulation.
“We’re helping the utility identify which customers need help, and the utility helps the customer connect with the contractor that will do the work,” says Falkowski.
The possibilities for augmenting routine movement with sensing capabilities are endless. Engineers at Northeastern University are working on a project called VOTERS, for Versatile Onboard Traffic Embedded Roaming Sensors, that uses a radar-enabled van to detect potholes in the course of normal driving. At Columbia University, engineer Andrew Smyth is collaborating with the City of New York to use sensors already attached to city vehicles to monitor traffic flows. His colleague Xiaofan Jiang has a similar project underway in Beijing to measure air-quality.
“This kind of mobile sensing is very exciting in the sense that you can have lots of data, the spatial resolution is very high, and after installation it doesn’t require much human intervention,” says Smyth. “It is really incredible.”
Often it’s hard to perceive the line between the present and some remote technological future. With the deployment of sensors, though, the line seems clear. The proliferation of autonomous vehicles (on land and in the air) will enhance both the need for broad-scale, continually updated data and the ability to collect it. At the same time, rapid advances in machine learning software are expanding the range of things that these sensors can observe: street lights today, faces in a crowd tomorrow.