Until recently, social media was unaccommodating terrain for blind and visually impaired users. Screen reading apps could only deliver a fraction of what we would call the Facebook experience, and the increasingly visual nature of social sharing across platforms was leaving text-hoovering interpreters with nothing to say.
But over the past year, the rise of automatic alternative text, as well as the increasing sophistication of facial and image recognition technologies across platforms has demonstrated the potential for artificial intelligence (AI) to help clear a path for the visually impaired to function more independently not just on Facebook, but the real world as well.
Take Seeing AI as one of the most impressive recent examples. The newly released Microsoft app (currently only available for iOS) functions as a suite of AI-assisted services for the visually impaired. It can read physical documents using a smartphone camera (and will even guide users to get the document within frame before scanning); it can identify products in stores by detecting and scanning bar codes (and read additional product info); it can recognize friends based on facial recognition, and describe their appearance and expression; and it can use its recognition capabilities on images and texts imported into the app.
Across the board, the app aim of “turning the visual world into an audible experience” works remarkably well. Any snatches of short text that fell within its frame it eagerly read aloud. It gobbled up the barcode on my bag of chips and told me the flavor (barbecue) and how many chips I should expect (30). It guided me through getting myself into position for a selfie, and once taken, it almost correctly identified me as a “40-year-old man with brown hair and a mustache looking happy.” (I’m 41, but as errors go, this one’s fine.)
And Seeing AI’s experimental “scene” feature showed great potential describing the contents of a room — it could tell the difference between a TV and a computer monitor, but it also thought it saw my husband lying on the coffee table. (Which he’d sworn not to do anymore.)
Even with all this automated assistance, there remains a place for human intelligence in this sweeping upgrade. One app called “Be My Eyes” uses a combination of crowdsourcing, social media, and live video to connect visually impaired users with a remote volunteer staff of over 500,000 sighted assistants, ready and waiting to read ingredients and expiration dates, signs on subway platforms, or simply describe what they see in front of them.
And for a far more structured service (from $89 to $329/month), Aira functions as a full-time informational concierge for the visually impaired. A full-time staff of certified “agents” work directly with users through a head-mounted camera and earpiece (think Google Glass but sans glass, and actually useful), to offer step-by-step real-time guidance. Using streaming video, Google Maps coordinates, and even social media integration, agents can help users navigate everything from busy sidewalks to crowded parties to, well, marathons.
Specialized devices to assist the visually impaired are also arriving atop the wearables wave. The Sunu wristband, for example, uses ultrasonic proximity sensors to read the contours of rooms, transmitting spatial information (open doorways, other people) through haptic signals on the wrist. This can help prevent accidents, but more importantly, can enable blind and visually impaired people to move with increased confidence.
On this front, wireless beacons are increasingly being installed and employed as ways to relay location-specific information to GPS apps designed for visually impaired users. The Canadian National Institute for the Blind, for instance, recently announced plans to install over 200 beacons that can identify everything from entrances to rest rooms to bus stops. And earlier this month, the MBTA announced plans to test beacons installed on bus stop signs along the 70 and 71 bus lines — which service the Perkins School for the Blind. The beacons will deliver precise bus stop information via the Perkins School’s own custom built BlindWays app.
Of course, these all comprise what could be just the first cracks in a grander shift in the terms of accessibility — how, for instance, will autonomous vehicles reshape mobility challenges, and how will augmented reality grow to customize collective experiences to meet individual needs? In the meantime, this new wave of technology is allowing the visually impaired to connect with the world in ways we’ve never seen, and allowing perfect strangers to lend helping hands from half a world away — if that’s a vision of the future, I’ll take it.