scorecardresearch
Perspective | Magazine

Silicon Valley wants to improve your ‘digital well-being’ — and collect more of your personal data along the way

It’s presented as a solution to Internet addiction. But it can also mean more invasive snooping about our behavior, habits, and emotions.

(ADOBE STOCK)

Several weeks ago, Google previewed a number of new features in its Android P operating system. The changes included shortcuts for switching a phone to do not disturb mode, a timer to limit use of apps, and a dashboard detailing how much time you’ve spent on Snapchat, Instagram, or Facebook Messenger. Google pitched its new interface elements as part of a broader goal: fostering digital well-being.

The notion of digital well-being — positive, healthy, personalized connection through digital devices — seems to be everywhere in Silicon Valley these days, though it isn’t well defined. It’s presented as a solution to Internet addiction, or the ways in which our digital devices seem to compel our attention through their design. Google describes it broadly as a form of life improvement, helping its users “focus on what matters most, disconnect when needed, and create healthy habits for the whole family.” Likewise, Facebook founder Mark Zuckerberg recently told Congress, “It’s not enough to just connect people,” and that Facebook would “have to make sure those connections are positive.” Instagram, which is owned by Facebook, announced in March that it is setting up a digital well-being team. Google plans to roll out its digital well-being features more widely for Android this fall and is touting similar models for its other products, including YouTube and Google Photos.

Advertisement



We should be clear, though, about why Silicon Valley seeks to compel our attention in the first place: To accumulate as much data about our lives as possible. By focusing public conversation on the interfaces of our devices and platforms, tech behemoths are performing a bait and switch. The greater threat to our well-being comes from the back-end collection and analysis of our personal data. And a focus on digital well-being could, in fact, make that data collection even more widespread and invasive. Like a parasite, Silicon Valley latches onto our human desire for connection and uses it to collect our data, monetize our attention, and extend technologies of surveillance across the texture of life.

We’re spending more time interacting with our digital devices than ever before. A recent survey by the Pew Research Center found that a quarter of American adults — and 39 percent of 18- to 29-year-olds — are “online almost constantly,” and that 77 percent of Americans use the Internet on a daily basis.

Advertisement



Digital technologies have wide-reaching effects on our behavior, habits, and everyday routines. The World Health Organization recently classified digital gaming addiction as a disorder.

While it’s common to hear people talk about how they’re addicted to their phones, most of us are compelled by social media because we’re looking for fundamental connections — social, intellectual, and emotional — with other human beings. It’s that desire for social connection that Silicon Valley has exploited through design. One group promoting digital well-being is the Center for Humane Technology, founded by former Google employee Tristan Harris. The center extols the virtues of what it calls humane design, resulting in digital media “that protects our minds and replenishes society.” Yet the small-bore design changes advocated by the center are not a meaningful antidote to this wider problem baked into Silicon Valley’s business model. If anything, framing these issues around individual wellness pushes the ethical responsibilities that lie with tech companies onto consumers themselves — and advances those companies’ data-collection business models in the process.

Advertisement



Like Google’s changes to Android, Facebook’s increased emphasis on positive connection and personalization is meant to make users feel good about using their devices. But it’s also used to justify upping more collecting of personal data — especially data about behavior, habits, and emotions — that these companies want anyway. The more personalized the experience, the more data Facebook and other tech companies will need to acquire about their users.

What can people who feel like they’re glued to their phones — and let’s face it, that’s most of us — do to tackle these big societal questions around digital technology and its role in our everyday lives? One important step is to hold politicians accountable for their stances on digital media and its regulation . Digital well-being is the tip of an iceberg: Our civil, economic, and democratic rights are all increasingly shaped by digital systems, and we have to stop treating them as separate from the real world.

While you’re getting political, it’s not a terrible idea to use some of the tools being provided by companies like Facebook and Apple (or applications like Freedom, which allows you to block your digital connectivity for short periods). But even more importantly, use tools that protect the well-being of your personal data, like encrypted chat platforms and password managers. And remember that no app tweak or design fix can substitute for robust interactions with other people. Holding on to that clarity will do more for our well-being, digital and otherwise, than any Google dashboard.

Advertisement




Luke Stark is a postdoctoral fellow in the Department of Sociology at Dartmouth College who studies the intersections of digital media and behavioral science. Send comments to magazine@globe.com. Get the best of the magazine’s award-winning stories and features right in your e-mail inbox every Sunday. Sign up here.