It’s hard to find a post-election post-mortem that doesn’t float back to bubbles.
Not the precarious tech or housing kind. The cultural ones: Liberal bubbles (so deftly burst by “Saturday Night Live” recently). Conservative bubbles. Independent bubbles (though, technically, those might comprise more of a foam or a froth).
Most loomingly these past few weeks, we heard about Facebook bubbles. Or, what’s less finger-pointingly known as “filter bubbles” — self-styled personal ecosystems of information we burrow ourselves into on social media with the help of sophisticated algorithms (and, despite Mark Zuckerberg’s insistence, some fake news).
“The social bubbles that Facebook and Google have designed for us are shaping the reality of your America,” wrote Mostafa M. El-Bermawy in Wired shortly following the election (headline: “Your Filter Bubble Is Destroying Democracy.”) “We only see and hear what we like.”
“In the days and weeks before the election, how many posts did you see from people who had radically different views than you?” asked Felicity Sargent at Refinery29. “How many did you read? I’m betting not so many.”
And though Eli Pariser, the author who quite literally wrote the book on filter bubbles with 2011’s “The Filter Bubble: What the Internet Is Hiding From You,” stopped short of blaming them for how the election turned out, he did concede in an interview with the Verge that, in 2016, “it’s much easier than it’s ever been to live in an information environment that is several standard deviations from normal. It changes the cultural conversation for everyone.”
Meanwhile, what he wrote in 2011 holds true, filter bubbles “create the impression that our narrow self-interest is all that exists.” These are not ideal conditions for a productive cultural conversation.
For its part, Facebook dares not even utter the word “bubble” — the network instead concerns itself with “information diversity,” and insists it played no significant role in sealing users off from each other’s respective realities. Quite the opposite, actually. The findings of its 2015 self-study (published in Science) offered the research equivalent of “it’s not us, it’s you”: “Compared with algorithmic ranking, individuals’ choices played a stronger role in limiting exposure to cross-cutting content.” This follows another self-study from 2012 that attempted to pierce the bubble theory by demonstrating “the vast majority of information comes from contacts that they interact with infrequently.”
It’s understandable that Facebook would be a touch defensive on the whole influencing-democracy/shaping-reality thing. And it’s worth mentioning, if only to highlight how tender the public trust issue is within the network, that Zuckerberg’s net worth has dropped by nearly $3.7 billion since the election, according to a figure from Forbes and not demonstrably due to the election itself.
The stakes are high for Facebook to strike the right balance between engineering and editorial, even if Zuckerberg is himself touchy on the topic of truth: “I believe we must be extremely cautious about becoming arbiters of truth ourselves.”
(I should throw in — especially as one of those media types — that the problem of bubbles may be most pronounced on Facebook, but it’s certainly not exclusive. As Joshua Benton cautioned in a recent piece for NiemanLab, “Any journalist critical of the Facebook filter bubble they saw Trump voters caught in needs to look closely at the Twitter filter bubble where they spend a lot of their work day.”)
Zuckerberg’s “don’t look at me” posturing over this issue doesn’t help his arguments sound more convincing; but he doesn’t really need help. He has a point.
A solid decade worth of near-constant growth at Facebook suggests that its users have been perfectly comfortable soaking in the warm bath of the News Feed — that is, until this election pulled an ice bucket challenge in the tub.
Facebook AI research director Yann Lecun recently told reporters, “We believe this is more of a product question than a technology question.” Meaning, a far more balanced Facebook is not some impossible technical pipe dream. We just have to want it.
And do we?
Are these filter bubbles the by-product of algorithms, AI, and enterprisingly dishonest Macedonian teens, or just more quantifiable representations of the bubbles we create to engender confidence and preserve sanity offline? Did I hear more noise from the opposite end of the ideological spectrum before the Internet connected us to each other? Apart from Thanksgiving dinners, I don’t think I did. (And granted, I had headphones on for like five years of my adolescence.)
Some folks are trying their best to actively correct the distortion of the filter bubble through clever hacks — like the EscapeYourBubble Chrome extension, which inserts “curated, positive posts in your Facebook feed” from the opposing view of your choice. Others will acknowledge the quandary, pair it with a heartfelt sigh, and return to regularly scheduled clicking routines.
It’s highly unlikely that Facebook will take steps to burst its own bubble for the sake of balance or some notion of realism. And it’s even less likely that users will demand they do. These bubbles, after all, are made in our own image. As much as they disappoint us, they are the worlds we created. The only way to truly escape the bubble might be to power down and retreat to the old-fashioned one we left behind — it could use us right now.