fb-pixel Skip to main content
IDEAS

Facebook could make its algorithms truly work for you

The social network guesses at what people want to see, often with disastrous results. What if we could tune those dials ourselves?

If we can’t live with Facebook, and we can’t live without it, we have to accept this double-edged sword. We need ways of sharpening its salubrious aspects while dulling its deleterious ones.Globe staff illustration/spotmatikphoto/Adobe

After the reports last fall stemming from whistleblower Frances Haugen’s damning revelations about the social harms of Facebook, there was a movement to delete the social network and push the platform into nonexistence. But moral outrage has swung in the opposite direction now that Russia has blocked Facebook in a blatant effort to stifle political dissent and control the domestic narrative over Russia’s invasion of Ukraine.

If we can’t live with Facebook and we can’t live without it, we have to accept this double-edged sword. We need ways of sharpening its salubrious aspects while dulling its deleterious ones.

An onslaught of congressional hearings and draft bills in recent months have raised the political volume on reining in Big Tech, but there is little consensus on how to proceed. At a January gathering organized around the anniversary of the Capitol insurrection, Senator Ron Wyden and others called for digital privacy legislation to prevent further destabilizing events. Such laws would deprive Facebook and other platforms of the granular personal data their algorithms use to guess what interests us and then manipulate us accordingly.

Separately, draft legislation would require any large platform using an “opaque” algorithm — one whose inputs are not obvious, because the algorithm relies on user-specific data that the user never expressly provided — to offer a more transparent version relying on information users provide. Meanwhile, Facebook’s leaked research says users should be given more options to personalize their algorithmically curated experience.

Advertisement



How do we decide which of those options is best?

We don’t have to. These approaches could be combined into a single system that fulfills an earlier promise from Facebook’s president of global affairs, Nick Clegg, to give “people greater understanding of, and control over, how its algorithms rank content.”

Advertisement



A Facebook blog post from January 2021 outlines how its News Feed algorithm works. Machine learning is used to predict the probability that each user will share, comment on, like, love, click “haha” for, etc., any post the user could be shown. A weighted sum aggregates these probabilities, which helps to determine a single number called the post’s “value” to the user.

The weights in this sum play a central role. Facebook originally gave all emoji reactions five times the weight of a like, but it eventually lowered the weight on the love and sad emojis to twice that of like. The value of an angry emoji was dropped to zero — to reduce the amount of toxic material users encounter. Facebook estimates how close you are to each of your friends and incorporates this so posts by closer friends have higher value. The subject of the post is also weighted — which is how Facebook is able to reduce the distribution of political content when it wants to.

That’s the story of the algorithm’s output, but what is its input? How does Facebook make these determinations about you? We know it has a lot of your personal data — your demographics, group memberships, history of engagement and keyword searches, etc. The algorithm uses whatever information it can access to predict your engagement probabilities.

But what if we had more say in what information fueled that process? Facebook should group the algorithm’s inputs into a reasonable number of easy-to-understand collections and provide users with on/off toggle switches for each. Turn the gender switch off to experience a gender-blind newsfeed. Turn “history of engagements” off if you don’t want posts based on your past actions on the platform. Turn “personal data” off if you want rankings based only on the content of posts and how other users have interacted with them, rather than anything specific to you.

Advertisement



Different users prefer different levels of data privacy, and these toggle switches would let users choose what kinds of personal data they are comfortable sharing with the News Feed algorithm.

Facebook should also reveal the different forms of engagement its algorithms predict and allow users to adjust the weights. Enjoy getting into political arguments with strangers online? Then crank up the dials for angry and “haha” reactions, long comments, and political content while lowering the dial for your closeness to the poster. Dislike engaging in such arguments? Then adjust the dials in the opposite manner.

These dials would let each of us specify what types of content and interactions we value, rather than leaving it up to Facebook’s engineers and executives to decide this for us.

Selecting a persona

A fundamental problem documented heavily in the Facebook whistleblower leaks is that there is a divergence between what many of us want to see on social media and what the News Feed algorithm thinks we want based on our actions online. In short, our clicks and comments are often the result of impulse, yet these are used to determine what content we see online — and the result is a proliferation of extremism, hate, and misinformation. The proposed dials wouldn’t eliminate this problem, but they would help reduce it because they would allow users to tell the algorithm what to prioritize.

Advertisement



A lot of users might find it overwhelming to have so many switches and dials to fine-tune. So to make matters easier, Facebook could provide a handful of preset configurations: news junkie, keeping it casual, maximum privacy, etc. We already have preset configurations for our TVs and home audio systems — movie mode, sports mode, concert mode, etc. — so why not provide options like these for our social media systems too? Facebook could even let you import configurations from third-party organizations. Perhaps, for example, a digital rights organization such as the Electronic Frontier Foundation could publish a recommended list of switch and dial settings and users could simply click a button to import them into their News Feed settings.

Facebook has resisted providing transparency into the algorithm in part to make it harder to game the system: Knowledge of the algorithm would help people manufacture content that rises to the top. But if each user’s News Feed uses different inputs and weights, then it would be harder to game the rankings because what rises to the top for some users would fall to the bottom for others. And this level of transparency would not reveal proprietary information that Facebook deserves to keep secret: We would know what the algorithm predicts but not how it does so.

Advertisement



Perhaps Congress could require all large Internet platforms that rely on data-driven algorithms for ranking content to provide toggle switches controlling what inputs are used. Machine learning is often described as a black-box process. It’s time to let each person choose what type of data — and especially how much of one’s personal data — goes into the box.

Noah Giansiracusa is assistant professor of math and data science at Bentley University and author of “How Algorithms Create and Prevent Fake News.”