Business
    Next Score View the next score

    Facebook, Google try to get their stories straight

    Facebook will begin surveying users for the news sources they trust, and tailor its newsfeed based on those results.
    Associated Press/File
    Facebook will begin surveying users for the news sources they trust, and tailor its newsfeed based on those results.

    Two Internet titans are finding out that journalism is harder than it looks.

    The social network Facebook relies mostly on computer algorithms to choose the stories that appear in its popular news feed. But on Friday, Facebook said it will ask users to pick their most trustworthy news sources.

    Meanwhile, Google has mothballed a service that checked the accuracy of facts published on Internet news sites, after two politically conservative sites, The Daily Caller and The Federalist, said that Google’s fact checks were biased, and sometimes wrong, to boot.

    Advertisement

    “On further examination it’s clear that we are unable to deliver the quality we’d like for users,” Google said Friday. “As we continue to work on addressing this problem and assess how best to serve our users, we are putting the experiment on hold.”

    Get Talking Points in your inbox:
    An afternoon recap of the day’s most important business news, delivered weekdays.
    Thank you for signing up! Sign up for more newsletters here

    Earlier this month, Facebook said it would serve up less “public content” such as news articles, video clips from TV networks, and advertisements, and instead prioritize messages from friends and family members. That led to concerns Facebook would marginalize mainstream news operations, and reduce their advertising revenues.

    The latest change still reduces the news Facebook readers see, but seeks to promote the most trustworthy news sources, Facebook Chief Executive Mark Zuckerberg said in a post Friday.

    “The hard question we’ve struggled with is how to decide what news sources are broadly trusted in a world with so much division. We could try to make that decision ourselves, but that’s not something we’re comfortable with,” Zuckerberg wrote. “We considered asking outside experts, which would take the decision out of our hands but would likely not solve the objectivity problem. Or we could ask you — the community — and have your feedback determine the ranking.”

    So Facebook will ask users which sources they trust, and those results will help determine the news seen by Facebook users. Users should get more stories from the most respected sources, and fewer from less trusted outlets.

    Advertisement

    Facebook’s ambitious efforts to reshape the newsfeeds that billions of users read is in part a reaction to the intense and bitter debates of the 2016 election and the prevalence of “fake news” articles and postings that attempted to influence the presidential race.

    Zuckerberg acknowledged that social media sites such as his can contribute to “sensationalism, misinformation and polarization . . . and if we don’t specifically tackle these problems, then we end up amplifying them.”

    William McKeen, a journalism professor at Boston University, said he wasn’t sure Facebook users could be relied upon to choose the most trustworthy news sources. “Reminds me of what [former CBS News reporter] Morley Safer said years ago, about how he would trust a citizen journalist as much as he would trust the citizen surgeon,” McKeen said. “I’m not sure the truth matters to a lot of Facebook users.”

    Facebook formerly relied on its own staff of human editors to curate the news, but the practice came under fire in 2016, after a whistle-blower alleged that editors sought to suppress stories with politically conservative content.

    Hiawatha Bray can be reached at hiawatha.bray@globe.com. Follow him on Twitter @GlobeTechLab.