In the aftermath of the U.S. Presidential election, Mark Zuckerberg took the stage at Techonomy16 to address concerns that Facebook didn’t do enough to stop the proliferation of fake news on News Feed.
Zuckerberg insisted that the company can always do more to improve the quality of the News Feed experience, but that Facebook could not have influenced the outcome of the election.
“Personally, I believe the idea that fake news on Facebook could have influenced the election, of which there is a very small amount, is a crazy idea,” Zuckerberg said.
He argued essentially that the media failed to learn its lesson about brushing off Trump supporters as being too dumb to decide. It was just as likely for News Feed to highlight fake news about Clinton, but the media remains steadfast in pointing towards Trump supporters, making the implicit assumption that they only had Facebook to make their decisions.
“People are smart and they understand what’s important to them,” noted Zuckerberg.
Instead, he asserted that the problem isn’t the accessibility of facts, but rather content engagement. He noted that Trump’s posts got more engagement than Clinton’s on Facebook. Facebook research shows that nearly everyone on the platform is connected with at-least someone that shares opposing ideological beliefs. The real question is how to influence the way people react when they see a post they disagree with and stop them from brushing it under the rug.
To get there, Facebook is making efforts to involve humans more deeply in the creation of the ranking algorithms the company uses for content. News Feed now has a human quality panel that is used to hone in rankings. Humans are given stories and asked to rank them to get a better idea of what makes a particular story fulfilling for the end user.
Zuckerberg had previously only addressed the election in a Facebook post featuring a photograph of his daughter Max. He noted at that time that, “We are all blessed to have the ability to make the world better, and we have the responsibility to do it,” but didn’t elaborate on what that meant specifically for him and his company.
Adam Mosseri, VP of Product Management for NewsFeed, echoed much of what Zuckerberg said earlier today in a statement to TechCrunch, though his brief comments were notably less skeptical of the importance of removing propaganda.
“We understand there’s so much more we need to do, and that is why it’s important that we keep improving our ability to detect misinformation,” Mosseri noted.
Despite all of the global concern about Trump’s win, Zuckerberg did take a moment to make it clear that he doesn’t believe any single person can fundamentally alter the arc of technological innovation.
As soon as I get Zuckerberg’s comments about the spread of misinformation on the social network typed up, I will post them below.
Featured Image: Paul Sakuma Photography