Credit: NeONBRAND (via Unsplash)

Facebook tries to set the record straight on misinformation in the News Feed

Yesterday Facebook attempted to intervene in the ongoing discussion about its role in misinformation. In an article by Alex Schultz, the VP of Analytics and Chief Marketing Officer at the social media giant, the company tried to offer clarity and correct what it appears to view as some false assumptions about the News Feed. Titled “What Do People Actually See on Facebook in the US?”, it gives us an insight into how Facebook measures and categorises content, and how it might go about trying to solve the problem in the months and years to come.

The idea that Facebook is raising the visibility of right wing misinformation and conspiracy content certainly isn’t new. However, the issue has become more visible than usual over the last week due to the fallout from the U.S. election. New York Times tech columnist Kevin Roose tweeted late on Monday night that “Facebook is absolutely teeming with right-wing misinformation,” providing screenshots from analytics company News Whip.

Roose’s tweet was instrumental in renewing interest in the issue – in response to Schultz’s post he joked that “I confess I did not have ‘get personally subtweeted by the 6th largest company in the world’ on my bingo card today but America is a land of possibilities.”

https://twitter.com/kevinroose/status/1326020882112786432?s=20

What does Facebook claim is actually happening in the News Feed?

The first point Schultz makes is that political content only accounts for a small percentage of what people see in their feed – 6%. The implication of this is that it’s wrong to view politically volatile misinformation campaigns as the predominant reality of our news feeds. This might well be true, of course (although it’s not clear how ‘political’ content is defined) but just because the amount of political content we see on Facebook is relatively small doesn’t mean its impact much further.

Reach and engagement

However, the bigger issue that Schultz attempts to grapple with is about the relationship between ‘reach’ (ie. how many people actually see a post in their news feed) and ‘engagement’ (clicks, shares, likes, comments, and reactions to a post). “Much of the public discussion around what performs best on Facebook focuses on posts from Pages with the most engagement and which also contain links to other content elsewhere,” Schultz writes.This, he suggests, is a little misleading, offering data that indicates that when looked at through the prism of ‘reach’, the pages that are most successful are not necessarily the hyperpartisan ones we associate with misinformation.

Schultz also later points out that “engagement does not predict reach.” as if trying to caution critics of trying to draw a line from one to another. However, as Roose points out on Twitter, the data on post reach isn’t publicly available. In essence, journalists and researchers are forced to look at Facebook post through the lens of information precisely because that’s all that Facebook offers.

https://twitter.com/kevinroose/status/1326300391256981504?s=20

Facebook’s next steps in fighting misinformation

Schultz notes that there have been “reasonable requests for us to share more data so it can be studied more fully.” Interestingly, the way Facebook will be doing this is not by sharing data. Instead it will undertake further research through the platform’s FORT (Facebook Open Research and Transparency) project, and in partnership with “researchers from a number of preeminent universities.”

Although its hard to disagree with Schultz that, thanks to the Cambridge Analytica scandal, “it is clear how careful [Facebook needs] to be about partnering with researchers and giving them access to data,” one wonders whether more clandestine research is the best way forward. There has surely got to be a better – and easier – way to improve transparency.