The biggest social media news story of the week has been ‘The Facebook Files’, a selection of internal documents revealing various investigations into the societal impacts of The Social Network, as reported by The Wall Street Journal.

The full Facebook Files series is available here, and is worth reading for anyone interested in the impacts of social media more broadly, but in summary, the key discoveries of the reports are:

  • Facebook has a system in place which subjects high profile users to a different review process than regular users
  • Facebook-commissioned studies have repeatedly found that Instagram can have harmful mental health impacts on users
  • Facebook’s ‘Family and Friends’ algorithm update in 2018, designed to reduce angst on the platform, actually increased division
  • Facebook is not doing enough to address potential harms it’s causing in developing nations
  • Anti-vaccine activists have used Facebook to sow doubt and spread fear about the COVID-19 vaccine deployment

The fact that Facebook is addressing all of these issues and evolving its tools in line with internal findings is nothing new – anyone who has done any research into Facebook and its algorithms is well aware of the harms that it can, and has, caused over time, and Facebook itself has stated that it is addressing all of these issues and evolving its tools in line with internal findings.

That being said, the Facebook Files are fascinating because they provide insight into how much Facebook knows about these issues and what its own data has shown about them. This indicates that Facebook might be doing more to solve these issues if they were disclosed.

Is it hesitant because it is concerned about the effect on the business? It seems from the WSJ research that Facebook is aware that it is creating broad social damage and magnifying bad aspects, but it has been reluctant to take action since doing so might affect its users’ experience on the site.

For example, according to the leaked documents, Facebook implement its ‘Friends and Family’ News Feed algorithm update in 2018 in order to amplify engagement between users, and reduce political discussion, which had become an increasingly divisive element in the app. Facebook did this by allocating points for different types of engagement with posts.

Facebook post scoring in algorithm update

Commenters contributed much more value than Likes, as seen in this summary. While Likes were worth one point, other response kinds (including re-shares) were worth five points, while comments were for thirty points if they were considered “significant” (non-significant comments were worth 15 points). Because Facebook utilized this score to assess increasing relevance between connections, the higher the overall value of each post, the greater the likelihood that it would get additional reach.

While the intention was to encourage more discussion, as you can imagine, the update instead encouraged more publishers and media outlets to share increasingly divisive and emotionally-charged posts in order to elicit more comments and reactions and thus receive higher share scores for their content. Comments and Reactions (such as ‘Angry’) were much more useful as a result of Facebook’s move, which resulted in generating more debate about political trends in users’ feeds and exposing more people to such material.

This brings to light another of Facebook’s most serious flaws: it increases exposure to political viewpoints that you may not have previously been aware of. For example, you may not be aware that your former coworker is also a flat-earth conspiracy believer until Facebook informs you of this. This, in turn, eventually pushes each individual more for or against each topic, ultimately pushing more people to choose sides.

Facebook’s internal analysis revealed that the company was aware of what was taking place and that the move was creating greater division and disagreement as a consequence. Did it, however, change its mind about its decision?

Following calls from the Wall Street Journal to change course with the algorithm yet again, Facebook CEO Mark Zuckerberg resisted. Zuckerberg argued that the algorithm update had resulted in more comments, which had addressed a more significant decline in in-app engagement over a longer period of time.

Facebook engagement decline

Given that Facebook is used by some 2.9 billion people, and has arguably the largest influence of any platform in history, insights like this are a major concern, as they suggest that Facebook has actively made business-based decisions on issues relating to societal harm. Which, again, is no major surprise – Facebook is, after all, a money-making business. But the influence and power the platform has to guide real-world trends is too significant to ignore such impacts – and that’s only one of the examples highlighted in WSJ’s reporting.

Other revelations relate to Instagram’s impact on young users:

“32% of teen girls said that when they felt bad about their bodies, Instagram made them feel worse […] Teens blame Instagram for increases in the rate of anxiety and depression. This reaction was unprompted and consistent across all groups.”

Instagram is doing more to provide more protection and support over time, but again, the impact, the real world effect here is significant.

Then there’s the way the platform influences people’s responses to key news events, like, say, the COVID-19 vaccine rollout.

41% of comments on English-language vaccine-related posts risked discouraging vaccinations. Users were seeing comments on vaccine-related posts 775 million times a day, and Facebook researchers worried that the large proportion of negative comments could influence perceptions of the vaccine’s safety.”

Unlike most other businesses, Facebook decisions can significantly shift public perception, and lead to real-world harms, on a massive scale.

Again, we know this, but now we also know that Facebook does too.

The concern, moving forward, is how it will move to address such, and whether the approach it’s taken thus far, in working to keep such revelations from the public, and even leaving harmful changes in place to further its business interests, will be how it continues to operate.

We don’t have any insights into how Facebook operates, as it’s is not a public utility. But at the same time, it really is. Some 70% of Americans now rely on the platform for news content, and as these insights show, it has become a key source of influence in many respects.

But at the same time, Facebook is a business. Its intention is to make money, and that will always play a key role in its thinking.

Is that a sustainable path forward for such a massive platform, especially as it continues to expand into new, developing regions, and more immersive technologies?

The Facebook Files raises some key questions, for which we don’t have any real answers as yet.

You can read The Wall Street Journal’s full ‘Facebook Files’ series here.