Facebook is very keen to dispel the notion that it helps amplify divisive content and misinformation, and it’s been working over the past few months to devise a new means of proving exactly this, resulting in the launch of its latest quarterly report, which it’s calling it’s ‘Widely Viewed Content’ update.

Facebook Most Viewed Content report

As explained by Facebook:

“Over time, the Widely Viewed Material Report will give more specific information about the most popular content that people see on Facebook, according to the report’s creators. It begins with the top 20 most-viewed websites, links, Pages, and articles in News Feed over the previous quarter, and eliminates advertisements while include material recommended by Facebook under News Feed units such as Suggested For You.”

Based on the data shown in the graph above, Facebook is attempting to emphasize the fact that, despite reports in the media claiming otherwise, political material does not predominate in users’ feeds.

“The great majority of information read in News Feed during the second quarter of 2021 (87.1 percent) did not contain a link to a source other than Facebook’s own website. Only around 12.9 percent of News Feed content views in the United States during the second quarter of 2021 were on items that had images “the “links”

This makes sense since Facebook has always attempted to give priority to postings from friends and family, and it has made specific algorithm adjustments in order to increase this priority even higher over the course of time. However, it is worth noting the language used here: Facebook is claiming that the most ‘viewed’ material on their platform is obviously not linked to divisive political content, which is in accordance with the overall statistics.

However, ‘views’ and ‘interaction’ are two very different things, and it is this distinction that is critical in this situation.

To provide some more context, in November last year, Facebook published an official response to this Twitter account, created by New York Times journalist Kevin Roose, which shares the top ten Facebook posts that see the most engagement in the app each day.

This listing is powered by Facebook’s own data, accessible via CrowdTangle, its monitoring and analytics platform, which is primarily used by journalists. As you can see, the daily list of the posts that see the most active engagement on the platform is generally dominated by divisive political spokespeople, most right-leaning, which appears to underline Facebook’s role in amplifying such content.

According to Facebook, this depiction is inaccurate, and as previously stated, the company attempted to clarify that this listing was not truly a genuine picture of what garners the greatest traction on the social media platform, with interaction statistics being one component of a larger jigsaw.

As per Facebook:

“Even during an election season, the majority of the information that people view [on Facebook] is not about politics. In reality, according to our research, political content accounts for around 6% of all information displayed on Facebook. Included in this category are postings from friends and posts from Pages (which are public profiles made by companies, corporations, celebs, news organizations, causes, and the like).”

So, while this material may have a high degree of interaction, it does not necessarily imply that it is the type of content that users will see more of in the app.

In order to counter this, Facebook first reportedly sought to implement new options for how it displays data within CrowdTangle, with a view to essentially painting a better picture of Facebook’s actual content engagement scope.

That, according to the New York Times, didn’t go as planned:

“On CrowdTangle, some execs suggested making reach data public in the hopes that reporters would use this data instead of the engagement statistics they believed made Facebook seem bad, which they believed was inaccurate. However, [Brandon] Silverman, the chief executive of CrowdTangle, responded via email, stating that the CrowdTangle team had already tried a function to accomplish this and discovered issues with it. One problem was that fake and deceptive news articles were frequently found at the top of these lists, which was a source of concern.”

So, regardless of how Facebook tried to spin it, these types of divisive posts were still gaining traction, demonstrating that, despite the aforementioned algorithm updates designed to discourage such sharing, this type of content continues to be the most popular on the social network, according to data from the Social Media Research Institute.

Which brings us back to the current report: it is in this environment that Facebook has attempted to reshape the narrative that its systems are assisting in the propagation of divisive content. By moving the focus away from ‘engagement’ (the postings that Facebook users actually comment on, Like, and share) and toward “views,” which refers to the comments that people actually see in their news feeds, the conversation can be more productive this time around.

So, what exactly does Facebook consider to be “Views”?

According to Facebook, “Content views are recorded when a piece of content appears on someone’s News Feed, is visible on their phone, computer, or tablet, and is present for a period of time that allows it to be seen; “Content viewers” refers to the number of accounts that have viewed a piece of content.”

In this case, this is a vitally essential distinction. The original CrowdTangle data, which Facebook is now attempting to downplay, shows the types of posts that people on Facebook are actively engaging with, whereas this report shows the types of content and posts that appear in users’ feeds but that they may not actually click on, comment on, Like, or otherwise interact with, according to the report.

This is just the material that is displayed to users as they scroll. This represents a statistically significant difference.

According to this, what do Facebook users view more of on a regular basis?

There are some difficulties here, to be sure, but first and foremost, here is a list of the top ten most commonly seen websites, as determined by Facebook links, during the last three months.

Facebook Most Viewed Content report

As a result, individuals are seeing YouTube links and UNICEF content, and there is nothing problematic about this.

Few polarizing and/or controversial domains appear among the most popular links, including the following ones:

Facebook Most Viewed Content report

Recipes, ‘Reppin’ for Christ’ – see, it’s not only right-wing pundits and political conspiracy theories on the right that are involved here.

Though they are strange – as Ethan Zuckerman points out, the listing includes a speaking agency of former Green Bay Packers players, a CBD seller and the aforementioned ‘Reppin for Christ’, which sells ‘stylish, pro-Jesus apparel’.

What is it about these links that makes them some of the most popular on Facebook? Considering the fact that there doesn’t appear to be any organic source driving this sort of referral traffic, it appears as though someone is spamming the living daylights out of these pages.

Even the list of the most popular Facebook pages over the previous three months dispels the myth that Facebook is a haven for right-wing propaganda.

Facebook Most Viewed Content report

It’s all for amusement purposes only – nothing serious or harmful. According to these results, Facebook is not disproportionately magnifying divide and political tension, and health misinformation is also at an all-time low, according to these data.

However, there are several important contrasts here that Facebook has attempted to brush over – most notably, the usage of quarterly statistics rather than daily, real-time information in the case of Facebook.

Due to the fact that news stories generally only gain traction on a single day, they may not have the same amount of presence in a quarterly listing, which highlights cumulative views over time. Furthermore, focusing on page views is almost irrelevant when you consider that Facebook is also including content recommended by Facebook within News Feed units (such as ‘Suggested for you’).

And, once again, the choice of the term ‘views’ rather than ‘engagement’ appears to be misleading, as these are the postings that are displayed in users’ feeds rather than the ones that they are actively engaging with.

In addition, Facebook has published domain-level information for external connections, rather than the exact URLs that people are linking to. How can I find out what the real YouTube URLs are that are being sent around? It’s possible that this sort of deeper drill-down will also give more information – but it doesn’t appear that additional context is what Facebook is aiming to achieve with this report.

It appears that Facebook is seeking for a way to dispel the impression that it is somehow responsible for the amplification of political movements, which is more plausible. Furthermore, this data does not account for shares in private groups, shares in message threads, and so on.

Basically, it’s difficult to get any actual information from this new report since it appears to be extremely intentional, very planned, and built around a certain aim, rather than sharing genuine knowledge with the reader. To ensure that it is not boosting contentious material, Facebook might establish its own list of the postings with the greatest engagement AND views each day, which would be used to challenge the Top 10 list account.

However, it will not do so since it recognizes that the listing will be extremely similar to what is currently available.

Because, whether we like it or not, Facebook does have a tendency to magnify material and movements that are contentious. With around 70 percent of Americans now obtaining at least some of their daily news content from social media platforms, even if only a small fraction of overall views, shares, and engagement on the platform is related to political discussion, that is still a significant amount of engagement and activity.

With over 2.9 billion monthly active users, more than a quarter of total activity, even accounting for Facebook’s stated statistic that only 12.9 percent of News Feed content views contain a link, would still equate to more than one billion links displayed in user feeds, based on the most recent data available.

Facebook’s new report does give some extra, broad context, but the caveats attached to the information shown appear to create more concerns than they do answers in this regard.

You can read Facebook’s ‘Widely Viewed Content Report’ here.