Facebook is taking stronger action to promote climate science, and tackle related misinformation on its platforms, as part of a renewed push for a broader, more inclusive global effort to combat the growing climate crisis.

As explained by Facebook:

Climate change is the greatest threat we all face – and the need to act grows more urgent every day. The science is clear and unambiguous. As world leaders, advocates, environmental groups and others meet in Glasgow this week at COP26, we want to see bold action agreed to, with the strongest possible commitments to achieve net zero targets that help limit warming to 1.5˚C.”

Facebook has been repeatedly identified as a key source of climate misinformation, and it clearly does play some role in this respect. But with this renewed stance, the company’s looking to set clear parameters around what’s acceptable, and what it’s looking to take action on, to play its part in the broader push.

First off, Facebook is expanding its Climate Change Science Center to more than 100 countries, while it’s also adding a new section that will display each nation’s greenhouse gas emissions, in comparison to their commitments and targets.

Facebook Climate Science Center

In September of last year, Facebook opened its climate change science center in order to link users with more accurate climate information. The data used to fuel the Center’s updates comes directly from major information sources in the field, such as the Intergovernmental Panel on Climate Change, the United Nations Environment Program, and others.

The new goal monitoring data for each country will offer an extra layer of accountability, perhaps increasing the pressure on each area to fulfill its promises by increasing coverage and awareness of their progress.

Facebook’s also expanding its informational labels on posts about climate change, which direct users to the Climate Science Center to find out more information on related issues and updates.

Facebook climate misinformation labels

During the COP26 climate meeting, Facebook is also stepping up its efforts to combat climate misinformation:

Ahead of COP26, we’ve activated a feature we use during critical public events to utilize keyword detection so related content is easier for fact-checkers to find — because speed is especially important during such events. This feature is available to fact-checkers for content in English, Spanish, Portuguese, Indonesian, German, French and Dutch.”

That begs the issue of why they wouldn’t employ this technique all of the time, but the assumption is that it’s a more labor-intensive strategy that can only be done in short spurts.

By combating such claims as they ramp up (Facebook also notes that climate misinformation ‘spikes periodically when the conversation about climate change is elevated’), that should help to lessen the impact of such, and negate some of the network effects of Facebook’s scale, in regards to amplification.

Finally, Facebook also says that it’s working to improve its own internal operations and processes in line with emissions targets.

Starting last year, we achieved net zero emissions for our global operations, and we’re supported by 100% renewable energy. To achieve this we’ve reduced our greenhouse gas emissions by 94% since 2017. We invest enough in wind and solar energy to cover all our operations. And for the remaining emissions, we support projects that remove emissions from the atmosphere.”

The company’s next step will be to work with suppliers that are also aiming for net zero, which will completely balance its business consequences once fully implemented.

Facebook’s track record on this front is spotty, not because of its own initiatives or endeavors, but because of how controversial content can be amplified by the News Feed algorithm, which inadvertently encourages users to share more left-of-center, controversial, and anti-mainstream viewpoints in order to gain attention and spark engagement in the app.

Which is a major flaw in Facebook’s processes that the company has often acknowledged, albeit obliquely. Part of the reason for the rise in popularity of this sort of information on Facebook is due to human nature, with individuals being able to share and participate with issues that interest them. Facebook claims that this is a problem with people, not with the company.

As Facebook’s Nick Clegg recently explained in regards to a similar topic, in broader political division:

The increase in political polarization in the US pre-dates social media by several decades. If it were true that Facebook is the chief cause of polarization, we would expect to see it going up wherever Facebook is popular. It isn’t. In fact, polarization has gone down in a number of countries with high social media use at the same time that it has risen in the US.”

So, according to the research Clegg cites, it’s not Facebook that’s the problem, but the fact that individuals now have more means to discuss and participate with such issues make it appear as if Facebook is playing a greater role.

However, this gets Facebook off the hook a little bit. The built-in incentive that Facebook has in terms of Likes and comments, as well as the dopamine rush that individuals get from them, is a major issue. This provides individuals an incentive to post more contentious stuff because it generates more alerts and increases their visibility – so there is an underlying mechanism on Facebook that encourages this sort of behavior, whether Facebook wants to admit it or not.

Which is why it’s critical that Facebook act – but the real question is how effective, or even possible, such countermeasures will be, given Facebook’s size.