How social media platforms like Twitter and Facebook manage the spread of mis- and disinformation—and their governance approaches to this and related problems—has been a major concern over the past few years. Ahead of the 2020 presidential election in the United States, experts from the Berkman Klein Center for Internet & Society convened to discuss how platforms are approaching these challenges and what they can improve going forward.

The event featured evelyn douek, a lecturer on law and S.J.D. candidate at Harvard Law School, and Julie Owono, a lawyer, executive director of Paris-based Internet Without Borders, and member of Facebook’s Oversight Board. Both are affiliates of BKC. The conversation was moderated by Oumou Ly, a staff fellow for BKC’s Assembly: Disinformation program.

douek set the stage for the discussion by providing an overview of content moderation since 2016, including approaches like takedown policies, tracking coordinated inauthentic behavior, and adding “friction” to sharing processes, such as adding flags and labeling content.

“All this stuff sounds incredible. As usual though, with platforms, the thing is a policy can look great on paper, but the question is, will they and can they enforce it?”. When it comes to content moderation, she said, there have certainly been examples over the past few years of ineffective enforcement of policies. “You can have an excellent hate speech policy or an excellent incitement to violence policy, but if you just don’t have the resources and aren’t dedicating sufficient attention to it, does it really matter? And that’s going to be the big question.”

However, douek noted, platforms and their policies are only one piece of the puzzle. “Platforms have a huge amount of responsibility here, but also is always going to be somewhat a limited lever to pull when there are massive other institutional failings,” she said.

Owono, in turn, shared insight into the role of the Facebook Oversight Board, and how it takes its cases. She also emphasized that the majority of Facebook users are from outside of the U.S. and that platforms could learn from elections outside of the U.S. and Europe.

“There have been some case studies and practices that were rolled out before elsewhere, and we have seen the impact that they could have,” Owono said. “So that’s a little disappointment that I have with many of the policies that have been rolled out.”

As an example, Owono pointed to the manipulated video— or deepfake— falsely showing an inebriated Nancy Pelosi. “Suddenly, platforms woke up to the fact that, yes, people can share manipulated media on our platforms,” she said. Yet the African country of Gabon, she noted, had already witnessed and engaged in conversations about the dangers of political deepfakes due to a manipulated Facebook Live video. “So had the platforms, again, given a thought about—or at least been open about this— they [probably] would have had this conversation even before the Nancy Pelosi case came out.”

“I think it’s unfortunate, especially since many of these platforms are repeating all the time that they’re ‘global.’ Yes, you are global, and there has to be consequences to that globality. And one of these consequences is paying attention also to your users who are beyond the frontier order of the United States, of Canada, and other places in the world,” Owono said.

Ly pointed out that after the 2016 U.S. presidential election, “the platforms learned pretty quickly how to detect and deal with disinformation from foreign actors. But as we’ve seen over the past couple of weeks, and certainly over the last couple of years, that’s not necessarily the case when it comes to domestic disinformation, especially when the purveyor of that domestic disinformation is within the U.S. government, including at the highest levels of the U.S. government.” She asked douek and Owono about what platforms could learn—or do—about applying these policies to political figures.

Owono highlighted disparities between how platforms apply their policies to different countries; “It has always been a kind of criticism against the platforms, that they tend to … apply their policies with more severity when it comes to outside leaders, and particularly leaders from southern— Global Southern—countries, especially those that are not in very good terms with Global North countries, including Iranian officials and Russian officials,” she said, calling for “more coherence and consistency” in platforms’ responses to governments around the world.

The challenges associated with content moderation on platforms like Twitter and Facebook aren’t solely for platforms to solve themselves; they frequently involve government and civil society stakeholders as well, Ly said.

“The fight against disinformation is not just a product of platforms. It’s really a question of our democracies, of our human rights, of the rule of law,” Owono said. “It’s the government’s responsibility to make sure that the citizens have not only access to the information but are able to read that information.”

Owono added that, in addition to the duty of government and platforms, civil society organizations also play a critical role in these discussions. “We wouldn’t have all this conversation if we didn’t have civil society organization researchers, who have been doing the work on alerting about what we are seeing now.”

douek agreed and also zeroed in on those involved in these challenges, highlighting the importance of users and their behavior on platforms. “I think we all should play our part in this process as well. I’m always a little bit shocked and surprised by people who like a disparity about the state of the online ecosystem, and then smash the retweet button on spurious claims when they like them,” she said. “I think if we try and … be the Twitter that you want to see in the world. That’s a good way of proceeding. To try and be good actors, particularly over the next week and couple of weeks here in America. To try and make sure you’re checking things before you spread them and amplify them is this is a small thing that we can all do as well.”