In a new report issued today, Facebook summarized next steps in a plan to establish an independent oversight board for content moderation and appeals. For Harvard Law School Professor Noah Feldman, who first proposed the idea of a content oversight board and continues to serve as an adviser, helping develop a new approach to one of the most vexing challenges confronting social media outlets like Facebook has been one of the most exciting things in his professional life.

In recent years, Facebook has been embroiled in a debate about how the internet giant regulates content on its platform and, in doing so, balances competing values that range from supporting free expression to combatting hate speech. Feldman hatched a potential solution.

An expert in constitutional law, with a special focus on the relationship between law and religion, and free speech, Feldman first pitched the idea for an oversight board to Facebook in January 2018. In his pitch, the noted legal scholar proposed an independent, transparent committee to help regulate the company’s content decisions. Facebook was intrigued and asked Feldman to serve as an adviser, and ultimately write a white paper.

Last November, Facebook Co-Founder and CEO Mark Zuckerberg laid out a plan that reflected the Harvard Law professor’s suggestion for a new way for people to appeal content decisions through an independent body. Earlier this year, Facebook released a draft charter outlining a series of questions the company sought to answer through a global input process, including public consultation, to form that body. Today’s announcement was the latest step in the process Feldman was instrumental in launching.

“For me it is an exciting opportunity to be able to work closely with a social media platform that has more than 2.3 billion users to try to implement an institution that will work for protecting freedom of expression,” said Feldman. “Facebook is undertaking a really bold experiment in borrowing an institution from public law and trying to apply it to the private sector and to social media and the internet.”

In a video recording, released as part of today’s announcement, Zuckerberg discussed governance and what that looks like for the technology industry with Feldman and Jenny Martinez ’97, dean of Stanford Law School. The video is part of series of discussions on the future of technology and society.

Facebook Watch

Since Facebook announced plans to create an Oversight Board last November, the company has hosted six in-depth workshops and 22 roundtables attended by more than 650 people from 88 different countries. The company had personal discussions with more than 250 people, including Jonathan Zittrain ’95, HLS’s George Bemis Professor of International Law.

In February, Zuckerberg visited Harvard Law School and participated in a discussion with Zittrain and students from the Harvard’s Techtopia program and Zittrain’s Internet and Society course. The nearly two-hour discussion was part of a series of study sessions for Harvard’s Techtopia initiative, a program for students across the University to explore problems.

Over the course of Facebook’s listening tour, the company received more than 1,200 public consultation submissions for how the Oversight Board could function and be designed.

Today’s report includes appendices that summarize all of the feedback and recommendations the company received through those conversations, workshops and roundtables; internal research; white papers; media reports; and public proposals.

Some general themes included in the report are:

  • First and foremost, people want a board that exercises independent judgment — not judgment influenced by Facebook management, governments or third parties. The board will need a strong foundation for its decision-making, a set of higher-order principles — informed by free expression and international human rights law — that it can refer to when prioritizing values like safety and voice, privacy and equality.
  • Also important are details on how the board will select and hear cases, deliberate together, come to a decision and communicate its recommendations both to Facebook and the public. In making its decisions, the board may need to consult experts with specific cultural knowledge, technical expertise and an understanding of content moderation.

CNET.com: Facebook scours the globe for ideas about its content oversight board