“If [Facebook] were being Machiavellian and this was just a fig leaf to do business as usual, you could have picked people that would have given you a quieter life.”
Madeleine Handaji and Kristen Gianaris
Rabat – After Facebook revealed the names of the 20 members who will make up its new oversight board on May 6, questions abound as to the true purpose of the “big brother” board and how the social media giant chose its members.
“The purpose of this body would be to uphold the principle of giving people a voice while also recognizing the reality of keeping people safe,” Facebook’s Mark Zuckerberg said in November 2018 when he first announced the creation of the board.
The board will sit to review content on the social network and has the decision-making power to remove or censor content that it finds in violation of its “community standards.” The board can overrule decisions made by Facebook’s moderating team and, even, Zuckerberg himself.
The launch of an oversight board comes after Facebook found itself sinking into a quagmire of controversy. In 2016, the social network faced backlash after moderators removed the famous “Napalm Girl” photograph, citing nudity. The photograph was taken during the Vietnam War in 1972 and depicts an injured nine-year-old child, running naked during a napalm attack.
Facebook has also met with pressure to censor and remove posts inciting nationalist, racist, or extreme right-wing voices. Critics of the social network have also called for the censorship or moderation of anti-vaccination content, political campaigning, and posts of a violent nature.
Hopes for objectivity in diversity
The new moderation board faces a moral and political mine-field of subjectivity: How, one might ask, can they hope to maintain the freedom of expression Facebook users have come to enjoy, while also upholding the ever-changing sea of the platform’s “community standards?”
With the reveal of the 20 names, Facebook announced that the board members speak 29 languages between them and have lived in 27 countries. According to the social network, the board members hold opposing, and sometimes controversial, views.
“If [Facebook] were being Machiavellian and this was just a fig leaf to do business as usual, you could have picked people that would have given you a quieter life,” board member and ex-editor in chief of the Guardian Alan Rusbridger told his former newspaper.
The board includes a government official from Israel, human rights advocates from Pakistan and West Africa, and a former US Federal Judge, appointed during the George Bush administration. Two former newspaper editors from the UK will also sit on the board, alongside a former Danish prime minister, several human rights lawyers and professors, and an Arab Nobel Prize Winner.
The choice of board members has raised eyebrows across the globe. Rusbridger approved the Guardian’s reporting on Edward Snowden’s NSA leaks about the US administration, while board member Emi Palmor served as the director-general of the Israeli Ministry of Justice.
Law professor John Samples is another interesting choice. Samples currently head a Libertarian think tank and has published numerous works regarding social media and speech regulation.
Optimism meets criticism
“This theory of oversight is heavily informed by legal scholarship, which is slow and administrative and technical in nature, when we need something much more suited to the speed of the technology itself,” said Joan Donovan, the research director of Harvard’s Shorenstein Center and an expert on media manipulation.
“They’re going forward with this really long drawn out procedural mechanism that doesn’t address what the problem is – which is that viral content only needs to be on the internet for 4-8 hours for it to do its damage.” Donovan added that the company was putting the new board “in the middle of a landfill and saying, ‘You sort it out.’”
Concerns abound that the curated board remains powerless and maybe little more than a pretty face allowing the social media outlet to publicly exercise due diligence. While some media experts suggest Facebook is better suited to investing in the engineering of their algorithms, others state that the company isn’t sincerely committed to a human rights-focused agenda.
“If Facebook really wanted to take outside criticism seriously at any point in the past decade, it could have taken human rights activists seriously about problems in Myanmar; it could have taken journalists seriously about problems in the Philippines … But it didn’t. This is greenwashing,” said Siva Vaidhyanathan, author of a book on Facebook and a professor of media studies at the University of Virginia.
Beginning a lengthy process
With over 2 billion Facebook users worldwide, there is little information to demonstrate the potential efficacy of the board. The oversight board is expected to begin with “dozens” of initial cases, a small proportion of what is considered necessary to review.
Facebook has invested $130 million to fund the board over the next six years. Over the course of the board’s initial funding phase, the number of members is expected to double, increasing their capacity to take on cases. Some members have expressed “realistic” views that the board will need to go through some trial and error before fully establishing legitimacy in the public eye.
Nick Clegg, Facebook’s head of global affairs, told Reuters, “I don’t expect people to say, ‘Oh hallelujah, these are great people, this is going to be a great success’ – there’s no reason anyone should believe that this is going to be a great success until it really starts hearing difficult cases in the months and indeed years to come.”