What are Hate Speech Campaigners and Advertisers Asking Facebook to Do?


FacebookThe advertisers’ boycott of Facebook, led by civil rights campaigners including the Anti-Defamation League, Sleeping Giants, and the NAACP, accuses the platform of profiting from hate (hence the campaign’s name, ‘Stop Hate for Profit’).

Numerous big-name brands say they agree. Coca-Cola, Unilever, Starbucks and Ben & Jerry’s are among those who have paused their Facebook spend, calling on Facebook to do more to tackle hate speech.

Completely cleansing Facebook of hate speech is no easy task. So much content is posted daily that posts and videos which violate Facebook’s standards will somewhat inevitably fall through the cracks. And there will always be disagreement about what constitutes hate-speech, despite attempts to clearly define it. For example Facebook’s hate speech policy explicitly bans “calls for exclusion or segregation”, but recent reports highlight internal disagreement about whether Donald Trump’s proposed ‘Muslim ban’ fell foul of this rule.

But these sorts of issues apply to all the big social platforms. There seems to be a sentiment in the industry at the moment that Facebook in particular has the most work to do. But what steps do Mr. Zuckerberg and his team need to take to make a genuine difference to their platform?

Changes to political posts and ads

One of the most visible differences between Facebook and other social platforms in the last few months has been Facebook’s political posts, most notably those from US president Donald Trump.

Facebook has exempted politicians’ posts from some of its content policies, allowing posts to remain visible even when they contain misleading statements, or incite violence. Zuckerberg has defended this stance, saying it’s better that politicians are able to have their statements and claims discussed in the public forum.

But those calling for change from Facebook disagree.

An two year independent audit on civil rights commissioned by Facebook itself said the social platform should take a stronger interpretation of its voter suppression policies – “an interpretation that makes those policies effective against voter suppression and prohibits content like the Trump voting posts [which falsely claimed that absent voter applications sent out in Michigan were illegal mail-in ballots]”.

Others call for more sweeping reform of Facebook’s political post policies. The Anti-Defamation League says Facebook should “ensure accuracy in political and voting matters by eliminating the politician exemption”. This would involve removing misinformation related to voting, and prohibiting any calls to violence by politicians.

Jake Dubbins, co-founder of the Conscious Advertising Network (CAN), an industry coalition which aims to clean up the ethics of modern advertising, said the model used by Twitter could be replicated by Facebook.

“I think what Twitter have started to do, where they keep up tweets that are in the public interest but flag them as untrue, is a step in the right direction,” he said.

Dubbins added that cleaning up political advertising on Facebook is also vital.

“Political advertising clearly either needs to not exist on the platform until there is formal regulation, or you have to have a form of independent fact-checking. I find it crazy that you can tell a lie, put a bunch of money behind it, and swing an election!” said Dubbins. “We gave evidence to the House of Lord recently, and their report on the subject talked about a ‘pandemic of misinformation’, which is a threat to democracy.”

Greater priority given to civil rights and hate speech

Facebook says tackling hate speech and harmful content is already high up on its agenda. Nick Clegg, Facebook’s VP of global affairs and communications, said in a recent blog post that Facebook invests “billions of dollars each year in people and technology to keep our platform safe. We have tripled – to more than 35,000 – the people working on safety and security. We’re a pioneer in artificial intelligence technology to remove hateful content at scale.”

But others say more cultural and structural changes are needed to make tackling hateful content a higher priority.

The civil rights auditors said in their final report that Facebook needs to make “more visible and consistent prioritisation of civil rights in company decision-making overall”. The report added that more resources need to be invested specifically “to study and address organized hate against Muslims, Jews and other targeted groups on the platform”.

The ADL agrees that specific areas need more investment. The group suggests that Facebook should create “expert teams to review submissions of identity-based hate and harassment”, and also enable harassed users to connect with a live Facebook employee.

CAN’s Dubbins said that higher-level, cultural changes are also needed to make hate speech a higher priority.

“I think there’s been a culture of denial over the years, and the internal belief that ‘what the world needs is more Facebook’ has persisted,” he said.

And Dubbins added that making senior figures at Facebook more aware of the impact of hate speech could help bring cultural change. “We recently took the director of TellMAMA (a support group for victims of Islamophobia) in front of one of the big platforms,” he said. “And she talked about having had to move offices because of direct death threats to her and her staff. And the big platforms are not used to hearing that lived experience”.

Both Dubbins and the ADL said that appointing civil rights advocates at all levels of Facebook, and ensuring they have genuine power to hold the company to account is important.

Tighter definitions of hate speech and hateful groups

Facebook’s hate speech policies already ban “direct attacks on people based on what we call protected characteristics – race, ethnicity, national origin, religious affiliation, sexual orientation, caste, sex, gender, gender identity and serious disease or disability”.

But campaigners say Facebook’s definition of ‘direct attacks’ needs tightening up.

The civil rights audit argued that Facebook should “go beyond banning explicit references to white separatism and white nationalism to also prohibit express praise, support and representation of white separatism and white nationalism even where the terms themselves are not used”.

The ADL agreed, saying that private and public groups which attack people based on Facebook’s protected characteristics slip through the net. The group called on Facebook to find and remove all groups explicitly or implicitly supporting white supremacy, militia, antisemitism, violent conspiracies, Holocaust denialism, vaccine misinformation, and climate denialism

Will changes compromise free speech?

Facebook has said its policies are designed to prevent outright abusive behaviour while preserving free speech. And there are genuine concerns that overzealous hate speech policies could end up censoring controversial, rather than hateful, viewpoints.

Zuckerberg himself in a speech at Georgetown University last year made his case. “We can continue to stand for free expression, understanding its messiness, but believing that the long journey towards greater progress requires confronting ideas that challenge us,” he said. “Or we can decide the cost is simply too great. I’m here today because I believe we must continue to stand for free expression.”

Laura W. Murphy, the civil liberties lawyer who led the recent audit of Facebook, argued in the final report that protection of free expression is important, but that “the value of non-discrimination is equally important, and that the two need not be mutually exclusive.” 

“As a longtime champion of civil rights and free expression I understand the crucial importance of both,” she said. “For a 21st century American corporation, and for Facebook, a social media company that has so much influence over our daily lives, the lack of clarity about the relationship between those two values is devastating.”

CAN’s Jake Dubbins agreed, saying that Facebook already has plenty of oversight to ensure freedom of expression isn’t violated.

Dubbins pointed to Facebook’s Oversight Board, which will have the power to override Facebook’s decisions on contentious content. The board is led by Thomas Hughes, previously executive director of Article 19, a nonprofit focused on defending free speech.

“But what’s missing is that counter to Hughes,” said Dubbins. “It needs to have that balance, where you’ve got experts on free speech, but also experts on hate speech discrimination too”.


Subscribe to Weekly VAN Newsletter

 
LegalMediaOpinionTechUS