By Larry MagidMedia companies, governments and courts have long debated where to draw the line on what speech should be permitted and what should be banned. The FCC, for example, has strict limits on what can be broadcast on radio and TV. Cable, print and online media outlets -- which have no government censors -- still have policies that limit what they publish, broadcast or post There are even some legal limits on speech in the United States, where we have our cherished first amendment. Child pornography, for example, is not protected speech. Social media companies must also grapple with what to permit and what to ban. One difference between social and traditional media is that social media companies' content providers are their members -- not professional journalists. Facebook doesn't make editorial decisions the way a newspaper or broadcaster would. It publishes anything anyone posts as long as it doesn't violate its "community standards" that ban users from posting pornography and limits displays of nudity. There are also restrictions on violence and threats, encouragement of self-harm, bullying, harassment and hate speech. A close read of these community standards illustrates that they aren't always cut and dry. "Facebook does not permit hate speech, but distinguishes between serious and humorous speech," the standards say. They also say, "we understand that graphic imagery is a regular component of current events, but must balance the needs of a diverse community. Sharing any graphic content for sadistic pleasure is prohibited."
Hard to get it right
With certain types of speech, companies like Facebook are dealing with competing rights. One could argue that even bigots have a right to spew their venom but one can also argue that women and others who feel threatened by speech that, in some cases, advocates hate and violence, have a right to few safe and secure from vitriol that could encourage some to harm them. It also gets complicated for global companies. Europe, for example, has laws against some types of hate speech that, in the U.S., is considered protected speech. In Turkey, as Jeffrey Rosen pointed out in the New Republic, it’s against the law to insert the country’s founder, Mustafa Kemal Atatürk.
Sometimes they have trouble getting it right.
A couple of week ago Soraya Chemaly, Jaclyn Friedman and Laura Bates wrote an “Open Letter to Facebook,” on the Huffington Post, accusing the social network of allowing “groups, pages and images that explicitly condone or encourage rape or domestic violence.”
A campaign spearheaded by Chemaly and the group Women, Action & the Media (WAM) called upon Facebook users to pressure advertisers to pull their ads from the service until Facebook banned gender-based hate speech on its site. Nissan and a number of other advertisers pulled their ads, according to The Associated Press.
Although these and other images have likely already been removed from Facebook, WAM’s website has examples, including one that depicts a woman with the caption “Win her over with chloroform: The way real men get the girl” and another depicting a beaten woman and a fist-wielding man that says “women deserve equal rights. And lefts.
Chemaly, who said in an interview that some of the worst images and videos weren’t posted, referred to some these posts as “human rights violations being used as entertainment.”
Last Tuesday, Facebook responded by pledging to update the guidelines its support staff use to decide whether to pull down reported cases of hate speech. In a blog post, Facebook’s Vice President of Global Public Policy, Marne Levine acknowledged, “In recent days, it has become clear that our systems to identify and remove hate speech have failed to work as effectively as we would like, particularly around issues of gender-based hate.”
Facebook’s community standards clearly prohibit attacking people based on their “race, ethnicity, national origin, religion, sex, gender, sexual orientation, disability or medical condition.” But the service has allowed for some images that have either attempted to be humorous or are purportedly posted as a commentary on what most would consider distasteful speech or images.
Defining hate speech
In other words, there is some question as to how to define hate speech. For example, it might be considered hate speech to say “All women are sluts,” but it might be acceptable to say, “sluts are good.” Calling an individual woman a slut would probably be considered bullying but if that person were a celebrity, it might be permitted. One can get away with expressing hatred for Judaism, but not for hating Jews. Sometimes the distinctions are ever more subtle. It’s hard enough for legal scholars, ethicists and judges to get this right, let alone support staff dealing with an enormous number of potentially offensive images and posts.
On stage at the Wall Street Journals’ All Things Digital conference last week, Facebook Chief Operating Officer Sheryl Sandberg said the company “already took down everything that people were protesting about.”
Sandberg referred to “a real tension between creating a safe and protected community and free expression” and claimed that Facebook is “the most protective site to make sure there’s not violence against women.”
Sandberg didn’t pledge to take down all instances of what she called “crude humor,” but she said Facebook is no longer allowing it to be posted anonymously. “Put your name on your sexism,” she said. (Here’s my live blog of her talk)
Last week’s agreement between Facebook and the coalition of groups that protested its content policy is only a start. As Chemaly put it, “the devil is in the details.” But when it comes to speech, there will never be universal agreement as to what is and isn’t acceptable. Even the U.S. Supreme Court sometimes has a hard time agreeing speech issues. It’s a tough job for Facebook, but as proprietors of a service with a global membership nearly four times the population of the U.S, it’s a responsibility they have to face.
New York Times editorial “Hate Speech on Facebook”
The Delete Squad: Google, Twitter, Facebook and the new global battle over the future of free speech (Jefferey Rosen in the New Republic)
The global free speech experiment for participants of all ages (Anne Collier, NetFamilyNews)
Why Facebook and Google Should Err on the Side of Free Speech (LarryMagid, Forbes)
Disclosure: Larry Magid is co-director of ConnectSafely.org, a nonprofit Internet safety organization that receives financial support from Facebook and other companies.