News Ticker

Facebook Skirmkish with UK Police Agency

I flew to London a couple of weeks ago with the intention of catching a flight to Spain for a tech conference, but British aviation officials changed my plans by grounding planes because of the ash from Iceland’s volcano.

I was able to use the time to delve into an ongoing controversy between Facebook and the United Kingdom’s Child Exploitation and Online Protection Centre (CEOP), the official government police agency charged with protecting kids from Internet crime.

The dispute, which has been covered heavily in the British press, seems silly at first glance, but it represents a very important philosophical dispute on how to best protect kids from online abuse.

CEOP wants Facebook to prominently display an orange button or logo on every page visible to U.K. users so they can easily click to a webpage on CEOP’s site to report abuse to police or get information resources about cyberbullying, hacking, viruses, harmful content and “sexual behavior.”

Several press reports have referred to the CEOP logo as a “panic button.” But in a phone call from Washington, where he was stuck because of grounded planes, CEOP CEO Jim Gamble said it’s not about panic but about a place to report and get safety information. He’s right about that — if someone really were in a life-threatening situation, they would be better off calling 999, the British version of 911.

During my time in London, I met with Facebook’s U.K. policy chief, Richard Allen, who told me that Facebook is willing to make some changes to its reporting system to satisfy CEOP, but that it’s not willing to adopt the CEOP button or redirect all abuse complaints to a police-operated site.

Elliott Schrage, who works out of Facebook’s Palo Alto office as vice president in charge of global public policy, reiterated that position. The CEOP plan, Schrage said, “invites people to feel there is more of a problem than there is.”

He’s right. The CEOP “button” would suggest that Facebook is a dangerous place for kids and that there is an imminent need to report issues to the police. But research has shown there are relatively few cases where kids are being threatened in a way that warrants a police report. Most abuse situations involve peer-to-peer bullying or harassment or inappropriate postings by young people. While police, along with other professionals, can advise kids to avoid such dangers, these reports rarely represent actual crimes.

Schrage says that even when complaints do involve potential crimes, users are better off reporting them to Facebook, which, in turn, can report real crimes to the police.

He said Facebook tested a police reporting system a couple of years ago at the urging of the New Jersey attorney general and found that “there were fewer reports made to authorities because people were intimidated, and as a result, Facebook was a less safe place because there were fewer reports.”

In our phone conversations, CEOP’s Gamble argued that the button serves as a deterrent to crime because it puts would-be offenders on notice that Facebook is cooperating with the police. But I find that argument hard to buy, considering that sexual abuse against children in Britain, the United States and most other countries is a very serious crime. If a long prison sentence and a spot on a sex offender registry won’t deter someone, I hardly think that a website button will.

If you have a security issue at a hotel, you typically report it to hotel security, which will call the police if it can’t handle the problem. And hotel security would certainly be able to deal with common problems like a neighbor playing a TV too loud.

Facebook has announced a number of significant improvements on safety education and reporting programs in Britain, including a redesigned abuse reporting system that will allow users to report directly to CEOP (but not with a CEOP button). It will also provide safety organizations with millions of dollars worth of free advertising.

Although he asked me not to quote him by name, a police official in the U.K. told me that he sides with Facebook on this issue because the company has been very cooperative. “We get a lot out of Facebook,” he said.

Personally, I think Facebook could do a better job in its abuse reporting by putting its own clearly marked abuse report and safety links on each page and, when there is clearly evidence of criminal activity, referring those cases to appropriate law enforcement agencies. It might help to have an independent body analyze or audit how well it handles reports on a global basis.

If Facebook were a country, its “population” of 400 million users would make it the third-largest in the world — behind only China and India. To quote Spider-Man, “With great power comes great responsibility.”

(Disclosure: My nonprofit Internet Safety organization serves on Facebook’s safety advisory board and receives funding from Facebook.)

Leave a comment