Istanbul is the site of this year’s U.N. Internet Governance Forum (IGF), while the Geneva meetings will focus on digital media and children’s rights per the United Nations Convention on the Rights of the Child, which has been ratified by nearly every country in the world, except the U.S. and Somalia.
The IGF is an annual event where “stakeholders” from governments, industry, nonprofits and academia discuss a wide range of Internet policy issues. Anne Collier and I are representing ConnectSafely.org, the nonprofit Internet safety organization where we serve as co-directors. (Disclosure: ConnectSafely receives financial support from some tech companies, including Facebook, Google and Yahoo.)
The workshops I’m participating in focus on child online protection, protecting child safety and child rights, and empowering youth through digital citizenship.
I organized the child safety and child rights workshop because I want to explore how to protect children against potential online harms in ways that don’t take away their free speech rights or their right to explore all the amazing resources available online. In a way, it’s a perfect segue to the Geneva conference about digital media and the rights of the child. Article 13 of the United Nations Convention on the Rights of the Child (UNCRC) states “The child shall have the right to freedom of expression; this right shall include freedom to seek, receive and impart information and ideas of all kinds, regardless of frontiers, either orally, in writing or in print, in the form of art, or through any other media of the child’s choice.”
Clearly, “any other media” includes the Internet, which means that by international treaty, children have codified rights when it comes to what they can read and what they can say. And even though the U.S. hasn’t ratified this convention, Americans do have First Amendment rights which, as far as I can tell, apply to everyone, including minors.
Yet, in the interest of protecting children, we sometimes deny them the right to access material and express themselves.
Many schools in the U.S. and other countries employ filters that restrict access to some websites or apps. These types of filters have been around for a long time and were first mostly used to block pornography and, over time, have evolved to also block sites that advocate or depict violence, the use of alcohol or illegal drugs or promote self-harm such as cutting or anorexia. A purest interpretation of the First Amendment or the Convention on the Rights of the Child could be used to argue against the use of these filters for any purpose, but I think most people would agree that parents have the right to protect young children from potentially harmful or disturbing content, and that schools, even public schools that are run by governments, have a right and responsibility to keep kids from accessing certain content within their facilities. But such filters are not just used to block porn, violence and self-harm.
Depending on how they are configured, filters can also block access to social media sites, which is common in many schools in the U.S. and other countries. They can also be used to ban sites that officials in some countries simply don’t want students to access. Ironically, Turkey — which is hosting this year’s governance forum, filtered the Internet for all of its citizens, blocking Twitter and YouTube, for a while earlier this year for what appear to be purely political reasons.
I’m particularly concerned about schools blocking social media. While it’s certainly fair to argue that students should be focusing on their studies while in class, it strikes me that a wholesale ban on social media sites raises some troubling free speech issues.
Social media is where people exchange information and ideas and it’s frequently used for political, cultural and religious expression, not unlike what’s printed in newspapers or discussed in hallways. And, while some schools block social media, teachers at other schools encourage its use and incorporate it into their curriculum as a way to encourage kids to express themselves, broaden their horizons and share learning resources with peers and others from around the world.
I’m also troubled by the Children’s Online Privacy Protection Act (COPPA), a well-meaning federal law that has the unintended consequences of preventing kids under 13 from expressing themselves on most social media platforms unless they lie about their age. Millions of kids have lied to use these services, often with their parents help.
Of course there are risks in social media, but there are also enormous benefits. Sports can be risky, but that doesn’t stop most schools from encouraging kids to participate. If schools treated sports the way they treat social media, they would ban baseball, football and soccer on school grounds and deny their students access to safety equipment, rule enforcement, coaching and camaraderie associated with school athletics, knowing full well that kids would still play those sports when they are away from school.
Whether you’re a decision maker at home, for a school or an entire country, protecting children from harm will always be a major priority. But avoiding harm also means protecting children’s rights, including the right to access media. It’s a delicate balance that requires thought and, most of all, respect for children and their rights and it’s not too much to ask.
This column first appeared in the San Jose Mercury News