Parents have a lot to learn about mobile safety, privacy and security: from their kids

After spending a lot of time writing and editing our new free booklet, A Parents’ Guide to Mobile Phones, my ConnectSafely.org colleagues and I came to the realization that parents do have a lot to learn — from their own kids.

Sure, our guide has all sorts of tips and suggestions, but our most important point is that parents should talk with their kids about their mobile use.  Note I said “with,” not “to.” An open, two-way conversation with your kids about their mobile use is a lot more effective than a lecture. Besides, you might discover they know more about safety, privacy and security than you think. Maybe more than you do.

ConnectSafely produced the guide in partnership with CTIA, with support from AT&T, Sprint, T-Mobile and Verizon Wireless.

Studies from Pew Research Center and as well as ethnographic research from danah boyd and others have found that kids are far from clueless when it comes to privacy and safety issues. That’s not to say that your kids don’t have anything to learn — we all do — but that you shouldn’t assume that they’re using their technology in a reckless fashion, even if you’ve seen some press reports making that claim.

Advice for parents

As far as our guide is concerned, we advise parents on things that matter including the best age to get a child their first cell phone, how to make sure your child’s privacy is protected and how you can help assure your child uses apps that are safe and appropriate. We also talk about how to use the phone’s settings to maxiumize privacy and security and advise parents to be sure their kids are using a pass code of some type to make sure that others can’t use their phone. In addition to protecting their data, keeping others from accessing their phone reduces the risk of someone using it to harass or bully others and get your kid into trouble.

Location

The guide warns parents about the possible misuse of geolocation — the feature that allows apps to pinpoint the phone’s (and therefore the user’s) location. While these features can enhance safety and give parents the ability to track their kids, they can also be misused. As we say in the guide, “With the exception of E911, it’s possible to turn off geolocation, either for the entire phone or just for specific apps.”  Parents and kids “can review the apps on their phones to see which apps share location. If you’re uncomfortable with any of them, you can try to turn off the app’s location feature or just delete the app.”

Tools

There are parental control tools from carriers, phone makers and app developers that parents can use to monitor or even limit what they’re kids can do on their phones and while such tools can be helpful in some cases, they’re not for every kid. If you do use monitoring or filtering tools, talk with your kids about why you’re using them and consider weaning them away from those tools as they show that they are responsible mobile users. Kids don’t stay kids forever and our goal as parents is to teach them the critical thinking skills that will last a lifetime in whatever situations they encounter. As my ConnectSafely co-director Anne Collier pointed out, “safety, privacy and security depend less and less on external safeguards (such as parental control tools, which can give parents a false sense of security) and more and more on the “filtering software” in their heads and hearts.

A Parents Guide to Mobile Phones is available in both English and Spanish.

 

Posted in Child safety | Comments Off

Aspen Institute Task Force takes on education in the Internet age

The report: Learner at the Center of a Networked World

A task force report & a student bill of rights: Task Force member and ConnectSafely co-director’s commentary about the report and the work of modern-day student activists

Make personalized learning a realty says ed tech task force: Observations and commentary by Family Online Safety Institute (FOSI) CEO Stephen Balkam

Posted in Child safety | Comments Off

Google complies with “right to be forgotten” but is it the “right” ruling?

The European Union Court of Justice ruling about the “right to be forgotten” is more wrong than right. Of course I sympathize  with the individual who pressed the claim — a Spanish citizen who was upset that Google was linking to a notice  in a newspaper about his repossessed home from 1998. But even though the newspaper article may have been old and by now irrelevant, it was still factual and Google’s only crime was pointing to a web page without making any value judgement — something Google has proudly done since the day it was founded.

Why it’s a bad decision

I have several problems with the court’s order. For one thing, it puts Google into the position of having to make editorial decisions about the content it points to, which kind of defeats the purpose of a search engine. I’m all for curated content, but that’s not what Google does. It simply scours the web to unearth content and makes that content available via search. If Google has to start making decisions about whether to include content that someone might want forgotten, what other editorial decisions might it make about content? Do we really want Google effectively censoring the web by failing to surface legal content?

Another problem I have with the right to be forgotten is that history is history, regardless of how unhappy someone might be to have it revealed. If something happened — even if it is outdated or irrelevant — it still happened and stories about it are part of the historical record. It’s the electronic equivalent of shredding old newspapers because you don’t like what’s printed in them.

It also creates a false sense of security because the court ruling doesn’t require that the information be taken down — just removed from search results. In other words, it’s still there and there may be other ways for people to find it.

Google’s Search removal request form

Google has created a web page where you can request that they “remove results for queries that include their name where those results are ‘inadequate, irrelevant or no longer relevant, or excessive in relation to the purposes for which they were processed.'”

On that page, the company said that it “will assess each individual request and attempt to balance the privacy rights of the individual with the public’s right to know and distribute information.” Factors will include whether the results include outdated information about you, as well as whether there’s a “public interest in the information—for example, information about financial scams, professional malpractice, criminal convictions, or public conduct of government officials.”

Europe only

The form is for Europeans only. The person has to select their country from a dropdown menu of 32 countries and must provide some form of identity verification and check a box certifying that the “information in this notification is accurate and that I am the person affected by the web pages identified, or I am authorized by that person to submit this request.”

People are required to provide the URL they want removed along with an explanation.

No appeal

Google has no choice but to abide by the court’s decision and, because it is the highest court in Europe, there is no appeal.

For more, see Anne Collier’s post, Remember: The ‘right to be forgotten’ is shared

Posted in Child safety | Comments Off

Missing Children’s Day — let’s bring them all home

Screen Shot 2014-05-25 at 10.13.26 AMSince it was founded 30 years ago, the National Center for Missing and Exploited Children (NCMEC) has helped reunite thousands of missing kids with their families.

I have had the privilege of serving on NCMEC’s board for nearly two decades and, since that time, we have seen a dramatic increase in the recovery rate thanks to NCMEC’s dedicated staff along with alert members of the public who report sightings of missing kids. From 1984 through December 2012, NCMEC has assisted law enforcement with more than 195,300 missing-child cases resulting in the recovery of more than 183,100 children, according to the organization. The  recovery rate for missing children has grown from 62% in 1990 to 97% today.

Screen Shot 2014-05-25 at 10.54.32 AMI became involved with NCMEC in 1993 during the search for Polly Klass, a 12-year-old girl who was abducted from her Northern California home, about 80 miles north of where I live. During the search I helped post Polly’s picture online. When Time magazine wrote about the effort, I was overwhelmed by requests from parents of other missing kids, which led me to contact NCMEC’s then CEO Ernie Allen who quickly realized the potential for using online tools to help find children. Even though we couldn’t save Polly, online tools such as MissingKids.com are now used routinely to help in the recovery of missing children.

Public-Private partnership

NCMEC is a non-profit organization, not a government agency, though it was authorized by Congress to serve as the national clearinghouse for information about missing and exploited children. Congress has also designated NCMEC to run the national, toll-free, 24-hour missing children’s hotline; and operate the CyberTipline for online reporting of the sexual victimization of children and inappropriate sexual content.

NCMEC is a unique public-private partnership which receives funding from both the Federal government and numerous private donors ranging from large companies individual donors from all walks of life — including young children who conduct fund-raisers or donate their pennies to help other kids.

Key facts

As you can see from the key facts below, only a tiny percentage of missing children cases involve “stereotypical’ abductions but even children who are abducted by members of their own family can be in extreme danger and deserve to be protected and returned to their lawful parent or guardian.

And it’s not just missing children. NCMEC helps prevent and prosecute cases of child exploitation, including sexual exploitation of children as young as infants. The organization also works to rescue underage victims of prostitution, helping them recover from the trauma of what is often forced or coerced and extremely traumatic exploitation by adults who profit through human trafficking.

NetSmartz workshop's "Clicky" teaches young children about Internet safety and digital citizenship

NetSmartz Workshop’s “Clicky” teaches young children about Internet safety and digital citizenship

NCMEC also operates the NetSmartz Workshop, which provides high-production value materials to help educate young people about how to stay safe on and offline.

Here are some “key facts” from NCMEC’s website.

The most recent, comprehensive national study for the number of missing children estimated in 1999: [1]

  • Approximately 800,000 children younger than 18 were reported missing.
  • More than 200,000 children were abducted by family members.
  • More than 58,000 children were abducted by nonfamily members.
  • An estimated 115 children were the victims of “stereotypical” kidnapping. These “stereotypical” kidnappings involved someone the child did not know or was an acquaintance. The child was held overnight, transported 50 miles or more, killed, ransomed or held with the intent to keep the child permanently.
  • To find the number of children missing from a specific state or territory contact the state’s Missing Child Clearinghouses.
  • The first three hours are the most critical when trying to locate a missing child. The murder of an abducted child is rare, and an estimated 100 cases in which an abducted child is murdered occur in the U.S. each year. A 2006 study indicated that 76.2 percent of abducted children who are killed are dead within three hours of the abduction. [2]
  • The National Center for Missing & Exploited Children® has assisted law enforcement in the recovery of more than 193,705 missing children since it was founded in 1984. Our recovery rate for missing children has grown from 62 percent in 1990 to 97 percent today.
  • The AMBER Alert program was created in 1996 and is operated by the U.S. Department of Justice. As of April 2, 2014, 688 childrenhave been successfully recovered as a result of the program. [3]
  • As of Dec. 2013, NCMEC’s toll free, 24 hour call center has received more than 3,899,964 calls since it was created in 1984. Information about missing or exploited children can be reported to the call center by calling 1-800-THE-LOST (1-800-843-5678).
[1] Finkelhor D., Hammer H., Schultz D., Sedlak A. National Estimates of Missing Children: An Overview, U.S. Department of Justice, 2002.
[2] Brown K., Keppel R., McKenna R., Skeen M., Weis J. Case Management for Missing Children Homicides: Report II, National Center for Missing & Exploited Children and U.S. Department of Justice, 2006.
[3]AMBER Alert, U.S. Department of Justice.

 

Posted in Child safety | Comments Off

Internet security is a global issue that requires global cooperation

This post first appeared in the San Jose Mercury News

PARIS — The National Cyber Security Alliance, or NCSA, is a Washington, D.C.-based organization that promotes online security and safety. Its board consists of representatives from Microsoft, Google, Facebook, Comcast and other U.S. companies, and it works closely with the Department of Homeland Security to provide security advice for American businesses and consumers. I’ve attended meetings in Washington, Pittsburg and Silicon Valley with NCSA staff, and the agenda has always focused on U.S. security issues.

NCSA, along with the Anti-Phishing Working Group, is the main force behind the “Stop. Think. Connect.” campaign, at StopThinkConnect.org, that seeks to raise awareness by encouraging people to pause and think about what they do before they “connect.” It’s kind of the cyber equivalent of the “buckle up for safety” campaign that promotes safety for motorists and passengers.

So I was a bit surprised when NSCA invited me to participate in an international online safety awareness meeting in Paris, attended by representatives of nonprofits, governments, universities and companies from several countries. The event was hosted by Microsoft and took place at its Paris office.

But I was reminded of the global nature of cyberthreats on the day we convened our meeting last Tuesday as news broke that the Justice Department, with the help of law enforcement agencies from other countries, issued indictments in connection with the Blackshades Remote Access Tool (RAT) “that enabled users around the world to secretly and remotely control victims’ computers,” according to the Manhattan U.S. attorney’s office, which said the bust involved more than 90 arrests in 19 countries.

The Blackshades RAT is malicious software, or malware, that has been used by criminals in more than 100 countries to “infect computers throughout the world to spy on victims through their Web cameras, steal files and account information, and log victims’ key strokes,” according to the Justice Department. The alleged co-creator of Blackshades, Alex Yucel, who is from Sweden, was arrested in Moldova and is awaiting extradition to the United States. Brendan Johnson, who is charged with helping to market and sell malware, including the RAT, and provide technical assistance to its users, was arrested in Thousand Oaks, California.

Blackshades provides a good example of how you could be sitting in your home in Palo Alto and be victimized by a criminal on another continent or vice versa. Thanks to botnets, where malicious software spreads itself from computer to computer without the knowledge of the machine’s owners, it’s possible for a computer from Estonia to infect your home PC and for your home PC to then infect someone else’s PC in Germany.

There are plenty of other examples of international cybercrime. I’m on the board of the National Center for Missing and Exploited Children, which regularly cooperates with counterparts in other countries to try to stem the tide of illegal child pornography across borders. John Carr, a child safety adviser to the United Kingdom government, told me that a “substantial proportion” of the illegal images that make their way to the UK come from the United States.

Privacy is also a global issue, as we were reminded last week when the European Court of Justice in Luxembourg ruled that search engines (the biggest two being U. S.-based Google and Microsoft’s Bing) can be required to delete search listings of posts, including stories in newspapers, that may be dated or irrelevant, even if they happen to be true. This ruling could not only affect U.S. companies that offer search, but also those of us in the United States and other countries who use these services, even though the delete order was issued by a court on another continent.

At the Paris meeting, the discussion turned to international cooperation, and it was generally agreed that it’s a good idea for organizations in countries around the world to coordinate at least some of their messaging because of the similarities of the issues that we all face. That doesn’t mean that a campaign that works in Istanbul will necessarily resonate in Indianapolis. But in our increasingly globally connected world, there are plenty of common themes and practices that we can share.

In an interview, NCSA Executive Director Michael Kaiser summed up the purpose of the meeting. “We are trying to reach everyone on the globe because we’re all connected to the same Internet and, unless we’re all safe and secure, we won’t have a safe and secure Internet.”

Posted in Child safety | Comments Off

Facebook changes new user default privacy setting to friends only — adds privacy checkup

This post first appeared on Forbes.com.

Until now, when new Facebook users sent out their first post, the default setting was public, which means that anyone could see it. It’s long been easy to change the audience to Friends only but if you didn’t know about that option, you could have accidentally told the world what you meant to only tell people you know and trust.

Disclosure: Facebook is one of the companies that provides financial support to ConnectSafely.org, the non-profit Internet safety organization where I serve as co-director.

Change for new users only

Facebook is changing the default for new users so that, going forward, the default setting is Friends. The change will have no impact on existing users. The first time someone posts, they will see a reminder to choose an audience for that post and if they don’t make a choice it defaults to Friends.

In a statement, Facebook said that “We recognize that it is much worse for someone to accidentally share with everyone when they actually meant to share just with friends, compared with the reverse.”

Changeable but sticky

 You can easily change the audience of each post and once you make a change it becomes sticky, which means it remains that way till you change it again. So, if you normally post to friends and decide to post something to the public, your subsequent posts will also be public until you change it back to friends only.

That stickiness is important to remember. If you normally post just to friends and decide to post something publicly, you must remember to change the setting back to Friend the next time you post or your posts will remain accessible to the public.

Privacy checkup

Facebook is also launching a “privacy check-up” to enable users to review their privacy practices and settings such as “who they’re posting to and the privacy settings for information on their profiles,” according to Facebook. It also helps users review which apps they’re using and “the privacy of key pieces of information on their profile.”

Earlier App privacy changes

At the F8 Facebook developers conference last month, Facebook CEO Mark Zuckerberg announced changes to its app privacy policies including allowing people to interact anonymously with apps (Facebook knows who you are but you have the choice about whether to reveal your identity to the app developer). Facebook is also providing users with more control over what information they reveal to apps as well as more control over what others can share about them via apps.

Steps in the right direction

I have to give Facebook credit for giving users more control over their privacy and changing the default from public to friends. Its always been possible to control your privacy on Facebook but it’s often been too complicated — especially for new users who could so easily to a broader than intended audience. I’m not entirely sure what motivated these most recent changes but I suspect they will be welcomed by users.

The privacy checkup is another important step. One of the biggest complaints about Facebook privacy is that users don’t know what is out there that others can see and may not be aware of how to control who has access to their content. Of course, this isn’t the first time Facebook has sought to simplify its privacy settings. There have been numerous changes over the years including, for example, the addition of an activity log, a couple of years ago, that helps people uncover what they’ve posted and what’s been posted about them.

There is still more work to be done in terms of educating users about their privacy and how to limit what people can see about them on Facebook and beyond. Facebook also needs to do more to educate people about how their personal information is used to direct advertising.

Posted in Child safety | Comments Off

Experts say UK’s optional filtering equals choice, not censorship

Internet Matters website part of educational campaign that accompanies optional UK filter

Internet Matters website is part of educational campaign that accompanies optional UK filtering

Under pressure from Prime Minister David Cameron and other British officials, major UK Internet service providers are now requiring customers to opt-in or out of Internet filters that affect every device in the home. And, starting this week, most public Wi-Fi networks in the UK will have porn filters so kids can’t get around parental protections when away from home and adults can’t access porn with their coffee at public places.

In the mean time, the UK’s four major ISPs — BT, Sky, TalkTalk and Virgin — have invested millions in an educational campaign centered around a new website called Internet Matters. 

The opt-in filtering policy requires broadband customers to make a choice as to whether they want filtering for all devices connected to the home router. The actual details depend on the ISP—each has its own implementation. But if a customer opts-in, it will affect all devices, which means that if you set your filters for content suitable for young children, everyone in the household, including older teens and adults, will have the same filter, unless it is turned off by the adult in charge.

Writing in the Guardian, Laurie Penny, who describes herself as a journalist and feminist activist, argues that “In the name of protecting children from a rotten tide of raunchy videos, a terrifying precedent is being set for state control of the digital commons.”

But even though the Prime Minister was pushing for these controls, it’s not in fact state control and unlike the filters that are in place in countries like Turkey, Bahrain, Saudi Arabia as well as “the great fire-wall of China,” UK filters are not mandatory. ISP customers have a choice as to whether to use the filters and three of the UK’s leading Internet safety advocates – all of whom I work with closely and know to be sensitive to issues of free speech – feel that it’s not censorship, but a way to give parents more control over the content that comes into their home.

Will

Childnet International’s Will Gardner

“I don’t think it’s censorship,” said Will Gardner, head of Childnet International, a London-based non-profit that delivers technology educational programs throughout the UK. “It’s a really good approach because they’re giving people a choice,” he added. “Nobody is saying that every house needs a filter, but people need to think about what is going to work best for their situation.”

UK government advisor, John Carr

UK government advisor John Carr

John Carr, an Internet safety advisor to the UK government, calls charges of censorship “a ridiculous idea.” He points out that “Everything that is accessible on the Internet today remains accessible tomorrow. Nobody is deleting, removing or altering anything. People are simply being asked to say whether or not they want to use the filters they are being offered by their ISP. They can say ‘yes’ or ‘no.’ Not a big deal.”

dave-miles2

Dave Miles – heads FOSI in Europe & Middle East

David Miles, the director for Europe, Middle East & Africa for the Family Online Safety Institute (FOSI), is cautiously optimistic that the implementation of the new whole-home filtering solutions will give parents the choice and the tools they need to control the content that is coming into their homes. “The filtering is predominantly applied to content that is inappropriate to minors, so concerns of censorship are somewhat of an overreaction in my view,” he said.

Of course there is nothing new about filtering. Parents in the U.S., UK and other countries have long had the ability to install filters on home PCs using software that they could purchase or download for free from companies including Microsoft, Symantec and many Internet service providers.

But a big difference with the UK approach is that it affects the entire home Internet connection, which means the setting will affect all wired and wireless devices connected to the home router including PCs, game consoles, tablets and even WiFi connects phones. Mobile devices connected to the cellular network are not affected but UK cellular companies already content by default with the option to opt-out.

The obvious problem with the UK approach is that it’s one-size-fits-all so a household of adults, teens and young children would have a single setting – likely to protect the youngest children – that will affect all users.

“It would be much better if every family member had their own individual log in with an associated age-appropriate profile,” said Carr, who told me that he repeatedly said that to the ISPs but they were “not prepared to go down that route.”

But, said Carr, allowing parents to configure multiple profiles “would have made implementation much more complex and costly.” He called the ISP’s strategy an experiment, “No one has a textbook full of tried-and-tested answers. This is innovation, something the hi-tech industries are meant to be good at.”

Miles agrees. “I think the new whole-home filtering should be seen as a start point. I would suggest that parents do an audit of the number of Internet-enabled devices they have in the home.” He suggests that parents of kids of varying ages consider installing device-side filters instead of using the whole-house approach.

As Gardner pointed out, there are a “number of obstacles to traditional device filtering” including “cost, ease of use and knowing they exist.” The other problem is that there are now multiple devices in many homes including PCs, tablets, game consoles and even Wi-Fi-connected media players like the iPod Touch. Without a router-based solution, parents would have to install and configure filters on each device.

The American experience certainly validates Gardner’s observation. Parental uptake of device-side filters has always been pretty low but one notable exception was when the then-mighty AOL automatically enrolled children in a filtered “Kids only” experience based on the child’s age. Parents could opt out of filtering, but most didn’t.

Public Wi-Fi filters

Kids trying to get around parent-enforced content filtering won’t be able to do so from most public Wi-Fi hotspots now that also block sexually explicit and other inappropriate content. In addition to blocking kids’ access, these filters prevent people nearby from having to look at potentially offensive content.

Even in the U.S. there are many public Wi-Fi services that block pornography and other adult-only sites. I discovered that at a café near my home in Silicon Valley last July when I was shopping for block ice for my Fourth of July party. The only store that carries it in my area is a wine shop but when I went to their site to find their location and hours, I was blocked because the site promotes the use of alcohol. Ironically, there are plenty of nearby billboards that also promote alcohol and no filters to prevent kids from viewing them. The café’s filters also prevented me from gong to a site about “wearable technology. “

How the UK got here

Over the last few years the UK has seen some highly publicized cases of child endangerment, which, ironically, had essentially nothing to do with kids’ access to porn. Yet, the October, 2012 murder of 5-year-old April Jones, whose killer had accessed child abuse images just hours before her death, touched off a public outcry for all sorts of measures to protect kids online. Another tipping point, according to Miles, was the revelation, after the death of BBC television personality Jimmy Savile, that he had sexually abused hundreds of children over a period of decades. “The Internet had nothing to do with Savile’s crimes,” said Miles, “but caused people throughout the UK to think about sexual crimes against children and the need to do everything possible to prevent further abuse.”

It’s also true that keeping porn away from kids would not have prevented the murder of April Jones but these two cases, said Carr, contributed to a general consensus that something had to be done to protect kids in this digital age.

Prime Minister Cameron, who is a father of three children, became directly involved in pushing for a number of child protections including not just filters but also a commitment from Google and other search engines to block access to search terms likely to lead to child pornography.

Despite its similarities to the United States, the UK remains a unique situation. For one thing, said Miles, the country’s media often takes a strong advocacy position and the press has focused heavily of late on Internet crimes against children.  On the positive side, there is also widespread cooperation and conversation among stakeholders. The UK Council for Child Internet Safety (UKCCIS), which was established in 2008, brings government, non-profits and industry together and, said Miles, “has enabled these initiatives to flourish.” Carr, Gardner and Miles are all on the UKCCIS board

Thoughts on filtering

I do think it’s a good idea for parents to have a choice about whether to use Internet filters and agree that parental control is not the same as government censorship. However, I also think it’s important for parents to think about why they are turning on a filter and — if they do, understand that it’s not a be-all solution. It’s also important to discuss critical thinking, reputation management, treatment of others and media literacy with your kids. If you use filters, think about how and when to loosen the reigns because, eventually, they’ll grow up to be adults, leave home and it will be up to them to protect themselves.  As I aruged in an earlier post, digital citizenship and media literacy beat tracking laws and monitoring

I also agree with my UK Internet safety colleagues concern about the ISP’s two sizes (On and Off) approach. The needs of a 5-year old are a lot different from a 17-year old which makes a household-wide filtering system a bit inconvenient.

We used filters for a short time when my kids were young but turned them off after my son complained that he was being blocked from some gaming and music sites. Instead we relied on family policies and an occasional parental peek into the browser history to see what was going on. I’m not claiming that my kids never saw porn – that would be highly unlikely – which is why I wrote What to do if your kids are looking at porn.

Internet matters

In addition to requiring parents to opt into or out of filters, leading British ISPs have banded together to fund a nationwide education campaign called Internet Matters. Its website has information on age-appropriate strategies and technologies but the ISPs are also committed to providing education and advocacy through other media, including advertisements. The reported budget for this program is £75 million ($127 million) but, said Carr, most of this is based on estimates of ad spending that the companies may be doing anyway. He said there are approximately £6 million ($10.2 million) of new money dedicated to the non-profit organization set up to run the program, which is still a massive investment compared to any Internet safety campaigns I’m aware of.

 

Posted in Child safety | Comments Off

Do kids need special protections or do we all need them?

This post is a work in progress and subject to editing and updates

by Larry Magid

We have always taken child safety more seriously than adult safety, even where children are not necessarily at greater risk than adults. It’s probably instinctual. As every parent knows, protecting one’s offspring is not something we even have to think about. We just do it.

Children not always a special case

But when it comes to risk, children are not always a special case. For example, in California it’s against the law for children under 18 to ride a bicycle without a helmet yet, every year, nearly 17,000 adults are killed or injured in bicycle accidents in the U.S., compared to about 2,200 children. More than 88% of the injured are adults even though adults make up 77% of the population, which means that people over 18 are more likely to be injured than children.  I think of this every time I ride my bike through the Stanford University campus near my home and notice all those over-18 people putting their expensively educated brains at risk by legally riding without helmets.

Bullying doesn’t just affect kids

The same can be said of other types of risks. Workplace bullying among adults is higher than school-yard bullying of kids. 2010 study commissioned by the Workplace Bullying Institute and conducted by Zogby International found that more than a third (35%) “have experienced bullying firsthand.” That’s higher than any of the credible studies of youth bullying. Plenty of adults are also affected by cyberbullying (adults sometimes call it “trolling”) as well.

Why are privacy laws aimed at kids

There are all sorts of existing and proposed privacy laws to “protect” children, yet there is no evidence that kids have any more privacy risks than adults. In fact, adults may have even more privacy risks given what they do online and the potential payback for being able to sell them products based on their online activities. When it comes to government surveillance, nearly all the victims are adults. And from what I’ve been able to observe in social media, adults are not typically more privacy savvy than teens when it comes to what they post. In fact, a 2013 Pew Research survey found that teens are more privacy conscious than many adults give them credit for.

Yet the federal Children’s Online Privacy Protection Act (COPPA) is aimed only at kids under 13 while California’s “eraser button law” affects only people under 18.

COPPA requires “verifiable parental consent” before a child under 13 can provide personally identifiable information (including IP address) to a commercial service. The well meaning law is designed to protect children’s privacy, but it’s unintended consequence is to ban children from social media sites because of the enormous cost, hassle and, ironically, privacy and security risks associated with obtaining that consent. COPPA also discriminates against children who’s parents, for a variety of reasons including limited English-language or technology skills, and fear of government are unable or unwilling to comply with COPPA.

Congrss is considering yet anothr law, The Do Not Track Kids Act (see Anne Collier’s analysis here), that assumes that young people are at a higher risk because “Children and teens lack the cognitive ability to distinguish advertising from program content and to understand that the purpose of advertising is to persuade them, making them unable to activate the defenses on which adults rely,” but that’s lumping together the age range of zero through 17. It may be true of very young children, but I haven’t seen evidence to suggest that teen Internet users fail to understand the difference between advertising and content any more than do adult users. 

Rational laws could protect everyone

To the extent that we need new laws to protect privacy, they should be aimed at everyone, regardless of age. Senior citizens need protection, young adults need protection, parents raising families need protection as do workers at every age. We all need greater transparency so that we know (and are able to understand) how our data is being used and we need laws that limit what government can collect or do with data that companies collect.

Failing to see the bigger picture

“When we think about kids online or off, we tend to automatically overestimate the dangers, because that’s all we ever see in the media, said Lenore Skanazy, author of the book and blog, Free Range Kids. What’s more she added, “we also don’t see the billions of friendly, helpful or informative chats kids have online daily — only the cyberbullying and sexting. Our fears don’t match reality.” Skanazy is observing the disproportionate amount of media attention on risk to youth.  Even the recent incidences of school violence are often interpreted out of context. Data from the U.S. Bureau of Justice Statistics shows a general decline in school related homicides between 1992 and 2011.  

Reflecting on the the past 20 or so years, Crimes Against Children Research Center David Finkelhor, in his paper, The Internet, Youth Safety and the Problem of “Juvenoia,” observed an overall positive trend in most youth-related risks including crime victimization, sexual assault and even bullying, causing him to conclude, “Given the convergence of positive indicators regarding children, there is a good chance that we will look back on this era as one of major and widespread amelioration in the social problems affecting children and families.”

Screen Shot 2014-05-02 at 8.30.13 AM

Homicides of youths ages 5-18 at U.S. schools, by school year, from the annual report “Indicators of School Crime and Safety: 2012,” by the Bureau of Justice Statistics

 

Children and teens need respect as well as protection

Of course, adults want to keep children safe. But it’s also important to respect them and their rights and to be very thoughtful before using “protection” as an excuse to limit their rights and privileges. Plus, we need to remember that with kids as with adults, one size doesn’t fit all. What’s suitable for a young child may not be suitable for a teen and, even within the same age groups, not all kids are equal when it comes to risk.

Ironically, the adult fear (often irrational) of teen use of social media is, according to author danah boyd, related to our clamping down on their freedoms in the physical world because of parental fears. In her book,  It’s Complicated, boyd suggests that “teens simply have fewer places to be together in public than they once did,” which is one of the reasons they flock to social media.

It’s also important to remember that risk is a necessary component of life and an important part of growth and learning for both children and adults. Josie Gleave’s Risk and play: A literature review examines scholarly research in this field, including many studies that suggest that risk-taking is an essential and beneficial aspect of play. She concludes, “There is evidence to suggest that many of the measures that have been taken to crate ‘safer’ play for children are neither necessary nor effective.”

Posted in Child safety | Comments Off

Google pulls scanning and ads from education apps

Google Education apps

Google Education apps

In addition to its consumer and business services, Google is also in the business of providing apps for education with apps such as Gmail, Drive, Docs, spreadsheets and YouTube for Schools. These apps, according  to the company, serve more than 30 million students, teachers and administrators.

Disclosure: I’m co-director of ConnectSafely.org, a non-profit Internet safety organization that receives financial support from Google  and other companies.

Google wrote that “Users who have chosen to show AdSense ads on their Google Sites will still have the ability to display those existing ads on their websites. However, as of October it has not been possible to edit or add new AdSense ads to existing sites or to new pages.”

Google also said that it has “permanently removed the ‘enable/disable’ toggle for ads in the Apps for Education Administrator console. This means ads in Apps for Education services are turned off and administrators no longer have the option or ability to turn ads in these services on.”

Acknowledged scanning per law-suit

As Education Week reported in March, Google acknowledged the scanning in response to a law-suit over  data mining within its education apps.

There had been questions over whether the Google scanning was in violation of the federal Family Educational Rights and Privacy Act (FERPA) law which, regulates information that can be collected from and about students.

Recently published guidance from the Office of Education is a little vague about whether it covers services like Google docs. In response to the question “Is Student Information Used in Online Educational Services Protected by FERPA?” the document answers that “It depends. Because of the diversity and variety of online educational services, there is no universal answer to this question.” The document says that “schools and districts will typically need to evaluate the use of online educational services on a case-by-case basis.”

Smart move

 

Posted in Child safety | Comments Off

Facebook adding more user-control of app privacy

perm

Facebook to offer more control over what information users must reveal to apps

This post first appeared on Forbes.com

Speaking at Facebook’s F8 developers’ conference, CEO Mark Zuckerberg acknowledged that “Over the years one of the things we’ve heard over and over again is that people want more control over how they share their information, especially with apps.”  He added, “If people don’t have the tools  they need to feel comfortable using your apps than that’s bad for them and it’s bad for you. He pledged that “we need to do everything we can to put people first and give people the tools they need to sign in and trust (your) apps.” Facebook also posted details on its developers blog.

Disclosure: I’m co-director of ConnectSafely.org, a non-profit Internet safety organization that receives financial support from Facebook and other companies.

Addresses user fear

Zuckerberg addressed an issue that has plagued me ever since Facebook started allowing third party app developers to let users sign-in with Facebook. It’s always felt like a bit of a black box. When I sign into an app, I’m never sure what information I’ll share, not only with the app developer but with my friends. And there is even the risk that you could reveal information about your friends by using a third party app. “We know that some people are scared about pressing this blue (Login with Facebook) button,” said Zuckerberg. He added, “if you’re using an app that you don’t completely trust or you’re worried might spam your friends, than you’re not going to give it a lot of permissions.”

Change “line by line” what you reveal to apps

Last year Facebook separated read and publish permission so that apps can no longer require you to publish to all your friends, but the company is now including a dialog that lets you change “line by line what you share with this app,” said Zuckerberg. You could, for example, choose not to share your email address or other details or withhold other permissions. “I can sign in on my own terms,” said Zuckerberg.

Users get to control what their friends share about them

Zuckerberg also admitted that people can sometimes be surprised when friends share some of your data with an app. In the past when a friend logged into an app, that app could ask you to share your own data and data your friends had shared with you. But, going forward, “we are going to make it so now everyone has to choose to share their own data with an app themselves.”

Anonymous log-in

anon

Sometimes when you want to try a new app, you don’t really want to create an account or sign in with your real identity so Facebook is offering a new feature called Anonymous log-in that enables you to sign into a new app without having to reveal your identity. Facebook of course does know who you are, but with the anonymous service, they won’t tell the app who you are. They do give you an anonymous identifier that enables you to use the app on various devices. Later, you have the option of signing-in under your real name.

App links

Facebook also introduced “App Links,” which is a platform that enables developers to “map your web content to your mobile content” across devices and platforms.

As this Facebook video explains, App-links make it easier for developers to allow users to link directly into their apps.

Posted in Child safety | Comments Off