A conversation with Esther Wojcicki on ‘Moonshots in Education’

Esther Wojcicki is an award winning journalism teacher and the author of a new book ocovern education called Moonshots in Education: Launching Blended Learning in the Classroom. The book explores digital and online learning with models and examples from schools that are already implementing digital learning.

Moonshots is an approachable book that’s part Wojcicki philiophy and part tips and advice from her co-author Lance Izumni and contributors Alice Chang and Alex Silverman. Actor James Franco (a former student of Esther’s) wrote the forward. One of my favorite passages is about a culture of trust

The first thing to establish in a classroom is a culture of trust. That doesn’t mean the students are given complete freedom to run wild and do what they want; it means the students trust each other to help in the learning process and the teacher trusts the students.

A conversation

The interview you can hear below, a conversation really, is more than just about the book. It’s about an educational philosophy that stresses doing rather than just studying and is based on something quite radical in education — respect for students.

And the reason I call this a conversation rather than just an interview is because Esther touched on subjects that are near and dear to my heart as a former educational reformer back in a different era.

Author and teacher, Ester Wocicik

Author and teacher, Ester Wojcicki

‘Revenge porn’ is about betrayal, not pornography

Screen Shot 2015-01-29 at 4.43.33 PM

This post first appeared in the San Jose Mercury News

Sharing explicit pictures or videos with an intimate partner is not always a harmful practice, but it can be devastating if those images get into the wrong hands — like those of Kevin Bollaert.

In the first criminal prosecution using a new California law targeting “revenge porn,” San Diego-based Bollaert, 28, was convicted Monday on six counts of extortion and 21 counts of identity theft for operating two websites. One of Bollaert’s now defunct sites posted nude and sexually explicit pictures of woman, often taken by a former intimate partner, with names, age and other information about the victims. Another reportedly enabled victims to pay to have their pictures removed from the first site.

Cowardly act

“Just because you’re sitting behind a computer, committing what is essentially a cowardly and criminal act, you will not be shielded from the law or jail,” California Attorney General Kamala Harris said. “The result of this conduct was to make people feel shame and embarrassment in the context of their family, their community, and their workplace,” she added.

“Revenge porn” is a term for pictures posted or shared, often by a former intimate partner, to embarrass or shame the victim. It’s sometimes referred to as “sextortion,” especially if the perpetrator demands money, sex or for the victim to remain in an abusive relationship.

Sexting gone wrong

Some revenge porn involves images or video taken by a partner, or using concealed cameras with one or perhaps both parties unaware, but in many cases the images are self-produced: Sexting gone wrong.

Often the pictures were consensually taken by or shared with the partner during a time when the victim trusted the partner not to misuse those images. It’s increasingly common for partners to share intimate photos — often via their smartphones — as a form of flirting or showing affection. There’s been a fair amount of research on sexting both for adults and teens and most sexting incidents do not result in anything bad happening; some have even argued that it’s a form of “safe sex,” because there is no chance of pregnancy or a sexually transmitted disease.

But, by definition, revenge porn is not consensual. Even if the victim consented to the video or images being produced, that doesn’t mean they’re consenting to them being shared.

Cindy Southworth, from the National Network to End Domestic Violence said that the revenge porn term “belittles and doesn’t really capture the true crux of the issue.”

She said that the “problem is photos being shared without consent. It’s not pornography, it’s a crime.”

Breaking of trust

Of course, most people who allow others to take or possess intimate pictures of them do so out of trust. You’re in a relationship or trying to start one and you have every reason to believe that the other person will enjoy the images but not share them with others. And, in the vast majority of cases for both teens and adults, that’s exactly what happens.

But, as many people have sadly discovered, relationships can fall apart. Although most people who break up are decent enough not to publicly violate the trust of their former partner, our world has its share of creeps and criminals, which is why we have revenge porn.

Distributing these images, said SSP Blue CEO Hemanshu Nigam, “can be destructive in all sorts of ways. It can affect your work environment, your kids and your community.”

Nigam, who is a former federal prosecutor for computer crime, called revenge porn “a form of digital rape.”

Added consequences for minors

In the case of minors, there is the added risk of legal consequences even if nothing malicious takes place because it’s illegal to produce, possess or distribute sexually explicit images of minors — even if the minor is the one taking the picture.

Of course, child pornography laws were designed to protect kids, not prosecute them for bad judgment, but there have been cases of youth being placed on sex offender lists for consensual sexting. Fortunately those cases are getting increasingly rare as prosecutors and law enforcement realize that there are better ways to deal with teen sexting.

There is also the possibility of an image getting into the wrong hands by accident or as a result of a hack. There are cases, for example, when someone gets their hands on another person’s phone, only to see and perhaps share images that were never meant for them. And — as Jennifer Lawrence, Kate Upton and other celebrities learned — there is also the possibility of someone breaking into an account to access and then share photos and video.

As Nigam pointed out, “It’s not just a celebrity problem. It can affect anyone who winds up in a bad relationship.”

So, the only way to be 100 percent sure that a photo won’t be circulated is to not take it or at least not share it. And if you do share it, make sure it’s someone who you can trust and hope that person never violates your trust. If images of you are distributed against your will, save the evidence and contact an attorney or law enforcement to explore civil or criminal actions.

For links to tips on how to prevent and deal with revenge porn, visit connectsafely.org/revenge.

I

onnectSafely to host Safer Internet Day with Calif AG Kamala Harris and Facebook’s Sheryl Sandberg

Screen Shot 2015-01-29 at 4.43.33 PM

This post first appeared in the Mercury News

In 2004, a project of the European Commission launched “Safer Internet Day,” which became an annual event held on the second Tuesday of February. This year represents the 11th Safer Internet Day, which is now being celebrated in more than 100 countries, including the United States.

ag-kamala-harris-official1-200x300

California Attorney General Kamala D. Harris to keynote Safer Internet Day

Last year, ConnectSafely.org, the nonprofit Internet safety organization that I help run, was asked to be the U.S. host, and we launched our own inaugural event in Washington, D.C., that featured panels of youth and industry leaders plus an address by U.S. Sen. Chuck Schumer, D-N.Y.

This year, Safer Internet Day USA (which is free and open to the public) is being held on the Facebook campus in Menlo Park with a keynote address by California Attorney General Kamala D. Harris; remarks by Facebook COO Sheryl Sandberg; three panels involving youth, industry and social activists; and a presentation by National PTA President Otha Thornton.

The U.S. theme this year is “Actions & Activism Toward a Better Net & World,” which reflects ConnectSafely’s perspective on the real meaning of Internet safety

As you’d expect at an Internet safety gathering, we have a panel, “Beyond Bullying: Dealing with Trolling & Social Cruelty” focusing on serious issues that plague the Internet and social networks. And, yes, there will be talk about privacy and security challenges that should be on everyone’s mind now that the details of our personal, social and financial lives, along with our health care and school records, are increasingly being stored on cloud servers. Even our bodies are being connected to the Internet as we strap on smartwatches that transmit data about our pulse and perspiration rates along with other bodily functions to cloud servers that could be vulnerable to all sorts of compromises.
COO Sheryl Sandberg to welcome Safer Internet Day to Facebook campus

COO Sheryl Sandberg to welcome Safer Internet Day to Facebook campus

But even as we share the best advice from industry, government and nonprofit leaders, we will remember that the title of our event is “Safer” Internet Day, not “Safe” Internet Day. That’s because no amount of precautions can possibly guarantee nothing will go wrong any more than air bags, seat belts and even safe driving can prevent all car accidents.

It’s not about avoiding all risk, but about managing the inherent risks associated with just about anything we do — in this case, using powerful and connected technologies to enhance our lives.

I’m particularly excited by the upcoming Safer Internet Day panel titled “Using Technology to Effect Social Change” because it will cover how today’s connected devices are helping people all over the planet make small and large changes to improve their lives. Whether it’s toppling oppressive regimes, helping to make local law enforcement more sensitive to the needs of their communities, or getting a traffic light installed in a busy intersection, activists around the world have found social media and mobile technology to be a powerful tool to revive that old sixties slogan, “Power to the People.”

Martin Luther King might have approved

I was thinking about this panel a couple of Mondays ago as the nation celebrated Martin Luther King Day. Without doubt, Dr. King would have been an enthusiastic user of social media, had the tools been available in his day. He and his fellow civil rights activists would have used Twitter and Facebook to recruit volunteers, mobilize support and attract people to events just as they did with the technologies of their time — the mimeograph, the megaphone and the telephone.

King’s supporters, and the press covering them, would have used smartphones to document the atrocities they faced daily, and those images would have been just as powerful as the grainy black and white photos they produced at the time. I bet they would have used LinkedIn to drum up support among businesses and professionals. And surely they would have used Tumblr, Instagram and maybe even Snapchat to rally the support of young people, like those brave volunteers who risked, and in cases gave, their lives to help make Dr. King’s “Dream” a real possibility.

But one thing King would almost certainly have not done was to rely on “clictivism,” a term for social change campaigns that rely on people simply liking a cause or checking an online box to express their opinion or outrage. He would have still organized sit-ins, boycotts, nonviolent demonstrations and, of course, that massive March on Washington — because he would have known that technology alone can’t bring about social change. For that you need dedication, hard work, organizers in the streets, the willingness to take real risks, and the energy of the people.

I’ll be thinking about Martin Luther King on Safer Internet Day because — just as he was trying to build a better, kinder and more just nation — we’re united beyond the global theme of creating “a better Internet together.” We won’t achieve that dream on Feb. 10th, but we will take a small step toward spreading online kindness, media literacy and social activism. You’ll find details at www.SaferInternetDay.us. Pre-registration is required to attend in-person.

 

Safer Internet Day Program

1:00 Doors Open

Welcoming Remarks

Opening Remarks: Sheryl Sandberg, Facebook COO

Keynote Speaker: California Attorney General Kamala D. Harris

Panel #1 – “Beyond Bullying: Dealing with Trolling & Social Cruelty”

  • Dan Tynan, Yahoo Tech columnist (moderator)
  • Rayna Archuleta, high school student
  • Annie Fox, M.Ed, parenting expert, family coach and online adviser for teens
  • Natalie Madrigal Ortiz, high school student
  • Cindy Southworth, National Network to End Domestic Violence
  • Catherine Teitelbaum, Ask.fm
  • Dave Wilner, Secret

Panel #2 – “Using Technology to Effect Social Change”

  • Maya Enista Smith, adviser to the Born This Way Foundation (moderator)
  • Zahra Billoo, Council on American-Islamic Relations
  • Rafael Johns, Youth Radio
  • Erik Martin, Student Voice (university student, consultant to U.S. Dept. of Education)
  • Jamia Wilson, Women, Action & the Media

Presentation by National PTA President Otha Thornton

Panel #3 – “Wrapping It All Up: A Conversation with Industry Leaders”

  • Stephen Balkam, Family Online Safety Institute (moderator)
  • Juniper Downs, Google
  • Nicky Jackson Colaco, Instagram
  • Del Harvey, Twitter

5:30 Program Ends
5:30 – 7:00 Light Supper/Reception

Ford CEO Mark Fields on mobility, connected cars & teen safety

Larry Magid interviews Ford CEO Mark Fields

SafeKids.com founder Larry Magid interviews Ford CEO Mark Fields

It may seem odd for an Internet safety site like SafeKids.com to be featuring an interview with the CEO of a car company, but cars are no longer just motor vehicles. They’re connected computers on wheels.

And Ford is no longer just in the car and truck business. As you walk around the parking lot, garage and labs at the company’s newly opened Silicon Valley Research Center, you do see cars and trucks along with all sorts of electronic gear. But there are also bicycles which, said Ford CEO Mark Fields, are among the many “mobility” technologies the company is looking at. “We’re thinking of ourselves not only as just an auto company,” he said in an interview, “but we’re also thinking ourselves as a mobility company (scroll down to listen),” He said that Ford is “thinking broadly about a lot of these big societal issues such as congestion in large cities,” and added, “we want to help be part of the solution.” He said it’s all about experiments ranging from bicycles and  cars with sensors looking for open parking spaces

I didn’t see a Ford logo on any of the bicycles but the company is equipping them with sensors to collect data about how people are getting from place to place. “It is a bit of opening the lens on our business, he said. “We’re first and foremost a car and truck company,” but he added “it’s important for us to experiment and to think from a consumer standpoint,” including “making customer’s lives easier getting from point A to point B.” He also said that expanding to other modes of mobility is “a good business opportunity.”

The company is also experimenting with what Fields called a “car swapping” app. Ford employees, many of whom drive company cars, have access to an app that lets them swap cars with fellow employees. An example, said Fields might be “I’m looking for a Mustang for the weekend,” in the hopes that a Mustang driving colleague might want to switch cars for a couple of days. So far, the app is only for employees, not the general public.

Ford is also experimenting with ride sharing services. “in other parts of the world we’re testing small mini-buses. Folks are OK getting into a vehicle and sharing it but they want the appropriate amount of person space so we’re looking at seating configurations,” said Fields.

Safety issues for teen drivers

One of several apps that enable parents to monitor teen driving

One of several apps that enable parents to monitor teen driving

I asked Fields, the father of two kids who are now driving, about teen safety and, of course he said that “safety is the top priority for our customers, whether it’s parents or kids.” He said that the company’s new Sync 3 connectivity system is designed to reinforce drivers having their “eyes on the road and their hands on the wheel” on a system that can be activated by touch screen or audio commands. Fields didn’t discuss technologies that allow parents to monitor or control their kids driving but there are several apps and devices that enable to parents to know how fast their teen is driving, where they are driving and even if they are driving erratically.

Broadening the business

Calling itself a mobility company is a lot like a newspaper or radio station calling itself a media company or a railroad saying that’s in in the transportation business, not the train business. Ford is known for making motorized vehicles that move people and things, but as the company looks forward, it’s starting to think about all the possible ways to move humans and objects from place to place.

Silicon Valley connection

Fields said that Ford wants to be part of the “Silicon Valley eco-system” and to that end, the new lab, which Ford says is “one of the largest automotive manufacturer research labs in Silicon Valley,” expects to employ 125 researchers, engineers and scientists by the end of the year. The lab is run by Dragos Maciuca, who came to Ford from Apple. Ford is also working with Google-owned Nest to deliver data from Nest home sensors (currently thermostats and smoke detectors) to the car. If smoke is detected at home, an alarm will go off in car with a notice on the car’s infotainment system.

Like Google, Ford is also experimenting with autonomous vehicles along with partners from University of Michigan, M.I.T. and Stanford. The company is providing a Fusion Hybrid autonomous research car to Stanford’s engineering program so that researchers can test planning and prediction algorithms.

Remote driving

In addition to bicycles, I also saw a golf cart at the facility. Actually what I saw was a Ford engineer sitting at what looked like an auto-simulator but he was remotely driving a golf cart located at Georgia Institute of Technology. This technology could come to market far sooner than autonomous cars, which are still years away, and could be used for specific applications such as off-road services or valet parking.

Ford is also working on improved voice recognition systems not only for infotainment and navigation but to assist in driving too.

No flying machines

I asked Fields whether I’ll ever achieve my boyhood dream of having my own personal flying machine and all he could say was that “we’re busy working on alternative fuels and autonomous vehicles but the Jetsons, I think, are still a cartoon.”

Click below to listen to Larry Magid’s entire 12 minute interview with Ford CEO Mark Fields.

Facebook to issue Amber alerts — exclusive interview with John Walsh

amber_edit

Facebook and the National Center for Missing and Exploited Children (NCMEC) are teaming up to put Amber Alerts about missing children on Facebook News Feeds, but only if they are in the targeted search area for an abducted child.

A game changer

John Walsh

John Walsh

John Walsh, the founder of NCMEC, former host of America’s Most Wanted and host of The Hunt on CNN called this partnership “a game changer” (scroll down to hear an exclusive podcast interview). He said the alerts will have pictures of the child, his or her height and weight, a description of the clothing he or she was last seen wearing, a description of any vehicle that may be involved and links to NCMEC missing child posters with more details. Users have the option to share the alert with friends.

Walsh said that the chance of finding a missing child are much higher if people are looking, and that the first 24 hours (really the first few hours, he said) are critical.

He also pointed out that people can see their Facebook News Feeds during times when they might not be watching TV,  listening to the radio or driving by a lighted freeway sign with an Amber Alert. Besides, the amount of detail available on Facebook will be much greater, further increasing the chance that someone might spot the child.

Reaching the right demographic

Another important aspect of this service is that it reaches younger audiences who might not even be tuned into traditional TV and radio. “This is a game changer for a younger generation,” he said. “I’m sure my 20-year-old son and every 14-, 15-, 13-year-old kid that’s on Facebook … When they get that regional Amber Alert, if they’ve seen that kid, I think they’re going to get online and do something about it.”

A personal tragedy led to Walsh’s life’s work

Walsh became involved in the search for missing children after his own child, Adam Walsh, was abducted and murdered in 1981. He helped found the National Center for Missing and Exploited Children and has remained active with NCMEC ever since. His wife Reve is on the board of NCMEC (as am I) and his son Callahan works at NCMEC.

In the interview, Walsh said that “I can only fantasize what would have happened in Adam’s case back in 1981 if we had the tools we have now.” He said that, in 1981, the FBI refused to get involved in Adam’s case because looking for children was not something the FBI did. Now they’re an important partner of the National Center.

Walsh personally lobbied Congress to make the Amber Alert system a federal program, and said that putting Amber Alerts on Facebook will only increase its reach. “With the huge population of social media on smartphones, this will make it easier to find missing children a lot faster.”

The recovery rate for missing children has grown from 62%  in 1990 to 97% today, according to NCMEC and, said Walsh, online media and TV play a big part in helping to find those children. The Justice Department reports that 723 children have been recovered as a result of Amber Alerts.

Click below to listen to the full 11-minute podcast with John Walsh and ConnectSafely.org co-director Larry Magid.

 Disclosure: Larry Magid serves on the National Center for Missing and Exploited Children board of directors and is also co-director of ConnectSafely.org, a non-profit Internet safety organization that receives financial support form Facebook.

Connected devices at CES raise security, privacy and safety questions

It seems as if almost every exhibitor at CES was showing things that connect to other things.

LG showed off washing machines and kitchen appliances that send messages to smartphones. Schlage announced a Bluetooth-enabled smart lock that enables iPhone users to use Siri voice commands to enter their house. Kolibree and Oral-B both showed off connected toothbrushes, and there was even a baby pacifier called Pacif-i, billed as the “worlds first Bluetooth smart pacifier.”

Basis Peak is one of many connected gadgets shown at CES

Basis Peak is one of many connected gadgets shown at CES

Fitness bands like the Basis Peak that send your activity and pulse to your phone, and to the cloud, were all around. Vital Connect showed off a Band-Aid size patch that can send your heart rate, body temperature, posture and EKG to health providers via smartphones. Automakers showed off cars that can “phone home” to transmit data that monitors systems in real time.

And, of course, drones were everywhere. These flying machines have wireless controllers and the ability not just to fly through real clouds, but to transmit data to virtual ones.

Together, these and thousands of other connected gadgets are referred to as the “Internet of Things,” or IoT. Eventually, the Internet of Things will be bigger than the Internet of people, since there are a lot more devices in the world than humans. (And by the way, humans aren’t the only living creators to be connected — thanks to pet trackers like the Tagg GPS Plus, we also have an Internet of dogs and cats.)

Like the Internet of people, the IoT has its own privacy, safety and security risks, which are not lost on regulators from Washington, D.C., and individual states.

Federal Trade Commission Chairwoman Edith Ramirez was at the Consumer Electronics Show and pointed out that connected devices often share “vast amounts of consumer data, some of it highly personal, thereby creating a number of privacy risks.”

There are also heightened security risks. At last year’s Black Hat security conference, researchers demonstrated how it was possible to hack cars, energy management systems and smart locks.

Safety issues abound as well. A hacked car or drone or even a connected robot could wind up threatening life and limb. So far, the hacks against Sony, Target and thousands of other institutions have caused embarrassment and loss of money and privacy, but not physical injuries or loss of life. But if “things” are hacked, the stakes could be a lot higher.

Risks associated with the Internet of Things are not lost on Intel CEO Brian Krzanich. Intel is betting on IoT by creating chips and devices for drones, smartwatches, robots and other connected devices. During his CES keynote, the Intel CEO even showed off a button-sized wearable computer called Curie, which can be sewn onto clothing.

In an interview, Krzanich acknowledged the risks, but said “there’s a lot of research going into how to really improve security right now.” He pointed out that “every technology advancement brings great value and great potential, but brings some level of risk and our job is to manage the risk.”

When asked about the risk of drones, Krzanich said that there is software on some GPS-equipped drones to prevent them from flying near airports and cameras and sensors to avoid colliding into other drones or buildings. Still, there are risks that can’t be avoided, like someone flying a camera-equipped drone over someone’s backyard or putting drones that don’t have GPS or collision avoidance software in the hands of owners who are using them irresponsibly.

I wonder if police departments are gearing up for drone abuse enforcement. If not, they should be.

Certainly federal regulators — from the Federal Communications Commission to the FTC to Homeland Security to the Federal Aviation Administration — are looking at how to protect the public interest when it comes to the vast array of connected things.

The FCC needs to think about the use of radio spectrum because the IoT is competing with broadcast, data, voice and sorts of other demands for the limited amount of available radio frequencies. The Department of Homeland Security is rightfully concerned about the potential of devices to be used to harm people or deliver explosives or other threats, especially if they get into the hands of terrorists. The FTC is responsible for helping to protect our privacy and has plenty on its plate when it comes to the potential abuse of all the data these “things” are collecting and transmitting. The FAA is working on how to regulate drones to make sure they don’t crash into airplanes, each other or people on the ground.

Adam Thierer, a senior research fellow at the Mercatus Center at George Mason University, worries that government regulation could go too far, especially at the early stages of technologies where over-regulation could wind up interfering with innovation.

“The better alternative to top-down regulation,” he argues, “is to deal with concerns creatively as they develop, using a combination of educational efforts, technological empowerment tools, social norms, public and watchdog pressure, industry best practices and self-regulation, transparency, and targeted enforcement of existing legal standards.”

In general, I agree with Thierer, but I still think there is a role for government to protect the public not against all these connected “things,” but against the people who misuse them.

 

Researcher sets the record straight on teen sexting

I rarely blog about other people’s blog posts, but the post, “Chances are, Your Teen has NOT Sexted” by Dr. Justin Patchin is worthy of amplification and further comment. Patchin, who is a professor of criminal justice at the University of Wisconsin-Eau Claire and co-director of the Cyberbullying Research Center, himself blogged about yet another blog post from CNN that distorts the prevalence of teen sexting with the headline, “Chances are, your teen has sexted.”

Common trap

The CNN article itself was relatively balanced and, as a journalist who often writes for publications whose editors write the headlines, I know that it’s possible that the click-worthy headline was written by someone other than the author, CNN’s Kelly Wallace. And, in her defense, Wallace’s article and headline were not all that different from a post from Drexel University’s PR department, drawing attention to research from that university.

While it’s hard to blame a journalist for basing a story on what appears to be a reliable source, it’s yet another example of falling into the traps that I wrote about in a Poynter blog post last year titled, “Beware sloppiness when reporting on surveys.

Bad sampling

Although this number wasn’t in the CNN story or the Drexel post (which did link to the full article), the survey sample consisted of 175 undergraduate students ”recruited from a large Northeastern university,” according to the abstract.

The problem with that sample is not only its size but the population itself. Even assuming the data is representative of undergraduates at that university, one can’t assume that those undergraduates represent the entire population of current or recent teens. There are obvious economic, academic, regional and often race and even gender differences between students at a particular school and the entire population of people their age.

There is more reliable data

The CNN story said that “More than half the undergraduate students who took part in an anonymous online survey said they sexted when they were teenagers, according to the study by Drexel University.” But, as Patchin points out, other studies show that far fewer teens engage in sexting. Patchin and his colleague summarized the research in 2010 and found that “between 4 and 19% of respondents had admitted to sending a sexually explicit image of themselves to others.” The Center’s own study, with data collected in 2010 from a random sample of over 4,000 middle and high school students, found that “7.7% of students had sent a naked or semi-naked image of themselves to others” and a very credible study by federally funded Crimes Against Children Research Center found that “less than 10% of youth reported appearing in or creating nude or nearly nude images or receiving such images in the past year.” Unlike the tiny study of undergraduates from one university, that study was based on a nationally representative sample of 1,560 students between the ages of 10 and 17.

This well documented data from credible sources doesn’t lead to flashy headlines, but it does paint a far more realistic picture of teenage sexting in America.

Exaggerated consequences of sexting

There are plenty of other important issues that Patchin touches upon in his post, including the assertion by both CNN and the Drexel press office that kids are taking an extreme risk when sexting. The biggest risk is probably the possibility that they will be caught and punished and, as the articles point out, it is possible for a teen to be charged with manufacturing, possession and distribution of child pornography — a serious crime that can lead to jail time and being put on a sex offender registry. But even that fear, while not out of the question, is greatly exaggerated. The vast majority of prosecutors today are more interested in helping teens modify their behavior than throwing the book at them. Such extreme charges are rarely filed unless there are such factors as extortion or intimidation, mass distribution or an adult playing a role in soliciting, receiving or distributing the images.

A number of studies have shown that the consequences of sexting, in most cases, are not severe. In fact, some consider it a form of “safe sex.” As Patchin points out, “to engage in sexting, it is a somewhat calculated decision based on the (probably accurate) belief that the risks to them are less for sexting than for engaging in sex. They are not going to get pregnant or catch any one of the many scary sexually-transmitted diseases.”

Why this matters

Good research and accurate reporting of research are important because they can influence parents, teens themselves and policy makers. Drexel University is a respected institution and CNN is an influential news source, and when information comes from these sources, people tend to believe it and very few are likely to dig deeper to find the real story. It’s important for university news departments to be accurate about the limits of research from their faculty and graduate students and incumbent on journalists to fully understand the study before repeating someone else’s summary. Most policy makers aren’t likely to read the actual reports and there is almost no way the general public will read them, especially since many of these reports are behind a paywall (it costs $39.95 to read past the abstract on this Drexel study).

The good news

So, thanks to Dr. Patchin for pointing out that most kids don’t sext and that kids are learning and getting smarter when it comes to sexting.

This post first appeared on Forbes.com

Empowering youth to combat bullying & cyberbullying

community mattersYouth Bullying (and adult bullying too) has been around for a very long time, but over the past couple of decades it’s evolved — hence the term “cyberbullying.”

At the end of the day, bullying — whether in school or online — is still bullying so strategies to combat cyberbullying have to be integrated into the entire school climate.

There are a lot of programs that seek to accomplish this and what the good ones have in common is the understanding that young people themselves are an essential part of the solution.  One such program, Community Matters, has been around since 1996.  The Northern-California based non-profit reports that it has worked with more than 1,000 schools, agencies and organizations across 30 states, Puerto Rico, Guam and Canada.

Community Matters' CEO Rick Phlipps

Community Matters’ CEO Rick Phillips

And, as its CEO and founder Rick Phillips said in an interview (scroll down to listen to 14 minute podcast) the organization’s strategy is to “see the young people in our schools (including those who may have engaged in bullying) not just as the perpetrator but to see them as the solution.” He added that “young people are in the best position to intervene because they see, hear and know about these things before an adult ever knows about them.”  He said that “young people are powerful,” but “adults underutilize youth.

Click below to listen to the interview;

Safety & civility advice for anonymous apps

After School’s iTunes page promises to let you post anonymously

After School’s iTunes page promises to let you post anonymously

A growing number of apps allow people to post anonymously. Some of the better known ones include Ask.fm, Whisper, Secret and Yik Yak but there are new ones all the time, including After School, that’s been downloaded more than 100,000 times including by students from more than 14,000 U.S. high schools, according to Recode.net.

As Recode pointed out, After School’s seven-person staff can’t possibly police all of the posts on this growing service, though the company says it does employ software to look for particularly alarming words like “kill,” “cut” and “bomb.” As TechCrunch reported, the app has been associated with numerous bullying incidents.

There are also reports of gun threats, which prompted the Superintendent of Flushing (Michigan) Community Schools to write, “The purpose of the app continues to be in question and very concerning. Not only does it allow for individuals to post anonymous, and often times inappropriate statements and pictures, it also allows the app company access to personal information from an individual’s Facebook account.” The app was temporarily removed from the Apple app store and later reinstated with a 17+ rating.
Yik Yak has also had its share of criticism, which prompted the company to geo-fence the app so it can’t be accessed from high school campuses. Ask.fm was once the poster-child for anything goes posts, but was recently acquired by IAC with new management, a chief safety officer and a commitment to better police its service.

What all these apps have in common is the ability for people to post comments or ask questions without having to reveal their real name or, in some cases, without even having to use an account name or alias.

As I discuss in this post, there are some very positive aspects to anonymous apps, but of course there are some risks including the ability to use the app for bullying, to spread false or malicious gossip, to embarrass people, for unwanted sexual solicitation and harassment or, in some cases, to post inappropriate photos.

Nothing new

These concerns are nothing new — we’ve been talking about them since the Internet first became commercialized back in the 90′s. And while the specific details vary according to the app, some general principles apply for all apps, whether anonymous or not.

Know how to report. Some apps have reporting features that can alert the company’s customer service staff if someone is being abusive. Learn to find and use these features where they exist.

Call for help if you’re frightened. If someone threatens you in a way that gives you reason to fear for your safety, reach out immediately for help. Contact school authorities, parents or law enforcement if you are concerned about your safety.
You’re never completely anonymous or above the law. Even though these apps might be able to hide your identity from other users, there are ways to track people down through Internet protocol addresses, cell phone identifiers, and other clues. Both hackers and law enforcement (with proper authorization) have tools to find you.

Know what the app knows about you and your friends. It’s not uncommon for mobile social media apps to collect information about you and your friends. Pay attention to any disclosures and be extra careful about allowing the app to contact Facebook friends or people on your contact list on your behalf. Also be aware of the apps geolocation features, including tracking where are and sharing it with others.

You are responsible for your behavior. Users are both morally and legally responsible for how they behave on these apps. In addition to the possibility of prosecution, you can be banned from using the app by the operator if you violate their terms of service and there can be other repercussions from school and other authorities if you violate community rules of behavior.

Disagree respectfully. Anonymous apps often give people an opportunity to engage in spirited debate around just about any issue including politics, religion, sexuality — even your favorite smartphone or computer. These debates can be great, but they should also be respectful.

Don’t out others. Spreading rumors or revealing secrets about others is a form of bullying. Just because you know something about someone, doesn’t give you the right to share it without permission. Also, respect other people when sharing photographs. It’s best to ask permission before sharing a photo with anyone else in it and common decency to take down (or untag) a person who objects to being in a photo.

Don’t invite trouble. Sometimes people ask for trouble, by posting questions about themselves like “am I pretty” or “do you think I’m fat.” Sadly, there are people who will sometimes pounce on people who ask questions like this. Think before you ask any questions about yourself or others.

People online have feelings. This should be obvious but sometimes we forget that people on the other side of the screen are really people with genuine feelings. It’s not uncommon for folks who are pretty considerate when they meet others in person to forget their manners when they encounter them online. One thing to consider is that you don’t know the mental or emotional state of the person on the other end. What may seem to you to be just funny or mildly annoying could be emotionally devastating to that person, depending on how they interpret it and what is going on in their lives.

Why you should ‘share thoughtfully. As we say in ConnectSafley.org’s A Parents Guide to Mobile, both kids and adults “need to know that what they post is a reflection on them. Talk with them about respecting their own and others’ dignity and privacy by being aware of what they’re “saying” with both words and images.”

Remember that what you post may be permanent. Your posts may appear to go away, but chances are they’ll remain online for a long long time. And even if you delete them, there’s always a chance that someone could have copied and reposted it.

This post first appeared on Forbes.com