Study: Why Some Children’s Apps Might Not Be as Safe as You Think

Data Privacy

Study: Why Some Children’s Apps Might Not Be as Safe as You Think

By Tina Nazerian     May 1, 2018

Study: Why Some Children’s Apps Might Not Be as Safe as You Think

To paraphrase The Bard, what’s in a certification? According to the authors of a study on the privacy protections in children’s apps, perhaps not much.

A recent study, published in the journal Proceedings on Privacy Enhancing Technologies, looked at whether 237 apps certified compliant with the child privacy law, COPPA, do a better job safeguarding privacy than a larger sample of non-certified apps. Overall, they found little difference between apps that were certified and those that weren’t—and in some cases suggested certified apps may present even bigger security lapses.

What Is the Safe Harbor Program?

The two-decade old Children's Online Privacy Protection Act, known as COPPA, is a federal law designed in part to protect children under 13 years old on websites designed for kids. According to the Federal Trade Commission, who is responsible for enforcing it, companies that produce apps and other tools for kids can apply for an optional certification that they’re in compliance with the law through what’s known as the Safe Harbor program. There are a total of seven Safe Harbor organizations, a mix of for-profits and nonprofits, designated by the FTC to issue such certifications, including TRUSTe, kidSAFE and iKeepSafe.

According to the study’s researchers, the Safe Harbor program aims to “create privacy practices that go above and beyond COPPA’s minimum requirements.”

Emily Tabatabai, a data privacy lawyer, tells EdSurge that many of the apps and companies that get COPPA certifications from Safe Harbor organizations aren’t even used by children. Some may be aimed exclusively at adults; others for users over the age of 13, which falls outside the scope of COPPA.

“A lot of companies will certify with the Safe Harbor program really as a PR benefit to indicate that they are aware of their obligations and they want to put users at ease,” says Tabatabai, who is a partner and founding member of the law firm Orrick, Herrington & Sutcliffe’s cyber, privacy and data innovation practice.

She also points out that many of these Safe Harbor organizations have limitations when it comes to gauging whether or not a company is compliant. They’re somewhat reliant on reviewing terms of service and privacy policies, she says, as well as on what the company tells them in questionnaires.

“They don’t necessarily have the technological know-how or technological capability to evaluate the way an app functions,” says Tabatabai. The organizations also lack the ability to investigate the use of third-party code and how data is being collected and exfiltrated to other services, she adds.

Holly Hawkins, the CEO and President of iKeepSafe, agrees that collecting self-reported data is a part of her organization’s certification process. Her team looks through self-reported information such as an app’s data collection and sharing process, but also its current terms of service, privacy policy and any contracts with schools. There is also a technical review focused on security. If her organization later finds out that a company isn’t up to iKeepSafe standards, Hawkins says iKeepSafe gives the company a period of time to correct it. If the company does not do so, it loses its COPPA Safe Harbor seal.

Safe Harbor organizations can get in trouble when their certification processes miss crucial red flags. In 2014, the FTC and TRUSTe settled charges over whether certified companies went through recertification annually, as TRUSTe claimed it did on its website. And in 2017, the New York Attorney General settled with the same organization, due to its “failure to adequately prevent illegal tracking technology from surfacing on some of the nation’s most popular children’s websites.”

What the Researchers Found

Despite the extra review measures, the study’s researchers did not find much difference between the apps certified by Safe Harbor organizations and a broader group of non-certified apps they examined. Ten percent of certified apps transmitted location data or contact information without consent, twice the rate of non-compliant apps. They also found a third of the 237 certified apps transmit so-called “identifiers,” which could include personal information, to advertising and analytics providers, whose terms of service, the researchers write, “prohibit their inclusion” in apps designed for kids.

However, Tabatabai explains that just because a service provider’s privacy policy states that the service isn’t directed to children under 13, it doesn’t mean the service can’t be legally used by child-directed apps. In some cases, it can.

One specific example examined by the researchers is the well-known classroom behavior app ClassDojo, which has a COPPA certification through the Safe Harbor organization iKeepSafe. At the time the study was conducted, the researchers claimed that ClassDojo was sending location data to Amplitude, an analytics provider, whose privacy policy, they write, prevents it from being used in child-directed apps.

For its part, ClassDojo tells EdSurge that it only shared location data from the parent and teacher portions of the app and “did not and does not violate Amplitude Privacy Policy.” The company adds that it was using that location data for business purposes, such as understanding use in urban versus rural areas. It no longer uses Amplitude as of December 2017, and has switched to processing the location data of the parent and teacher portions of the app internally, a spokesperson for the company tells EdSurge.

Personal Expectations vs. Legalese

The researchers write that it can’t be definitively known whether or not third parties are using information for actions COPPA does not allow, such as behavioral advertising. However, they suggest violations are likely.

Tabatabai points out that it is “exceptionally difficult” to tell if an app is using third party tools in a manner which violates COPPA unless it’s known exactly how the app uses those tools to collect user data and the “manner in which the data is being used.”

However, an app or a company can get in trouble with the FTC. In 2016, mobile advertising company, InMobi, settled the FTC’s charges that it “deceptively tracked the locations of hundreds of millions of consumers—including children—without their knowledge or consent to serve them geo-targeted advertising.”

Tabatabai believes that the study emphasizes the importance of app developers to “carefully scrutinize every single of third-party code,” as well as “the importance of app developers to have top down education and awareness training for all of their employers and continuous monitoring and updating and testing of their services to make sure that their apps are functioning the way they intend them to be.”

Legal questions aside, there is also a question of what parents are aware of when it comes to the privacy policies of the apps their children use, and whether they are aware of the types of information being transmitted. Take Ma’Chell Duma, a parent in Seattle, for example. Duma tells EdSurge the less she knows about an app her 11-year-old son uses, the more she researches it. She says she looks for whether information is being traded or stored, and what else the app can access.

Yet she acknowledges that some parents are more knowledgeable than others when it comes to these privacy matters. Amelia Vance, a policy counsel at the Future of Privacy Forum, has similar thoughts.

“I think the average parent—the average person—is surprised by any sharing of information on the internet, and they don’t necessarily have a good idea of the exchange of information that occurs,” Vance says, adding that companies and others can make it more clear what information is being sent.

Irwin Reyes, one of the paper’s authors, has concerns as well of the ramifications of this data being out there in the first place.

“For me personally, not professionally, the main concern I would have is that this data is being collected and it’s going to these companies that I would assume most people have never heard of,” he says. “And in an age of data breaches and things like that, once it’s out there, you’re really not in control of it. And so, for instance, an adversary would be able to piece together different parts of your life from that.”

Learn more about EdSurge operations, ethics and policies here. Learn more about EdSurge supporters here.

More from EdSurge

Get our email newsletterSign me up
Keep up to date with our email newsletterSign me up