Time to Reclaim Your Name?

Be Aware, and Beware, of COPPA” one headline blared this week. On July 1, updated rules for the Children’s Online Privacy Protection Act (COPPA) go into effect. The updated rules are a big part of a renewed recent interest in privacy. But revelations of the government’s data collection efforts have sparked a necessary and overdue debate on how to balance national security against citizens’ privacy rights.

Congress enacted the COPPA in 1998, requiring the Federal Trade Commission to issue and enforce regulations concerning children’s online privacy. The FTC’s original COPPA Rule became effective on April 21, 2000. The FTC issued an amended rule on December 19, 2012. [Editor's note: The Benton Foundation supported efforts to update the COPPA rules.]

The primary goal of COPPA is to place parents in control over what information is collected from their young children online. The rule was designed to protect children under age 13 while accounting for the dynamic nature of the Internet. The rule applies to operators of commercial Web sites and online services (including mobile apps) directed to children under 13 that collect, use, or disclose personal information from children, and operators of general audience Web sites or online services with actual knowledge that they are collecting, using, or disclosing personal information from children under 13. The rule also applies to Web sites or online services that have actual knowledge that they are collecting personal information directly from users of another Web site or online service directed to children. Operators covered by the rule must:

  • Post a clear and comprehensive online privacy policy describing their information practices for personal information collected online from children;
  • Provide direct notice to parents and obtain verifiable parental consent, with limited exceptions, before collecting personal information online from children;
  • Give parents the choice of consenting to the operator’s collection and internal use of a child’s information, but prohibiting the operator from disclosing that information to third parties (unless disclosure is integral to the site or service, in which case, this must be made clear to parents);
  • Provide parents access to their child's personal information to review and/or have the information deleted;
  • Give parents the opportunity to prevent further use or online collection of a child's personal information;
  • Maintain the confidentiality, security, and integrity of information they collect from children, including by taking reasonable steps to release such information only to parties capable of maintaining its confidentiality and security; and
  • Retain personal information collected online from a child for only as long as is necessary to fulfill the purpose for which it was collected and delete the information using reasonable measures to protect against its unauthorized access or use.

On June 28, the Wall Street Journal published results of an examination of 40 popular and free child-friendly apps on Google's Android and Apple's iOS systems and found that nearly half transmitted to other companies a device ID number, a primary tool for tracking users from app to app. Some 70% passed along information about how the app was used, in some cases including the buttons clicked and in what order. Some three years after the Journal first tested data collection and sharing in smartphone apps—and discovered the majority of apps tested sending details to third parties without users' awareness—the makers of widely used software continue to gather and profit from people's personal information.

Earlier this week, the Entertainment Software Rating Board, comprised of about 25 participating companies covering thousands of domains, expanded its program that manages privacy self-regulation for the video game industry to cover mobile apps. The ESRB has been working on the mobile counterpart to its self-regulatory program for the past nine months in anticipation of the COPPA update. Companies that comply with ESRB Privacy Certified are allowed to display a seal on the game's privacy policy signifying that the mobile app complies with mobile privacy standards and best practices. The biggest challenge was figuring out how to obtain parental consent on a mobile device without forcing users to wait 24 hours. To solve the problem, ESRB turned to Veratad, which provides real-time age and identity verification. ESRB also created a short-form privacy notice that links to a longer notice. Another challenge was dealing with the expanded definition of personal information to include photos and videos. ESRB is in the process of granting some mobile certifications to its members. It is also waiting for the FTC to grant ESRB's mobile privacy program safe harbor status, which essentially confers FTC blessing by proxy.

On June 24, the Center for Digital Democracy, a privacy group that lobbied hard for the COPPA updates, announced it will work with more than 60 children's civil rights, advocacy and medical groups to monitor websites and apps. CDD gave groups a backgrounder on the new rules and provides a step-by-step legal guide of what to look for on children's sites and mobile apps to make sure they are in compliance with the new rules. The CDD singled out a few potential violations in particular: asking children to take or upload pictures of themselves; collecting location information; and sending push notifications from an app that pop up on the device even when the app is not in use. If any of the groups find a violation, the CDD is asking that group to notify them and the FTC.

"We'll be focused initially on the major kids sites—Disney, Nickelodeon and Cartoon Network—to make sure they respect the new rules and empower parents with the information and control they deserve," said Jeff Chester, executive director of the CDD. In advance of the new school season, the group also intends to step up public education and launch an online campaign.

But children and COPPA are not the only issues driving the spotlight on privacy of late. Revelations of National Security Agency phone and Internet surveillance have many people wondering what information is being collected about them and who has access to it.

Former FTC Chairman Jon Leibowitz thinks the government and companies pose an equal threat to people's privacy. "The implications of the NSA data breach are going to be greater in the context of protecting consumers' commercial privacy," he said. "People are increasingly concerned about their own privacy vis-à-vis commercial entities... Americans are going to be concerned about this breach, and things will move much more quickly into the commercial context." His argument boiled down to this: Outcry from the media, members of Congress, and everyday citizens may not prompt the government to change its ways... On the plus side, though, it might sufficiently spook previously unsuspecting Internet users into demanding more control over their information. That means protection from third parties, like advertising or data mining companies, who use cookies to track what people do on the internet — what they buy, watch, read, and more. "The sky will not fall down if consumers have a little more choice in how they protect data, particularly when it comes to unseen data collectors who put cookies in your property — your computer. You have no idea how they're using it," he said.

Current FTC Commissioner Julie Brill proposed this week an industrywide initiative to give consumers access to their own records held by data brokers. She envisions an online portal where data brokers would describe their data collection practices and their consumer access policies. Commissioner Brill has come up with a handy nickname for her proposed effort: “Reclaim Your Name.”

“Reclaim Your Name would empower the consumer to find out how brokers are collecting and using data; give her access to information that data brokers have amassed about her; allow her to opt-out if she learns a data broker is selling her information for marketing purposes and provide her the opportunity to correct errors in information used for substantive decisions – like credit, insurance, employment, and other benefits,” Commissioner Brill said.

The proposal is in response to the vast amounts of data collected from individuals without permission. Often without consent or warning, and sometimes in completely surprising ways, big data analysts are tracking our every click and purchase, examining them to determine exactly who we are – establishing our name, good or otherwise – and retaining the information in dossiers that we know nothing about, much less consent to. Commissioner Brill identified four privacy challenges:

  1. The Fair Credit Reporting Act (FCRA) requires entities collecting information across multiple sources and providing it to those making employment, credit, insurance and housing decisions to do so in a manner that ensures the information is as accurate as possible and used for appropriate purposes. But while the FTC is working hard to educate online service providers and app developers about the rules surrounding collecting and using information for employment, credit, housing, and insurance decisions, it is difficult to reach all of those who may be – perhaps unwittingly – engaged in activities that fall into this category. Further, there are those who are collecting and using information in ways that fall right on —or just beyond —the boundaries of FCRA and other laws. The challenge is to figure out how FCRA’s principles can coexist with new ways of collecting and using information – how consumers can maintain notice, access, and correction rights on all the dossiers – not just credit reports – that inform important decisions on eligibility as well as offers in areas such as housing, employment, finances, and insurance.
  2. The second big challenge to big data is transparency. Consumers don’t know much about either the more traditional credit reporting agencies and data brokers or the newer entrants into the big data space. In fact, most consumers have no idea who is engaged in big data predictive analysis. Current access and correction rights provide only the illusion of transparency.
  3. Some data is collected and used for determinations unrelated to credit, employment, housing, and insurance, or other eligibility decisions. Three concerns arise when health information flows outside the protected Health Insurance Portability and Accountability Act of 1996 (HIPAA -- see note below for more on HIPPA) environment: a) sensitive health information might be used to make decisions about eligibility that fall outside the contours of the FCRA, without notice or choice to the consumer; b) sensitive health information could fall into the wrong hands through a data security breach; and c) what damage is done to our individual sense of privacy and autonomy in a society in which information about some of the most sensitive aspects of our lives is available for analysts to examine without our knowledge or consent, and for anyone to buy if they are willing to pay the going price.
  4. The final big challenge of big data is predictive analytics attaching its findings to individuals. Given how closely our smartphones and laptops are associated with each of us, information linked to specific devices is, for all intents and purposes, linked to individuals. The FTC has called on companies trafficking in big data to take both technological and behavioral steps to make sure the information they use in their advertising is truly and completely de-identified.

Commissioner Brill then turned to solutions. First she mentioned “privacy by design” -- companies building more privacy protections into their products and services. She also supports legislation that would require data brokers to provide notice, access, and correction rights to consumers scaled to the sensitivity and use of the data at issue. For example, Commissioner Brill suggested, Congress should require data brokers to give consumers the ability to access their information and correct it when it is used for eligibility determinations, and the ability to opt-out of information used for marketing.

But “Reclaim Your Name” is a comprehensive initiative, Commissioner Brill argued, that can begin even before legislation is enacted. Reclaim Your Name would give consumers the knowledge and the technological tools to reassert some control over their personal data – to be the ones to decide how much to share, with whom, and for what purpose – to reclaim their names. Reclaim Your Name would give consumers the power to access online and offline data already collected, exercise some choice over how their data will be used in the commercial sphere, and correct any errors in information being used by those making decisions materially impacting consumers’ lives. Together, these policies will restore consumers’ rights to privacy that big data has not just challenged but has abrogated in too many instances.

Brill outlined a role for industry:

  • As the data they handle or create becomes more sensitive – relating to health conditions, sexual orientation, and financial condition – data brokers would provide greater transparency and more robust notice and choice to consumers.
  • The credit bureaus need to develop better tools to help consumers more easily obtain and understand their credit reports so they can correct them.

Reclaim Your Name would fit in with the FTC's push for an online do-not-track mechanism, Commissioner Brill said. Do-not-track would allow consumers to choose when their data is collected for marketing purposes, and Reclaim Your Name would give consumers access to online and offline data already collected and give them some choice about how that data is used, she said.

“There is no reason that big data cannot coexist with … a system that empowers consumers to make real choices about how their private information will be used,” concluded Commissioner Brill. “The ability to claim your name – or in the case of big data, Reclaim Your Name – is as American as Mom and apple pie. I can’t believe consumers will give that up easily, even for all the convenience, entertainment and wonder that cyberspace currently has on offer. And I want to believe that industries currently fueled by big data will join together to help consumers reclaim their names.”

We’re always following online privacy so you’ll see more on how Reclaim Your Name progresses. ‘Til then, we’ll see you in the Headlines.

Notes:
1. The HIPAA Privacy Rule protects the privacy of individually identifiable health information. The HIPAA Security Rule sets national standards for the security of electronic protected health information. The HIPAA Breach Notification Rule requires covered entities and business associates to provide notification following a breach of unsecured protected health information.


By Kevin Taglang.