Omer Tene

In Jessica Rich, FTC loses cornerstone of privacy program

[Commentary] Since the 1990s, the Federal Trade Commission has established privacy and data security as a new regulatory area, through dozens of enforcement actions — which scholars have called “a new common law of privacy” — policy reports, and research workshops. Throughout this period, spanning three decades and a transition from the dawn of personal computing and the commercial internet to an age of machine-to-machine communications, smart cars, wearable devices, big data, and the cloud, Jessica Rich, who announced her departure recently as director of the Bureau of Consumer Protection, has conceived, initiated, driven, and spearheaded the agency’s emergence as the nation’s primary technology regulator. Through a long series of cautious, incremental steps, always meticulous, never flashy, and often with a wry joke and a smile, Rich built the foundation for a substantial body of law, setting the standard for technology regulators in the US and abroad.

[Omer Tene is Vice President of Research and Education at the International Association of Privacy Professionals. He is an Affiliate Scholar at the Stanford Center for Internet and Society and a Senior Fellow at the Future of Privacy Forum.]

With Ramirez, FTC became the Federal Technology Commission

[Commentary] Under Chairwoman Ramirez, the Federal Trade Commission has truly become the Federal Technology Commission. Despite repeated calls for privacy legislation and tighter control of the data broker industry, the FTC continues to face unregulated pockets of the data ecosystem armed by only its broad authority against unfair or deceptive trade practices, which dates back to 1914. With that, after three years in office and six years as FTC Commissioner, Ramirez leaves the agency stronger and better equipped to deal with the challenges of the next years.

[Omer Tene is an Affiliate Scholar at the Stanford Center for Internet and Society and a Senior Fellow at the Future of Privacy Forum.]

The Right Response to the “Right to Delete”

[Commentary] The decision by the European Court of Justice, requiring Google to delete search results that display a Spanish user in a bad light, continues to cause consternation among online experts and supporters of free speech.

Unwittingly, the European Court appointed Google a global online censor, imposing on it the unenviable burden of policing content on the Web. In doing so, it furnished Google (and similar online intermediaries) with strikingly vague criteria and little process, to boot. And if understaffed privacy regulators intend to handle complaints case by case, they will soon be swamped by an unmanageable deluge of individual take-down requests.

But condemning the Court’s decision should not invalidate the concerns it sought to address.

In general, much of our information is subject to fairly clear norms that guide us in who can accessible what and for how long.

Why can’t technology do more to ensure that certain types of recorded data decays or becomes less accessible with time? Much more than law, technology can account for subtle differences in individuals’ subjective privacy expectations, which fluctuate based on the context and nuance of interpersonal relationships. Let’s have many more companies experiment with default settings that allow for data decay.

While these solutions are imperfect, they chart a promising path toward a world where some friction allows us to retain and hide a bit of ourselves.

[Polonetsky serves as executive director and co-chair of the Future of Privacy Forum; Tene is vice president of research and education at the International Association of Privacy Professionals (IAPP)]