Digital & Social Media

Why Privacy Needs to Matter In the Age of Big Data

Why Privacy Needs to Matter In the Age of Big Data
Share

Big data allows companies to use personal information of consumers in innovative ways that enhance the public good. They can predict the outbreak of influenza by analyzing consumers’ search queries, improve traffic flows by mapping commuter travel patterns, and develop new products like energy-saving “smart appliances.” At the same time, the collection and use of personal data for commercial purposes raises fundamental privacy concerns.  Consumers need baseline protections that set clear-cut “rules of the road” to guide business practices and establish reasonable consumer expectations.

For consumers, the absence of rules creates uncertainty about their ability to “self-define”—that is, to control which personal information, in what context, they share with the world. This lack of control can chill valuable activities like researching a health condition, expressing political views, or purchasing certain products or services.

Imagine a consumer whose uncertainty about online privacy has caused her to restrict activity to browsing trusted news sites and updating her Facebook page. Even with this reduced digital footprint, she may be revealing more than she realizes, particularly when combined with the tremendous amount of personal information that she supplied in other ways, for instance, when making purchases using a retail loyalty card; the demographic data on a product warranty card; her image collected by a store’s facial recognition software; or the zip code she gave in response to a request by a cashier. The resulting profile, comprised of bits of information gathered from disparate sources and in different contexts, is far more complete than any of its parts might suggest.

While the consumer might be comfortable disclosing limited pieces of information in one context, she may not approve of the aggregation of all her information for sale to unknown third parties for unanticipated uses. Exposing this information—intentionally or through a data breach—may cause embarrassment for the consumer and lead to inferences about her lifestyle or character that cause even greater harm.  Indeed, it’s not difficult to envision a scenario in which an amassed profile suggests a potential employee is likely to suffer from depression or a particular senior citizen is vulnerable to telemarketing fraud. We have seen personal data used in ways consumers may not have anticipated and couldn’t control.  For instance, predicting the sexual orientation of social media users based on their contacts or a woman’s pregnancy based on her purchases are now not only possibilities but realities.

Today, the FTC has powerful tools to protect consumers against companies infringing on their privacy. We can sue companies that mislead consumers or that cause them substantial, tangible injury. We recently took action against a company that did not adequately secure IP Cameras it sold to consumers, resulting in the exposure of live video feeds of children and private areas of peoples’ homes. Although we were able to act in that instance, the damage to consumer privacy had already occurred.

As active as we have been at the FTC, our existing tools cannot resolve all of the issues big data raises. We can’t require companies to give you a choice before collecting anything and everything about you. We can’t address all the ways in which your data can be used to make decisions that affecting you. And, unless you’re a financial institution or are directed to children, we can’t even require you to have a privacy policy.

For these reasons, in 2012, the FTC issued a report outlining three best practices that can serve as a guide to industry and address consumer uncertainty. First, companies should adopt the “privacy by design” approach, building privacy into products and services at the outset. Making privacy a priority instead of an afterthought will help shift the burden of protecting privacy away from the consumer. Second, companies should give consumers choices in real-time so they can decide whether, and in what context, their data can be collected or used. Finally, companies should be more open about their information collection and use practices.

Not all companies have followed our advice, and until they do, consumers need to be wary. Adopting these common sense concepts won’t stifle innovation and commerce. It will promote them. No one benefits when companies use consumers’ data in ways that surprise them. Clear rules can only increase consumer confidence in the marketplace and will create a level playing field rewarding those companies that use big data to compete and innovate in a responsible, privacy-protective manner.

Homepage photo: FTC building, Bloomberg via Getty Images

Subscribe

About the author

Peder Magee is a senior attorney in the Federal Trade Commission’s Division of Privacy and Identity Protection. He works on a variety of privacy policy and litigation matters, including enforcement of the FTC’s COPPA Rule. Magee is the principal author of the March 2012 FTC Report, Protecting Consumer Privacy in an Era of Rapid Change, which sets forth the Commission’s new privacy framework. He also is the principal author of the staff report on the FTC’s self regulatory principles for online behavioral advertising.