In Washington DC, there is an ongoing debate over the technological and economic promise of Big Data and the tension created by privacy concerns. The argument underscores a key question for policy leaders: Are existing privacy laws and policies sufficient to address the privacy risks that some envision in the era of Big Data? Earlier this month, the White House weighed in, releasing a report that explored the issue and made a number of recommendations.
While the report was commissioned by President Obama in the wake of the NSA revelations, the final product made only a passing reference to this incident. Rather, the report provided a fairly balanced analysis of the public and private sector benefits of using Big Data and the privacy concerns that accompany certain uses. It made six recommendations, several of which echoed previous proposals by the Administration:
- Advance the Administration’s Consumer Privacy Bill of Rights, which would give consumers more control over how their personal data is used over the internet.
- Pass federal data breach legislation to establish a single standard that will ensure consumers are informed when their personal data is stolen or improperly exposed.
- Extend privacy protection to non-U.S. individuals.
- Ensure any data collected on students in school is used solely for educational purposes.
- Expand technical expertise within the federal government to be able to identify practices and outcomes facilitated by Big Data analytics that have a discriminatory impact on protected classes.
- Amend the Electronic Communications Privacy Act to ensure protection for digital content is consistent with protection for physical content.
The most controversial recommendation was the reference to potential discrimination problems (#5). The report used terms such as “digital redlining” and “discrimination” in reference to the common practices of discounting, targeted advertising and marketing, and differential pricing. The report suggested that the use of robust segmentation techniques might exacerbate existing socio-economic disparities. Critics were quick to respond that while no one should condone unfair discrimination of any kind, the very nature of effective marketing results in discriminating among consumers, in order to ensure that a specific message reaches the intended audience. Marketers also noted that Big Data analytics provides added refinement and precision so that consumers can be targeted with offers that appeal most to them.
We at FICO are enthusiastic about the current and future potential of Big Data analytics, however we acknowledge that it is important to take a balanced approach when imposing rules on the use of personal data. There is no disputing that incidents involving identity theft and data breaches can lead to financial losses and widespread distrust of data sharing. As a result, there is a need for appropriate regulation and vigorous enforcement in these areas. Yet, we have also seen well-intended policy leaders restrict data use in the name of consumer protection, only to find that those restrictions put people and businesses at a disadvantage. Used well, predictive analytics and Big Data can give people more choices, greater protection from financial crime and other benefits, while still protecting them from discrimination and respecting their privacy.
As Big Data analytics is used in more decisions by an increasing number of businesses, finding the right balance between business benefit, personal benefit and personal privacy will be the challenge. As the White House’s recent Big Data report suggests, it’s a vital debate, with no easy answers.