Artificial Intelligence Poses Data Privacy Challenges

Bloomberg Law: Privacy & Data Security brings you single-source access to the expertise of Bloomberg Law’s privacy and data security editorial team, contributing practitioners,...

By Stephen Gardner

Oct. 19 — International privacy regulators are increasingly concerned about the need to balance innovation and consumer protection in artificial intelligence and other data driven technologies, global privacy chiefs said Oct. 19.

Data protection officials from more than 60 countries expressed their concerns over challenges posed by the emerging fields of robotics, artificial intelligence and machine learning due to the new tech's unpredictable outcomes. The global privacy regulators also discussed the difficulties of regulating encryption standards and how to balance law enforcement agency access to information with personal privacy rights.

Such technological developments “pose challenges for a consent model of data collection,” and may lead to an increase in data privacy risks, John Edwards, New Zealand privacy commissioner, said at the 38th International Data Protection and Privacy Commissioners' Conference, in Marrakesh, Morocco. For example, decision-making machines may be used to “engender or manipulate the trust of the user,” and would be an “all seeing, all remembering in-house guests,” that would collect personal data via numerous sensors.

Peter Fleischer, global privacy counsel at Alphabet Inc.'s Google, said that established privacy principles would continue to be relevant for new technologies, but machine learning raised particular problems, such as machines finding “ways to re-identify data.”

The emerging technologies may have a broad impact across various industries. “Humans teaching machines to learn” was a “revolution in the making” that may have broad societal consequences that could cut across numerous economic sectors, Fleischer said.

For example, data-driven machines may have the ability to analyze sensitive medical data, make medical diagnoses, thereby potentially revolutionizing the health-care industry, Fleischer said at the conference. Machines that learn would act “like a chef: see the ingredients and comes up with something new,” he said.

Unknowable Algorithms

Machines that learn through collecting real-world data may pose privacy challenges “that we are only beginning to understand,” Edwards said.

Machines may make decisions based on an algorithm that is “not known and is in fact unknowable by the designer or user of the application,” would pose particular problems of “responsibility or accountability for automated decision making,” Edwards said.

Edwards said that care is needed to ensure that decisions taken by machines didn't reflect biases inherent in the initial data used as the basis for machine learning.

Federal Trade Commission Chairman Edith Ramirez compared the machine learning problems to how drones collect data. There may be “no traditional way for operators to make disclosures about what information they are collecting and how they will use it,” she said.

Ramirez also pointed to problems in regulating emerging technologies due to the ambiguity surrounding data collection practices. The enforcement of privacy in the context of emerging technologies is complex because of the interconnection of devices, she said.

Asimov Principles

The privacy commissioners' conference didn't provide specific answers to looming challenges for privacy regulation. The role of the annual conference is rather to highlight issues that are likely to become significant.

Marc Rotenberg, president of the Electronic Privacy Information Center, said that in terms of robotics and machine learning, privacy commissioners may consider supplementing Isaac Asimov's three laws of robotics—a robot should not injure a human, obey human orders unless it would lead to the injuring of a human, and protect itself unless that would contravene the first two laws. Rotenberg proposed that, in addition, robots must always reveal the basis of their decisions, and robots must always reveal their identities.

Rotenberg said that “we believe increasingly in the importance of algorithmic transparency,” which would conform with the suggested rule that robots should always reveal the basis of their decisions.

Ramirez said that regulators should build up expertise in emerging technologies so they have a thorough understanding of them as the basis for regulation. Privacy commissioners would likely need help from researchers and technologists because they must understand “not only what is happening today but what is likely to happen tomorrow,” she said.

Encryption Difficulties

Encryption has also puzzled privacy regulators who say that it's hard to balance consumer privacy with law enforcement agencies' access to information for legitimate purposes.

Edwards said that encryption protects the data of companies and individuals, but “poses a significant challenge for law enforcement agencies and others with lawful authority to intercept communications. “No satisfactory response to this challenge has yet been identified,” though few companies favor the introduction of back doors so that law enforcement authorities can access encrypted communications for valid purposes.

Jane Horvath, senior director of global privacy at Apple Inc., spoke strongly in favor of encryption technologies and said “a government demand for back doors puts everyone's security at risk.”

“Security is an endless race,” and “encryption, in short, protects people,” Horvath said.

To contact the reporter on this story: Stephen Gardner in Marrakesh, Morocco at correspondents@bna.com

To contact the editors responsible for this story: Donald G. Aplin at daplin@bna.com ; Daniel R. Stoller at dstoller@bna.com

Copyright © 2016 The Bureau of National Affairs, Inc. All Rights Reserved.