Human Rights Should Underpin Ethical Privacy Code

Bloomberg Law: Privacy & Data Security brings you single-source access to the expertise of Bloomberg Law’s privacy and data security editorial team, contributing practitioners,...

By Jeremy Hainsworth

March 3 — Privacy, ethics and legal professionals agree that development of any code of ethics for privacy protections needs to be a multi-stakeholder, cross-sector conversation that includes the public in an open and transparent way.

(Click image to enlarge.)

Picture of word ethics

However, the issue goes beyond just privacy and delves into the heart of human rights in a new era of digital information, the Internet of things and big data, analysts told Bloomberg BNA.

The analysis must go beyond the law, according to Information Accountability Foundation (IAF) Executive Director Martin Abrams. “You need to have an ethical framework,” he said.

“You have to have a process to arrive at decisions that upholds your integrity and satisfies a full range of stakeholders.”

In a 2015 report, the IAF said that a unified ethical framework consists of five key values: they must be beneficial, progressive, sustainable, respectful and fair.

The issue here, Abrams said, is what constitutes privacy.

European Union and UN

Further, Abrams continued, European Data Supervisor Giovanni Buttarelli's Opinion No 4/2015—Towards a new digital ethics: Data, dignity and technology—cited Article 1 of the Charter of Fundamental Rights of the European Union, which says that “human dignity is inviolable. It must be respected and protected.”

Buttarelli's office has indicated it would establish an ethics board to promote the maintenance of “dignity” as an objective of data protection .

Industry professionals agree that Article 1 is a good starting point.

According to World Privacy Forum Executive Director Pam Dixon, Article 12 of the Universal Declaration of Human Rights could also play a role in crafting ethics. It says “no one shall be subjected to arbitrary interference with his privacy, family home or correspondence, nor to attacks upon his honor and reputation,” she said.

“Article 12 is grossly overlooked in political discussions on human rights and privacy.” Change can't wait for laws to change or governments to act, Dixon said.

“Ultimately, big enough players need to be at the table to make that happen,” she said, adding that governments need to be involved at the end of the day.

Broad Public Conversation

British Columbia Information and Privacy Commissioner Elizabeth Denham said that discussion about ethics requires broad public conversation.

“There are types of data and specific types of content where you should get explicit consent,” she said. Further, the way that organizations and governments make decisions about data needs to have that process demonstrated and open to scrutiny, she added.

“Oversight is more important than ever,” Denham said. “That needs to be done at a different way than we're doing it now” she said. “It needs to be done at an organizational level, not just at a transactional level.”

However, Denham continued, putting all of that into a global ethical framework may be something that remains “elusive.” She suggested it could be the United Nations that needs to take the lead.

Denham said some of the areas that need to be discussed are de-identification of data, transparent decision-making, research ethics and a discussion about what constitutes acceptable and reasonable use of data. All of that needs oversight, Denham said. “The public needs to know someone has their back.”

Deloitte Canada Digital Privacy Leader Sylvia Kingsmill agreed.

End users lack privacy awareness and how data are being used, Kingsmill said. “It should be treated as a right and a freedom rather than a consent issue,” she said. Although consent models differ around the world, in the interim, organizations should adopt the highest standards, she said..

Further, she agreed with the concept of privacy by design for applications. “It avoids costly retrofits, the hackers—they're five steps ahead,” Kingsmill said.

Outdated Thinking

According to Dixon, the Nuremburg Code—formulated after the World War II war crimes trials—also has something to add to the discussion on consent and data use. The code covers the ethical use of humans as research subjects and says that voluntary consent of the human subject is absolutely essential.

While some believe the code has outlived its usefulness, Dixon said, its message is equally applicable to the use of health data so people aren't harmed.

However, according to a Milwaukee-based futurist, discussions about ethics and workable definition of privacy are old news based on outdated thinking.

It's a discussion framed by the use of 20th century technology, Richard Thieme said.

According to Thieme, ethics in the overt sphere such as government and business are much different than those in the covert sphere where classified information, security and terrorism are prime factors in gathering information.

“That whole domain has different needs,” he said of the covert sphere. Moreover, he said, the issue is one that transcends the old concept of national boundaries. “There is a constant stream of data across boundaries,” he said.

“It's a universe of information that doesn't respond to 19th century nation-state boundaries.”

Thieme said with global economic systems, people are acting trans-nationally which makes it hard to institute global privacy ethics under the nation-state model.

He said the state of privacy is leading to hyper-vigilance among people based on paranoia.

“People feel helpless,” he said. “Well, they are.”

Dixon, however, said a line could yet be drawn.

“We have to have lines,” she said. “Ethical guidelines would show us where the line should be. We cannot continue in an unfettered environment,” Dixon said.

To contact the reporter on this story: Jeremy Hainsworth in Vancouver at

To contact the editor responsible for this story: Jimmy H. Koo at