U.S. Sends Mixed Privacy, Encryption Messages: Panel

Bloomberg Law: Privacy & Data Security brings you single-source access to the expertise of Bloomberg Law’s privacy and data security editorial team, contributing practitioners,...

By Joyce E. Cutler

Feb. 19 — The U.S. government is giving mixed messages to technology companies—build privacy and security by design in products and services, but leave them open to backdoor access by default—privacy professionals said.

The issue took on even more significance coming amid the firestorm over whether the Federal Bureau of Investigation can force Apple Inc. to unlock an iPhone used by one of the shooters in last December’s San Bernardino, Calif., terrorist attack.

“A weakness is a weakness. It can be exploited by anybody,” Demetrios Eleftheriou, Symantec Corp. global privacy director, said regarding backdoor access to smart devices.

A federal judge Feb. 16 ordered Apple to provide the FBI with software to disable the security feature that auto-erases the phone's data after successive, incorrect attempts to enter the pass code .

“It just seems like there’s a bit of an inconsistent message from the government,” Eleftheriou said.

“We have law enforcement on the one end saying you build back doors, they want broken by design,” he said. On the other end are “the regulators saying you have to incorporate security by default, privacy by default in the product,” he said.

“So I’d be interested in hearing a message from both sides saying, hey, is broken by design compatible with privacy by design and security by default?” Eleftheriou asked.

But the debate isn't isolated to U.S. domestic policy, panelists said at a Feb. 18 Bloomberg Law event in San Francisco.

What Happens Here Doesn’t Stay Here

The U.S. government needs to consider whether its inconsistent stance on consumer encryption is compatible with the new European Union General Data Protection Regulation requirements for privacy by design and security by default, Eleftheriou said.

Will DeVries, Google Inc. privacy counsel said companies “want the process to be really clear, really defined and based on principles that we can apply globally to our services that actually make sense and keep us all safe.”

As for encryption and back door access, “a weakness is a weakness. It can be exploited by anybody.”

Demetrios Eleftheriou, Global Privacy Director,Symantec Corp.

The argument against accessing a terrorist’s phone for privacy reasons is a “red herring” when “we’re actually worried about the precedent of saying can you ask a tech company to undermine the security of devices that’s out in the public, not just for the device they’re talking but a security flaw that then can be used on any device,” DeVries said.

The U.S. government often acts as though the requests it wants are only going to be valid for the U.S, he said.

“The fact is every government in the world thinks their own laws are valid and their own needs for access to data is valid. There’s no principled reason for tech companies that operate globally like Google and Apple why a request from another government, the Russian, the Chinese, wouldn’t be just as valid for users in that country or say Turkey or say some other place that doesn’t have civil rights record like the U.S. or Western Europe,” he said.

Dangerous Effects?

Companies can be ordered to assist with law enforcement to get at some data, Chris Jay Hoofnagle adjunct professor at the University of California, Berkley, School of Information & School of Law and member of the advisory board of Bloomberg BNA's Privacy & Data Security Law Report, said.

“Obviously, what makes this situation so dangerous and difficult is that the work the government would like Apple to do could be used prospectively and could be used to erode privacy and security in devices generally,” Hoofnagle said.

The tech industry finally is at this point where their devices are good enough to beat these forensic appliances, “so what happens now is going to be really important for the future of device security.”

Chris Jay Hoofnagle, Adjunct Professor,U.C. Berkley School of Information & School of Law

The tech industry finally is at this point where their devices are good enough to beat these forensic appliances, “so what happens now is going to be really important for the future of device security,” he said.

Crack in Fourth Amendment

Hoofnagle said he’s seeing a crack in Fourth Amendment applicability—worrying that there may be a split in application between various level of criminals.

“We might come to a world in the U.S. where we basically have different Fourth Amendment standards for the terrorism case where maybe we do feel as though the phone should be unlocked versus other types of crimes that aren’t as serious.”

By Joyce E. Cutler

To contact the reporter on this story: Joyce E. Cutler in San Francisco at jcutler@bna.com

To contact the editor responsible for this story: Daniel R. Stoller at dstoller@bna.com