Balancing Innovation with Privacy

A formidable goal, but will state and federal legislation help or hinder business?

By Joanie Wexler

By Joanie Wexler June 20, 2019

Concerns over digital intrusions into our private lives are mounting. Well-publicized data breaches and recent Facebook privacy violations, which have resulted in scandal and at least one lawsuit, continue to fuel worries ranging from unwelcome solicitations to full-blown identity theft.

As privacy legislation attempts to catch up with the times, many can’t help but wonder: will regulation help individuals personally but hinder business? Or will everyone benefit in the long run?

The Conundrum

Privacy issues are spurring legislation around the world, and they pose an immense conundrum. Big data and analytics have become integral to the digital transformation movement. They allow businesses to slice and dice consumer information, helping marketers remain innovative and customize their offerings.

Mobile apps can ask users for over 200 permissions. If we are given our wish for more privacy, it means we are given so much control that we choke on it.

Dr. Woodrow Hartzog, a professor of law and computer science at Northeastern University School of Law and Khoury College of Computer Sciences.

But while the use of consumer data can be used for the common good, the failure of some companies to adequately protect that data “undermines consumer trust in the Internet marketplace, and that trust is a driving force behind the…success of American technological advancement and prosperity,” said U.S. Sen. Roger Wicker, R-Miss., chairman of the Committee on Commerce, Science, and Transportation, in a Congressional hearing on data privacy on February 27.

“The source of the Internet’s innovation is also the source of its vulnerabilities,” said Randall Rothenberg, CEO of the Interactive Advertising Bureau, in his opening testimonial remarks at the hearing. “How do we close off the sources of corruption without impeding innovation?”

That’s the overriding question, and budding legislation aims to achieve just this type of balance. But it’s no simple task.

Full Transparency: Opaque at Best

Legislators are beginning to realize that “full transparency” through the use of lengthy acceptable-use policies doesn’t go far enough. In most cases, users don’t understand what they’re sanctioning when they hit an online “I agree” button, even if they have the wherewithal to read the complex policy top to bottom, according to Dr. Woodrow Hartzog, a professor of law and computer science at Northeastern University School of Law and Khoury College of Computer Sciences.

Dr. Hartzog was one of the experts who testified at the February hearing in Washington, D.C.

“Honesty goes beyond just transparency,” he told the Congressional panel. “While you might remember to adjust your privacy settings on Facebook, what about Instagram, Twitter, Google, Amazon, Netflix, Snapchat, Siri, Cortana, Fitbit, Candy Crush, your smart TV, your robot vacuum cleaner, your Wi-Fi-connected car and your child’s Hello Barbie?

“Mobile apps can ask users for over 200 permissions,” he continued. “The problem with thinking of privacy as [user] control is that if we are given our wish for more privacy, it means we are given so much control that we choke on it.”

Privacy Movements Afoot

The impetus for the U.S. to move on a federal privacy policy is being driven in part by individual state privacy laws such as the California Consumer Privacy Act of 2018, ratified in June 2018, and the Washington Privacy Act, a bill introduced in January 2019. In addition, the General Data Protection Regulation (GDPR) privacy legislation in Europe, which officially took effect in May 2018, has created a model that others would like to emulate.

Considered the biggest shake-up of personal data privacy rules since the birth of the Internet, GDPR requires, in part, any organization that serves EU customers to know exactly what information they have on every customer and where it’s stored. The organization must be able to delete that information, on-demand, at the customer’s request — or face significant fines.

Currently, the U.S. Federal Trade Commission (FTC) has no authority to impose fines for a first-offense privacy violation, said Victoria Espinel, president and chief executive officer of BSA – The Software Alliance, an advocate for the global software industry. She was a Congressional panelist in the February U.S. Congressional privacy hearing. “That’s wrong and should be fixed,” she said.

Around the time the GDPR was enacted, U.S. senators Amy Klobuchar (D-MN) and John Kennedy (R-LA) introduced the Social Media Privacy and Consumer Rights Act of 2018.

Their bill’s stated aim is “to protect consumers’ online data by increasing the transparency of data collection and tracking practices and requiring companies to notify consumers of a privacy violation within 72 hours.”

For the complexity-of-transparency reasons mentioned, many participants in the February privacy hearing don’t think these goals are ambitious enough. And while they agreed that a federal privacy law is important and that the FTC should be the primary authority to enforce it, their opinions are split on whether such a law should preempt state legislation.

To Preempt or Not to Preempt

Some feel having 50 separate state laws in what has become an integrated national and global economy is confusing to consumers and too complex and costly to administer. Experts like Dr. Hartzog acknowledge the point but worry that a preemptive law could be dangerous.

“The main reason I think preemption is dangerous is that federal legislation, unless crafted really well, is not nimble and can ossify quickly,” he said in an exclusive interview. “That’s a risky strategy when you’re dealing with an environment [the digital world] that changes so quickly.”

During the February hearing, Rothenberg said he favors “consistency over chaos” and is “generally in favor of preemption.” However, he said, there are roles for the state, each vertical industry, and the federal government to play. “The trio delivers the strongest solution,” he said.

Person typing on laptop

Dan Goldstein, a former attorney who is now president and owner of Page 1 Solutions, a digital marketing agency, has stated publicly that he advocates a single data privacy law rather than “a patchwork of different regulations adopted by each state. The risk to businesses and consumers is that some legislation or regulation may be so restrictive that it harms both consumers and tech companies.”

On the other hand, things that are good for certain sectors of the economy are not necessarily good for all, Dr. Hartzog said.

“A mind-reading machine would be a helluva marketing tool, but few would agree it would be desirable,” he told The Forecast. “It’s important to search for solutions that let us use data while keeping in mind that the ‘innovation-at-all-costs’ mantra could mean just that: ‘at all costs’ isn’t necessarily a net positive.”

Fickle Citizens

Will privacy regulation take a bite out of businesses’ efforts to use consumer information for innovation, safety and well-being? It’s hard to know, because these are early days in attempting to balance consumer protection with all the good that modern digital technology has the potential to deliver. Meanwhile, people are fickle about their privacy.

“If you’re online, you’ve pretty much given up your privacy,” said Ayanna Howard, a roboticist and chair for interactive computing at the Georgia Institute of Technology. Howard frequently runs up against bias both for and against automation and artificial intelligence (AI), tools that enable the analytics that allow companies to learn all about consumers.

“If you use a free service, you’re exchanging your dollar for your privacy,” she told The Forecast. “If Facebook started charging $19.99 a month, I don’t know how many people would willingly say: OK, we do want our privacy, but we’re not willing to pay to ensure it.”

By using free sites such as Facebook, Google, Twitter, and LinkedIn, Howard said, “we’re paying for these services—but we’re paying with our data instead of our dollars and cents.”

On the Front Burner

Privacy and security continue to be major issues for Americans, according to a National Telecommunications Information and Administration (NTIA) survey conducted by the U.S. Census Bureau released in August 2018. Nearly three-quarters of Internet-using households had significant concerns about online privacy and security risks, while a third said these worries caused them to hold back from some online activities. About 20% said they had experienced an online security breach, identity theft, or a similar crime during the past year.

How likely are consumers to give up their online life to solve the problem? Not very, indicates Howard.

How likely are we to see a federal data privacy law in the U.S. this year? If states pass bills that federal legislators agree are good working models, “a federal law might not be so urgent,” said Dr. Hartzog.

Then again, he said, if a state were to pass a very strict privacy law mandating something extreme, like prohibiting any collection of user data, “there might more haste for a federal law,” he said.

Joanie Wexler is a contributing writer and editor with more than 20 years experience covering IT and computer networking technologies.

© 2019 Nutanix, Inc. All rights reserved. For additional legal information, please go here.