Orbon Alija/Getty Images

Why ethical use of data is so important to enterprises

With everything else going on this year, data privacy has been front of mind for many consumers, accompanied by ethics related to how their data is collected and used.

The ethical use of data goes deeper than simple regulatory compliance. Enterprises are waking up to a new reality in which consumers -- and even their own employees -- want them to put a brake on how much data they vacuum up and what they do with it.

According to a recent KPMG survey, 97% of consumers said they believed data privacy was important, and 87% thought it should be a human right, but 54% of respondents didn't trust companies to use their data ethically.

"Data privacy and protection are clear priorities for consumers," said Vijay Jajoo, principal in cyber security services at KPMG.

For companies to act ethically, they must pay close attention to how they manage and protect consumer data, he said.

"Does the fact we can now do certain things mean that we should?" Jajoo said. "This problem goes to the core of the data ethics debate."

But for many companies, ethical data collection and management are not a high priority.

"Too many companies have made it a habit to scoop up every piece of data they can get their digital hands on," said Kathy Baxter, principal architect of ethical AI practice at Salesforce.

They're motivated by potential use cases in business analytics, predictions or just to sell to others, she said. "The result is companies having a lot more data than they need or should have."

Why it matters

As cyberattacks increase, consumers are starting to pay more attention to privacy issues. Governments around the world are now looking at passing legislation to protect users' data, partly in response to new AI technologies demanding more data that still pose significant risks of bias and misuse. All this means it's more important than ever for companies to take a holistic view of their data collection practices.

Many companies are in reactive mode, tightening up their security after a breach happens and updating their processes and policies after new legislation is passed. This approach is not only more costly than building in proper processes from the start, it also creates significant public relations problems for companies.

The compliance risk

Privacy violations cost money -- directly, with the regulations in Europe and California, and indirectly, if they contribute to data breaches.

Some companies approach compliance from the perspective of fulfilling the minimum requirements of particular laws that apply to them. Others take a more holistic view, looking at ethical data mining principles and the intent behind privacy laws to apply them across the board. For example, Europe's GDPR and the California Consumer Privacy Act only apply to customers in those jurisdictions, but some companies extend these protections to all users, regardless of where they are based.

"As the power of contemporary enterprises grow, they bear a larger responsibility for going beyond compliance to engage with ethics," said Laura Norén, visiting scholar at NYU's Center for Data Science and VP of privacy and trust at Obsidian Security.

That means collecting only the data that is required, avoiding harm to people and the environment and obtaining explicit consent for that data collection, she said.

Focusing on principles can help companies stay ahead of regulations, reduce future costs and limit privacy-related public relations incidents.

"From a pure regulatory perspective, companies should be looking at data ethics," said Anand Rao, partner and global AI leader at PwC.

And that may be happening. According to a recent PwC survey, 85% of business leaders said they comply with the strictest privacy and security regulations around the world. Regulations tend to focus on specific details of privacy rather than on the big picture, however.

That's the central problem with ethical data collection, said Dipayan Ghosh, co-director of the Digital Platforms & Democracy Project at Harvard Kennedy School. "There's no regulatory structure that requires it."

That means that there's no commercial incentive for companies to care about the bigger ethical data issues, he said.

"Enterprises should of course care, but they're currently not incentivized to care," Ghosh said. "That may all be changing soon, though."

The public relations risk

What could cause that change and get enterprises to care about ethical data mining? The public cares.

According to a recent consumer survey by McKinsey, 71% of respondents said they would stop doing business with a company that gave away sensitive data without permission.

"Building consumer trust is ultimately profitable," said Paige Bartley, senior research analyst of data at S&P Global Market Intelligence. "When consumers have a trusting relationship with your organization, they will spend more over time."

The public relations risk isn't just about customers. Other stakeholders such as investors, employees and regulators also pay attention to a company's ethical misdeeds.

"While it seems like data can be weaponized to gain revenue, in many instances it can backfire with bad PR that could drive a company out of business," said David Linthicum, chief cloud strategy officer at Deloitte Consulting.

The data loss risk

Breaches are one of the most obvious costs of gathering and keeping too much data and not doing enough to protect it.

According to Risk Based Security, 2020 has been a record year for the number of records exposed in breaches. And that was just in the first two quarters. Last week, the company released an updated report: More than eight billion additional records were exposed, putting the total for this year so far at 36 billion.

But it's not just criminals getting their hands on sensitive data. Law enforcement agencies also request data about individuals from private companies, and that data may be used in ways that were not originally expected.

"Look at commercial genealogy companies giving law enforcement access to genetic data, as one example," said Jessica Lee, partner and co-chair of Loeb & Loeb's privacy, security and data innovations practice. "I don't think most consumers -- at least until recently -- considered that their efforts to get more information on their ancestry could result in the arrest of a family member."

Companies need to think through how they respond to law enforcement requests ahead of time, Lee said, and make it clear to customers so they can understand the risks.

The AI risk

One of the most important aspects of ethical data use going forward is building AI systems.

A recent IDC survey of business leaders found that ethical use of AI was important to ensure good customer experience, data privacy, regulatory compliance and employee trust. These factors were even more important than ensuring AI recommendations resulted in good business decisions.

Meanwhile, about 65% of respondents said they are "very confident" that they're using AI ethically, while another 33% said their confidence is "generally high, but with reservations."

But this high degree of confidence could be the result of limited understanding of the risks involved, said Bjoern Stengel, senior research analyst at IDC's Worldwide Services Research Group.

"We're at a relatively early stage and organizations are just starting to realize what the use of AI actually means," he said.

The survey also shows a gap between the confidence level and adoption of ethics-related solutions. More than half of respondents were looking to invest in systems for strategy or data architecture phases of the AI development lifecycle, as opposed to more advanced stages such as deployment, model training and monitoring.

Only about 7% of respondents said they had implemented AI on a large scale as part of their enterprise strategy, with the rest using it in select business units, small-scale pilot projects or still having it in the planning stages.

"That indicates that the maturity and understanding of these issues is not that great," Stengel said.

And the potential risks are high.

"Companies are using data to make decisions that can amplify existing bias and cause real harm," Baxter said.

There have been recent ethical cases of AI hiring systems favoring white men, parole recommendation systems rating people of color as higher risk, credit card companies giving lower credit ratings to women and facial recognition technology resulting in the arrests of innocent people.

"For the sake of your business's success, you want to make accurate predictions and decisions," Baxter said. "But for the sake of society, you need to ensure that you are not perpetrating harmful bias."

The path forward

To address these issues with the ethical use of data, Salesforce has training programs for employees, poses ethical questions in its software development cycle and examines its training data for bias, Baxter said.

Salesforce also created an Office of Ethical and Humane Use of Technology led by a chief ethical and humane use officer. Other companies are also paying attention to the ethical use of data.

According to a study conducted earlier this year by FTI Consulting, 81% of enterprise leaders said their top corporate leaders and board members were knowledgeable of data privacy compliance issues and were committed to complying. In fact, 97% of companies planned to increase spending on data privacy with an average increase of 50%.

The tricky part, according to Max Kirby, customer data platform practice lead at Publicis Sapient, a digital business consulting company, is to connect the philosophical principles of ethical data management with practical implementation guidelines that developers can understand.

"There aren't many philosophers who are also technologists," he said. Although that is likely to change. "Meanwhile, the thing that bridges the gap is publishing a playbook, [a document of principles] that explains your stance and philosophy and how you look at privacy."

One place to start looking for a framework to build your company's ethical data mining policy around is the International Association of Privacy Professionals' Fair Information Practice Principles. But there are many organizations making attempts to create guidelines, Kirby said.

"They are aimed at the underlying principles, not at a specific set of laws or practices, but I don't see any one that's risen above the rest right now," he said. "We're still in the sandbox stage of privacy."

Next Steps

Why you need an AI ethics committee

Dig Deeper on Business intelligence management