Sergey Nivens - Fotolia

Artificial intelligence data privacy issues on the rise

End users are in the crosshairs of business data privacy issues, especially when it comes to information gleaned from artificial intelligence technologies.

Thanks to the sheer amount of data that machine learning technologies collect, end-user privacy will be more important than ever.

It's still very early days for artificial intelligence (AI) in businesses. But the data that desktop and mobile applications automatically collect, analyze using machine learning algorithms and act upon is a reality, and IT shops must be ready to handle this type and volume of information. In particular, thorny artificial intelligence data privacy issues can arise if employers can detect and view more -- and more personal -- data about their employees on devices or apps.

"AI requires a ton of data, so the privacy implications are bigger," said Andras Cser, vice president and principal analyst at Forrester Research. "There's potential for a lot more personally identifiable data being collected. IT definitely needs to pay attention to masking that data."

Business applications and devices can take advantage of machine learning in a number of ways. A mobile sales app could collect location or IP address data and find patterns to connect the user with customers in their area, for instance. If the user accesses this app on a personal device they use for work, they may not want their employer to be able to view that data when they're off the clock. Or, a user's personal apps could learn information about the individual that he or she wouldn't want their human resources department to find out.

Health-related devices that take advantage of artificial intelligence pose a significant threat. A lot of companies give out Fitbits, for example, to gather data about employees that's used for insurance purposes, said mobile expert Brian Katz. Artificial intelligence data from that kind of device could reveal a health condition the employer didn't know about, and then comes the real dilemma:

"If your manager knows about it, do they act on it?" Katz said.

Keeping artificial intelligence data in the shadows

One way for IT to address data privacy issues with machine learning is to "mask" the data collected, or anonymize it so that observers can't learn specific information about a specific user. Some companies take a similar approach now with regulatory compliance, where blind enforcement policies use threat detection to determine if a device follows regulations but do not glean any identifying information.

With AI it becomes easier to correlate data ... and remove privacy.
Brian Katzmobile expert

Device manufacturers have also sought to protect users in this way. For example, Apple iOS 10 added differential privacy, which recognizes app and data usage patterns among groups of users while obscuring  the identities of individuals.

"If you know a couple things that you can correlate, you can identify a person," Katz said. "With AI it becomes easier to correlate data ... and remove privacy. People want to provide a better experience and learn more about [users], and doing that in an anonymous way is very difficult."

Tools such as encryption are also important for IT to maintain data privacy and security, Cser said. IT departments should also have policies in place that make it clear to users what is permissible and not permissible data for IT to collect and what the business can do with it, he said.

It's important for users to understand this information, Katz said.

"Part of it's just being transparent with users about what you're doing with the data," he said.

Another best practice is to separate business and personal apps using technologies such as containerization, he said. Enterprise mobility management tools can be set up to look at only corporate apps but still be able to whitelist and blacklist any apps to prevent malware. That way, IT doesn't invade users' privacy on personal apps.

Data privacy and security regulations

The United States does not have a national, comprehensive law around data privacy issues. Instead, it has adopted several different regulations over the years in specific industries.

For example, the Health Insurance Portability and Accountability Act (HIPAA) protects individuals' personally identifiable healthcare information. And the Fair Debt Collection Practices Act limits creditors and other financial institutions from sharing identifying information about a person's transactions or debts.

The European Union takes a very different approach. Individuals in the EU are protected from having any data collected about them without their knowledge via the Data Protection Directive, and beginning next year, the updated General Data Protection Regulation. Sanctions against companies that flout these regulations can include fines of up to 20,000,000 EUR (about $22 million U.S.). 

Data privacy and security laws still evolving

Privacy regulations vary widely across the globe, and many businesses and countries are still working to update guidelines based on emerging technology.

The European Union (EU), for example, has strong protection for the personal privacy of employees. Individuals must be notified of any data gathered about them, any data processing can be done only if there is a "legitimate" purpose such as investigating suspected criminal activity, and collected data must be kept secure. There are also restrictions on entities sharing collected data outside the EU.

The United States is more lax, said Joseph Jerome, a policy counsel on the Privacy & Data Project at the Center for Democracy & Technology in Washington, D.C.

"Basically employers can get away with anything they want so long as they're providing some kind of notice of consent," he said.

That's the reason some companies prefer to provide corporate-owned devices rather than enable BYOD, Katz said.

"You don't have as much of an expectation about privacy there, and that's why they do it," he said. "Your privacy is much more limited on a [corporate] device."

And when it comes to artificial intelligence data specifically, an interesting question arises: Who is responsible for the learned information? The employer? The machine learning application itself? The person that created the algorithm? These factors are still up in the air, Cser said.

"Legal frameworks are not yet capable of handling this kind of autonomous information," he said. "It's going to be a precedence-based type of evolution."

Could machine learning help, not harm?

Still, the data privacy issues raised by artificial intelligence are not entirely new. The internet of things and big data have been able to glean similarly personal and large volumes of data for years.

"It's basically a continuation of those trends," Jerome said. "It's lots and lots of data being gleaned from a lot of different sources. There's a lot of hype here, but at the end of the day ... I don't know if it raises any new issues."

Rather, machine learning might be a unique way to actually help users manage their data privacy, Jerome said. Privacy assistant apps could allow users to create policies that predict and make inferences over time to decide how and when the user would like their data to be collected and used, or not, according to Carnegie Mellon University research.

"AI might be an amazing way to do privacy management," Jerome said.

Next Steps

Artificial intelligence will affect contact centers

How machine learning can help IT security

Analytics and AI can stop a healthcare breach

Dig Deeper on Mobile security