James Thew - Fotolia
Google tackles Android app privacy with machine learning
Google will use machine learning and automated peer review scans to improve Android app privacy and limit app permissions overreach.
Google wants to ensure users are protected from Android app privacy intrusions and permissions overreach by using machine learning to improve app scans, but experts disagree on how effective Google's efforts will be.
Apps submitted to the Google Play Store are already scanned for malicious code and other threats, but now Google will be using peer group analysis to ensure Android apps don't ask for unnecessary permissions.
"To protect our users and help developers navigate this complex environment, Google analyzes privacy and security signals for each app in Google Play. We then compare that app to other apps with similar features, known as functional peers. Creating peer groups allows us to calibrate our estimates of users' expectations and set adequate boundaries of behaviors that may be considered unsafe or intrusive," Martin Pelikan, Giles Hogben and Ulfar Erlingsson of Google's security and privacy team wrote in a blog post. "This process helps detect apps that collect or send sensitive data without a clear need, and makes it easier for users to find apps that provide the right functionality and respect their privacy."
Machine learning comes into play
Google said it has developed a machine learning algorithm to group apps with similar capabilities in order to compare Android app privacy settings and determine if an app must be more closely inspected by Google's security and privacy team.
Rebecca Herold, CEO of Privacy Professor, said rather than relying on machine learning and "the subjective views of the determined security and privacy experts," Google should consider requiring Android app privacy to comply "with specific security and privacy standards."
"If [developers] could say, 'We've validated our apps to meet the IEEE P1912/NIST XXXX/etc. standards and be compliant with GDPR [General Data Protection Regulation]/HIPAA [Health Insurance Portability and Accountability Act]/FISMA [Federal Information Security Management Act]/etc. requirements,' that would provide more confidence by having more authoritative, established and internationally accepted standards as their determination for acceptable privacy controls than leaving it to human judgment and AI [artificial intelligence] results," Herold told SearchSecurity via email. "Plus, by establishing a minimum requirement to meet specified security and privacy technical standards, the app designers and vendors can then build the security and privacy controls into the apps before they submit them to Google Play to review by the experts."
Michael Patterson, CEO of Plixer in Kennebunk, Maine, said developers will likely find ways to get around the new Android app privacy scan.
"The massive industry push by software companies to collect big data from customers is still building momentum. Google's attempts to stop this rush to collect personally identifiable information probably won't work for a few reasons," Patterson told SearchSecurity. "For example, vendors like Words With Friends can simply build in functionality that justifies their need to gather details, such as GPS location information. Google could be trying to police mobile application developers for ulterior motives."
How effective is machine learning
Herold said it was "great" that Google would be performing the Android app privacy scans, but also questioned the machine learning efficacy.
"Since they are the entity providing the apps, they should bear responsibility for ensuring as much as possible that the apps in their site are secure and protect privacy -- especially with new laws, such as the [European Union's] GDPR, that require privacy controls to be built in," Herold said. "Using machine learning to do so will be interesting to see how effective and accurate it will be. Determination of privacy includes consideration of context for how/where/when/etc. data is used/shared/etc. I would recommend that they also have a team that would do some type of quick review of the apps that were not flagged by their AI mechanism to see the data involved and audiences for those AI-declared safe apps just to provide a second opinion and catch the apps that fall through the cracks."
Liviu Arsene, senior e-threat analyst at Romania-based antimalware firm Bitdefender, said automating Android app privacy scans may miss the human factor with privacy concerns.
"Privacy is something extremely personal, as some users might find location tracking intrusive, while others might not. Trying to put a privacy label on the 3 million currently available apps will be difficult, and false positives will definitely occur," Arsene told SearchSecurity. "In some sense, [the scan] could make app vetting a lot easier and more efficient, but ultimately it's still up to the individual user to decide whether or not an app is intrusive or not."
Herold noted, going forward, developers are going to have more and more reason to downplay Android app privacy in favor of collecting data.
"As more apps are integrated within the internet of things and the internet of medical things, there will be even more types of data that can be tied to individuals, and so the privacy risks will also increase through that emerging increase in use," Herold said. "So, long story short: While some app creators and vendors are trying to address privacy, the risks are increasing much more quickly than the number of app developers who are trying to mitigate the privacy risks."