sdecoret - stock.adobe.com

Experts cautious about Apple's mood-detecting AI research

The tech giant is working with UCLA to develop technology that uses facial recognition, sleep, typing patterns and other data points to detect emotion. But some are cautious.

While Apple is reportedly working on AI technology that can detect mental health states and emotion, some are skeptical.

It is unclear and still unproven whether AI is reliable for producing clear diagnoses and uncertain how such "emotion AI" would be used in the field, according to Jorge Barraza, assistant professor of the practice of psychology at the University of Southern California and CSO at Immersion, a neuroscience tech vendor.

"When we infer things from emotion AI at the macro level -- meaning that we tend to see patterns at the macro level -- at the individual level it starts becoming a little more dubious," Barraza said.

Outside of a social context, "it's unclear how much meaning [emotion] has in order for us to understand what people's psychological experiences are,” he added. "Different types of expressions or emoting might have different meanings whether it's in a social context or whether it is not."

The research project apparently grew out of an Apple-sponsored joint research project with UCLA that the university first publicized in 2020, according to the Wall Street Journal.

Apple and UCLA researchers are looking to create algorithms that can use digital signals to detect depression or anxiety. The data points they're using includes facial recognition, sleep patterns, typing behavior and vital signs

Research into mental health

The researchers are using Apple devices including the iPhone and Apple Watch with a Beddit Sleep Monitor device. The project began with 150 participants in 2020 and is expected to involve about 3,000 people by the end of 2023.

Neither Apple nor UCLA responded to requests for comment about the research project.

Researchers are going beyond just trying to understand a person's mental health and are seeking to determine if a person has anxiety or depression.

The personal device-based research, while still unproven, could produce useful tools, Barraza said.

I do see this technology as being very promising. Not in terms of diagnosing things like depression or anxiety, but at least serving directionally to give people self-awareness of their day to day.
Jorge BarrazaProfessor, University of Southern California

"I do see this technology as being very promising," he said. "Not in terms of diagnosing things like depression or anxiety, but at least serving directionally to give people self-awareness of their day to day."

Apple's interest in emotion AI

Apple's interest in emotion AI began in 2016 when it bought Emotient, a vendor that uses AI to read emotions.

Emotient is among the growing number of vendors in the field of emotion AI. Meanwhile, enterprises are using similar systems that use AI and machine learning to gauge employee engagement, and to assess potential job candidates.

Apple's use of the technology is different from what others have done before because researchers are focusing on multiple data points, Barraza said. He said that usually with emotion AI researchers focus on either just facial recognition (capturing expressions such as smiles and frowns) or voice analysis (tone and words that are being used). Instead, the researchers working with Apple and UCLA are looking at both facial recognition and voice analysis as well as heart rate, sleep patterns and more.

"We're talking about a big data set," he said. "We're not just relying on one piece of information to tell us how people are experiencing [their emotions]."

Emotions differ depending on social context

Although the technology could be helpful in making people more aware about their emotional well-being, Barraza said the approach still must be looked at with skepticism -- especially if the data is used to predict how one is feeling.

Despite the intentions of Apple or whomever owns the technology, it may not be used in the way it's intended. Instead, it could be used in a way that is detrimental to an employee or may be an older interpretation of data.

Culture and emotions

Another challenge of emotion AI is how to deal with how different emotions are viewed or perceived in different cultures.

"What might be different in certain cultures or certain subgroups or certain ages … that nuance is so hard to detect," said R "Ray" Wang, founder and principal analyst at Constellation Research.

Wang said the challenge for any company trying to develop emotion AI is knowing when the data is good enough.

Researchers need to determine the level of precision they want in order to avoid false positives and false negatives, he said. They need to look for where a bias could lie and where false patterns could lie in the data set. This could mean accounting for cultural differences, accents, or even racial differences that could affect one's emotional well-being.

However as one of the largest makers of mobile devices in the world, Apple may have a chance at making the technology work due to its huge network of users.

"We're at the beginning of emotion AI," Wang said. "It's going to take off over time. But if you release it too early and you lose the trust and confidence of people, that's the risk."

 

Dig Deeper on AI technologies