Getty Images

Head-Worn Device Leverages ML, MEG, Distinguishes Between Hand Gestures

New research described a wearable device that uses machine learning and a noninvasive brain imaging technique to capture and differentiate between hand gestures.

Research from the University of California San Diego described a head-worn wearable device that leverages machine learning and a noninvasive brain imaging technique called magnetoencephalography (MEG) that could help distinguish between various hand gestures in patients with conditions such as paralysis or amputated limbs.

Amputated limbs and paralysis affect a large portion of the US population. According to research from the National Center for Biotechnology Information, almost 5.4 million people in the US had paralysis in 2013.

In pursuit of one day allowing these patients to use their minds to control devices, UC San Diego researchers are working on a non-invasive brain-computer interface. The results of recent research are the first step in developing the interface. The research shows that a helmet containing an embedded 306-sensor array can leverage MEG to detect magnetic fields that derive from neuronal electric currents between brain neurons.

“With MEG, I can see the brain thinking without taking off the skull and putting electrodes on the brain itself,” said study co-author Roland Lee, MD, director of the MEG Center at the UC San Diego Qualcomm Institute, in a press release. “I just have to put the MEG helmet on their head. There are no electrodes that could break while implanted inside the head; no expensive, delicate brain surgery; no possible brain infections.”

Lee is also an emeritus professor of radiology at UC San Diego School of Medicine and a VA San Diego Healthcare System physician.

Researchers conducted a trial that aimed to determine the efficacy of MEG. Made up of 12 volunteer subjects, this trial had participants wear the MEG helmet and make one of the three hand gestures made in the game “rock, paper, scissors.”

The data was interpreted through a deep-learning model known as MEG-RPSnet.

When reviewing results, researchers found that the device had an accuracy level that exceeded 85 percent in distinguishing between hand gestures.

An area for improvement, however, related to the number of sensors in the MEG helmets. Given that measurements from half of the sampled regions generated results with a 2 percent to 3 percent loss of accuracy, researchers noted the potential need to eliminate sensors.

As the use of wearable devices continues to grow, researchers are discovering new ways to apply them to clinical cases.

For instance, published in April 2022, study results indicated that wearable sensors used to monitor heart rate, skin temperature, and respiratory rate showed potential for detecting and managing COVID-19.

In this study, researchers analyzed various databases related to the use of wearable devices. In a final sample review of 12 studies, researchers found that the most used device was the Fitbit, followed by the WHOOP strap, the Apple watch, and the Empatica E4.

Of the 12 studies, nine used machine-learning algorithms to explore physiological data, eight of which indicated an existing correlation between COVID-19 and elevated heart rate. Further, 75 percent of studies that reviewed respiratory rates indicated that it increased with COVID-19 and that 76 percent of participants experienced elevated temperatures just before the onset of symptoms.

Next Steps

Dig Deeper on Artificial intelligence in healthcare

xtelligent Health IT and EHR
xtelligent Healthtech Security
xtelligent Healthcare Payers
xtelligent Pharma Life Sciences
Close