UW Researchers Develop an mHealth Tool that Tracks Vital Signs With a Camera
Researchers at the University of Washington and Microsoft are working on an mHealth platform that uses the camera on a laptop or smartphone to capture a user's pulse and respiration rate.
The University of Washington and Microsoft have developed an mHealth platform that can collect vital signs using the camera on a laptop or smartphone.
The technology, which analyzes video of a user’s face, measures light reflected off the skin and uses AI to translate that into pulse and respiration rate, could someday help healthcare providers during a telehealth visit or in a remote patient monitoring program, as well as those capturing vital signs of a patient in an emergency, such as an accident.
“Any ability to sense pulse or respiration rate remotely provides new opportunities for remote patient care and telemedicine,” Shwetak Patel, a professor in UW’s Paul G. Allen School of Computer Science & Engineering and the electrical and computer engineering department and senior author on the project, said in a press release. “This could include self-care, follow-up care or triage, especially when someone doesn’t have convenient access to a clinic,”
“It’s exciting to see academic communities working on new algorithmic approaches to address this with devices that people have in their homes,” said Patel, who has been working on several projects to turn the smartphone into a health monitoring device and co-founded the mHealth company Senosis Health, which was acquired by Google after spinning out of the university..
The data requires 18 seconds of video, so that the technology can take into account the user’s age, skin color, facial hair and any other background issues that may affect the readings.
“Every person is different,” Xin Liu, a UW doctoral student in the Paul G. Allen School of Computer Science & Engineering and lead author on the project, said in the press release. “So this system needs to be able to quickly adapt to each person’s unique physiological signature, and separate this from other variations, such as what they look like and what environment they are in.”
“Machine learning is pretty good at classifying images,” he added. “If you give it a series of photos of cats and then tell it to find cats in other images, it can do it. But for machine learning to be helpful in remote health sensing, we need a system that can identify the region of interest in a video that holds the strongest source of physiological information — pulse, for example — and then measure that over time.”
Connected health researchers have been working for years on technology to better capture vital information from patients. In 2016, a hospital in Scotland tested out a specialized camera and AI software platform that could measure a patient’s blood rate and pulse oxygen levels. And in 2019, researchers in the US and Canada used an mHealth platform to analyze two-minute videos from an iPhone to measure blood pressure.
The project, funded by the Bill & Melinda Gates Foundation, Google and the university, could lead to a connected health platform that can accurately and safely measure vital signs on a mobile device, storing that data on the device for security.
The research team unveiled the technology at a December conference on neural information processing systems and is presenting its finding this week at the Association for Computing Machinery’s (ACM’s) Conference on Health, Interference and Learning. They’re continuing to fine-tune the technology for more accuracy, particularly with regard to skin tone.
“We acknowledge that there is still a trend toward inferior performance when the subject’s skin type is darker,” Liu said. “This is in part because light reflects differently off of darker skin, resulting in a weaker signal for the camera to pick up. Our team is actively developing new methods to solve this limitation.”