mHealth Researchers Look to Make the Selfie a Health Resource
The ubiquitous smartphone selfie could soon be part of healthcare's mHealth toolkit, if researchers can fine-tune digital health technology that can draw valuable data from the photos and videos.
mHealth researchers are working on a digital health tool that will allow care providers to determine a patient’s blood pressure through a selfie.
While still very early in development, the mHealth technology could add to the diagnostic capabilities of mobile devices like a smartphone or tablet, giving clinicians more opportunities to collect data outside of the hospital or doctor’s office and in remote patient monitoring programs.
In a recent study, researchers in Canada and the US used transdermal optical imaging to analyze two-minute photographs taken by an iPhone camera to capture red light reflected from hemoglobin under the skin. Using those images to measure blood flow, they were able to accurately detect three types of blood pressure in 98 percent of some 1,328 participants.
The study was led in part by Kang Lee, a research chair in development neuroscience at the University of Toronto and a researcher at the University of San Diego who co-founded Nuralogix, a Toronto-based mHealth company developing AI tools for detecting facial blood flow. The company has one app on the market that measures resting heart rate and stress through facial recognition, and is reportedly working on a blood pressure app for distribution in China.
Lee and his colleagues say they hope to refine the technology to also measure blood glucose, cholesterol and hemoglobin.
Smartphone cameras have found a niche in telehealth, most notably in medication management. A Baltimore-based digital health company called emocha Mobile Health, spun out of Johns Hopkins, is seeing success using the technology in Video Directly Observed Therapy (VDOT) programs, in which patients take selfies as they take their prescribed medications and send those videos to their care providers.
The company is working on programs across the country, including at least two federally-funded projects to test the reliability of the mHealth platform in treatment for opioid abuse.
Earlier this year, healthcare providers in Belgium screened more than 60,000 people for atrial fibrillation (AF) through their smartphone cameras and identified roughly 800 in need of further treatment.
In what was billed as one of the largest population health screening programs ever attempted, the two-week program, led by researchers at Hasselt University in Belgium, invited participants to download an mHealth app through a QR code advertised throughout the country. More than 62,800 people downloaded the app, and were instructed to measure their heart rhythm at least twice daily for eight days.
“It’s fascinating to see the impact that PPG (pulse-plethysmography) technology is having on clinical practice,” Tine Proesmans, MASc, of Hasselt University’s Mobile Health Unit, said in a press release. “Using digital tools in a very structured way enables us to outpace any other traditional methodology to screen or prescreen patients and guide them into an appropriate care pathway.”
“Using digital technology only, we were able to reach a large population very quickly and collect clinically meaningful and actionable data, without the need for medical infrastructure,” she said. “Patients also benefit by having the flexibility to take measurements anytime, anywhere, and at a fraction of the cost.”
In Atlanta, meanwhile, researchers have been working on an mHealth app that would allow clinicians to detect anemia through photograph’s of a patient’s fingernails.
The app uses an algorithm, designed by Georgia Tech and Emory PhD student Robert Mannino, that scans a photograph of the user’s fingernails for pallor, of distinct coloration. The app then uses AI technology to match that image against a database of anemia scans.
“The bottom line is that we have created a way for anyone to be able to screen themselves for anemia anytime, anywhere, without the need to draw blood,” Dr. Wilbur Lam, an associate professor of biomedical engineering and pediatrics at the Georgia Institute of Technology and Emory University, told Reuters in late 2018.
While these programs show progress, not all the work has been positive. This past February, UK researchers tested three different smartphone cameras and found that they aren’t calibrated at the same level, casting into question the reliability of clinical decisions made by analyzing a photograph or video.
“Camera manufacturers have their own autofocus algorithms and hardware specifications, and this means different cameras can produce different results for the same scene,” said Carles Otero, of the Vision and Eye Research Institute at Anglia Ruskin University’s School of Medicine. “It is important that clinicians bear this in mind.”
“Our results show that while the clinician’s subjective evaluation was not affected by different cameras, lighting conditions or optical magnifications, calibration of a smartphone’s camera is essential when extracting objective data from images,” he added. “This can affect both telemedicine and artificial intelligence applications.”
And in 2017, Australian researchers released a study in which they found that an mHealth platform using a smartphone camera wasn’t accurate enough for clinicians to adequately diagnose foot ulcers in patients living with diabetes.
“It is important for these negative outcomes to be reported, as mobile phone images are, in our experience, already widely used in daily clinical practice for the assessment of diabetic foot ulcers and wounds in general,” that study concluded. “Mobile phone images are often used in addition to verbal descriptions of diabetic foot ulcers when a patient, carer or home care nurse seeks remote assistance from a specialized team. And even though these images may tell more than the words used to describe the ulcer, the low diagnostic values found for both diagnosis of clinical characteristics and for treatment decisions are an important warning that caution is needed when clinicians remotely assess such images.”