Getty Images/iStockphoto

Purdue Project Uses an mHealth App to Measure Blood Hemoglobin

Researchers are developing an mHealth app that would allow providers to measure a patient's blood hemoglobin levels from a smartphone photograph of the inner eyelid.

Researchers at Purdue University are developing an mHealth app that can assess a patient’s blood hemoglobin levels through an image of the eyelid.

As explained in a recent post in The Optical Society’s OSA newsletter and in a YouTube video, engineers at the university are working on software that could be embedded into an app. Care providers would then be able to take a photograph of a patient’s inner eyelid – or even have the patient take the photo while at home – without the need to draw blood.

“This technology won’t replace a conventional blood test, but it gives a comparable hemoglobin count right away and is noninvasive and real-time,” Young Kim, an associate professor of biomedical engineering at Purdue, said in a story published by the university. “Depending on the hospital setting, it can take a few hours to get results from a blood test. Some situations also may require multiple blood tests, which lead to more blood loss.”

Once perfected, the app would use super-resolution spectroscopy to convert smartphone photos, which are typically low-resolution, into high-resolution digital spectral signals, which would then be used to measure blood hemoglobin.

“The idea is to get a spectrum of colors using a simple photo,” added said Sang Mok Park, a Purdue PhD candidate in biomedical engineering. “Even if we had several photos with very similar redness, we can’t clearly see the difference. A spectrum gives us multiple data points, which increases chances of finding meaningful information highly correlated to blood hemoglobin level.”

The project, a collaboration between researchers at Purdue and the University of Indianapolis, the Vanderbilt University School of Medicine and Kenya’s Moi University School of Medicine, is one of many aimed at creating mHealth and telehealth tools that more easily – and less obtrusively – capture patient data, either in the care facility or remotely. This particular app would help providers assessing patients for issues like anemia, kidney damage, hemorrhages and blood diseases like sickle cell anemia.

Just last year, researchers in Canada presented results of a study in which they used transdermal optical imaging to analyze two-minute selfies taken with an iPhone to detect hemoglobin under the skin. And in Atlanta, researchers are working on an mHealth app that can detect anemia through images of a patient’s fingernails.

“The bottom line is that we have created a way for anyone to be able to screen themselves for anemia anytime, anywhere, without the need to draw blood,” Dr. Wilbur Lam, an associate professor of biomedical engineering and pediatrics at the Georgia Institute of Technology and Emory University, told Reuters in late 2018.

Those involved with the Purdue project are now working to refine the app so that it would account for smartphone image quality, as well as skin color and photos taken with a flash. They also want to create guidelines to help the person taking the photo.

They’re currently testing the app with cancer patients at Indiana University’s Melvin and Bren Simon Comprehensive Cancer Center, and are working with India’s Shrimad Rajchandra Hospital to perfect the app for frontline healthcare workers.

While the programs are promising, not all the work has been positive. In early 2018, UK researchers tested three different smartphone cameras and found that they aren’t calibrated at the same level, casting into question the reliability of clinical decisions made by analyzing a photograph or video.

“Camera manufacturers have their own autofocus algorithms and hardware specifications, and this means different cameras can produce different results for the same scene,” said Carles Otero, of the Vision and Eye Research Institute at Anglia Ruskin University’s School of Medicine. “It is important that clinicians bear this in mind.”

“Our results show that while the clinician’s subjective evaluation was not affected by different cameras, lighting conditions or optical magnifications, calibration of a smartphone’s camera is essential when extracting objective data from images,” he added. “This can affect both telemedicine and artificial intelligence applications.”

And in 2017, Australian researchers released a study in which they found that an mHealth platform using a smartphone camera wasn’t accurate enough for clinicians to adequately diagnose foot ulcers in patients living with diabetes.

“It is important for these negative outcomes to be reported, as mobile phone images are, in our experience, already widely used in daily clinical practice for the assessment of diabetic foot ulcers and wounds in general,” that study concluded. “Mobile phone images are often used in addition to verbal descriptions of diabetic foot ulcers when a patient, carer or home care nurse seeks remote assistance from a specialized team. And even though these images may tell more than the words used to describe the ulcer, the low diagnostic values found for both diagnosis of clinical characteristics and for treatment decisions are an important warning that caution is needed when clinicians remotely assess such images.”

Next Steps

Dig Deeper on Digital health apps

xtelligent Health IT and EHR
xtelligent Healthtech Security
xtelligent Rev Cycle Management
xtelligent Healthcare Payers
Close