Ace2020/istock via Getty Images

AI Approved for Adults May Aid Interpretation of Pediatric Chest Images

Artificial intelligence-based software approved to interpret adult chest radiographs may also achieve significant accuracy on images of pediatric patients over the age of 2, new research shows.

A study published in Scientific Reports earlier this month found that artificial intelligence (AI)-based software approved for use in interpreting adult chest radiographs achieved high performance on pediatric patient images that did not include cardiomegaly or patients under 2 years of age.

The researchers note that AI has been widely adopted for medical imaging on adult patients, but studies investigating the use of these technologies on children are more limited. However, such studies are critical, especially for pediatric chest radiography, because chest radiographs are often collected in daily clinical practice and are used to detect critical and emergent diseases.

Further, chest radiographs are even more important for children than adults because advanced imaging studies cannot be freely performed on pediatric patients, who are at risk for both pediatric-specific chest diseases and many of those that impact adults, according to researchers. Chest radiographs are also often interpreted in the emergency room by a non-specialist clinician, contributing to a clinical need for accurate AI software to serve as clinical decision support tools.

To start filling some of these research gaps, the authors of this study sought to evaluate whether AI-based lesion detection software that was developed and approved for adult chest radiographs could be used for pediatric chest radiographs. In doing so, they aimed to measure the clinical potential of an AI-based solution for pediatric chest radiographs and to identify the specific age groups for which the software needs further validation before being used in a clinical setting.

They began by collecting chest radiographs from patients under the age of 18 taken from March to May 2021. An AI-based lesion detection software then assessed each radiograph for the presence of nodules, consolidation, fibrosis, atelectasis, cardiomegaly, pleural effusion, pneumothorax, and pneumoperitoneum. The AI’s results were then compared to that of a pediatric radiologist.

Of the 2,273 chest radiographs evaluated, 15.3 percent had positive lesion results according to the radiologist.

When the researchers included all eight types of detectable lesions, the AI assessed 433 radiographs as positive and 1,840 radiographs as negative. Of the 433 that were positive, the radiologist assessed 171 as negative, indicating that the AI had provided false-positive results. When cardiomegaly was excluded from analysis, the AI-based software assessed 374 radiographs as positive and 1,899 as negative. Of the 374, the radiologist assessed 118 radiographs as negative, implying further false-positive results.

When the researchers did a subset analysis comparing age between patients with correct and incorrect diagnoses by the AI, they found that patients with incorrect diagnoses were significantly younger than those with correct diagnoses. Patients 2 years or younger made up 81.5 percent of those with incorrect diagnoses, indicating that age is a significant factor influencing the AI’s performance.

When researchers excluded cardiomegaly and patients under 2 from the analysis, the AI’s performance improved significantly, achieving an accuracy of 96.9 percent, compared to 89.5 when just excluding cardiomegaly and 87.5 percent when including all lesions and ages.

These findings indicate that comparable AI performance can be achieved in older pediatric and adult patients. But more research evaluating younger children and specific diseases is needed to develop and utilize AI lesion detection software for pediatric chest radiographs in clinical settings, the researchers stated.

Next Steps

Dig Deeper on Artificial intelligence in healthcare