Two X-ray images show a patient’s diseased lungs. Using an artificial intelligence program developed by Albert Hsiao, M.D., and his colleagues at UC San Diego Health system, the image on the right has been dotted with spots of color indicating where there may be lung damage or other signs of pneumonia. (Image courtesy of Albert Hsiao)
Albert Hsiao, M.D., and his colleagues at the University of California, San Diego (USCD) health system had been working for 18 months on an artificial intelligence program designed to help doctors identify pneumonia on a chest X-ray. When the coronavirus hit the U.S., they decided to see what it could do.
The researchers quickly deployed the application, which dots X-ray images with spots of color where there may be lung damage or other signs of pneumonia. It has now been applied to more than 6,000 chest X-rays, and it’s providing some value in diagnosis, said Hsiao, director of UCSD’s augmented imaging and artificial intelligence data analytics laboratory.
His team is one of several around the country that has pushed AI programs developed in a calmer time into the COVID-19 crisis to perform tasks like deciding which patients face the greatest risk of complications and which can be safely channeled into lower-intensity care.
The machine-learning programs scroll through millions of pieces of data to detect patterns that may be hard for clinicians to discern. Yet few of the algorithms have been rigorously tested against standard procedures. So while they often appear helpful, rolling out the programs in the midst of a pandemic could be confusing to doctors or even dangerous for patients, some AI experts warn.
“AI is being used for things that are questionable right now,” said Eric Topol, M.D., director of the Scripps Research Translational Institute and author of several books on health IT.
Topol singled out a system created by Epic, a major vendor of electronic health record software, that predicts which coronavirus patients may become critically ill. Using the tool before it has been validated is “pandemic exceptionalism,” he said.
Epic said the company’s model had been validated with data from more 16,000 hospitalized COVID-19 patients in 21 healthcare organizations. No research on the tool has been published, but, in any case, it was “developed to help clinicians make treatment decisions and is not a substitute for their judgment,” said James Hickman, a software developer on Epic’s cognitive computing team.
Others see the COVID-19 crisis as an opportunity to learn about the value of AI tools.
“My intuition is it’s a little bit of the good, bad and ugly,” said Eric Perakslis, Ph.D., a data science fellow at Duke University and former chief information officer at the FDA. “Research in this setting is important.”
Originally published by
Ashley Gold, Kaiser Health News | May 22, 2020