During a presentation at Nvidia’s GTC Conference in San Jose, University of Chicago radiologist Paul Chang said the industry needs to “drill for gas and build roads” as part of the pathway for successful use of AI in healthcare.
Translation? AI technology needs to be able to plumb deep data sets (gas) and effectively integrate into the workflow (roads) to drive actual clinical adoption and transformation.
“To me a great machine deep learning algorithm is kind of like the best race car on the block,” he said. “But a car, no matter how fast, still needs gas and roads.”
Data availability – more specifically, the lack of wide ranging annotated data sets – still function as a major limiting factor for training and fine-tuning deep learning algorithms.
The Clara platform has added AI-assisted annotation, which helps automate the annotation and labeling process, increasing productivity for radiologists.
Through these AI-enabled tools, radiologists are given the ability to more easily annotate organs in a 3D MRI or CT scan image, instead of having to go through every single 2D slice of the organ.
Nvidia has pre-trained these models across 13 different organs and also allows for smart editing of organ boundaries, which can be used to adjust errors and improve accuracy.
“Can you imagine asking them to sit down for four hours and asking them to annotate that data? They’ll never be able to dedicate that amount of time,” said Kimberly Powell, the vice president of healthcare at Nvidia. “But if they can take 15 minutes and actually annotate a decent amount of data and then use that to start training their own algorithm, that’s powerful.”
The capability of letting users input their own annotated data and then adapt pre-trained models is known as Transfer Learning.
The idea of feeding internal and personalized data back into the algorithm overcomes some of the high degree of variance that exists across the healthcare system where different hospitals have different demographics, instruments and operations.
“If I’m a radiologist I want to know that the data that is going to be used for training is mine,” Powell said.
In order to actually make sure this technology works within individual healthcare systems, Nvidia has released a feature called Clara Deploy which enables organizations to drop in algorithms that connect with existing imaging software.
One potential model for how this might work has been piloted in collaboration with Ohio State University’s Wexler School of Medicine.
Nvidia is working with the university to develop the technical infrastructure to create a linked central repository for algorithms which can be automatically fed new radiological images.
“A lot of these algorithms – if they’re working on an operational problem – don’t need FDA approval, so they can go right from development to deployment and can be used in clinical practice no questions asked,” Powell said.
What ultimately necessary, according to Powell, is for stakeholders to collaborate in proving out the clinical value of algorithms, building the technological capabilities to use them and figuring out the delivery system to get the technology into the clinical space.
She mentioned collaborations with the NIH in helping to co-develop AI algorithms tools, as well as the Nvidia’s integration into GE Healthcare hardware as two key examples of how the company has approached the problem.
“I think that’s where you’ve got to be thoughtful about the solutions you bring and the partners that you have to make this as non-invasive as possible into the clinical practice,” Powell said.
Picture: Getty Images, wigglestick