User login
How AI is, or will soon be, relevant in radiation oncology
Artificial intelligence (AI) is impacting many aspects of health care, and radiation oncology is no exception. It has the potential to cut costs and streamline work flows ranging from image analysis to treatment plan formulation, but its specific place in clinical practice is still being debated.
In a session at the annual meeting of the American Society for Radiation Oncology, researchers discussed some of the ways that AI is or will soon be relevant to the clinic. The general consensus was that
In his talk, Sanjay Aneja, MD focused on practical applications of AI that are in the clinic or close to being ready. One example is image classification. “There has been recent evidence that suggests in a variety of different kind of scenarios, deep-learning models can be very good at image classification in automated ways,” said Dr. Aneja, who is a professor of radiology at Yale University, New Haven, Conn. He described one study that used AI to classify 14 different pathologies on chest x-ray images.
Dr. Aneja described the open-source nnU-net tool, which automatically configures itself and segments biomedical images for research or clinical purposes, including therapy planning support, intraoperative support, and tumor growth monitoring. The researchers who developed it also created a “recipe” to systematize configuration of nnU-net, making it useful as an out-of-the-box tool for image segmentation.
He predicted that AI will improve radiology oncology by assisting in the determination of disease extent, including microscopic areas of disease. It could also help plan treatment volume and monitor treatment response. “I think that these are the types of things that will be moving toward the clinic in the future; very specific applications and models trained on very specific scenarios that will help us answer a very important clinical question,” Dr. Aneja said.
He expects AI to contribute to auto-segmenting and clinical contouring, “but I will caution everyone that these algorithms have not been proven to be better than physician contours. They very frequently fail in the specific use cases when anatomy is distorted by, I don’t know, say a tumor. And so a lot of times, we don’t actually have the ability to just make it an automated process. I think it’ll be something that physicians will use to help them but not necessarily replace their contouring ability,” Dr. Aneja said.
Another, potentially more useful application, is in adaptive radiation planning. “I think that AI auto-contouring will be very helpful in establishing contours in a situation in which a physician doing them would not be feasible. We need to have nimble and computationally efficient auto segmentation algorithms that will be able to be easily deployed at the linear accelerator,” he said.
AI in pathology and treatment selection
In another talk, Osama Mohamad, MD talked about AI in pathology, and specifically treatment selection. He described research from his group that digitized pathology data from 5,500 patients drawn from five randomized, clinical trials. They used AI on data from four of the clinical trials to identify a prognostic biomarker for distant metastasis, then validated it on data from the remaining clinical trial, which compared radiation versus radiation plus short-term hormone therapy in prostate cancer.
The results suggested that most patients should receive hormone therapy, but the AI suggested a more nuanced answer. “Patients who had AI biomarker negative do not see any benefit from adding 4 months of hormone therapy ... whereas patients who have biomarker positive have significant difference and improvement in distant metastasis at 10 years and 15 years. This means that we can save a significant proportion of patients from getting [androgen deprivation therapy], which is hormonal therapy and has very well-known side effects, because they simply they will not benefit,” said Dr. Mohamad, who is an assistant professor of radiation oncology at University of California, San Francisco.
That study relied on the ArteraAI prostate cancer test, which is available through a Clinical Laboratory Improvement Amendment–certified laboratory in Florida.
Another example of AI used to plan treatment is On-line Real-time Benchmarking Informatics Technology for Radiotherapy (ORBIT-RT), developed at the University of California, San Diego. It focuses on radiotherapy treatment plan quality control, and has two main components: creating clinically validated plan routines and a free radiotherapy plan quality control system.
No matter how impressive the technical advances may be, AI contributions won’t impact clinical practice if radiation oncologists, physicians, and patients don’t accept AI. Dr. Aneja’s group surveyed patients about which health field they would feel more comfortable with AI having an important role. Most said they were extremely uncomfortable when it came to cancer. “Now, does that mean that we can’t use AI in oncology? No, I think it just means that we have to be a little bit more nuanced in our approach and how we develop AI solutions for cancer patients,” Dr. Aneja said.
Physicians also show reluctance, according to Alejandro Berlin, MD, who is an affiliate scientist at Princess Margaret Cancer Centre in Toronto. He discussed some research looking at physician acceptance of machine learning. His group looked at physician acceptance of treatment plans for prostate cancer that were generated by physicians and in parallel by machine learning. In a theoretical phase, physicians generally agreed that the machine learning plans were better, but when it came to a phase of the study in which physicians chose which plan to implement in a real patient, the acceptance of machine learning-generated plans dropped by 20%.
This tendency to trust humans over machines is what Dr. Berlin called “automation bias,” and he called for a more collaborative approach to implement AI. “In some cases, [machine learning] is going to be good and sufficient. And in some cases, you will need the expertise of a human.”
Dr. Aneja, who also moderated the session, expressed a similar sentiment when summing up the day’s talks: “I do feel like it’s a disruptive technology ... but I think there will still be a need for us to have people who are trained in order to evaluate and make sure that these algorithms are working correctly and efficiently.”
Dr. Aneja, Dr. Mohamad, and Dr. Berlin have no relevant financial disclosures.
* This article was updated on Nov. 15, 2022.
Artificial intelligence (AI) is impacting many aspects of health care, and radiation oncology is no exception. It has the potential to cut costs and streamline work flows ranging from image analysis to treatment plan formulation, but its specific place in clinical practice is still being debated.
In a session at the annual meeting of the American Society for Radiation Oncology, researchers discussed some of the ways that AI is or will soon be relevant to the clinic. The general consensus was that
In his talk, Sanjay Aneja, MD focused on practical applications of AI that are in the clinic or close to being ready. One example is image classification. “There has been recent evidence that suggests in a variety of different kind of scenarios, deep-learning models can be very good at image classification in automated ways,” said Dr. Aneja, who is a professor of radiology at Yale University, New Haven, Conn. He described one study that used AI to classify 14 different pathologies on chest x-ray images.
Dr. Aneja described the open-source nnU-net tool, which automatically configures itself and segments biomedical images for research or clinical purposes, including therapy planning support, intraoperative support, and tumor growth monitoring. The researchers who developed it also created a “recipe” to systematize configuration of nnU-net, making it useful as an out-of-the-box tool for image segmentation.
He predicted that AI will improve radiology oncology by assisting in the determination of disease extent, including microscopic areas of disease. It could also help plan treatment volume and monitor treatment response. “I think that these are the types of things that will be moving toward the clinic in the future; very specific applications and models trained on very specific scenarios that will help us answer a very important clinical question,” Dr. Aneja said.
He expects AI to contribute to auto-segmenting and clinical contouring, “but I will caution everyone that these algorithms have not been proven to be better than physician contours. They very frequently fail in the specific use cases when anatomy is distorted by, I don’t know, say a tumor. And so a lot of times, we don’t actually have the ability to just make it an automated process. I think it’ll be something that physicians will use to help them but not necessarily replace their contouring ability,” Dr. Aneja said.
Another, potentially more useful application, is in adaptive radiation planning. “I think that AI auto-contouring will be very helpful in establishing contours in a situation in which a physician doing them would not be feasible. We need to have nimble and computationally efficient auto segmentation algorithms that will be able to be easily deployed at the linear accelerator,” he said.
AI in pathology and treatment selection
In another talk, Osama Mohamad, MD talked about AI in pathology, and specifically treatment selection. He described research from his group that digitized pathology data from 5,500 patients drawn from five randomized, clinical trials. They used AI on data from four of the clinical trials to identify a prognostic biomarker for distant metastasis, then validated it on data from the remaining clinical trial, which compared radiation versus radiation plus short-term hormone therapy in prostate cancer.
The results suggested that most patients should receive hormone therapy, but the AI suggested a more nuanced answer. “Patients who had AI biomarker negative do not see any benefit from adding 4 months of hormone therapy ... whereas patients who have biomarker positive have significant difference and improvement in distant metastasis at 10 years and 15 years. This means that we can save a significant proportion of patients from getting [androgen deprivation therapy], which is hormonal therapy and has very well-known side effects, because they simply they will not benefit,” said Dr. Mohamad, who is an assistant professor of radiation oncology at University of California, San Francisco.
That study relied on the ArteraAI prostate cancer test, which is available through a Clinical Laboratory Improvement Amendment–certified laboratory in Florida.
Another example of AI used to plan treatment is On-line Real-time Benchmarking Informatics Technology for Radiotherapy (ORBIT-RT), developed at the University of California, San Diego. It focuses on radiotherapy treatment plan quality control, and has two main components: creating clinically validated plan routines and a free radiotherapy plan quality control system.
No matter how impressive the technical advances may be, AI contributions won’t impact clinical practice if radiation oncologists, physicians, and patients don’t accept AI. Dr. Aneja’s group surveyed patients about which health field they would feel more comfortable with AI having an important role. Most said they were extremely uncomfortable when it came to cancer. “Now, does that mean that we can’t use AI in oncology? No, I think it just means that we have to be a little bit more nuanced in our approach and how we develop AI solutions for cancer patients,” Dr. Aneja said.
Physicians also show reluctance, according to Alejandro Berlin, MD, who is an affiliate scientist at Princess Margaret Cancer Centre in Toronto. He discussed some research looking at physician acceptance of machine learning. His group looked at physician acceptance of treatment plans for prostate cancer that were generated by physicians and in parallel by machine learning. In a theoretical phase, physicians generally agreed that the machine learning plans were better, but when it came to a phase of the study in which physicians chose which plan to implement in a real patient, the acceptance of machine learning-generated plans dropped by 20%.
This tendency to trust humans over machines is what Dr. Berlin called “automation bias,” and he called for a more collaborative approach to implement AI. “In some cases, [machine learning] is going to be good and sufficient. And in some cases, you will need the expertise of a human.”
Dr. Aneja, who also moderated the session, expressed a similar sentiment when summing up the day’s talks: “I do feel like it’s a disruptive technology ... but I think there will still be a need for us to have people who are trained in order to evaluate and make sure that these algorithms are working correctly and efficiently.”
Dr. Aneja, Dr. Mohamad, and Dr. Berlin have no relevant financial disclosures.
* This article was updated on Nov. 15, 2022.
Artificial intelligence (AI) is impacting many aspects of health care, and radiation oncology is no exception. It has the potential to cut costs and streamline work flows ranging from image analysis to treatment plan formulation, but its specific place in clinical practice is still being debated.
In a session at the annual meeting of the American Society for Radiation Oncology, researchers discussed some of the ways that AI is or will soon be relevant to the clinic. The general consensus was that
In his talk, Sanjay Aneja, MD focused on practical applications of AI that are in the clinic or close to being ready. One example is image classification. “There has been recent evidence that suggests in a variety of different kind of scenarios, deep-learning models can be very good at image classification in automated ways,” said Dr. Aneja, who is a professor of radiology at Yale University, New Haven, Conn. He described one study that used AI to classify 14 different pathologies on chest x-ray images.
Dr. Aneja described the open-source nnU-net tool, which automatically configures itself and segments biomedical images for research or clinical purposes, including therapy planning support, intraoperative support, and tumor growth monitoring. The researchers who developed it also created a “recipe” to systematize configuration of nnU-net, making it useful as an out-of-the-box tool for image segmentation.
He predicted that AI will improve radiology oncology by assisting in the determination of disease extent, including microscopic areas of disease. It could also help plan treatment volume and monitor treatment response. “I think that these are the types of things that will be moving toward the clinic in the future; very specific applications and models trained on very specific scenarios that will help us answer a very important clinical question,” Dr. Aneja said.
He expects AI to contribute to auto-segmenting and clinical contouring, “but I will caution everyone that these algorithms have not been proven to be better than physician contours. They very frequently fail in the specific use cases when anatomy is distorted by, I don’t know, say a tumor. And so a lot of times, we don’t actually have the ability to just make it an automated process. I think it’ll be something that physicians will use to help them but not necessarily replace their contouring ability,” Dr. Aneja said.
Another, potentially more useful application, is in adaptive radiation planning. “I think that AI auto-contouring will be very helpful in establishing contours in a situation in which a physician doing them would not be feasible. We need to have nimble and computationally efficient auto segmentation algorithms that will be able to be easily deployed at the linear accelerator,” he said.
AI in pathology and treatment selection
In another talk, Osama Mohamad, MD talked about AI in pathology, and specifically treatment selection. He described research from his group that digitized pathology data from 5,500 patients drawn from five randomized, clinical trials. They used AI on data from four of the clinical trials to identify a prognostic biomarker for distant metastasis, then validated it on data from the remaining clinical trial, which compared radiation versus radiation plus short-term hormone therapy in prostate cancer.
The results suggested that most patients should receive hormone therapy, but the AI suggested a more nuanced answer. “Patients who had AI biomarker negative do not see any benefit from adding 4 months of hormone therapy ... whereas patients who have biomarker positive have significant difference and improvement in distant metastasis at 10 years and 15 years. This means that we can save a significant proportion of patients from getting [androgen deprivation therapy], which is hormonal therapy and has very well-known side effects, because they simply they will not benefit,” said Dr. Mohamad, who is an assistant professor of radiation oncology at University of California, San Francisco.
That study relied on the ArteraAI prostate cancer test, which is available through a Clinical Laboratory Improvement Amendment–certified laboratory in Florida.
Another example of AI used to plan treatment is On-line Real-time Benchmarking Informatics Technology for Radiotherapy (ORBIT-RT), developed at the University of California, San Diego. It focuses on radiotherapy treatment plan quality control, and has two main components: creating clinically validated plan routines and a free radiotherapy plan quality control system.
No matter how impressive the technical advances may be, AI contributions won’t impact clinical practice if radiation oncologists, physicians, and patients don’t accept AI. Dr. Aneja’s group surveyed patients about which health field they would feel more comfortable with AI having an important role. Most said they were extremely uncomfortable when it came to cancer. “Now, does that mean that we can’t use AI in oncology? No, I think it just means that we have to be a little bit more nuanced in our approach and how we develop AI solutions for cancer patients,” Dr. Aneja said.
Physicians also show reluctance, according to Alejandro Berlin, MD, who is an affiliate scientist at Princess Margaret Cancer Centre in Toronto. He discussed some research looking at physician acceptance of machine learning. His group looked at physician acceptance of treatment plans for prostate cancer that were generated by physicians and in parallel by machine learning. In a theoretical phase, physicians generally agreed that the machine learning plans were better, but when it came to a phase of the study in which physicians chose which plan to implement in a real patient, the acceptance of machine learning-generated plans dropped by 20%.
This tendency to trust humans over machines is what Dr. Berlin called “automation bias,” and he called for a more collaborative approach to implement AI. “In some cases, [machine learning] is going to be good and sufficient. And in some cases, you will need the expertise of a human.”
Dr. Aneja, who also moderated the session, expressed a similar sentiment when summing up the day’s talks: “I do feel like it’s a disruptive technology ... but I think there will still be a need for us to have people who are trained in order to evaluate and make sure that these algorithms are working correctly and efficiently.”
Dr. Aneja, Dr. Mohamad, and Dr. Berlin have no relevant financial disclosures.
* This article was updated on Nov. 15, 2022.
FROM ASTRO 2022