Applying Artificial Intelligence to Address the Knowledge Gaps in Cancer Care

George Simon, Courtney D. DiNardo, Koichi Takahashi, Tina Cascone, Cynthia Powers, Rick Stevens, Joshua Allen, Mara B. Antonoff, Daniel Gomez, Pat Keane, Fernando Suarez Saiz, Quynh Nguyen, Emily Roarty, Sherry Pierce, Jianjun Zhang, Emily Hardeman Barnhill, Kate Lakhani, Kenna Shaw, Brett Smith, Stephen SwisherRob High, P. Andrew Futreal, John Heymach, Lynda Chin

Research output: Contribution to journalArticle

3 Citations (Scopus)

Abstract

Background: Rapid advances in science challenge the timely adoption of evidence-based care in community settings. To bridge the gap between what is possible and what is practiced, we researched approaches to developing an artificial intelligence (AI) application that can provide real-time patient-specific decision support. Materials and Methods: The Oncology Expert Advisor (OEA) was designed to simulate peer-to-peer consultation with three core functions: patient history summarization, treatment options recommendation, and management advisory. Machine-learning algorithms were trained to construct a dynamic summary of patients cancer history and to suggest approved therapy or investigative trial options. All patient data used were retrospectively accrued. Ground truth was established for approximately 1,000 unique patients. The full Medline database of more than 23 million published abstracts was used as the literature corpus. Results: OEA's accuracies of searching disparate sources within electronic medical records to extract complex clinical concepts from unstructured text documents varied, with F1 scores of 90%–96% for non-time-dependent concepts (e.g., diagnosis) and F1 scores of 63%–65% for time-dependent concepts (e.g., therapy history timeline). Based on constructed patient profiles, OEA suggests approved therapy options linked to supporting evidence (99.9% recall; 88% precision), and screens for eligible clinical trials on ClinicalTrials.gov (97.9% recall; 96.9% precision). Conclusion: Our results demonstrated technical feasibility of an AI-powered application to construct longitudinal patient profiles in context and to suggest evidence-based treatment and trial options. Our experience highlighted the necessity of collaboration across clinical and AI domains, and the requirement of clinical expertise throughout the process, from design to training to testing. Implications for Practice: Artificial intelligence (AI)-powered digital advisors such as the Oncology Expert Advisor have the potential to augment the capacity and update the knowledge base of practicing oncologists. By constructing dynamic patient profiles from disparate data sources and organizing and vetting vast literature for relevance to a specific patient, such AI applications could empower oncologists to consider all therapy options based on the latest scientific evidence for their patients, and help them spend less time on information “hunting and gathering” and more time with the patients. However, realization of this will require not only AI technology maturation but also active participation and leadership by clincial experts.

Original languageEnglish (US)
Pages (from-to)772-782
Number of pages11
JournalOncologist
Volume24
Issue number6
DOIs
StatePublished - Jun 1 2019

Fingerprint

Artificial Intelligence
Neoplasms
Therapeutics
Knowledge Bases
Electronic Health Records
Information Storage and Retrieval
Complex Mixtures
Referral and Consultation
History
Clinical Trials
Databases
Technology

Keywords

  • Artificial intelligence application in medicine
  • Clinical decision support
  • Closing the cancer care gap
  • Democratization of evidence-based care
  • Virtual expert advisor

ASJC Scopus subject areas

  • Oncology
  • Cancer Research

Cite this

Simon, G., DiNardo, C. D., Takahashi, K., Cascone, T., Powers, C., Stevens, R., ... Chin, L. (2019). Applying Artificial Intelligence to Address the Knowledge Gaps in Cancer Care. Oncologist, 24(6), 772-782. https://doi.org/10.1634/theoncologist.2018-0257

Applying Artificial Intelligence to Address the Knowledge Gaps in Cancer Care. / Simon, George; DiNardo, Courtney D.; Takahashi, Koichi; Cascone, Tina; Powers, Cynthia; Stevens, Rick; Allen, Joshua; Antonoff, Mara B.; Gomez, Daniel; Keane, Pat; Suarez Saiz, Fernando; Nguyen, Quynh; Roarty, Emily; Pierce, Sherry; Zhang, Jianjun; Hardeman Barnhill, Emily; Lakhani, Kate; Shaw, Kenna; Smith, Brett; Swisher, Stephen; High, Rob; Futreal, P. Andrew; Heymach, John; Chin, Lynda.

In: Oncologist, Vol. 24, No. 6, 01.06.2019, p. 772-782.

Research output: Contribution to journalArticle

Simon, G, DiNardo, CD, Takahashi, K, Cascone, T, Powers, C, Stevens, R, Allen, J, Antonoff, MB, Gomez, D, Keane, P, Suarez Saiz, F, Nguyen, Q, Roarty, E, Pierce, S, Zhang, J, Hardeman Barnhill, E, Lakhani, K, Shaw, K, Smith, B, Swisher, S, High, R, Futreal, PA, Heymach, J & Chin, L 2019, 'Applying Artificial Intelligence to Address the Knowledge Gaps in Cancer Care', Oncologist, vol. 24, no. 6, pp. 772-782. https://doi.org/10.1634/theoncologist.2018-0257
Simon G, DiNardo CD, Takahashi K, Cascone T, Powers C, Stevens R et al. Applying Artificial Intelligence to Address the Knowledge Gaps in Cancer Care. Oncologist. 2019 Jun 1;24(6):772-782. https://doi.org/10.1634/theoncologist.2018-0257
Simon, George ; DiNardo, Courtney D. ; Takahashi, Koichi ; Cascone, Tina ; Powers, Cynthia ; Stevens, Rick ; Allen, Joshua ; Antonoff, Mara B. ; Gomez, Daniel ; Keane, Pat ; Suarez Saiz, Fernando ; Nguyen, Quynh ; Roarty, Emily ; Pierce, Sherry ; Zhang, Jianjun ; Hardeman Barnhill, Emily ; Lakhani, Kate ; Shaw, Kenna ; Smith, Brett ; Swisher, Stephen ; High, Rob ; Futreal, P. Andrew ; Heymach, John ; Chin, Lynda. / Applying Artificial Intelligence to Address the Knowledge Gaps in Cancer Care. In: Oncologist. 2019 ; Vol. 24, No. 6. pp. 772-782.
@article{1a2b561fd6ba450e9435d83a1682c662,
title = "Applying Artificial Intelligence to Address the Knowledge Gaps in Cancer Care",
abstract = "Background: Rapid advances in science challenge the timely adoption of evidence-based care in community settings. To bridge the gap between what is possible and what is practiced, we researched approaches to developing an artificial intelligence (AI) application that can provide real-time patient-specific decision support. Materials and Methods: The Oncology Expert Advisor (OEA) was designed to simulate peer-to-peer consultation with three core functions: patient history summarization, treatment options recommendation, and management advisory. Machine-learning algorithms were trained to construct a dynamic summary of patients cancer history and to suggest approved therapy or investigative trial options. All patient data used were retrospectively accrued. Ground truth was established for approximately 1,000 unique patients. The full Medline database of more than 23 million published abstracts was used as the literature corpus. Results: OEA's accuracies of searching disparate sources within electronic medical records to extract complex clinical concepts from unstructured text documents varied, with F1 scores of 90{\%}–96{\%} for non-time-dependent concepts (e.g., diagnosis) and F1 scores of 63{\%}–65{\%} for time-dependent concepts (e.g., therapy history timeline). Based on constructed patient profiles, OEA suggests approved therapy options linked to supporting evidence (99.9{\%} recall; 88{\%} precision), and screens for eligible clinical trials on ClinicalTrials.gov (97.9{\%} recall; 96.9{\%} precision). Conclusion: Our results demonstrated technical feasibility of an AI-powered application to construct longitudinal patient profiles in context and to suggest evidence-based treatment and trial options. Our experience highlighted the necessity of collaboration across clinical and AI domains, and the requirement of clinical expertise throughout the process, from design to training to testing. Implications for Practice: Artificial intelligence (AI)-powered digital advisors such as the Oncology Expert Advisor have the potential to augment the capacity and update the knowledge base of practicing oncologists. By constructing dynamic patient profiles from disparate data sources and organizing and vetting vast literature for relevance to a specific patient, such AI applications could empower oncologists to consider all therapy options based on the latest scientific evidence for their patients, and help them spend less time on information “hunting and gathering” and more time with the patients. However, realization of this will require not only AI technology maturation but also active participation and leadership by clincial experts.",
keywords = "Artificial intelligence application in medicine, Clinical decision support, Closing the cancer care gap, Democratization of evidence-based care, Virtual expert advisor",
author = "George Simon and DiNardo, {Courtney D.} and Koichi Takahashi and Tina Cascone and Cynthia Powers and Rick Stevens and Joshua Allen and Antonoff, {Mara B.} and Daniel Gomez and Pat Keane and {Suarez Saiz}, Fernando and Quynh Nguyen and Emily Roarty and Sherry Pierce and Jianjun Zhang and {Hardeman Barnhill}, Emily and Kate Lakhani and Kenna Shaw and Brett Smith and Stephen Swisher and Rob High and Futreal, {P. Andrew} and John Heymach and Lynda Chin",
year = "2019",
month = "6",
day = "1",
doi = "10.1634/theoncologist.2018-0257",
language = "English (US)",
volume = "24",
pages = "772--782",
journal = "Oncologist",
issn = "1083-7159",
publisher = "AlphaMed Press",
number = "6",

}

TY - JOUR

T1 - Applying Artificial Intelligence to Address the Knowledge Gaps in Cancer Care

AU - Simon, George

AU - DiNardo, Courtney D.

AU - Takahashi, Koichi

AU - Cascone, Tina

AU - Powers, Cynthia

AU - Stevens, Rick

AU - Allen, Joshua

AU - Antonoff, Mara B.

AU - Gomez, Daniel

AU - Keane, Pat

AU - Suarez Saiz, Fernando

AU - Nguyen, Quynh

AU - Roarty, Emily

AU - Pierce, Sherry

AU - Zhang, Jianjun

AU - Hardeman Barnhill, Emily

AU - Lakhani, Kate

AU - Shaw, Kenna

AU - Smith, Brett

AU - Swisher, Stephen

AU - High, Rob

AU - Futreal, P. Andrew

AU - Heymach, John

AU - Chin, Lynda

PY - 2019/6/1

Y1 - 2019/6/1

N2 - Background: Rapid advances in science challenge the timely adoption of evidence-based care in community settings. To bridge the gap between what is possible and what is practiced, we researched approaches to developing an artificial intelligence (AI) application that can provide real-time patient-specific decision support. Materials and Methods: The Oncology Expert Advisor (OEA) was designed to simulate peer-to-peer consultation with three core functions: patient history summarization, treatment options recommendation, and management advisory. Machine-learning algorithms were trained to construct a dynamic summary of patients cancer history and to suggest approved therapy or investigative trial options. All patient data used were retrospectively accrued. Ground truth was established for approximately 1,000 unique patients. The full Medline database of more than 23 million published abstracts was used as the literature corpus. Results: OEA's accuracies of searching disparate sources within electronic medical records to extract complex clinical concepts from unstructured text documents varied, with F1 scores of 90%–96% for non-time-dependent concepts (e.g., diagnosis) and F1 scores of 63%–65% for time-dependent concepts (e.g., therapy history timeline). Based on constructed patient profiles, OEA suggests approved therapy options linked to supporting evidence (99.9% recall; 88% precision), and screens for eligible clinical trials on ClinicalTrials.gov (97.9% recall; 96.9% precision). Conclusion: Our results demonstrated technical feasibility of an AI-powered application to construct longitudinal patient profiles in context and to suggest evidence-based treatment and trial options. Our experience highlighted the necessity of collaboration across clinical and AI domains, and the requirement of clinical expertise throughout the process, from design to training to testing. Implications for Practice: Artificial intelligence (AI)-powered digital advisors such as the Oncology Expert Advisor have the potential to augment the capacity and update the knowledge base of practicing oncologists. By constructing dynamic patient profiles from disparate data sources and organizing and vetting vast literature for relevance to a specific patient, such AI applications could empower oncologists to consider all therapy options based on the latest scientific evidence for their patients, and help them spend less time on information “hunting and gathering” and more time with the patients. However, realization of this will require not only AI technology maturation but also active participation and leadership by clincial experts.

AB - Background: Rapid advances in science challenge the timely adoption of evidence-based care in community settings. To bridge the gap between what is possible and what is practiced, we researched approaches to developing an artificial intelligence (AI) application that can provide real-time patient-specific decision support. Materials and Methods: The Oncology Expert Advisor (OEA) was designed to simulate peer-to-peer consultation with three core functions: patient history summarization, treatment options recommendation, and management advisory. Machine-learning algorithms were trained to construct a dynamic summary of patients cancer history and to suggest approved therapy or investigative trial options. All patient data used were retrospectively accrued. Ground truth was established for approximately 1,000 unique patients. The full Medline database of more than 23 million published abstracts was used as the literature corpus. Results: OEA's accuracies of searching disparate sources within electronic medical records to extract complex clinical concepts from unstructured text documents varied, with F1 scores of 90%–96% for non-time-dependent concepts (e.g., diagnosis) and F1 scores of 63%–65% for time-dependent concepts (e.g., therapy history timeline). Based on constructed patient profiles, OEA suggests approved therapy options linked to supporting evidence (99.9% recall; 88% precision), and screens for eligible clinical trials on ClinicalTrials.gov (97.9% recall; 96.9% precision). Conclusion: Our results demonstrated technical feasibility of an AI-powered application to construct longitudinal patient profiles in context and to suggest evidence-based treatment and trial options. Our experience highlighted the necessity of collaboration across clinical and AI domains, and the requirement of clinical expertise throughout the process, from design to training to testing. Implications for Practice: Artificial intelligence (AI)-powered digital advisors such as the Oncology Expert Advisor have the potential to augment the capacity and update the knowledge base of practicing oncologists. By constructing dynamic patient profiles from disparate data sources and organizing and vetting vast literature for relevance to a specific patient, such AI applications could empower oncologists to consider all therapy options based on the latest scientific evidence for their patients, and help them spend less time on information “hunting and gathering” and more time with the patients. However, realization of this will require not only AI technology maturation but also active participation and leadership by clincial experts.

KW - Artificial intelligence application in medicine

KW - Clinical decision support

KW - Closing the cancer care gap

KW - Democratization of evidence-based care

KW - Virtual expert advisor

UR - http://www.scopus.com/inward/record.url?scp=85056751880&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85056751880&partnerID=8YFLogxK

U2 - 10.1634/theoncologist.2018-0257

DO - 10.1634/theoncologist.2018-0257

M3 - Article

C2 - 30446581

AN - SCOPUS:85056751880

VL - 24

SP - 772

EP - 782

JO - Oncologist

JF - Oncologist

SN - 1083-7159

IS - 6

ER -