Enterprise AI Analysis
Responsible Artificial Intelligence governance in oncology
The use of Artificial Intelligence (AI) in healthcare is expanding rapidly, including in oncology. Although generic AI development and implementation frameworks exist in healthcare, no effective governance models have been reported in oncology. Our study reports on a Comprehensive Cancer Center's Responsible AI governance model for clinical, operations, and research programs. We report our one-year AI Governance Committee results with respect to the registration and monitoring of 26 AI models (including large language models), 2 ambient AI pilots, and a review of 33 nomograms. Novel management tools for AI governance are shared, including an overall program model, model information sheet, risk assessment tool, and lifecycle management tool. Two AI model case studies illustrate lessons learned and our "Express Pass" methodology for select models. Open research questions are explored. To the best of our knowledge, this is one of the first published reports on Responsible AI governance at scale in oncology.
Executive Impact at a Glance
Understand the immediate, quantifiable benefits and scope of this AI governance model's implementation in oncology.
Deep Analysis & Enterprise Applications
Select a topic to dive deeper, then explore the specific findings from the research, rebuilt as interactive, enterprise-focused modules.
Overall AI Program Structure
The AI Task Force (AITF) developed an overall AI program framework, identifying 4 main challenges: high-quality data, high-performance computing, AI talent capacity, and policies/procedures. The AITF identified 87 active projects (76% research, 17% clinical, 7% operations) across 9 domains, leading to the establishment of the AI Governance Committee (AIGC).
MSK Enterprise AI Program Structure
This structure is supported by Adoption, Training, Innovation; AI Skills & Services; LLMs/AI Models; Computing; and Data Foundation.
AI Governance Committee - An "Embedded" Model
The AIGC is embedded within the enterprise digital governance structure, aligning with existing review processes to avoid stifling innovation and duplication. Its goals include promoting Responsible AI, developing guidelines, strategic alignment, and advising workgroups. Key collaborations include IRB and Research Data Governance.
AIGC Goals & Activities
| Goals of Committee | Activities |
|---|---|
|
|
MSK's AI Lifecycle Activity Monitoring (iLEAP)
The iLEAP model (Legal, Ethics, Adoption, Performance) guides AI models through research, home-grown build, and acquired paths. Research models are governed by IRB but if translated to clinical/operational use, AIGC review begins at G3. The model includes explicit decision gates (G1-G5) and an "Express Pass" for rapid deployment.
iLEAP - AI Model Lifecycle Management Operating Model
This simplified flow represents the development path. Research and Buy paths have different entry points.
Key Outputs & Criteria for iLEAP Gates
Each gate in the iLEAP model has specific outputs required from the Development Team and reviews by the AIGC. The "Express Pass" facilitates faster deployment for models meeting specific criteria, such as strategic alignment, engaged sponsor, small project scope, existing core platform usage, and readiness of deployment teams.
Case Study: Acquired FDA-Approved Breast AI Model
Radiology Department - Breast Cancer Detection AI
Challenge: Identify an FDA-approved AI model to assist radiologists in triaging mammograms for breast cancer detection.
Solution: A 3rd-party vendor's image-based AI model was selected. Radiology requested an "Express Pass" review due to its FDA approval and integration with existing PACS.
AIGC Review: The AIGC reviewed the idea proposal, risk assessment (scored "low"), and post-go-live monitoring plan (leveraging Radiology's QA program). Human-in-the-loop with interpreting radiologist was a key mitigation measure.
Outcome: Approved for live deployment with ongoing G5 monitoring. Review time: 2 weeks. Successfully deployed by MSK's Application Services.
Case Study: Internally Built Brain Metastasis Segmentation Model
Medical Physics Department - Brain Metastasis Segmentation
Challenge: Develop an in-house AI model for brain metastasis tumor segmentation using MRI images, building on existing AI development expertise.
Solution: The Medical Physics department requested an "Express Pass" review for their internally developed model, which would facilitate patient treatment planning.
AIGC Review: The AIGC reviewed the proposal, considering the department's strong QA program for software deployment. The risk assessment was "medium" due to direct use with live patients, but mitigated by human-in-the-loop oversight.
Outcome: Approved for live deployment with ongoing G5 monitoring. Review time: 2 weeks. Medical Physics department will deploy the model.
Calculate Your Potential AI ROI
Estimate the efficiency gains and cost savings your enterprise could achieve with a well-governed AI implementation.
These are estimates based on industry averages and your inputs. Actual results may vary.
Our Phased Implementation Roadmap
Our proven process for integrating Responsible AI governance into your enterprise, tailored for oncology's unique needs.
Phase 1: Strategic Assessment & Framework Design
Comprehensive review of existing AI initiatives, ethical principles, and regulatory landscape. Design of custom governance framework and identification of key stakeholders.
Phase 2: Governance Model Development & Tooling
Development of the AI Governance Committee (AIGC) charter, iLEAP lifecycle model, risk assessment tools, and model registry. Integration with existing IT governance.
Phase 3: Pilot Implementation & "Express Pass" Rollout
Pilot AI models through the new governance process, refining workflows and criteria. Establish "Express Pass" for approved models to accelerate deployment.
Phase 4: Scaling & Continuous Monitoring
Expand governance to a broader portfolio of AI models. Implement algorithmovigilance for post-deployment monitoring, performance drift detection, and safety. Foster a culture of Responsible AI.
Ready to Implement Responsible AI?
Our experts are ready to guide your oncology center through the complexities of AI governance and ensure safe, effective, and ethical AI deployment.