false
OasisLMS
Login
Catalog
Learning Lab: Intelligent or Just Artificial? – Th ...
Intelligent or Just Artificial mp4
Intelligent or Just Artificial mp4
Back to course
[Please upgrade your browser to play this video content]
Video Transcription
Video Summary
NAHQ hosted a Learning Lab on the emerging quality challenges of “smart machines” in healthcare, presented by quality consultant Ken Rohde. He explained how clinical decision-making has evolved from individual mental models to written algorithms, to software-based decision support, and now to machine learning systems that continuously adapt based on feedback. While algorithms can improve repeatability and safety, Rohde warned that software-driven systems can fail in high-stakes ways, citing the Therac-25 radiation overdoses and the Boeing 737 MAX crashes as examples of design, testing, training, and reporting breakdowns.<br /><br />Rohde emphasized that healthcare quality professionals don’t need to be software experts, but must provide oversight—especially through validation (ensuring a system meets user needs safely), incident reporting, and tools like FMEA and root cause analysis. He recommended asking whether vendors have structured design, integration/testing, and formal validation processes, even when products are “black boxes.” For machine learning, he highlighted risks such as poor data quality, model drift, and bias, and advised strengthening statistical literacy and partnerships with IT/biomed. In Q&A, he noted automation may replace some routine tasks, but increases the need for quality oversight.
Keywords
smart machines in healthcare
machine learning clinical decision support
healthcare quality oversight
software validation and verification
incident reporting systems
failure mode and effects analysis (FMEA)
root cause analysis (RCA)
model drift and algorithmic bias
Therac-25 and Boeing 737 MAX safety failures
×
Please select your language
1
English