The Quiet Miss: When a Chest X‑Ray Triage AI Overshadowed Clinical Judgment
A real-world example of Automation bias in action
Context
A busy urban hospital implemented an AI triage tool to pre-screen chest X‑rays and flag urgent findings, aiming to reduce backlog and speed up reporting. Radiologists were instructed to prioritize studies labeled 'high priority' by the system while still being responsible for final interpretations.
Situation
Within weeks, the AI system labeled a large share of chest X‑rays as 'no acute findings,' allowing radiologists to skim or defer full reads to manage workload. A senior radiologist, juggling a heavy shift and trusting the triage tool's high reported accuracy, accepted the 'clear' label on several studies without detailed re-evaluation.
The bias in action
Automation bias appeared as the radiologist gave disproportionate weight to the AI's 'no acute findings' output and reduced scrutiny of those images. The AI had been highly accurate in many prior cases, creating a reinforcement loop of trust; when it missed a subtle peripheral pulmonary nodule on a smoker's X‑ray, the clinician's reliance on the tool meant the mistake went unnoticed. Junior staff were reluctant to challenge the senior radiologist's rapid sign-off, especially because the interface prominently displayed the AI result. The result was a systematic lowering of vigilance for AI-cleared cases rather than active cross-checking.
Outcome
One patient with an early-stage lung cancer had their diagnosis delayed by three months because the initial X‑ray was signed off as 'no acute findings.' Over six months the hospital recorded additional missed or delayed detections on AI-cleared studies, prompting an internal review. The triage tool did reduce median reporting time for flagged urgent cases, but introduced a measurable increase in missed subtle pathology among the 'cleared' group.



