An AI Stethoscope Found 3.5 Times More Heart Disease. Then 70% of Doctors Stopped Using It.

Published: (February 27, 2026 at 03:17 PM EST)
5 min read
Source: Dev.to

Source: Dev.to

The largest trial of AI‑assisted cardiac screening ever conducted just published its results in The Lancet. The technology worked. The doctors didn’t.

The TRICORDER trial — named with exactly the optimism you’d expect — gave AI‑enabled stethoscopes to 96 NHS GP practices across northwest London. The devices, made by California startup Eko Health, use machine learning to detect heart failure, atrial fibrillation, and valve disease from the sound of a heartbeat. Over 12 months, 972 clinicians examined 12 725 patients with them.

When doctors actually used the stethoscope, the results were striking:

  • Patients were 2.3 × more likely to be diagnosed with heart failure.
  • 3.5 × more likely to be caught with atrial fibrillation.
  • Nearly 2 × as likely to receive a valve disease diagnosis.

These are conditions that kill people when missed, and primary care misses them constantly.

However, the trial’s primary endpoint — the number that matters for regulatory purposes — was negative. No statistically significant improvement across the entire study population. The reason is simple: 70 % of the GP practices that received the stethoscopes either stopped using them entirely or used them so rarely that the signal drowned in noise.

The Gap Between Works and Used

The TRICORDER trial, led by Professor Nicholas Peters and Drs. Mihir Kelshiker and Patrik Bachtiger at Imperial College London, cost the National Institute for Health and Care Research about £1.2 million. Eko provided the devices, and Imperial ran the trial independently.

  • Cost: $379 per stethoscope + $120 / year for the AI software.
  • Function: Amplifies heart sounds 40 ×, cancels ambient noise, and runs three detection algorithms simultaneously.
  • Workflow: A GP places it on a patient’s chest for 15 seconds; the AI flags abnormalities in real time.

The problem was everything around those 15 seconds. The device didn’t integrate with existing electronic health records, so results had to be entered separately. In clinics already running 10 minutes behind schedule by mid‑morning, an extra workflow step — even one that catches life‑threatening conditions — gets skipped.

“Some practices used the device less over time because it added extra steps to routine care and was not well integrated with existing electronic health record systems,” Kelshiker said. When surveyed, GPs identified EHR integration as the single change most likely to make them actually use the thing.

The False Positive Problem

The stethoscope also flagged too many healthy people. Two‑thirds of patients the AI identified as having suspected heart failure turned out not to have it after follow‑up blood tests and echocardiograms.

  • False‑positive rate: 67 % → every true diagnosis comes with two unnecessary referrals.
  • Consequences: anxiety, additional testing, and NHS resources spent confirming nothing is wrong.

Critics were blunt. Futurism ran a headline calling the stethoscope a device that “only fails two‑thirds of the time.” The British Heart Foundation recommended limiting its use to symptomatic patients to avoid generating unnecessary fear in otherwise healthy people.

The Pattern

This is not unique to stethoscopes. The gap between “AI works in studies” and “AI works in practice” is the defining failure mode of healthcare AI.

  • AI radiology tools are deployed in 90 % of healthcare organizations, yet only 19 % report high success rates.
  • Epic’s sepsis prediction system has been widely criticized for alert fatigue — the same false‑positive‑driven abandonment that killed the TRICORDER trial’s primary endpoint.
  • A survey found 77 % of health‑IT leaders cite immature tools as a significant adoption barrier, and more than half say infrastructure and data governance matter more than the AI itself.

The pattern repeats: a research team demonstrates that an AI tool catches diseases earlier, a trial confirms the technology’s accuracy, and then the real world — with its 10‑minute appointment slots and disconnected record systems — ignores it.

What Dies in the Gap

The per‑protocol numbers from TRICORDER are not trivial:

  • Atrial fibrillation, caught 3.5 × more often, causes roughly 25 % of all strokes.
  • Heart failure, caught 2.3 × more often, has a five‑year mortality rate worse than most cancers.

These are conditions where early detection directly translates to survival, but survival statistics don’t change clinical behavior. Workflow integration does.

The Lancet’s accompanying editorial — titled “Implementation first” — argued that the field has it backwards. Efficacy trials should come after implementation science, not before. Proving an AI works is easy; proving a doctor will use it is the actual experiment.

Eko Health has sold over 700 000 stethoscopes worldwide, holds multiple FDA clearances, and secured CPT billing codes in 2024, meaning insurers can now reimburse for AI‑assisted auscultation. The company raised $41 million in its latest round. None of that mattered when a London GP was running 20 minutes behind and the stethoscope required a separate login.

Bottom line: The TRICORDER trial proved that an AI stethoscope can find heart disease that doctors miss. It also proved that doctors will miss the stethoscope sitting on their desk.

0 views
Back to Blog

Related posts

Read more »