Attention is being turned to enhancing one of the most basic medical tools, the stethoscope. The question is can we improve our ability to hear and interpret the sounds produced by the human body with artificial intelligence (AI)?. We have seen this concept applied in radiology where the machine matches up favorably with the radiologist, and in cardiology with arrhythmia detection and pattern reading of electrocardiograms by computer. But, can AI be applied to the basic skill of auscultation?
To examine this question, we need to go back two centuries to the “ear trumpet”, a tube and earpiece, first used by the French physician Rene Laennec in 1816, which revolutionized the medical examination. It took more than a century before this device was improved to the binaural device that we are familiar with today. About a decade ago, the electronic stethoscope was introduced to provide amplification.
Amplification, however, did not improve accuracy. Studies have shown that only 60 percent to 80 percent of the time, physicians can hear a heart murmur when it is present. When making rounds with students, I would often hear a heart sound or murmur that was missed by those with better hearing. The reason was that my experience had taught me that a sound might be present, and if I focused and concentrated on hearing it, I would detect it. After pointing it out, the student would return to the patient and could then hear the sound or murmur.
Supplementing the electronic stethoscope with Bluetooth-enabled cloud-based technology provides amplification, visual display, transmission, storage, and interfaces with electronic records. Digitizing the data has opened the door for the application of artificial intelligence to the interpretation of auscultation.
Artificial intelligence using Convolutional Neural Networks attempts to train a machine to perceive the world as humans do. An example of this would be facial recognition software. Currently, there is FDA approval for an artificial intelligence program interface with an electronic stethoscope to detect whether or not a heart murmur is present. The algorithm, however, can only detect a murmur and cannot determine the origin or severity. Here is where the human-AI interface can have an impact. Just as a clinician over-reads and approves or corrects the computer readout of an EKG, I can envision a skilled clinician reviewing the auscultation data, rendering an opinion, and directing appropriate evaluation.
The enthusiasm for AI in clinical decision-making underscores the need for rigorous testing of these applications. Randomized clinical trials, similar to those done for new drugs and devices, which measure outcomes, need to be performed. Overenthusiastic claims that AI can outperform physicians could create a new specialty of medical fortune-tellers whose objective is not what is best for the patient but rather what is best for profitability.
While the allure of AI is exciting, its use needs to be measured in terms of its success in enhancing the skill of the clinician, not as a precursor to his or her replacement. When a clinically proven AI electronic stethoscope is put to use, it should represent an advance in medical practice because it validates the clinician’s findings and serves as a reminder that “It’s not what’s in your ears, but what’s between your ears” that counts.