VIN News Service photo
Artificial intelligence increasingly is helping veterinary professionals interpret radiographs, whether they are taken in specialty, emergency or general practice.
The world's two biggest owners of veterinary hospitals have started using artificial intelligence to read radiographs, indicating that adoption of the novel technology may be approaching a turning point. That doesn't necessarily mean radiologists will be out of work. The companies predict that AI is more likely to help them do their jobs than replace them, at least in the foreseeable future.
Mars Inc., which owns more than 2,500 veterinary practices worldwide, is employing AI to interpret radiographs at about 60 practices as part of a trial exercise, executives at its Antech veterinary diagnostics division told the VIN News Service. The McLean, Virginia-based group's AI was developed in-house by Antech and is being trialed at practices in Europe, including the United Kingdom, and in North America. Dates for a broader company-wide rollout have yet to be decided.
Bristol, England-based IVC Evidensia, meanwhile, which owns more than 2,300 practices in Europe and Canada, has progressed further: It has finished trialing an AI product developed by the American company SignalPET. The product already is present in dozens of IVC Evidensia's Canadian practices, and a U.K.-wide rollout is expected to be completed in about 12 months, closely followed by mainland Europe.
The software tools employ AI to read radiographs, commonly known as X-rays, and provide an interpretation within minutes. Users access the software by signing into a website, and can pay as little as US$10 per interpretation. Other companies offering the technology to veterinarians include Vetology and MetronMind, both based in California.
AI is used to interpret radiographs in human medicine, too, though mostly in academic settings due to higher regulatory hurdles than for veterinary medicine and pushback from some radiologists who fear the technology may not be ready for clinical use. At least one product has been released for autonomous use in humans: In March, an AI tool for reading chest X-rays was approved by European regulators.
Adoption appears to be occurring more rapidly in the veterinary domain for now. That rise has been stoked by a shortage of veterinary radiology specialists, particularly in academia, that has been recognized by professional associations including the American College of Veterinary Radiology.
AI tools are developed with a technique known as deep learning, which in the case of radiology, trains them to identify abnormalities that may indicate disease. Training involves feeding the AI a multitude of radiographic images. The process is refined through reviews by radiologists and user feedback.
Mars said it drew on the knowledge of Antech's 123 radiologists to develop AI trained with a vast trove of images stored by VCA, the U.S. veterinary business it acquired in 2017.
"VCA data has been stored for almost 20 years, and we have in the order of 11 million or 12 million slides that we've trained the AI with," said Paul Fisher, senior vice president at Antech Imaging Services.
Fisher sees the technology as more of a friend to radiologists than a rival, at least in the near-term. "I think it'll make them more efficient with assisted reads but I can't really see — certainly for a number of years — that it would replace a radiologist."
The adoption of AI technology might even create more demand for oversight by human beings with specialist radiology training, according to Dr. Alistair Cliff, deputy chief medical officer at IVC Evidensia.
"I'm very clear on this: I think that veterinary radiologists should be excited by this product," Cliff said. "They should be excited because what we are essentially doing is sparking a conversation about radiology in a far, far bigger group of animals than we currently do."
Academic research and anecdotal evidence indicate that veterinarians typically submit X-rays for a radiologist's opinion in about 5% to 10% of cases, perhaps fewer, Cliff said. AI technology, by offering general practitioners a cheap, fast analytical tool, could introduce more pet owners to radiology, he contends, potentially prompting some to seek out specialists to evaluate initial AI-based findings. "We see this being something that will do nothing but support the radiologist community and, dare I say, shine a light on them."
The question of accuracy
Before commencing its rollout, IVC Evidensia trialed the SignalPET product at 22 practices in the U.K. for up to 12 weeks, during which time it measured a number of performance factors including the accuracy of the tool, its impact on clinical care, and how satisfied veterinarians and pet owners were with its performance. IVC Evidensia's analysis indicated 95% accuracy — based on recognizing what is normal or abnormal in a radiograph — which happens to match the accuracy level claimed by product developer SignalPET.
As for the impact on clinical care, Cliff said use of the product at trial practices triggered an increase in the use of other diagnostic equipment, such as endoscopy and ultrasound. Pets ultimately spent less time in hospital, and repeat visits fell, indicating the AI prompted more targeted care, resulting in better clinical outcomes. Veterinarians were apparently impressed: 95% of the practitioners at the 22 trial practices said they approved of the product, while the remaining 5% said they hadn't used it long enough to be sure, Cliff said.
Users of AI radiology tools contacted last year by VIN News offered more mixed evaluations of their capabilities, ranging from enthusiastic praise to questioning their accuracy or applicability to certain conditions. Happy customers recounted occasions on which the software picked up incidental conditions that might otherwise have been overlooked. Others said the products were especially helpful for junior colleagues as a learning tool. Detractors, however, claimed the AI had produced readings that weren't specific enough or had identified lesions that didn't exist.
Manufacturers maintain that their accuracy claims are supported by academic research. In one of the more recent papers, published in January in Veterinary Radiology & Ultrasound, the journal of the AVCR, researchers at Tufts University assessed the accuracy of Vetology's product in 41 dogs with confirmed pleural effusion (a buildup of excess fluid around the lungs). They found the technology had an accuracy rate of 88.7%, and concluded that the technology "appears to be of value and warrants further investigation and testing."
Product developers acknowledge their offerings aren't perfect, though they stress that the quality of radiographs being shown to their AI in general-practice settings might not always be as high as the quality of radiographs presented by specialists or by academics conducting research. "Interpretations are only as good as what's offered to the AI to analyze," said Eric Goldman, Vetology's president.
Goldman said the quality of images uploaded to AI software by veterinary professionals will depend on various factors, such as positioning of the patient, lighting levels and the presence or absence of impediments.
"The other thing that I would say is that software doesn't know that an animal is vomiting, an animal has diarrhea, an animal is coughing," he said. "Software can look at a well-positioned, well-taken radiograph and tell whether it's a heart or lung issue, but it has to match the clinical signs and it has to match the DVM's training and experience. That's why, all things considered, we believe humans and AI go better together."
To that end, Vetology has developed a version of its AI tool that is tailored to radiologists, by using technical language and concepts to which they are more familiar. It currently is being used by radiologists at the company's tele-radiology business to perform preliminary assessments. "We want to be inclusive with the technology and we want radiologists to be a part of it," Goldman said.
Mars executives note that the process used to develop AI tools isn't perfect, either. "It's trained by human beings, who can make mistakes," Antech's Fisher said. "So that's why we use a group of radiologists to train it every single day — though as everyone probably knows, radiologists don't always agree with each other."
Diane Wilson, director of scientific and academic affairs at Antech Imaging Service, notes that with 20 years' worth of VCA data, its AI can encounter a lot of inconsistency and still produce accurate readings. She added: "I think one of the biggest limitations is education of the end user. This is not a mechanical radiologist. It's a machine that conducts a diagnostic test. It's important that veterinarians who might use it understand what it can and can’t do."
Similarly, IVC Evidensia's Cliff recommends veterinarians view AI radiology tools more as an assistant than an authority. "It forms part of the jigsaw puzzle that is a diagnosis," he said. "It's not a diagnosis in itself."
How sophisticated and autonomous AI products become remains an open question, not just in veterinary and human medicine, but in industries ranging from retailing to the arts.
Dr. Debra Baird, a veterinary radiologist based in Indiana and diagnostic imaging consultant at the Veterinary Information Network, an online community for the profession, doubts that AI will ever put people like her out of the job. "Listing radiographic findings or detecting abnormalities is one thing," she said, "but interpreting the findings and the importance of them is where I think AI will not be able to replace the human mind."
VIN News Service commentaries are opinion pieces presenting insights, personal experiences and/or perspectives on topical issues by members of the veterinary community. To submit a commentary for consideration, email email@example.com.