The world’s two largest veterinary hospital owners have started using artificial intelligence to read X-rays, indicating that adoption of the new technology is nearing a turning point. This does not necessarily mean that radiologists will be unemployed. Companies predict that AI is more likely to help them do their jobs than to replace them, at least for the foreseeable future.
Mars Inc., which has more than 2,500 veterinary practices worldwide, is using AI to interpret X-rays from about 60 practices in a trial exercise, executives of its veterinary diagnostics division said. Antech at the VIN News Service. The McLean, Va.-based group’s AI was developed in-house by Antech and is currently being tested in practices across Europe, including the UK, and North America. Dates for a broader company-wide rollout have yet to be decided.
IVC Evidensia, based in Bristol, England, which has more than 2,300 practices in Europe and Canada, has made further progress: it has finished testing an AI product developed by the American company SignalPET. The product is already present in dozens of Canadian IVC Evidensia practices, and a UK-wide rollout is expected to be completed in approximately 12 months, followed closely by mainland Europe.
Software tools use AI to read radiographs, commonly known as x-rays, and provide an interpretation in minutes. Users access the software by logging into a website and can pay as little as $10 per interpretation. Other companies offering the technology to veterinarians include Vetology and MetronMind, both based in California.
AI is also being used to interpret x-rays in human medicine, although mostly in academia due to greater regulatory hurdles than for veterinary medicine and pushback from some radiologists who fear the technology is not ready for use. clinical use. At least one product has been launched for standalone use in humans: In March, an AI tool for reading chest X-rays was approved by European regulators.
Adoption seems to be happening faster in the veterinary field so far. This increase has been fueled by a shortage of veterinary radiology specialists, particularly in academia, which has been recognized by professional associations, including the American College of Veterinary Radiology.
AI tools are developed with a technique known as deep learning, which, in the case of radiology, trains them to identify abnormalities that may indicate disease. The training consists of feeding the AI a multitude of x-ray images. The process is refined through radiologist reviews and user feedback.
Mars said it was drawing on the knowledge of Antech’s 123 radiologists to develop an AI trained with a vast trove of images stored by VCA, the US veterinary company acquired in 2017.
“VCA data has been stored for almost 20 years, and we have on the order of 11 million or 12 million slides with which we trained the AI,” said Paul Fisher, senior vice president at Antech Imaging Services. .
Fisher sees the technology more as a friend to radiologists than a rival, at least in the short term. “I think it will make them more effective with assisted readings, but I don’t really see – certainly for several years – that it would replace a radiologist.”
The adoption of AI technology could even create greater demand for monitoring by human beings with specialized training in radiology, according to Dr. Alistair Cliff, deputy chief medical officer at IVC Evidensia.
“I’m very clear on this: I think veterinary radiologists should be excited about this product,” Cliff said. “They should be excited because what we’re essentially doing is sparking a conversation about radiology in a much larger group of animals than what we’re doing right now.”
Academic research and anecdotal evidence indicate that veterinarians typically submit x-rays for a radiologist’s opinion in about 5% to 10% of cases, possibly less, Cliff said. AI technology, by giving GPs a quick and inexpensive scan tool, could introduce more pet owners to radiology, he claims, potentially prompting some to seek out specialists to assess initial results based on AI. “We see this as something that will only support the radiology community and, dare I say, enlighten them.”
The issue of accuracy
Before starting its rollout, IVC Evidensia tested the SignalPET product in 22 UK practices for 12 weeks, during which time it measured a number of performance factors including the accuracy of the tool, its impact on clinical care and veterinarian satisfaction. and pet owners were with her performances. Analysis by IVC Evidensia indicated 95% accuracy – based on recognizing what is normal or abnormal in an X-ray – which matches the level of accuracy claimed by product developer SignalPET.
As for the impact on clinical care, Cliff said use of the product in trial practices has triggered an increase in the use of other diagnostic equipment, such as endoscopy and ultrasound. Pets ultimately spent less time in the hospital and repeat visits decreased, indicating that AI resulted in more targeted care, leading to better clinical outcomes. Veterinarians were apparently impressed: 95% of practitioners across 22 trial practices said they approved of the product, while the remaining 5% said they hadn’t used it long enough to be sure. said Cliff.
Users of AI radiology tools contacted by VIN News last year offered more mixed reviews of their capabilities, ranging from enthusiastic praise to questioning their accuracy or applicability to certain conditions. Satisfied customers have recounted occasions when the software detected ancillary conditions that might otherwise have been overlooked. Others said the products were particularly useful for junior colleagues as a learning tool. Critics, however, claimed the AI produced readings that weren’t specific enough or identified lesions that didn’t exist.
Manufacturers maintain that their claims of accuracy are backed by academic research. In one of the most recent articles, published in January in Veterinary Radiology & Ultrasound, the journal of the AVCR, researchers at Tufts University evaluated the accuracy of Vetology’s product in 41 dogs with confirmed pleural effusion (a buildup of excess fluid around the lungs). They found the technology had an 88.7% accuracy rate and concluded that the technology “appears to have value and warrants further research and testing”.
Product developers acknowledge that their offerings aren’t perfect, though they point out that the quality of x-rays presented to their RN in general practice settings may not always be as high as the quality of x-rays presented by specialists or by academics conducting research. “The interpretations are only as good as what is offered to the AI for analysis”, said Eric Goldman, president of Vetology.
Goldman said the quality of images uploaded to the AI software by veterinary professionals will depend on a variety of factors, such as patient positioning, light levels and the presence or absence of obstacles.
“The other thing I would say is that the software doesn’t know that an animal is vomiting, an animal has diarrhea, an animal is to cough“, he said. “The software can look at a well-positioned, well-taken X-ray and tell if it’s a heart or lung problem, but it has to match the clinical signs and the training and DVM experience. That’s why, all things considered, we believe humans and AI are better together.”
For this, Vetology has developed a version of its AI tool adapted to radiologists, using technical language and concepts that are more familiar to them. It is currently used by radiologists in the company’s teleradiology business to perform preliminary assessments. “We don’t want to be inclusive with technology and we want radiologists to be part of it,” Goldman said.
Mars executives note that the process used to develop AI tools isn’t perfect either. “It’s trained by human beings, who can make mistakes,” Fisher of Antech said. “That’s why we use a group of radiologists to train it every day – even though, as everyone knows probably, radiologists don’t always agree with each other.
Diane Wilson, director of scientific and academic affairs at Antech Imaging Service, notes that with 20 years of VCA data, his AI can encounter many inconsistencies and still produce accurate readings. She added: “I think one of the biggest limitations is the education of the end user. It’s not a mechanical radiologist. It’s a machine that does a diagnostic test. It is important that veterinarians likely to use it understand what he can and cannot do.”
Similarly, IVC Evidensia’s Cliff recommends that vets consider AI radiology tools more of an assistant than an authority. “It’s part of the puzzle that is a diagnosis,” he said. “It’s not a diagnosis in itself.”
The sophistication and autonomy of AI products remains an open question, not only in veterinary and human medicine, but in sectors ranging from retail to the arts.
Dr. Debra Baird, Indiana-based veterinary radiologist and diagnostic imaging consultant to the Veterinary Information Network, online community of the profession, doubts AI will ever put people like it out of work. “Listing X-ray results or detecting abnormalities is one thing,” she said, “but interpreting results and their significance is where I think AI won’t be able to replace the human mind.”