Beyond the Glass Slide: Emerging Technologies That Will Reshape Veterinary Pathology

AI in Veterinary Diagnostics · Part 6 of 6

When I think about what veterinary pathology could look like in twenty years, I don't picture a discipline that looks roughly the same as today with better software. I picture something fundamentally different — faster, more accessible, and capable of extracting diagnostic information from tissue in ways that simply aren't possible right now.

The glass slide has served us extraordinarily well. Hematoxylin and eosin staining is over a century old, and it remains one of the most information-rich diagnostic tools in medicine. But it has real limitations — it requires tissue processing that takes hours, it consumes tissue that is sometimes irreplaceable, and it depends on staining reagents and antibodies that don't always exist for the species we're working with.

The technologies in this post represent a different approach. Not better H&E — something genuinely new. And the pathologists who understand these technologies now, who are thinking about how to validate them and where they'll be most useful, will be the ones best positioned to lead when the time comes.

The workflow that hasn't changed — and why that matters

The histopathology workflow most veterinary clinicians are familiar with has been essentially unchanged for decades. Tissue is collected, fixed in formalin, processed, embedded in paraffin, sectioned, mounted on glass, and stained — most commonly with H&E — before a pathologist examines it. The process works well. It has an enormous evidence base. And it has limitations that are easy to overlook because they're so familiar.

Two emerging technology categories — virtual staining and label-free tissue imaging — take fundamentally different approaches to generating diagnostic tissue information. Neither is ready for routine veterinary use. Both are advancing quickly in human medicine research, and that trajectory has direct implications for what veterinary pathology will look like in the years ahead.

Virtual staining: generating special stains from H&E

Special stains — PAS for fungi and glycogen, Masson's trichrome for collagen, Perl's Prussian blue for iron — provide information that H&E alone can't. So does immunohistochemistry, which uses antibody-based labeling to detect specific proteins. Both require additional tissue, reagents, processing time, and in the case of IHC, validated antibodies that don't always exist for the species or antigen in question.

Virtual staining uses AI to computationally generate the appearance of a special stain or IHC label from a digitized H&E section — without performing the stain at all. The idea is that the morphologic information in an H&E image already encodes, to varying degrees, the features a special stain would highlight. A well-trained AI model can learn that relationship and synthesize a virtual stain from the H&E input.

Context from human medicine

Research in human pathology has shown that AI-generated virtual trichrome stains from H&E sections can achieve accuracy sufficient for liver fibrosis staging when compared against expert assessment of true trichrome preparations. Similar work has been published for virtual IHC prediction in certain tumor types. These are controlled research findings — not clinical standards — and reproducibility across institutions and scanner platforms is still being worked out.

The implications for veterinary pathology are significant in two areas. The first is tissue conservation. Small biopsies — punch biopsies from exotic species, endoscopic samples, core biopsies — sometimes yield barely enough tissue for a single H&E section. Virtual staining could allow the diagnostic value of a special stain to be extracted without consuming any additional tissue.

The second is antibody availability. Many commercial IHC antibodies were developed for human or laboratory animal tissue. Cross-reactivity in other species is variable and must be established case by case. For exotic species, avian patients, reptiles, and fish — where reagent libraries are limited — virtual IHC prediction from H&E could provide diagnostic information that currently isn't available at all.

What virtual staining is not: A virtual stain is a computational prediction, not a chemical measurement. It can't detect an antigen that isn't reflected in morphology — it predicts what a stain would look like based on what the H&E shows. When morphology and stain result are discordant — which happens — the virtual stain will follow the morphology rather than reveal the discordance. That distinction matters for how these tools should be used.

Virtual IHC: estimating biomarker status from H&E

Virtual IHC prediction deserves its own attention because of its direct connection to clinical decisions. IHC biomarkers drive tumor classification, targeted therapy selection, and in some cases carry independent prognostic value. The ability to estimate biomarker status from H&E morphology — even imperfectly — has real implications for efficiency and access.

In veterinary oncology, where IHC panels are used routinely for round cell tumor classification, lymphoma immunophenotyping, and soft tissue sarcoma subtyping, virtual IHC prediction could reduce the number of ancillary tests needed in clear-cut cases and focus resources on the ambiguous ones. This is a tool that a pathologist with deep IHC experience is uniquely positioned to evaluate — because understanding what the virtual stain is and isn't telling you requires the same background that comes from working with the real stain at scale.

Label-free imaging: tissue without staining

Virtual staining still works from conventionally processed H&E tissue. Label-free imaging goes further — it generates diagnostic tissue images without any staining at all, using optical techniques that read the intrinsic properties of the tissue itself.

Several approaches are in active research development. Stimulated Raman histology (SRH) uses laser-based spectroscopy to generate images of fresh, unprocessed tissue that closely resemble H&E sections. Multiphoton microscopy uses the natural fluorescent properties of structural tissue proteins to generate contrast without added labels. MUSE — microscopy with ultraviolet surface excitation — produces rapid surface images of fresh tissue with H&E-like contrast using relatively simple hardware. In each case, the images come from tissue that hasn't been fixed, processed, or stained — and they're available in minutes rather than hours or days.

The intraoperative margin opportunity: The most immediately compelling clinical application of label-free imaging is intraoperative surgical margin assessment. Right now, confirming clean margins requires sending the specimen for conventional histopathology — which takes days — or relying on gross assessment while the patient is still on the table. Label-free imaging could provide diagnostic-quality tissue images within minutes of excision. In human surgical oncology this capability is in active clinical evaluation. Its translation to veterinary surgery is a meaningful near-term prospect.

AI's role in label-free imaging isn't in generating the images — it's in making them interpretable. SRH and multiphoton images don't look exactly like H&E. The underlying structural information is there, but the contrast characteristics and artifact patterns are different. AI tools that translate label-free images into familiar H&E-like representations — or that learn to interpret them directly — are what make label-free imaging viable as a clinical tool rather than a research instrument.

The validation challenge in veterinary species

Both virtual staining and label-free imaging face the same challenge every AI diagnostic technology faces in veterinary medicine: the validation work has been done almost entirely in human tissue. The translation to veterinary species can't be assumed.

Tissue morphology differs across species. Staining characteristics vary. What constitutes a diagnostically significant feature in canine hepatic tissue may not map directly onto patterns a model learned from human liver biopsies. For exotic species and aquatic animals, the gap is even wider.

This isn't an argument against these technologies. It's an argument for the species-specific validation work that veterinary pathologists are uniquely positioned to lead — and that represents a genuine research opportunity for the field. The tools are coming. The question is whether veterinary pathology will be ready to receive them critically.

A realistic timeline

Most veterinary practices won't encounter these tools in the next two to three years. Some form of virtual staining assistance — for a limited set of validated stain types in companion animal species — could reach early adopters within five years. Label-free intraoperative imaging, which depends on specialized hardware and complex clinical integration, is more likely a ten-to-fifteen year prospect for routine veterinary surgical use, though academic and specialty centers may see it sooner.

What's certain is that the pathologists and diagnostic services that understand these technologies now will be the ones best positioned to implement them wisely when the time comes. The glass slide has had a remarkable run. What comes after it will require the same depth of expertise to interpret — and the same rigor to validate. That's not a challenge to the pathology profession. It's an invitation to lead.


This concludes the AI in Veterinary Diagnostics series. If you have questions about any of the topics covered, or would like to discuss how AI-assisted diagnostic tools are being developed at Vetopathy, we welcome the conversation. Reach us at esnook@vetopathy.com.

Next
Next

The AI-Connected Clinic: A Diagnostic Ecosystem That Doesn’t Exist Yet — But Almost Does