DOD enlisting AR microscopes to spot cancer
The Defense Innovation Unit is partnering with Google Cloud to use machine learning and augmented reality to speed and improve the accuracy of cancer diagnoses.
The Pentagon's Defense Innovation Unit will be working with Google Cloud on augmented reality microscopes (ARM) that use artificial intelligence to help military doctors with cancer detection.
The ARM is part of a new DIU project, called "Predictive Health," which aims to leverage artificial intelligence to transform military health care. The project's goal is to help improve the accuracy of diagnoses, reduce the amount of information physicians must process when diagnosing and treating cancer, lower overall health care costs and maximize readiness.
Google researchers first discussed ARM in a 2018 paper where they described how doctors using the ARM to examine tissue samples for diagnosis and staging would see an overlay of pathology-based cancer detection information. That additional insight would be generated by machine learning models trained on diagnoses from a team of pathologists who identified cancer in thousands of tissue images in de-identified public and private datasets.
The ARM has three main components: the augmented microscope, a set of trained deep learning algorithms, and a computer running software that continually captures the microscope images, runs the deep learning algorithms and displays enhanced results in the microscope in real-time.
“As in a traditional analog microscope, the user views the sample through the eyepiece,” a 2018 Google blog explained. “A machine learning algorithm projects its output back into the optical path in real-time. This digital projection is visually superimposed on the original (analog) image of the specimen to assist the viewer in localizing or quantifying features of interest.” At that time, the system updated quickly enough (10 frames/second) that users could move the slide or change magnification as they examined the tissue samples and still get the overlays.
Besides detecting cancers, the Google researchers suggested the ARM could run many types of machine learning algorithms to solve object detection, quantification, or classification problems. The visual feedback might be modified to include text, arrows, contours, heat maps or animations.
With the Predictive Health project, researchers will have access to the Defense Department’s trove of medical data from which to build machine learning models. Dr. Niels Olson, the DIU chief medical officer and originator of the Predictive Health project, said DOD has a very diverse set of data, given its size and the variety of people receiving health care from DOD.
"If you think about it, the DOD, through retired and active duty service, is probably one of the largest health care systems in the world," Olson said. "The more data a tool has available to it, the more effective it is.“
Google’s approach with DIU will leverage TensorFlow, an open-source framework to help deliver machine-learning models, as well as the Google Cloud Healthcare API for data ingestion and de-identification to maximize patient privacy.
"The prototype of this technology that we're adopting will not replace the practitioner," said Nathanael Higgins, the support contractor managing the program for DIU. "It is an enabler -- it is not a cure-all. It is designed to enhance our people and their decision making.”
"AI is obviously the pinnacle of that type of tool in terms of what it can do and how it can help people make decisions," Higgins said. "The intent here is to arm them with an additional tool so that they make confident decisions 100% of the time."
The initial rollout is planned for select Defense Health Agency treatment facilities and Department of Veterans Affairs hospitals in the United States.
This article first appeared on GCN, a Defense Systems partner site.
NEXT STORY: Automated analytics for the tactical edge