Microscopes Powered by Google’s AI Could Change Cancer Diagnostics
A DoD pilot program could help make artificial intelligence useful not just to researchers but to physicians.
New augmented-reality microscopes, powered by AI, will change how doctors detect cancer, and finally begin to fulfill the promise of applying artificial intelligence to medical imagery. The Defense Innovation Unit recently awarded a contract to Google Cloud to deliver the AI models to power a pilot program called Predictive Health, Defense One has learned.
Here’s how it will work: a doctor or researcher looking through a special microscope at, say, potentially cancerous tissue samples, will see information projected on areas that deserve close scrutiny, as determined by an algorithm trained on vast Defense Department databases of cancer imagery.
“What it’s trying to do is help the pathologist, at that moment in time, synthesize data to make a better diagnosis. To be able to process a lot of information in a way that’s very difficult for them to do right now, or very time-consuming,” said Mike Daniels, the vice president for public sector at Google Cloud. The algorithm, working with the augmented reality microscope, provides something like a second pair of eyes, one that’s better trained to spot certain anomalies — but not necessarily diagnose them. That’s still the job of the human. “It's almost like it’s looking at the same thing, literally at the same time at that point of care and then providing information context,” he said.
In a statement provided to Defense One, Google says the effort will use TensorFlow, its open-source AI software library, as well as the Google Cloud Healthcare API, to ingest vast amounts of Defense Department medical imagery and strip it of patient-identifying information.
Medical researchers have been talking about the promise of neural networks for spotting cellular anomalies, including cancer, since the early 1990s. The emergence of large medical datasets, cloud-computing capabilities, and new models have made such networks more useful for doing research and predicting things like patient survivability. But, as a group of Greek researchers observed in a highly-cited 2015 paper, “In spite of the claims that these ML classification techniques can result in adequate and effective decision making, very few have actually penetrated the clinical practice.”
The hope is that combining AI and augmented reality will increase accuracy and throughput of diagnosis and thus make artificial intelligence much more relevant not just to researchers but to actual doctors caring for patients. “There’s an enormous amount of information to synthesize for these pathologists as part and parcel of this disease diagnosis,” said Daniels. He put the contract size at seven figures.
That enormous amount of healthcare data, unique to the Department of Defense, also presents a rare opportunity for the Department to train new machine learning tools. While the pilot will only be active at a handful of Defense healthcare sites, there are 9.6 million beneficiaries in the Defense Health System, which means a lot of data to improve the accuracy of models.
Indeed, the active duty and retired troops served by DoD’s healthcare system make it one of the largest in the world, DIU chief medical officer Dr. Niels Olson said in a Defense Department news release last week. "The more data a tool has available to it, the more effective it is. That's kind of what makes DOD unique. We have a larger pool of information to draw from, so that you can select more diverse cases."
And that’s all on top of the time and training that Google has already put toward improving models to assist with pathology. “We have spent literally millions upon millions of machine-learning hours in this space to perfect our vision of AI as it relates to it,” said Daniels.