Ethics vs. compliance in AI
Tech experts warn against treating ethics as like just another box on a compliance checklist.
The Defense Department is focused on implementing its ethics principles for artificial intelligence, especially when it comes to health-related data. But tech experts warn against conflating ethics as just another compliance checklist.
Jane Pinelis, who leads test, evaluation, and assessment for the DOD's Joint Artificial Intelligence Center, said preserving personal health information is one of the JAIC's biggest priorities.
"On the health side, one of the biggest things that we're concerned about is the preservation of personal health information," Pinelis said during an Oct. 22 Defense One NextGov event on AI. "On something else, we might be worried about equitability and bias, how do we train these models, what kind of data do we use in training them, and what does that mean about future applications."
The JAIC announced progress with its Predictive Health effort on Oct. 21, which aims to reduce the time it takes to diagnose cancer. The project produced an augmented reality microscope supported by AI algorithms to help detect metastatic breast cancer cells on digital images. The device is slated for use at Brooke Army Medical Center in San Antonio, Texas and other military treatment facilities.
DOD adopted AI principles in February and has been working on a plan to implement them, an effort by their head of ethics policy Alka Patel. The JAIC's ethics team has been looking to expand its role as it evaluates existing policy, and launched a Responsible AI Champions pilot, six-week intensive study course for personnel that could be expanded across the department.
Pinelis said that's especially the case regarding personally identifiable information and what's required: "The question is how can we tie it to some of the work that we're doing and can we automate a lot of it so that it's easier."
The JAIC's response to COVID, Project Salus, which used predictive modeling to anticipate first responders equipment needs, helped establish the ethics and testing at the start mindset, especially as privacy became more important as they got to "zip code level" data, Pinelis said.
But when it comes to responsible technology use, Heather Roff, senior research analyst, the Johns Hopkins University Applied Physics Laboratory, warned that viewing it as compliance could be detrimental.
"Ethics is not compliance and if we think about ethics as compliance then we are failing as moral agents," Roff said during the panel.
"If you think about ethics as compliance or compliance officers or inspectors general then you are actually getting the bare minimum of what ethics is. Ethics is about…how to think about doing things responsibility. What am I thinking about when I build it" and not "is the compliance officer watching."
Roff also said basic research and testing were integral to the ethics conversation.
"Basic research is just not funded to the levels it needs to be funded," Roff said. "We need more basic research and funding for things like testing and evaluation...because testing supports the principles."
This article first appeared on FCW, a Defense Systems partner site.
NEXT STORY: Air Force rolls out bomb disposal robots