DARPA pursues small biometric sensors for human-machine interface
Biometric and neural data are helping teach machines to adapt to human variability, according to DARPA.
The key to unlocking the next generation of warfighting technology is not ensuring that humans understand machines, but rather developing machines that can understand us. In order to realize the full potential of artificial intelligence (AI) and autonomy, machines must be able to factor in their operator’s actions and reactions when making decisions, according to panelist experts from DARPA, the Army Research Laboratory, and the Air Force Research Laboratory, speaking at the 2017 Defense One Tech Summit.
“The ability to sense and understand the state and capabilities of the human is absolutely critical to the successful employment of highly automated systems and autonomous systems,” explained Dr. James Christensen, Portfolio Manager of the 711th Human Performance Wing of the Air Force Research Lab. “The speed of the decision cycle with these kinds of systems is going to be so fast that they have to be sensitive to and responsive to the states of the operators and their intent, just as much as the operator’s overt actions and the control inputs of they are providing.”
In order to design algorithms that can comprehend a person’s state, experts must pinpoint indicators of human mental and physical state that the machine can learn to identify. This is where biometric sensor data and intensive study of neural pathways becomes crucial, according to Dr. Justin Brooks, a scientist at the Army Research Lab.
The research is primarily being conducted through the Army Research Lab’s Human Variability Project and DARPA’s Neural Engineering System Design (NESD) program.
With the Human Variability Project, the service research labs have “put a lot of effort into constructing a program that’s going to observe humans…with wearable monitors to gather tons of biometric data, to really drill down what the causes of [human] variability are,” explained Brooks. “We want to understand how [humans’] cognitive states, how their physical states, and how their emotional states comingle together [because] we ultimately want to build technology that can adapt to those variations.”
DARPA’s NESD program recently awarded $65 million to a group of six academic, nonprofit, and industry research teams. The teams are each pursuing a different angle of the problem, but with the collective goal of understanding which neurons indicate different human impulses and reactions, and how to target specific neurons in the brain to predict those impulses, according to a DARPA press release.
“The NESD program looks ahead to a future in which advanced neural devices offer improved fidelity, resolution, and precision sensory interface… NESD aims to enable rich two-way communication with the brain at a scale that will help deepen our understanding of that organ’s underlying biology, complexity, and function,” said Philip Aveda, the founding Program Manager of NESD.
One obstacle for the program, however, is that cybernetic brain implants, or any electronic devices, tend to produce heat that damages the brain. For this reason, according to DARPA, the Brown University research team is experimenting with micro-millimeter-sized sensors that can be place on a person’s head, called neuro-grains. Another team is attempting to model the precise neurons that respond to different types of visual and tactile stimuli.
Once enough data is gathered from both the biometric sensor studies of the Human Variability Project and the neural NESD program, models of human variation indicators can be designed and programmed into the next generation of weapons systems and equipment, according to Brooks.
The Air Force Research Lab is already working on one example of a next generation system with the idea for a flight system that automatically senses the pilot’s condition and adjusts itself accordingly. The system would use built in sensors in the pilot’s flight suit to monitor key human variability indicators, and could adjust altitude and flight path before the pilot even registered distress, according to Christensen.
“What we would like to do ultimately is combine those bits of computational processing infrastructure that the human cannot do well…with what the human does do well,” explained Brooks. “The idea is actually to…use them in a sort of man-machine interface and exceed the capabilities of either system individually…so we are focused on developing adaptive technologies that aren’t limited by what we think the average human can do, but actually can adapt to people in real-time,” he said.
More research must be done before any of these next generation human-machine systems are fielded, however the panelists expressed “cautious optimism” that human-adaptable AI technology could at some point become part of the future force.