DARPA takes autopilot to a new level
The Aircrew Labor In-Cockpit Automation System features a customizable, drop-in kit that would allow advanced automation to be easily added to existing aircraft.
The Defense Department's investment in artificial intelligence and battlefield autonomy starts in the lab, but not much if it stays there.
Defense Advanced Research Projects Agency's Aircrew Labor In-Cockpit Automation System (ALIAS) project is expected to complete its first fly-by-wire experiment in May or June. A demo will come later this fall.
The technology aims to improve flight safety and performance and reduce the cognitive load on pilots and the number of onboard crew members with a customizable, drop-in, removable kit that would allow advanced automation to be easily added to existing aircraft. ALIAS' sense and avoid capabilities were tested in 2016 with a Cessna 172G aircraft approaching an unmanned aerial system from multiple angles. The project was a GCN DIG IT award finalist that year.
"This is all a part of operationalizing autonomy," Lt. Col. Philip Root, acting deputy director of the Tactical Technology Office, said. "Those words really mean something to me."
Fly-by-wire isn't the capability, but the mechanism needed for autonomy, Root said. Moreover, it'll be a first for Army aviation.
"We need the fly-by-wire to add a computer in the middle that helps and augments a human," he said. "Once we prove that works, now we can begin adding the autonomy flight controls -- operating in the background like a lane assist [feature in cars that helps] the human operator avoid a tree."
ALIAS expects to do its first zero-pilot test in early 2020 with an unmanned Black Hawk helicopter. If successful, the initiative would allow the military to make better use of pilots' time. It would also allow the military to get more use out of the aircraft if it could go on low-risk missions while pilots rested.
A single-pilot test has been the most challenging to pull off, according to Root. There's no feedback loop for the pilot because co-pilot is a relatively invisible, silent machine.
"You now have a co-pilot that's not there, and [pilots] don't know how to rely on someone who's not there," he said. "How does a machine have the same contextual understanding of when to talk, so to speak, and what information is relevant? And how does the human pilot know when to trust the autonomy?"
The situation can be likened to adaptive cruise control that causes a car to automatically slow down when approaching another moving vehicle but doesn't slow down when the driver thinks it should.
ALIAS is working on this issue, even though Root admitted it might not get it right. "We could absolutely do this wrong. We could have an autonomous co-pilot that's supposed to allow the human pilot to do other things, and in actuality we reduce the effectiveness because the guy or gal is so concerned that they never relinquish control."
But despite humans' natural distrust for machines, ALIAS aims to bridge that gap by having pilots train with the technology, becoming familiar with it like they would with a new iPhone.
"Pilots, operators, Marines trust those things that work," Root said.
"Trust is two parts: You have to believe the system can actually deliver, and you have to see it deliver routinely. Those two things are separate," he said. "We can provide ability to trust a machine if we develop it from the ground up to foster that trust."
Whether that trust is earned remains to be seen.
More tests are still needed, Root said. "Let's talk in a year, and I'll tell you exactly how it worked out."
This article was first posted to FCW, a sibling site to Defense Systems.
NEXT STORY: Stricter supply chain enforcement is coming