Advanced sensors give soldiers best battlefield picture
High-tech sensors on the battlefield are cutting through the situational awareness limitations that confront vehicle-mounted soldiers. In a time when much of the military operates in chaotic urban settings – with their potential for roadside bombs, blocked fields of view and snipers – gaining an electronic edge in situational awareness has become more important than ever before.
High-tech sensors on the battlefield are cutting through the situational awareness limitations that confront vehicle-mounted soldiers. In a time when much of the military operates in chaotic urban settings – with their potential for roadside bombs, blocked fields of view and snipers – gaining an electronic edge in situational awareness has become more important than ever before.
Operating inside a vehicle, especially a heavily shielded one such as an M-1 Abrams, a Stryker, or the newer Mine Resistant Ambush Protected vehicle, has always involved trade-offs between protection and perception.
“Just imagine driving your car and saying that you’re going to black out your side windows and your back window,” said Army Lt. Col. Edward Stawowczyk, product manager of Forward Looking Infrared at Fort Belvoir, Va. “It is challenging when you’re riding inside of a closed-in vehicle and not having situational awareness of what’s going on right outside your vehicle,” he said.
Other added factors such as noise or the maze-like dimensions of urban settings and the need for an electronic edge make the problem more apparent. Rural settings aren’t exempt – natural weather or the absence of natural light can also slow down maneuverability there.
Sensors have a history of giving U.S. troops a strategic edge. Iraqi invaders in Kuwait during the Gulf War were shocked by the United States forces' night vision capabilities. U.S. tanks participating in the drive to Baghdad in 2003 were able to push through a sandstorm, although at a reduced pace. However, military researchers didn’t believe then they had the time to rest on their laurels, and with the challenges of an asymmetric battlefield, they now think so even less.
Sensors of today
The Army defines situational awareness as the ability to generate knowledge through the use of timely and accurate information that warfighters can use to make informed decisions on the battlefield.
Ask Stawowczyk about his favorite sensor and he’ll point to the Long Range Advanced Scout Surveillance System (LRAS3).
A long-range reconnaissance and surveillance sensor, LRAS3 has day and thermal night vision. Thermal vision works by capturing the heat differences between objects and using them to create an image. “It almost looks like you’re looking through a black-and-white TV set, almost that resolution, with green, instead of white,” Stawowczyk said. About 1,600 units are in use, he said.
The long-range part of LRAS3 is extremely important, he said. It’s a truism that whoever sees the enemy first, kills first -- or makes an intelligent decision not to kill. On a battlefield where the enemy doesn’t obviously identify himself and mixes with civilians, that is vital.
In an article in the September/October 2005 Defense AT&L magazine, infantry scouts praised LRAS3. “Now, when we use it, we can tell the difference between a man planting a bomb and children playing several miles away on a moonless night,” one scout said.
Feedback from the field is uniformly positive, military sources say. “The question I get on LRAS3 is, ‘How do I get more,’” Stawowczyk said.
The system interfaces with the Force XXI Battle Command, Brigade-and-Below command and control system -- and the most renowned component of the FBCB2 is the Blue Force Tracking satellite tracking system. It [Blue Force Tracking] isn’t networked directly into a targeting system, but that should change in the next 18 months, Stawowczyk said. “In other words, we’ll be able to cue to shoot,” he said.
Meanwhile, LRAS3 isn’t the only high-tech targeting device on the battlefield. Acoustic sensors used to detect the locations of snipers have appeared in the last few years, most notably in the Army as a system named Boomerang. In July, the Army announced a $74 million procurement of 8,131 more units from the Cambridge, Mass.-based research and development company BBN Technologies.
Boomerang is essentially a clutch of microphones arranged in a flower petal-like arrangement on a stick, and this system detects the sound of a shot and notifies soldiers of an incoming projectile. A loud speaker box expresses the projectile's general origin by describing clock terms – two o’clock, three o’clock and so forth.
“The speaker is really, really loud,” said Lt. Col. Terrence Howard, who oversees Boomerang as the product manager for robotics and unmanned sensors at the intelligence, electronic warfare and sensors program executive office at Fort Monmouth, New Jersey, The Boomerang comes in mounted and dismounted versions.
The system’s sensitivity can be tweaked depending on the setting where soldiers are operating, Howard said. The system creates a bubble with the microphones in the middle listening for projectiles that penetrate its circumference. The bubble moves at the same velocity as a moving vehicle, but Howard declined to say whether it has a speed limit, citing operational concerns.
Soldiers can change the size of that bubble according to their environment. In a rural area, the bubble might be wider; in an urban setting it might become smaller. The system knows to distinguish between vertically fired gunfire that trigger-happy locals might unleash in a celebratory manner, Howard said. “If you’re shooting straight up in the air, you’re not going to penetrate the bubble,” he said.
Vehicle operators today use another system named Driver’s Vision Enhancement. Stawowczyk said that, compared with the LRAS3, DVE is a lower-end thermal sensor. It gives a forward-direction field of view of 40 degrees horizontally and 30 degrees vertically evenly split along a horizontal plane, though drivers can pan and tilt the sensor. The display “looks a lot like a laptop display that folds down much like the sun visor in your car,” he said.
The DVE’s field of vision is restricted compared with ordinary human eyesight, which with peripheral vision can extend to 175 degrees, Stawowczyk said. “It’s restricted, yes, but what is total darkness?” he asked.
Driving a vehicle through a sensor is challenging, but that doesn’t come close to the challenges faced by remote operators of unmanned aerial systems, Stawowczyk said. Military studies have shown that most UAS crashes are attributable to human factors, but UAS operators are more environmentally isolated from their platform than vehicle operators, Stawowczyk said. Even a driver encased in a steel-plated box will feel the vehicle banking one direction or another and feel the slope of the terrain.
“With the training that’s provided, I think that you overcome some of the differences between driving with the naked eye and driving with a sensor,” he added. However, it’s not easy.
“There’s always been some issues that someone is driving the vehicle with a night-vision sensor on the front and they drive the vehicle into a ditch,” said David Randall, deputy division director of ground combat systems in the Army's Night Vision and Electronic Sensors Directorate.
Sensors of tomorrow
An U.S. tank rumbling down a city street is a commanding sight. But for insurgents, it’s still a target, despite all the armor and sensors bolted onto its hull. Also, the tank has a weak spot – the commander’s upper body, sticking up through the hatch as he monitors the environment, But that's still the best way to gain situational awareness.
As a result of that weak spot, the Army has efforts under way for a 360-degree sensor system that would envelop vehicles in continuous, electronically generated situational awareness.
One such system could be an extension of DVE. The Army is evaluating whether to place DVE-like sensors in 60 degree increments onto vehicles, Stawowczyk said. However, the possible system, named the Close Surveillance Support System (CS3) wouldn’t provide an unobstructed 360-degree view. Rather, users would toggle between different sensor views. Although the driver is concentrated on the forward view, the commander could toggle to look at the aft view.
Like every new technology, the system itself would require some trade-offs. It would take up more space on the vehicle and add to its energy requirements because DVE is powered through the internal electric system.
“These are real issues for the platform program manager,” Stawowczyk said, adding that limited user-testing is under way.
Meanwhile the holy grail of sensor situational awareness would be total 360-degree vision – a capability Stawowczyk said would be more sophisticated that a CS3, but might also take some time to field.
That's the goal of an ongoing research effort for a system named the Distributed Aperture System (DAS). After five years of funding from fiscal 2003 to fiscal 2007, program managers sent a prototype for two weeks of testing in June at the Maneuver Battle Labs at Fort Benning, Ga.. As a funded R&D effort, the program ended in fiscal 2007, but “the hardware is still being used as a research development tool,” Randall said. The prototype consisted of multi-spectrum sensors, each multi-spectral unit covering a 21-degree slice of the 360-degree spectrum, with about 3 degrees of overlap between units. The total field of view included 90 vertical degrees above horizontal plane and 20 degrees down.
“The real technical challenge was to come up with an affordable system that had a limited number of sensors – more sensors takes more money and takes up limited real estate on the vehicle,” Randall said.
Fusing pictures from multiple sensors into a seamless whole presents many difficulties such as different magnification rates and parallax. The latter is the phenomenon of objects in the background appearing to switch positions relative to foreground objects when observed by sensors placed in different locations observing the same objects. (You can see it by blinking one eye and then another.)
Randall said engineers minimized the problem by allowing only a 3-degree overlap between sensors. Computer processing algorithms then correct for the distortion, not by trying to normalize one sensor's perspective onto another but by coming up with a medium between the two.
However, the distance at which parallax correction is performed depends on the situational awareness needs of the vehicle crew, said Dan Plemons, an engineering fellow with BAE Systems' sensor systems. BAE was the DAS prime contractor. “We recognized from early on that we couldn’t come up with the magic system that worked everywhere,” he said. BAE decided to target the system for close-in situational awareness, he added.
The amount of processing power it takes to fuse multiple sensor inputs into a continuous whole points to another technical challenge – removing latency from the display picture. Even the smallest lag between what the display and the centrifugal forces an operator feels when making a turn can cause nausea. Overall latency can’t be more than 80 milliseconds, Randall said. “If you have this sensation of a turn, but your camera hasn’t told you that you’ve done that yet, it’s like getting seasick,” Plemons said. It also leads to operator overcompensation on the vehicle controls.
However, the operational test in June at Fort Benning produced no seasick soldiers. Test drives that DAS against the DVE showed improved results during day and night conditions, Maneuver Battle Labs officials said.
Soldiers could particularly see DAS' benefit when backing up an armored vehicle, said John Bennett, future branch project officer. During combat, soldiers aren’t too concerned about running over objects behind them, but otherwise, procedure requires a soldier to get outside of the vehicle to direct the driver – or at a minimum, stick his head out of the hatch.
“You may have an insurgent mixed in with the crowd, just waiting to take a pop shot at you,” said Darrell Barden, future branch deputy chief.
One feature soldiers felt strongly was needed was a way to quickly dim the lighted display panels, battle lab officials said. Without the dimming factor, popping the main hatch with the DAS system on was like uncapping a flashlight. They also said the controls needed a redesign. “It being a prototype, if you hit a bump in the road, took a sharp curve, it caused the controllers to move – made it difficult in sometimes for the operators to control the system,” Barden said. Depth perception was also a problem at times.
However, overall, the system enhanced situational awareness, the maneuver battle lab officials said, and Randall expects a DAS-like system to become operational in the near future.