Virtual Reality Comes to US Military's Mapping Agency
U.S. officials abroad may soon be able to use their smartphone cameras to help swiftly reconstruct a bomb scene for eyes wrapped in Oculus 3-D headgear back in Washington.
One day, a Foreign Service officer in Somalia might be able to use his smartphone camera to help swiftly reconstruct war devastation for eyes wrapped in Oculus 3-D headgear back in Washington.
Virtual reality is not all fun and games. It’s one tool America's spy mapping agency uses to offer analysts a real-world view of landscapes rocked by natural or manmade changes.
Conversations "have just started” between National Geospatial-Intelligence Agency immersion researchers and NGA’s mobile app developers to marry the two technologies, Joe Wileman, a researcher at the agency’s Springfield, Virginia, headquarters, told Nextgov.
Emerging tech developers there, equipped with instruments like Oculus viewers and Xbox controllers, demonstrate to intelligence and military analysts the art of the possible. In the past, photographers have used a tripod-mounted scanner to capture data for creating 3-D-panoramic views of foreign territory, like an Ottawa outdoor park scene viewed by a reporter.
With a smartphone app, you could have an individual strolling down a street somewhere, snapping photographs, and those photographs would be uploaded to a generic building model at NGA.
“We can then take those images and then overlay them on top of the building, so that our model becomes now the actual building,” said an agency researcher who spoke on background.
For example, the next time there’s an earthquake in Nepal, an American embassy employee based there could immerse a Federal Emergency Management Agency decision-maker into the disaster zone, Wileman said.
Or perhaps, "we can't afford to send somebody to Mogadishu [who can] walk around and see what it really looks" like, but "we may have somebody in a consulate who is in there who can take the day and take photos" to transmit back, a researcher said.
Submersing analysts into a simulated theater can more easily answer questions like: How do I avoid areas of concern or prevent being seen by an adversary?
Some mobile device manufacturers have begun offering consumer-grade 3-D-mapping smartphones. NGA might need more time to deploy similar gizmos, because of the Pentagon's unique data security constraints.
Earlier this month, Lenovo unveiled the first-ever smartphone equipped with Google's depth-sensing tech called Tango. The sensors on the $499 phone reportedly are designed to capture more than 250,000 measurements a second.
A professional tripod-mounted scanner goes for about $50,000, with smaller hand-held devices now selling for about $17,000, according to the Wisconsin State Journal.
The trick of the eye created by 3-D cameras is really just a collection of points.
Typically, a light-pulsating "lidar" scanner is used to measure the distance of objects in the surrounding area. A sensor on the tool gauges the amount of time it takes for each light beam to bounce back. The measurement data is plotted on a grid with an x, y and z coordinate system and projected onto a photograph of the environment to create a series of points, called a 3-D point cloud.
Already, some local law enforcement agencies have found that virtual reality is a huge time saver when trying to canvass a crime scene.
Using a 3-D scanner in 2014, Wisconsin's Dane County Sheriff’s Office documented a 2,200-square-foot home where a woman was murdered. It took 37 scans and about 10 hours to capture the house, the Wisconsin State Journal reported. A team of investigators collecting evidence the old fashioned way—with a tape measure and pencil and paper—would have needed two to four days to get the job done.
Drones Can Map in 3-D, Too
There also is talk at NGA of flying a drone over, for example, a hurricane-ravaged community to survey the devastation and produce VR imagery for response planning. (FEMA would have to request NGA's services, as the intelligence community is outlawed from flying unmanned surveillance aircraft over America.)
Another area for experimentation is the use of "augmented reality"—think Google Glass—to position geospatial data on top of a soldier's field of view in the real world.
One could argue that a predecessor of augmented reality is the flight cockpit, NGA Deputy Director Sue Gordon said during an interview at a Feb. 25 Esri conference.
"It’s pretty fun for us to work with a technology that is so central to a geographic reference," but "it still has some distance to travel" in terms of being ready for operational use, she said.
Because AR would change the warfighter’s perception during battle, more sensory and safety research needs to be done, Gordon told Nextgov. The brain is accustomed to seeing maps where the scale is smaller than the corresponding distance on the ground.
"When you put augmented reality on your face, it typically makes everything one-to-one," she said. "We’re still learning how the human perceives that different view."