Special Ops wants to correct the Parallax view in optical systems
The command is holding a challenge to find an algorithm that can compensate, in near-real time, for different views from offset optical sensors.
The U.S. Special Operations Command is developing what it calls a “multi-spectral visual system,” which involves combining visual feeds from two sources. But that also creates a problem familiar to anyone who has ever closed one eye while focusing on an object, then switched eyes: the view changes depending on which eye you’re using.
It’s known as the as the parallax effect, and the command is holding a prize challenge to find an algorithm that can correct for parallax in near real time.
With the Algorithm for Real-Time Parallax Correction challenge, the command is looking for a system that can predict what the view would be for other, offset optical sensors being combined to create one view, in order to compensate for their different positions.
The challenge will be held in two phases. The first invites written proposals that should include an algorithm, an explanation of how it works and any backup data. Awardees from Phase 1 will be invited into Phase 2, which will hold on-site field tests of the algorithms.
This isn’t a big-money challenge, with a total of $50,000 in prizes available, which suggests that the technology likely already exists and the command is looking for the best available solution. But it would solve a potential problem for that visual system—as with the human eyes, having more than one input can provide depth perception, but only if the two views are compensating for each other.
It also reflects ongoing efforts within the military to combine various forms of data from multiple inputs into a unified view.