Published: March 2019
At present, the JTAC builds up their situational awareness picture and coordinates the air operations, largely, using only voice communications over a radio link. The work carried out as part of AMS DE-RISC demonstrates how the voice only approach can be updated to an enhanced solution allowing automated, digital exchange of information between the various actors within an operation. This digital exchange of data is faster, less error prone and allows a situational awareness picture to be presented to the JTAC, through multiple interfaces, to augment the awareness that the JTAC has already built up.
The key to the JTAC solution is the interface between the JTAC and the elements that will facilitate the efficient planning and execution of the close air support (CAS) mission. This interface is provided through two mechanisms; the first is the JTAC software running on a small, powerful and lightweight processing device known as an ‘end user device’ (EUD). The second mechanism is the augmented reality (AR) glasses developed by BAE Systems ES which provide the JTAC with an intuitive interface between the digital, data driven network and the real world. How both these interfaces have been implemented is discussed in section 3 of this document and together they form the core of the JTAC solution developed as part of WP5.
AR Glasses: Binocular see-through display for augmented reality imagery using a wide FoV colour display for immersive viewing. The display is collimated so that imagery and symbology are focused at infinity for easy fusion with the viewed scene thus avoiding eye strain by not having to re-focus between the outside world and the symbology. The glasses include head orientation and position tracking for conformal imagery. The AR glasses are daylight readable, light weight and comfortable for prolonged use.
JTAC Software / Web Features Service (WFS): Provides improved effectiveness with intuitive interfaces to allow a clear SA picture to be built up. Automatic data population enables improved accuracy and speed. The human machine interface displays the current situational picture with transparent real time digital interactions resulting in a reduction in cognitive workload. The overall approach is task driven to provide a natural flow through the mission planning steps.