An Interview With Red 6 CEO Daniel Robinson

Contact Our Team

For more information about how Halldale can add value to your marketing and promotional campaigns or to discuss event exhibitor and sponsorship opportunities, contact our team to find out more

 

The Americas -
holly.foster@halldale.com

Rest of World -
jeremy@halldale.com



Red-6-Aerospace
Red 6 Aerospace CEO Dan RobinsonSource: Red 6 / Credit: Mike Killian

MS&T’s Andy Fawkes and Dim Jones had the opportunity to chat with Daniel Robinson, Co-Founder and CEO of augmented-reality company Red 6, a former RAF Tornado F3 pilot and the first non-US pilot to fly the Lockheed Martin F-22 Raptor operationally.

One of the recurrent themes of military flight training discussions has been the mix of live and synthetic training, both in formal flying training but particularly in operational continuation training on the front line – the Live/Synthetic Balance (LSB). Many factors, including cost, environmental considerations, operational security and airspace and asset availability, militate for the use of synthetic training equipment (STE), from simple desktop devices to full-mission simulators (FMSs). The spiralling costs of flying 5th-generation aircraft have prompted some air forces operating, or expecting to operate, the F-35 to plan on LSBs as extreme as 10%-90%.

However, there is a growing conviction that there are some aspects of live flying which are essential to the operational effectiveness of aircrew – the ‘fear factor’, high G and physical stress and, crucially, the ability to make decisions under these conditions, which just cannot be replicated in a simulator. The aim, then is to make this necessary live flying as cost-effective as possible; the problem is that it is extremely difficult, and certainly prohibitively expensive, to provide friendly and adversary aircraft in the numbers and with the capabilities required for realistic training – a shortfall in both ‘scale’ and ‘relevance’.

Simulators use virtual reality (VR) to replicate both the performance of one’s own aircraft and that of a simulated adversary – a virtual entity superimposed on a virtual world. Augmented reality (AR) superimposes a synthetic adversary, or friendly aircraft, on the real world. The advantages of this are obvious in terms of cost: there is no need for an expensive adversary aircraft and, if the technology can also be used to replicate friendly aircraft – wingmen, tanker aircraft, etc – there is no need for them either. This not only conserves airframe life for the friendly aircraft replicated, but it allows the virtual adversary to simulate any aircraft, thus providing peer or near-peer opposition which would not be possible in the real world. Not only that but, in a visual fight against a live adversary, flight safety rules would prohibit some tactics or manoeuvres which might be most effective on the day, resulting in negative training; an AR adversary would generate no such limitations. A peripheral but nonetheless crucial consideration is that no longer would front-line pilots be degrading their limited live flying time, and using up valuable airframe life, by playing ‘Red Air’ instead of carrying out their primary roles.

The problem is that the image technology which allows pure virtual simulation of a visual fight does not work in the real world, due principally to two challenges: tracking and display. While a synthetic entity can be accurately presented in an aircraft’s real or emulated sensors (eg, radar), as in live/virtual/constructive (LVC) systems, it cannot be replicated in a visual engagement situation.

Dan Robinson and Red 6 believe that they are on the way to solving this problem with their Advanced Tactical Augmented Reality System (ATARS); what’s more, they have convinced some big players, such as Boeing, Lockheed Martin, BAE Systems and Korean Aircraft Industries, that the possibilities are worth exploring.

High-intensity aerial warfare in the future will demand constant training, embracing both scale and relevance. As Robinson observes: “The best training available for most pilots is Red Flag. We need to be getting that quality of training every time we get airborne”. He believes ATARS, an adaptation of LVC, is the way to do it.

The most demanding environment is a visual close-range engagement between high-performance aircraft, and realising this capability is where most of the focus has been directed thus far; however, this is by no means the only discipline in which AR could have utility. Nor is it the case that 5th-gen fighters and those of the next generation have such capable sensors and such long-range and reliable weapons that they will never get into a visual fight. Weapons or sensor failure, missile expenditure and enemy action such as jamming can all conspire to generate a close-range engagement, and all pilots need to train for it.

So How Does Red 6 ATARS Work?

There are three elements in the equation:

  • Generating the virtual entity and tracking it; 
  • Displaying it to the pilot; and 
  • Transferring data between aircraft or between aircraft and simulator as appropriate. 

Tracking in a virtual environment can be achieved by placing sensors outside the simulator but, in an outside environment, this does not work. Then there are the challenges of tracking not in a 1G/2D static environment, but dynamically in 3D, at high-G, at 400 knots and upside-down. Moreover, the AI which generates the entity needs to know when not to draw – ie, when the target would not be visible to the pilot due to obstructions.

These objectives are achieved by SLAM – Simultaneous Location and Mapping – through a mix of optical and inertial sensors. As regards display, even the best VR headsets don’t work outdoors, for reasons of brightness, resolution and field of view (FoV), the requirements of which militate for some sort of helmet-mounted display (HMD); however, existing HMDs are nowhere near the standard required for ATARS.  Red 6 don’t build helmets, they provide optical solutions, and ATARS is designed to be helmet-agnostic.

The threat entity can replicate anything you want – a Chinese Chengdu J-20, for instance, or a Russian Sukhoi Su-57 – and the image is rendered in the real aircraft, generated by an on-board computer, driven by AI algorithms and refined in the HMD to reduce latency. It can work independently in an aircraft which does not have its own sensors, or integrated into the host aircraft systems, be they real or emulated, so that it appears in the appropriate sensors, much as happens in LVC now. Other threats, such as SAMs, can also be injected, as can visual effects such as missile trails, tracer fire and explosions.

Datalink is required for multi-ship operations, such that all participants see the target from their own perspectives; again, AI algorithms are used to create these differing aspects. A wingman can be live, synthetic or in a simulator, such that the lead aircraft can see either a synthetic wingman fighting a virtual adversary, or a real wingman fighting a synthetic target. In this way, it is possible to mount a 2v1 fight using free and engaged tactics and enabling decision-making as for real; a manual override will also allow the leader to cause the adversary to switch opponents, assuming that it was not about to do so itself. The only limitation is that peacetime separation rules would have to be observed between two real aircraft in the same engagement.

The AI algorithms have been derived from DARPA work on programmes such as Air Combat Evolution (ACE). Displaying different target aspects from two live aircraft is network-enabled, and ‘packets’ of information are sent across the network, to be rendered locally in host aircraft, again to minimise latency.

Although fast-jet close combat has been the primary focus of ATARS, principally because it is the hardest to achieve and arguably the most valuable, this does not mean that it does not have utility elsewhere, not least for enabling practice of friendly operations which would otherwise require the live participation of friendly assets – formation flying and air-to-air refuelling (AAR) are prime examples. There are also limitless potential applications for rotary-wing and multi-engine operations.

Nor is AR’s utility limited to the front line; there is opportunity to download advanced training events from operational types to much less expensive trainers. The collaboration with Boeing, Lockheed Martin, KAI and BAE Systems suggest that ATARS may be used in aircraft like the Boeing T-7A Red Hawk, the LM/KAI TA-50, and the BAES Hawk T2, all of which have emulated sensors. Indeed, initial integration work is being done on T-38, and is shortly expected to start on Hawk T2, variants of which are in wide use around the world alongside their T1 forebears.

Although I have mentioned collaboration with industry, the militaries of the respective nations have been fully involved from the outset, which suggests that they also see the potential; indeed, the first approach to the USAF was made in 2018 and in 2019 received a development grant from the Air Force Research Laboratory’s AFWERX technology accelerator programme, followed in 2021 by a 5-year, US$70m contract to develop AR for the USAF.

The RAF’s Deputy Commander Operations and its senior warfighter, Air Marshal Harvey Smyth, is also more than aware of the impact AR and ATARS could have on training and readiness.

The initial ATARS airborne trials were carried out using an aircraft which Robinson built himself, and involved creating a static virtual cube in mid-air, and flying around it in a real aircraft. This led to being able to ‘fly through’ the cube, and the immersive nature of this exercise was reflected in the pilots’ instinctive desire to protect the wings of the aircraft by avoiding the edges of the cube. Development of the system is now between ‘advanced prototype’ and minimum viable product (MVP) status.

Note from Dim Jones: I have experienced and witnessed over many years the increasing challenge of providing realistic and relevant operational training for front-line aircrew, from availability of friendly assets and adversary aircraft in representative numbers to airspace limitations and tactical security. I have also seen initiatives to enhance formal training courses (by downloading advanced events from the operational conversion units) founder for lack of available aircraft and suitably qualified instructors. Although relatively in its infancy, ATARS has the capacity to transform this situation. I would be lying if I said that I fully understood the technology, but I know a man who does, and I can certainly see the potential, as can others. I believe that AR is the way forward and, with the operational effectiveness of our future front lines in mind I wish Red 6 every success in their endeavours.

Related articles



More Features

More features