For more information about how Halldale can add value to your marketing and promotional campaigns or to discuss event exhibitor and sponsorship opportunities, contact our team to find out more
The Americas -
holly.foster@halldale.com
Rest of World -
jeremy@halldale.com
Adaptive, performance-focused, at the point of need. Enabled by big data analytics and cloud delivery. That’s how leaders in the community envision the future of military simulation and training. MS&T Editor Rick Adams summarizes their thoughts on what’s trending.
From centralized to decentralized. Push to pull. Content-centric to learner-centric. Human instruction to automation, from detailed acquisition specifications to performance outcomes.
Military training is in a transition driven by a new generation of learners, emerging ‘XR’ technologies, and the availability of enormous amounts of data begging to be automated, amalgamated and evaluated.
We asked leaders in several companies: What do you see as market trends in military simulation and training in the next 3-5 years? And what emerging technologies will transform the ways in which soldiers, sailors, airmen, and cyberwarriors are trained in the future?” Here are highlights of what they told us.
“We see a continuation of the trend where customers are more focused on training outcomes than simply buying products or services,” said Tom Quelly, Business Development Analysis Director for Lockheed Martin Training and Logistics Solutions.
“They want faster training so less time is spent in training. Train at the point of need as opposed to always having to go to a schoolhouse. Push training to lower level solutions. Maximize their budget and increase availability which in turn allows more ‘hands on’ time for students to practice and rehearse at their own pace, where it is most convenient for them.”
“Send me what I need … in real time,” is how Lenny Genna, President of L3 Link Training & Simulation, described the new learner attitude. “We see the next generation of simulators being able to drive from a small computational device or connection to the cloud. The footprint for wired computation will be reduced. Putting more in smaller devices that are more connected will allow the pilot to try scenarios multiple times in the simulator.”
“Potentially 100% of training can be in the simulator and not on live platforms, with scenarios generated more quickly,” Genna suggested. “That transformation will reduce costs.”
“If we are providing a higher number of devices, there must be some compromises in fidelity, tactile feel, or other elements of the solution to achieve that lower price point,” stated John Hayward, Senior Vice President and General Manager of TRU Simulation + Training’s Government Division. “How we make those compromises is critical to ensure the best transfer of learning while managing cost.”
The increasing emphasis on “personalised” learning, LM’s Quelly noted, necessitates an “adaptive curriculum that adapts to a student's current knowledge. Some students might go faster, some slower. Keep the students engaged at an optimal level for learning.”
Hayward said, “We see a training system as a series of learning events in which a student can demonstrate their knowledge and proficiency around a subject and ‘test out’ or be engaged with learning material when appropriate. We expect students will have opportunities to get hands-on the system to learn.”
The enablers of individualised training include “advanced data analytics and associated artificial intelligence (AI) algorithms that have the potential to provide not only a more effective training experience but enable a new generation of training capabilities that are both self-paced and adaptive to the individual trainee,” commented Phil Perey, Head of Technology for CAE Defence & Security. “These capabilities also provide direct and objective insight into a broader continuous improvement process for how the training syllabus can be optimised, thus creating a true ‘closed-loop’ training system.”
“We see machine learning (ML) and focused AI as key enablers for customers to reap improvements from individual and collective training events,” L3 Link’s Genna agreed. “Embedded ML solutions analyse real-time performance data against objective performance criteria and then adjust to an individual’s or unit’s performance level – either increasing or decreasing speed and complexity. The computer can determine with high degrees of accuracy areas requiring additional training, intervention or acceleration better than a human instructor and more importantly, it quickly brings to light areas of concern for the instructor to take action.”
Indeed, Genna predicted, AI “ultimately will reduce the human role in training and simulation.” He sees developing technologies “as removing several key training constraints – they will improve the way that human instructors engage in training. They will allow computers to automate those processes that are entirely objective. You can ingest that expertise into the computer and reduce the live instructor requirement, letting automated teaching and assessment objectify the event. This approach can also standardise the assessment as well as guarantee achievement to required proficiency targets.”
“We at Link will never advocate that human Instructors won’t be part of training; however, there will be a reduction in live instructors required and that will assist throughput in training,” he added.
Darren Shavers, Director of Business Development and Foreign Military Sales, Meggitt Training Systems, cautioned, “Since this new training concept will require a lot of computer power and connectivity, militaries are also worried about how to have all this info in the cloud protected from hackers. Previously stated intentions of keeping all this data in secure, encrypted locations won’t allow the capability to meet future point of need requirements.”
CAE’s Perey said head-mounted displays and other wearable technologies that blend real and computer-generated information – virtual reality (VR), augmented reality (AR), mixed reality (MR), etc., now referred to as XR capabilities – are steadily improving in resolution and field-of-view. “This will enable a suite of smaller, more portable and more deployable trainers to complement traditional training devices. In some cases, these will offload some of the training tasks currently delivered on more traditional training devices, freeing those for more advanced mission training.”
Does current, commercially available XR mimic high-fidelity, multi-million-dollar simulators? “No, it doesn't,” admitted LM’s Quelly. “But it really is viable for training things like part tasks where you're teaching the student familiarity with the cockpit. It depends on what you're training and then how far we can take it.”
Saab Group’s Jerker Johansson said the Swedish company expects “joint training will continue to increase in order to enable large-scale units with troops from different coalition nations. Systems that enable multi-domain and multi-national training will be further developed.” He predicted “improvements in cloud computing and wireless delivery technologies like 5G will enable networked, cyber-safe, synthetic training environments on a broad scale which portend the ability to conduct large-force live, virtual, constructive (LVC) training events on a scale previously impossible.”
“What we believe will happen,” CAE’s Perey added, is “the integration of LVC training, as opposed to treating each as individual and isolated training techniques. What has been talked about for what seems like the past decade will become routine in the coming years.”
Lockheed’s Quelly said the rapid pace of immersive innovation is also transforming government acquisition strategies. “If you wrote requirements for what was possible two years ago in VR headsets, you'd already be out of date. They want it on demand, not as a procurement contract, but modelling and simulation as a service from the cloud.”
Training customers “want to be able to integrate new ideas, knowing that they come quickly and they come from all directions,” he offered, adding, “We see an increasing willingness to leverage alternate acquisition” such as the Other Transactional Authority (OTA) mechanism and other omnibus-type contracts. “We need to be more agile, we need to speed up the pace of innovation, and we need to constantly adapt.”
CAE Rise data analytics debrief . Image credit: CAE
Military Simulation & Training (MS&T) magazine asked:
1: What do you see as market trends / issues in military simulation & training in the next 3-5 years … and why? These might include evolutions of existing dynamics or game-changing emerging trends.
2: What emerging technologies will transform the ways in which soldiers, sailors, airmen, and cyberwarriors are trained in the future? What role(s) do you envision for artificial intelilgence/big data/machine learning, blockchain, cloud computing, robotics/drones, and X Realities (AR, VR, MR, holographic, etc.), or other disruptive technologies?
Let’s review some of the key challenges that defences forces face worldwide. Ongoing trends include significant – and in some cases growing – pilot and qualified instructor shortages, security concerns on the use of advanced sensors while conducting live training, and difficulties in fully exploiting the massive amounts of intelligence, surveillance, and reconnaissance (ISR) data for conflict advantage. Of course, militaries are continually faced with making decisions that have cost and budget implications, so are always looking to enhance and optimize efficiency and effectiveness to achieve the desired readiness objectives.
Certain emerging technologies over the next three to five years promise to transform military simulation & training more significantly than that past two decades – perhaps as significantly as the now-ubiquitous smartphone has changed our daily lives. CAE sees four primary technology vectors that will, individually and combined, contribute to the evolution of military training. They are:
1: Improvements in Head-Mounted Display technologies that combine portions of real and computer-generated (or virtual) information – more broadly referred to as XR capabilities. The combination of improved resolution and field-of-view will enable a suite of smaller, more portable and more deployable trainers to complement traditional training devices. In some cases, these smaller, portable and deployable training systems leveraging virtual, augmented or mixed-reality will offload some of the training tasks currently delivered on more traditional training devices (such as a full-mission simulator), thus freeing up those traditional training devices for more advanced mission training or further offloading of training tasks from the actual weapon system platform.
Compelling and realistic haptics remains a laggard to fully virtualize the training system. We are therefore likely to see earlier adoption of mixed-reality systems that use a combination of physical and virtual interfaces. For example, CAE has already fielded or is currently developing several XR-type capabilities, such as a rear-crew training solution for the UK Royal Navy’s Merlin Life Sustainment Program (MLSP). On this program, the mixed-reality system allows multiple rear crew members to interact with the pilots and other rear-crew who are training at various positions in the rear cabin of an AW101 Merlin helicopter. The trainee touches and interacts with physical equipment in a full-scale mock-up of the rear cabin of the helicopter while the synthetic view out of the aircraft is overlaid in a head-mounted display worn by the trainees. This enables the rear-crew to train for the variety of missions typically performed in the AW101 – such as door gunnery, ramp operations, and hoist operations – but without the need for massive physical displays. CAE is also developing similar mixed-reality solutions for US Navy MH60 Seahawk helicopter aircrews as well as for the comprehensive NH90 helicopter training system we are developing for the Qatar Emiri Air Force.
2: Advanced data analytics and associated Artificial Intelligence algorithms that have the potential to provide not only a more effective training experience for both pilots and instructors, but enable a new generation of training capabilities that are both self-paced and adaptive to the individual trainee. These capabilities also provide direct and objective insight into a broader continuous improvement process for how the training syllabus can be optimized, thus creating a true “closed-loop” training system. CAE recently introduced CAE Rise (Real-time Insights and Standardized Evaluations) for the military market at I/ITSEC 2018. CAE Rise enables students and instructors to get an objective, real-time assessment of student performance using live data during training sessions. The performance reports can be stored in the student’s records, and data is analyzed across groups to assess curriculum and class-level performance. A civil version of CAE Rise is currently in use at Air Asia, and CAE is actively pursuing several defence programs that incorporate the CAE Rise data-driven training system.
3: More pervasive use of cloud computing, enabling point of need availability of simulation and up-to-date situational awareness of the battlefield for mission planning and mission rehearsal. Additionally, the connected nature of these systems will facilitate collective training upstream through broader exploitation of distributed mission operations training. CAE is currently leveraging these cloud-connected capabilities in its aforementioned CAE Rise system, where all the context and assessment algorithms are cloud-based and streamed from the simulator to cloud and back to the instructor during training. Separately, CAE is working with the United States J7 Joint Force Development, streaming up-to-date geographical datasets in the Open Geospatial Consortium Common Database (OGC CDB) format directly to the user’s standard web browser.
4: Defence forces have long used elements of live, virtual, and constructive as part of their overall training continuum. Live training will always have a central role for militaries in developing combat capability and readiness. In recent years, there has been an increasing use of virtual and constructive training as simulation-based technology has afforded the downloading of some training to the virtual environment, and the cost/benefit analysis has made the use of more virtual training very compelling. What is needed, however, and what we believe will happen over the next three to five years is the move toward the integration of live-virtual-constructive training as opposed to treating each as individual and isolated training techniques. You are beginning to see militaries around the world implement programs aimed at establishing the architectural standards and technologies that will enable integrated LVC training, so what has been talked about for what seems like the past decade will become more commonplace and routine in the coming years.
We see a few key enabling technologies playing a critical role in next generation training and simulation solutions as the services look to changing the paradigm of historical training methods to meet both readiness and proficiency challenges across the force structure. First, improvements in cloud computing and wireless delivery technologies like 5G will enable networked, cyber-safe, synthetic training environments on a broad scale which portend the ability to conduct large force live, virtual, constructive (LVC) training events on a scale previously impossible.
Advances in gaming engines, computational power, graphics processing and display technologies are rapidly changing how content is delivered to customers for their use in individual or large force multi-player training operations. Ultimately, we see these combined technologies increasing the inventory of training devices not just at fixed training sites but to deployed locations as well. This growth in device seats will focus upon lower-cost, lower fidelity trainers – think virtual reality – co-located with higher-fidelity devices or at remote locations but all linked into the same event. Link’s Blue Boxer Extended Reality™ (BBXR) is an excellent example of a combined technology approach that leverages these technologies to deliver high value training at a significantly lower cost.
Further, we see Machine Learning (ML) and focused AI as key enablers for customers to reap improvements from individual and collective training events. Embedded ML solutions, like Link’s Adaptive Learning Engine™, analyze real-time performance data against objective performance criteria and then adjust to an individual’s or unit’s performance level – either increasing or decreasing speed and complexity. The computer can determine with high degrees of accuracy areas requiring additional training, intervention or acceleration better than a human instructor and more importantly, it quickly brings to light areas of concern for the instructor to take action.
The exciting rate of change necessitates reimagining our training approaches, and we continue to invest and innovate in ways that produce previously unattainable capability and value for our customers.
Advances in computing, AI, ML and delivery will change not only the way solutions are delivered, but also how they are developed. On the delivery side, AI, ML and big data will be combined to push more individualized training; modulating an individual’s training pace thus optimizing knowledge and skill absorption. At the same time, the resultant data will produce unprecedented insights into the progress and readiness of their respective units.
In many ways, we see these technologies as removing several key training constraints – ultimately, they will improve the way that human instructors engage in training. They will allow computers to automate those processes that are entirely objective. One example is how expected flight profile standards, tolerances and deviations are measured for a pilot conducting a takeoff and formation rendezvous. A human instructor simply cannot assess with absolute accuracy deviations measured in feet or time limits. The automated assessment tools can measure, display and grade the profile, freeing the instructor to act truly in the capacity of a teacher vice observer and data collector.
AI is a growth area for our customers and ultimately will reduce the human role in training and simulation. The current syllabi leverages human-based instruction and in a pilot training program for example, you can ingest that expertise into the computer and ultimately, reduce the live instructor requirement letting automated teaching and assessment objectify the event. This approach can also standardize the assessment as well as guarantee achievement to required proficiency targets.
We at Link will never advocate that human Instructors won’t be part of training; however, there will be a reduction in live instructors required and that will assist throughput in training. Ultimately, I see a future where the student pilot will log into a training device, execute the event autonomously, with performance observed and monitored, and feedback provided. AI increasingly will be a key enabler. We see all of our customers moving in that direction.
And, AR, VR and MR will evolve based on computing and graphics. We see the next generation of simulators being able to drive from a small computational device or connection to the cloud, and to ‘send me what I need’ and receiving it real-time. The footprint for wired computation will be reduced. Putting more in smaller devices that are more connected ultimately will allow the pilot to try scenarios multiple times in the simulator. Potentially 100% of training can be in the simulator and not on live platforms, with scenarios generated more and more quickly. That transformation will reduce costs across multiple accounts including acquisition, operations and maintenance via longer airframe life, fewer live-fly events and thus lower maintenance requirements.
I think it is important to touch on the fact that many of our existing programs have started implementing key aspects of the discussed technologies. At Link, we are trying to pull the pieces together so that customers see the value of these new approaches while also taking advantages of previous investments. A specific area I believe we were one of the first movers is in cloud-based training architectures. In 2016, we implemented a cloud-based real-time training platform using commercial cyber security, an open architecture, virtual machines, COTS hardware and commercial gaming engine to push training to the point of need.
In November 2018, we unveiled our BBXR mixed reality trainer at the Interservice/Industry Training, Simulation and Education Conference tradeshow – it absolutely highlights advances in connectivity, graphics and display technologies. The feedback we received during the show was outstanding, and importantly, I think it is really informing customers about what is possible with these new devices. Our ALE™ completely automated performance assessment for the instructor, providing visual cues that helped the instructor know when and how to constructively engage the student. Further, students were able to see how they performed relative to their peers, and even during our demonstrations it became clear healthy competition drives performance.
Another area I want to highlight is our Virtual ISR Training Application or VISTA. Here, you see us automating scenario lay-downs, white-cell role-playing and C2 interaction. We demonstrated this during a joint-NATO exercise, and it was powerful. We empowered analysts to produce more training value with fewer resources.
L3 Link BBXR Mixed Reality Trainer. Image Credit: L3 Link Training & Simulation
Most allied armed forces are determining how to prepare for the next conflict; however, training normally lags actuality in the current acquisition process. Regardless, military forces need to train more realistically for the battlespace to which they may deploy and greater immersion can facilitate this mission. Although the goal of a Star Trek-like Holodeck experience (combining solid props and warfighters with any environmental setting) is not new, achieving it requires both technological advances and a willingness to reconsider decades-old requirements to meet emerging threats. This level of full immersion is not yet possible, so augmented reality represents the current state of the art. Procurement requirements will call for the ability to augment the reality of where our warfighters are and have them engage evolving characters in this AR-based world. For industry, that means addressing the challenge of convincingly augmenting our warfighters reality, while still maintaining the required training fidelity. Think about multiple player video games with learning CGI characters that look very real. We can try to evolve that video game to having our warfighter using their weapons and other tools, instead of the joystick. Thus, the future involves morphing the video game into an augmented solution where our warfighters can join and still have form, fit and function of training and devices.
Industry is slowly adapting goggles to have a smaller form factor and a wider field of view to meet the function for having peripheral vision while immersed in training. To meet the need to train anywhere and everywhere, known as point of need (PoN), armed forces are debating how to run their war games from the cloud, importing real data from robotics, drones and other devices. Since this new training concept will require a lot of computer power and connectivity, militaries are also worried about how to have all this info in the cloud and protected from hackers. Thus, cybersecurity has become a big issue. Previously stated intentions of keeping all this data in secure, encrypted locations won’t allow the capability to meet future PoN requirements.
It will become more common to incorporate support weapons to Live Training. Such weapons that historically have not been involved in live exercises.
We see that joint training will continue to increase in order to enable large scale units with troops from different coalition nations. Systems that enable multi-domain and multi-national training will be further developed.
The new technologies with better sensors will support more effective and time saving training. Also more distributed training for individuals and units. The analysis of performance will be automated.
In addition to traditional high-fidelity simulators like a full flight simulator, we are seeing many of our military customers include AR and VR solutions in their long-term simulation plans. While AR/VR is being introduced into the services, we expect to see leverage of lessons learned across those services and continue to push industry and the limits of IR technologies.
We are also seeing a desire for intelligent tutoring. This includes training systems that adapt to each student’s unique learning needs such as traditional adaptive learning approaches and student biometric analysis to help understand stress levels to drive repetition and proficiency in areas that are weak.
We are seeing our military customers wanting to push training to the lower level solutions, such as flight training devices. This provides the military customer to maximize their budget and increase availability which in turn allows more “hands on” time for students to practice and rehearse at their own pace, where it is most convenient for them. While we don’t see these lower level devices replacing the higher-fidelity full flight simulators, the advantage of lower level devices is that students can take the knowledge from a full flight simulator training session and practice on-the-fly, immediately ahead of a specific mission in a device designed for that task.
However, we don’t believe that the military will accept unnecessary risk to be the leader in this technology development. We expect to see the continued trend of military pushing for lower cost, higher fidelity AR/VR solutions for all types of training, including high risk activities. This will likely include aviation, infantry, armor, surface, sub-surface, cyber, and unmanned systems. We believe AR/VR can drive down our customers’ operational risks and avoid some of the mishaps we have seen in both in the air and seas that were based on service members encountering situations they had never previously encountered.
In the near term we believe these AR/VR solutions will not replace but rather will augment the existing high-fidelity systems to provide more training to students, speeding knowledge transfer, but retaining the high-fidelity solutions for certification. This may allow for fewer high fidelity systems, which will lower the overall cost of the training solution while improving the student throughput and proficiency.
As discussed above, we see X Realities (specifically AR/VR) as key technology for our customer to achieve their stated goals. Using AR/VR will allow us to deliver many hands-on learning opportunities for students that would otherwise be unavailable. This will give the current generation the exact experiences they seek to learn the job tasks they need to be successful. We believe that as cloud computing continues to improve and become more cyber compliant, it can certainly be an element of that solution.
We expect AI to help with adaptive learning as will big data and machine learning. Ultimately, we see a training system as a series of learning events in which a student can demonstrate their knowledge and proficiency around a subject and “test out” or be engaged with learning material when appropriate. We expect students will have opportunities to get hands on the system to learn. If we are providing a higher number of devices there must be some compromises in fidelity, tactile feel, or other elements of the solution to achieve that lower price point. How we make those compromises is critical to ensure the best transfer of learning while managing cost.
Through this combination of adaptive learning and the increased hands-on opportunities, we believe we can deliver material and experience to meet each student’s unique learning needs.