For more information about how Halldale can add value to your marketing and promotional campaigns or to discuss event exhibitor and sponsorship opportunities, contact our team to find out more
The Americas -
holly.foster@halldale.com
Rest of World -
jeremy@halldale.com
MS&T Editor Andy Fawkes and the MS&T Editorial Team deep dive into key technologies which are driving the transformation of training.
There are numerous predictions of how with advances in technology the world might change over the next decade. Many centre around AI, the Internet, human interfaces, energy supply, health and transport, but how might the military simulation and training world change? Predicting the future can be fraught because human endeavour is often chaotic and what seem like certain trends are consigned to history as other trends take over. However, as it is the start of a new decade, we asked industry experts for their view and have put together a guide to some of the hottest trends in S&T-related technology.
One does not need to be in the S&T community to observe that the take up of technologies, particularly consumer-focused, are accelerating. For example, mobile devices have reached levels of worldwide adoption in 10+ years which took telephone landlines 100+ years. Further, technologies once thought of as science fiction can now be found in our homes, such as Amazon’s Alexa AI.
We are seeing both military customers and industry looking to this wider world and seeking the same level of innovation and agility. Sébastien Lozé, Industry Manager Simulations at Epic Games told MS&T “there are three steps of technology innovation: Discovery (where many of these trends are today); Dissemination (where the target audience is reached); and Adoption”. The UK’s Defence Innovation Unit recently funded an Army VR training pilot project and the USAF’s Small Business Innovation Research (SBIR) program is funding training-related projects. Companies such as CAE are putting their emphasis on “digital innovation”. Their 2019 Annual Report highlights their Project Digital Intelligence innovation program aimed at “cutting edge digital technologies and systems such as big data artificial intelligence, cloud computing, cybersecurity and augmented/virtual reality”.
Military S&T is not a greenfield site however, with training programmes already contracted out so the challenge for the 2020s will be to incentivise existing players to innovate whilst encouraging and supporting new entrants in what can seem a complex and tough market to break into.
Military customers are seeking training pipelines that can adapt to changes in demand, not tied to specific locations and more agile to operational demands. Some term this accessibility and flexibility as the “Netflixisation” of training.
Lozé told MS&T, “Thanks to advances in cloud services (Google Cloud Services, Microsoft Azure, AWS), 5G technology, real-time virtual environments such as Unreal, and more - training will no longer be locked to one physical location. Instead, advanced simulations will be mobile, and trainees will greatly benefit from the convenience of learning at their own pace, and in their own space (either personal places or workplaces).”
This unified accessible approach echoes that of the US Army’s Synthetic Training Environment (STE) which aims in the early 2020s to move to a “single STE that delivers a training (and mission rehearsal capability) service to the Point of Need”.
AI, Automation, Machine Learning, Big Data, Analytics et al are terms that regularly make the headlines and technology commentators anticipate major advances to be made in these fields over this decade. Inevitably there will be an impact on S&T. Advances in training data capture and analysis could improve our understanding of training outcomes and help tailor training to individuals.
For decades AI has been important to the S&T community to reduce the number of human role players but now it may help create simulations. Lozé said “today, content creation for simulation is heavily reliant on humans. GEOINT data editors build accurate datasets in a process that is nothing short of time consuming. AI can, and will, significantly reduce the time cost of manual preparation, freeing the GEOINT community to focus more on data analysis than on data maintenance. This is just one example of how content generation will be revolutionized by AI, and we can expect a swell of other improvements.”
As for the AI in simulations, Pete Morrison, Chief Commercial Officer, BISim, cautioned that “there is huge excitement around machine learning, but I think that the change will be gradual as there is a tremendous difference between one of Google's computers playing Go and a military commander making tactical decisions. I think it all comes down to whether the problem is bounded or not and how much data is available. It's all about the rule sets and then having the data in order to effectively teach the AI to make good decisions.”
MS&T has reported on the significant number of XR-based applications (AR-MR-VR) seen at I/ITSEC 2019. Simulation companies have been exploring the use of XR for some time, particularly since affordable products such Oculus Rift was released in 2013. But there have been challenges for VR. Morrison explained that for VR-based pilot training “we hit a brick wall in the lack of haptic feedback that is required for pilots to be able to interact with the cockpit.” Morrison did not see the haptic problem being solved fully in the 2020s as “... there’s no good technology for allowing an operator of any system to interact with the virtual world in a realistic way and part of the problem is that these systems are tremendously complex.”
MR (mixed reality) however, shows considerable promise. The Varjo MR HMD allows the trainee to see and interact with a physical cockpit but seamlessness see the virtual world out of the cockpit. Morrison believes that “over the next 10 years MR will absolutely crack it as it is a factor of processing power and resolutions will increase and eventually the scene in the headset will be good enough. It might take another few years, but MR will replace traditional dome-based flight simulation.”
Lozé is looking to other XR innovations such as the Open XR initiative. “This will allow for flexible deployment to all methods of XR, allowing simulation creators to focus on the curriculum at hand, and not on the display of a pixel on one HMD or unique display. Choosing your method of XR will soon be as trivial as mono vs stereo vs 5.1 sound, and this will unlock all kinds of new opportunities in the next ten years.”
The British Army is looking beyond individual training to collective/team XR-based training. In 2019 it demonstrated that 30+ soldiers with XR HMDs could operate together in the same virtual world in a variety of training scenarios and in 2020 this XR-based system will be deployed to barracks as a mobile training service. In future such deployable and flexible systems could offer collective training opportunities not easily provided through more traditional fixed training systems.
The resolution of displays is ever improving. 8K televisions are on the market and XR displays are heading towards eye resolution. “Advancements in GPU capabilities are ushering in a new era of display techniques (ray tracing, PBR, photogrammetry, virtual humans), ensuring that simulations not only behave realistically but also with a high level of authenticity and believability,” Lozé told MS&T. With recruits experiencing high fidelity games and video content “the next generation of trainees demands a more immersive training experience.”
Although the term “digital twin” has been used for some time it is on the rise with the ever-increasing digitisation of products and activities. A digital twin-based approach offers the prospect of consistent, reusable, and available data in support of the procurement of military systems and platforms and all associated simulations through life, ranging from analysis through to training and mission planning. Indeed, keeping simulators up to date with their corresponding system/platform is an ongoing challenge and a digital twin approach could in future help build understanding between the procurement and S&T communities.
Digital twins can be applied to hardware but what about people? UK start-up, AERALIS is developing a modular fighter jet trainer aircraft and looking at novel ways of delivering the syllabus. AERALIS Strategy Director Tim Davies explained that they are looking to streamline and improve the complex and costly process of military flying training by incentivising the training process. They are proposing the “use of a ‘digital twin’ which represents the model warfighter that a student needs to emulate, and which the student then tries to match by working towards making their digital record of flying and combat performance equal to that of the twin,” Davies explained. Based on gamification principles, the pilot not only can compare themselves with the digital twin, but their digital record can be shared across the class so that students can learn from each other to improve their own performance. In addition, the training service provider will have anonymous data to compare not only student performance but also instructor performance and syllabus design. “For the first time a provider will have quantitative basis on which to develop better training systems in the future,” Davies said.
Another example of digital twins is the Brightline and Teslasuit demonstration at I/ITSEC 2019. Here a trainee can train and rehearse their tasks in a virtual world and then carry out the same tasks wearing a Teslasuit which provides tactile feedback to guide them towards the behaviour of the digital twin they created in the virtual world or that of an exemplar student.
Exemplar digital twins and digital student records combined with the security and transparency of blockchain technology together with links to HR and recruitment systems could provide powerful insights to decision makers, from the students themselves through to the enterprise. However, crossing the many organisational barriers to make this a reality would be a very significant challenge, even over a 10-year period.
The military S&T community has been successful at developing interoperability standards, but can we expect change? Morrison believes “… that we are actually getting into the end of what I call the traditional interoperability era. HLA/DIS are good standards created for good reasons, widely adopted, and working. But the problem is that the rest of the world outside military simulation doesn't use HLA/DIS. They use web services and standardised means of communications like Google protocol buffers in order to deal with these very complex systems.” There is also the HLA/DIS skill issue. Morrison continued, “You can leverage the technologies that the broader industry is using, exploiting its skill base and not needing to train an HLA engineer. Over the next 10 years, everybody's applications are going to move to a much more web-enabled architecture based upon web services or micro services.”
A transformation in live training is also anticipated. In MS&T Issue 6/2019, MG Gervais said “we’ve been asked to accelerate the live training environment - force-on-force, force-on-target … and to find a replacement for our legacy laser-based engagement system.”
Morrison told MS&T, “I think live training will be significantly different in 10 years. You will see a new type of kind of weapon tracking and hidden kill sensing rather than lasers. There'll be a much smarter implementation as we will know everybody's position in great detail. We know what their posture is, we know where weapons are aiming.” Morrison also spoke about 5G and its greater bandwidth which “allows you to do a lot of processing really quickly to get much more realistic results. … supported by micro drones mapping out the training areas in great detail.”
Morrison also believes the US Army’s developing Integrated Visual Augmentation System (IVAS) will impact live training. IVAS is billed as a fight-rehearse-train system with Microsoft-designed prototype AR goggles based on the HoloLens 2. With IVAS and other developments “I think live training will be significantly different in 10 years,” Morrison said.
Lozé sees changes in the way S&T are procured in this decade. He calls it “business model evolution” and continued, “it is an evolution of the business model, simulation as a service, a more modular structure of simulators relying on standards. This is paving the way for new entrants in the industry that come from the non-military domain with a different expertise. We are seeing this with HTX Labs for example which is delivering important elements of Pilot Training Next for the USAF. This type of new player is bringing a refreshing tone to the industry and the main prime contractors are displaying a surprisingly agile mindset by teaming up with this new blood. On top of the technology innovation, we can easily foresee a strong business model innovation wave coming.”
Gaming continues to be a significant influence on simulation. Intelligence firm Newzoo expect the global games market to expand by 7.9% to US$160.5 billion in 2020. Growth is expected in subscription models, paid downloadable content, gamer data exploitation and streaming gameplay. Also of interest to the S&T community will be the impact of gaming on individuals and wider societal culture.
Ray Tracing is a graphics technology that attempts to emulate the way light works in the real world, tracing the path of simulated light that is reflected from virtual objects based on the object's properties. It can be an extremely realistic 3D rendering technique but is computationally demanding and so the launch of the 2019 Nvidia RTX 2000 graphics cards that supports ray tracing in games was a significant milestone. Over the 2020s we can expect ray tracing to be more commonplace in games and simulations.
Gamification is the application of game-design elements and game principles in non-game contexts such as training. It aims to leverage people's natural desires such as competition, social interaction, status, and learning.
AI advances are widely reported but for S&T it offers the potential to reduce training requirements; personalise learning; support virtual mentoring and tutors; support advanced analytics; create training content; training that automatically changes to fit the skill level of the trainee; support more cost effective; faster simulation design and production; and improve simulation AI-driven characters & effects.
Streaming multimedia including music, video and gaming is now commonplace as Internet bandwidth and computing capabilities increase. The streaming or “Netflixisation” of training and education content is growing, over both dedicated and generic platforms such as Blackboard and YouTube respectively.
Edge Computing shifts the emphasis from wholly cloud computing to placing more computation and data storage at the location where it is needed. This reduces latency and can have security benefits. The Internet of Things is likely to drive Edge Computing.
Quantum Computing takes conventional computing of 1s and 0s to multiple states and an exponential rise in processing capability. It may not only affect S&T technologies, e.g. its AI capabilities, but also the training requirements of other domains such as cyber and long-term forecasting.
Analytics support effective decision making through the capture, interpretation, and communication of meaningful patterns in data. As the ability to analyse data becomes greater, for example through advances in big data and machine learning, then analytics offers the prospect of improving our understanding of training benefits, tailoring training to the trainee, and improving overall enterprise understanding and cost benefits of training.
Blockchains are books of records (or units of value) that are distributed and transparently shared by communities. The 2017 EU Report of Blockchain in Education noted blockchains could support “the award of qualifications, licensing and accreditation, management of student records, intellectual property management and payments.” IBM is working towards a private and secure blockchain-based framework for providing every person, organization, and connected device a permanent identity or “Knowledge Score”.
Digital Twins have no single definition as they tend to be defined through the perspective of the user community, e.g. manufacturing. Some definitions focus on the digital representation of a capability’s current state, others its lifecycle. If digital twins become more commonplace in wider defence then this could have significant advantages with S&T systems better linked to the actual systems and platforms.
Extended Reality (XR) encompasses augmented reality (AR), mixed reality (MR) and virtual reality (VR). The technologies have been under development for decades but miniaturization, increasing processing power and significant investments by consumer and ICT companies are accelerating progress. Thus far there has not been a dramatic take up in XR by general consumers but globally the enterprise market appears to be steadily embracing XR, especially in support of training.
Haptic Technology refers to any technology that can create an experience of touch by applying forces, vibrations, or motions to the user. They can provide more presence and agency for users in virtual worlds, especially in VR applications.
Wearables are smart electronic devices that can be incorporated into clothing or worn on the body as implants or accessories, for example, an activity or heart rate tracker. Driven by the consumer and healthcare sectors and wider Internet of Things industry, wearables offer the prospect of improved measurement and effectiveness feedback from training.
Internet of Senses, a term coined by Ericsson, describes technologies where humans can routinely interact with the Internet through all our senses; sight, smell, taste, touch, and hearing, beyond visuals and audio as now.
Spatial Computing refers to the practice of using head and body physical actions as inputs for interactive digital media systems. With the merging of XR, haptic and other interactive technologies with the real and virtual worlds this term may take on more widespread use.
5G networks offer faster speeds, more reliable connections and less latency on smartphones and other devices with average download speeds of around 1GBps. 5G rollout is currently taking place across the world with countries such as Switzerland reaching 90% of their population at the end of 2019 and by the end of the 2020s 5G is likely to be the norm. 5G may increase the geographic flexibility of training systems and reduce the weight/size of XR headsets as processing can be done elsewhere.
APIs or application programming interfaces provide an interface between different parts of a computer program intended to simplify the implementation and maintenance of the software. The simulation interoperability standard HLA includes APIs, but some propose a wholly API-driven approach to interoperability bringing the industry more in line with the wider IT world.
LVC is the integration of live, virtual, and constructive simulations, blending real people in an instrumented live environment with real people in virtual simulations supported by simulated people and systems. A much richer and cost-effective training environment can be established exploiting the relative advantages of L, V, and C simulations. Improved LVC interoperability will provide more agile and routine LVC-based training.