For more information about how Halldale can add value to your marketing and promotional campaigns or to discuss event exhibitor and sponsorship opportunities, contact our team to find out more
The Americas -
holly.foster@halldale.com
Rest of World -
jeremy@halldale.com
Artificial intelligence (AI) is increasingly being applied in training programs across various high-risk communities, including civil aviation, the military and other safety-critical industries. Recently, Halldale hosted the “AI in Action: Episode 1” podcast – which brought together industry experts to discuss the transformative role of AI in workforce learning. This report explores key insights from the discussion, which was moderated by Andy Fawkes, Halldale’s military simulation training correspondent.
According to Rick Adams, an aviation journalist with over 40 years of experience and author of a new book, “The Robot in the Simulator: Artificial Intelligence in Aviation Training,” it is important to point out that AI is not magic. “It is a great marketing term, but it is not artificial and it is not intelligent, at least not yet. It can be an extremely useful tool for crunching very large sets of information in a superfast time. I think of it as supercomputing on steroids in a training or learning context,” he said upon the webinar’s start. “How useful it is depends on the quality and relevance of the data. It does have the promise of incorporating data that has not previously been available to instructors and providing insights beyond human observation.”
Cedric Paillard, chief executive officer (CEO) of the Airline Pilot Club (APC), where he oversees the integration of AI into aviation recruitment and training, is an airline pilot and executive in high tech companies. He also believes that AI is not magic. “I see AI as an intelligent system—not necessarily in the sense of independent thinking, but rather as a tool that enhances our own intelligence. In the context of pilot training and recruitment, AI helps personalise and optimise the support of pilots and instructor,” he said.
The main difference today compared to the past is the ability to analyze large datasets, which can vary in nature; for example, they may include written assessments from instructors on pilot records or detailed flight data. AI enables the integration and processing of this information within “intelligent systems” allowing for more informed decision-making in training environments. By combining different data sources, AI can provide real-time feedback, tailor training to individual needs, and ensure alignment with industry standards such as EBT and CBTA, explained Paillard. “So, it is an “intelligent system” in the way that it combines different types of data. There is nothing magic about it,” he said
As to whether training organizations are taking a strategic grip of their data to in order to exploit AI, the starting point must always be the benefits to be reaped, according to Paillard. “The real value of AI—whether it is generative AI, machine learning, or any other technology under the AI umbrella—is its ability to address specific needs with precision,” he said. “Take, for example, Generation Z pilots. AI enables us to create personalised training experiences tailored to their learning preferences. Through microlearning, we can focus on the exact skills they need to develop, helping them progress efficiently—whether it is to become an airline pilot or to move from pilot to captain. This translates into significant efficiency gains, such as faster training and reduced repetitive simulations, which are well-recognized benefits.”
One of the biggest challenges, however, is trust in the adoption of AI, according to Paillard. “The key is demonstrating to instructors and pilots that they can trust AI. And this will not happen overnight—it requires proving that AI operates within established industry standards, ensuring data security and maintaining high levels of privacy,” he said.
Effective implementation of AI in training relies on structured and high-quality data, according to Colin Hillier, a former Royal Navy officer and CEO of Mission Decisions, a company providing AI solutions for human-machine teaming. “The biggest challenge is data access and structuring it properly”, he said. “Most of the time it is unstructured in PDFs, in text, in a document somewhere not labelled, not indexed. So, the latest crop of generative AI tools actually help us with that, because how do we do that? In the past we used people to structure the data and that is still the challenge.”
When it comes to AI tools, there is actually not much need to spend much time on the models themselves, explained Hillier. “The models are powerful, but they are only as good as the data they are trained on. The majority of our work—about 80 to 90%—is focused on data extraction. When working with a large organisation, the key challenge is accessing that data. Where is it stored? Is it in a simulator, within documents, or embedded in another piece of software? Often, the data exists, but there is no straightforward way to retrieve it, especially if there is no application programming interface (API) to facilitate access,” he said. “The real challenge remains efficient data extraction—making sure AI can actually use the information that is available,”
Despite AI’s transformative capabilities, human oversight remains essential to ensure reliability, fairness, and ethical considerations in training. Adams cautioned against the risks of AI-generated misinformation. “What made people nervous early on with ChatGPT was hearing about things like hallucinations whereby if it did not know the answer, it would make something up. This is obviously very concerning in a safety-critical mission environment,” he affirmed. “Aviation agencies are working to establish governance frameworks for AI in the industry. Since forming a task force in 2018, EASA released two versions of its AI roadmap, along with concept papers on machine learning and related technologies. The FAA released its first conceptual AI roadmap last year, and it is actively working on creating a framework for the aviation community to follow.”
When it comes to training, most organizations hesitate to adopt new approaches unless they are explicitly authorized and provide training credits. Whether AI-driven training will be widely accepted depends on regulatory approval and integration into recognized training methodologies, emphasized Adams.
The point about AI implementation in training is its ability to support the instructors and the team that actually produces the training, not replacing someone with AI. “It is a question of actually providing the right set of data at the right time for making the right decision and aligning the training outcomes or the recruitment outcomes with the objective of the airline or the flight school and that is where AI actually accelerates things for us,” he said.
Referring to the fact that recorded simulator data can often lack context in the form of instructor notes, scenario details, or audio recordings, Hillier highlighted that even with contextual information, the data is not labelled in a way that AI can use effectively. “People tend to be sceptical about data, and what we find is that, without clear planning about how the data will be used, it becomes limited in value,” he said. “This is a major challenge for AI companies. Just because data exists on the internet does not mean it is useful. The data must be validated to ensure it is meaningful and appropriate for its intended purpose.”
AI is set to reshape training methodologies across aviation, the military and safety-critical industries. However, as highlighted in the podcast, its true value lies not in replacing human expertise but in augmenting decision-making, personalizing learning, and optimizing training and hiring processes. Trust remains a barrier to widespread AI adoption in training, with concerns over data security, misinformation and regulatory approval still in play. The effectiveness of AI-driven solutions ultimately depends on high-quality data, strategic implementation, and – above all – human oversight.
[Editor’s note: Mario Pierobon, is Halldale’s special correspondent based in Italy. We look forward to his future contributions to our civil aviation, safety-critical industries and military departments.]