VR or Live Training on the Flight Deck

Contact Our Team

For more information about how Halldale can add value to your marketing and promotional campaigns or to discuss event exhibitor and sponsorship opportunities, contact our team to find out more

 

The Americas -
holly.foster@halldale.com

Rest of World -
jeremy@halldale.com



VR-Launch-and-Recovery

Live OJT is currently the only way to train carrier flight deck crews. The US Navy would like to change that. MS&T’s Chuck Weirauch writes.

Amidst what has been termed controlled chaos, readying aircraft for launch, shooting and recovery are perhaps the most dangerous operations for any team of sailors afloat. Yet live on-the-job training conducted during actual mission operations is currently the only way for carrier flight deck personnel to learn.

A closer look at the flight deck personnel will show some with UI – Under Instruction – emblazoned on their helmets. There is an ongoing need to train new flight deck crew members, and currently really no place to learn the job of flight deck crew member other than OJT, according to US Navy Commander Hank Phillips. He is the Naval Air War Warfare Center Training Systems Division (NAWCTSD)’s Military Director for Research and Engineering.

“Currently there is no other environment for learning aside from being on the flight deck with a person standing behind you holding you by the scruff of the neck,” Phillips pointed out. “There is no virtual representation of that environment where you must learn the comms, the direction the danger is coming from and what you are supposed to be anticipating next. This situation is begging for a virtual environment training place, because you have to understand where you are in three-dimensional space – and VR goggles are perfect for that. You have to be able to process the visual and auditory cues coming at you from all directions at once, and you have got to be able to work with people who are separated from you by short and long distances.”

Prototype Solution

NAWCTSD’s prototype solution to this training dilemma, the Flight Deck Crew Refresher Training Expansion Packs (FDCR TEP), will undergo its initial testing and parallel training effectiveness study this May at Naval Air Station Oceana at Virginia Beach, VA. This process will be conducted through September of this year before the new trainer is released to the Fleet, according to Tyson Griffin, the training command’s Head of its Advanced Modeling and Simulation Branch.

The FDCR TEP trainer is just one of several gaming-technology-based products undergoing development at NAWCTSD. It is also the first to incorporate VR headset hardwear and software. The new trainer employs the Unity gaming engine, as well as the traditional way for gaming software developers to upgrade their original products, namely expansion packs.

Shooting Aircraft

From the carrier’s control tower the five members of the Primary Flight Control (PRI-FLY) team, the Air Boss, Mini Boss, the Handler, Air Bos’n and Landing Signal Officer (LSO) orchestrate aircraft movement directing the dozens of multi-colored helmet and jersey-clad flight deck personnel on the flight deck. Every different color helmet and jersey indicates the deck crew member’s role, from the blue of aircraft handlers, to the green of catapult and arresting gear crew members to the purple of aviation fuelers and the red of ordnance handlers.

The on-deck “shooter” is a catapult officer who must first verify that all pre-flight checks are completed, with aircraft, deck team members and equipment in the proper position, before he gives the signal to launch the aircraft. The shooter must complete three to six months of on-the-job training before he can qualify for the position. The FDCR TEP is designed in part to replace a legacy projection-based shooter trainer.

According to Phillips, the FDCR TEP has now been expanded to incorporate VR headset-based training for the shooter. The shooter trainee is linked with the PRI FLY team and LSO officer for virtual networked team training, with the control tower personnel observing deck operations via a monitor.

Shooter’s VR

“The current shooter trainer is essentially able to train only one person at a time,” Griffin said. “And since it is a projection-based system, it has only a single point of view for the shooter. It does not give them a full 360-degree situational awareness of their surroundings. With the new trainer, we can build a full virtual environment of the flight deck. We can drop that shooter into that flight deck environment, with a fully immersive VR headset so that they can understand what is going on around them much better. We can also have multiple role players, whether they are synthetic role players, or real lifetime role players, in the environment with them. So, it makes much more for a crew-coordinated exercise as well.”

Once the initial FDCR TEP has undergone its testing and parallel training effectiveness program, it will be delivered to the Fleet this year. The trainer also has the capacity via expansion packs to provide a networked virtual training environment for other flight deck crew members to be added to the environment as well. However, there are currently no requirements to develop a training expansion pack for a catapult officer, arresting gear officers or aircraft handlers, for example, nor the funding. The FDCR TEP program is funded by the Office of Naval Research (ONR).

“We are talking about a series for flight deck crew refresher packages, “Griffin reported. “We have developed a framework within the Unity gaming engine that allows us to build out multiple roles within the flight deck crew itself with expansion packs. Expansion packs could be for the shooting officer, the air boss, the mini-boss, the LSO, and also could be for the weapon handlers, the fuelers – all the different color shirts that you would have on the flight deck. If we start with one or two flight deck crew positions at a time per expansion pack, as the idea takes hold we can build more and more expansion packs in the future until we cover every single one of those crew members that works on a flight deck. So now they could all train together in this virtual synthetic environment within the game engine.”

Such a fully immersive trainer as the FCR TEP simply would not be possible today without the application of low-cost gaming technologies, Phillips emphasized.

“We are heavily involved in employing off-the-shelf gaming technologies to build learning environments,” he pointed out. “With this lower-cost technology now, we can go and actually put it in the Fleet without worrying that we can’t sustain it. For the shooter, it makes sense to put him into the VR. But for the guys that are operating the deck edge gear, it does not make sense to put them into the VR necessarily. They have to deal with different types of equipment where maybe they need peripheral vision, or need to talk to someone who is right next to them, so we utilize different types of technologies when it is appropriate to enable the types of training that you need to get across to that individual operator.”

Other Applications

Along with the FDCR TEP, according to Phillips NAWCTSD is also advancing the use of VR technology in several other training applications. They include the Virtual Reality Advanced Precision Kill Weapon System (APKWS) Trainer prototype for the H-60R Seahawk helicopter that is scheduled to be delivered to Helicopter Maritime Strike Squadron (HSM-40) at Naval Air Station Jacksonville, Mayport FL this April. The Orlando command is also evaluating a vendor-provided networkable VR simulation of the F/A-18 that incorporates the capability to interact with Multi-Function Display bezels by capturing hand and finger position with cameras. This will enable the pilot to “push” virtual buttons, and navigate through MFD displays to the degree that they are readable using current headset visual resolution, he reported.

In addition, a VR Testbed is being employed to evaluate two gaming features as to their training effectiveness in regard to performance and motivation. The product being employed on the Testbed for the training effectiveness study is the game-based Periscope Operator Adaptive Trainer (POAT) for tactical training.

And there is a growing broad interest in leveraging the ‘realities’ for training. The 2018 Serious Games Showcase and Competition (SGS&C) is featuring a new category for submission that will focus on gaming products that employ VR, AR and mixed reality technologies. Dubbed the XR Technologies category, it will replace the Mobile category that was discontinued in 2017.S

New XR Category for SGS&C 2018

“We used to have the Mobile category,” explained Jennifer McNamara, Director of the Serous Games Showcase and Challenge and Vice President for Serious Games for Breakaway LTD. “When we originally created it, we were encouraging people to build games for the mobile platform. What we found over the years is that almost all of our games were coming in with a mobile aspect, so we felt that we did not need a stand-alone category for that technology any more. We were looking for where we would inspire the next novel technology for new and innovating ways for applying learning, and it was clear to us that the XR technology would be that new category.”

Part of the inspiration for the new category was that for the first time in 2017, the SGSC had four finalists that were in the VR category. One, Earthlight Arcade, actually won the 2017 People’s Choice award. The game allows the viewer to explore a space station in a virtual reality environment, using either the Vive or the Oculus Rift VE headset. A finalist was RoboEngineers, which allows the viewer to fully assemble robots and test them in a virtual play experience. Finalist Bionautica Trail employs a VR headset for runners to view one of the most scenic trails in the world. It is used as a part of a cardiac rehabilitation program. Finalist medRoom Academy is a VR platform for medical training.

Yet another reason for establishing the XR Technologies category is that now almost every entry in the Challenge has some XR component integrated into the product, McNamara pointed out. However, although the technologies have a lot to offer, they are at what she calls the “shiny object” phase, new technologies that are not yet really offering any real applications for learning, but are there more for technology demonstration.

“As the Showcase and Challenge, we have always seen ourselves as leading the (I/ITSEC) floor in useful and novel applications of gaming technologies, but really also telling how they really impact learning and education,” McNamara stated. “So our hope was that by including the technologies as a category in the 2018 Challenge, we will be able to move away from people sort of selling technology demos, and moving towards people bringing actual training and learning products to the floor.”

Since the SGS&C has a history of bringing in developers from outside of the I/ITSEC community, its organizers also hope that opening up the XR category will help draw even more new participants into the arena than before. But while VR, AR and mixed reality offer much new potential for the enhancement of learning, users can have cognitive issues with their use. That’s why McNamara calls for more research in their use before their widespread application for learning and training.

“I think the biggest challenge of the community right now is figuring out the appropriate way to use these technologies and leverage them in blended solutions.” McNamara summed up. “Employing the XR technologies along with more traditional environments can help to prevent them from becoming too overwhelming for the users.”

The entry submission period for the SGS&C 2018 competition opens August 1st and closes at the end of September. For more information about the event and how to submit an entry, go to www.sgschallenge.com. Information and descriptions about 2017 competition winners and finalists can be found at the site as well.  – Chuck Weirauch

Originally published in MS&T issue 2, 2018.

Related articles



More Features

More features