For more information about how Halldale can add value to your marketing and promotional campaigns or to discuss event exhibitor and sponsorship opportunities, contact our team to find out more
The Americas -
holly.foster@halldale.com
Rest of World -
jeremy@halldale.com
Instructors are being untethered, and visual displays are becoming the new fashion statement. Rick Adams looks at the evolution of visualization, motion refinement, and instructor mobility.
When man first drew pictures of animals on the walls of a cave in Lascaux in what is now south-west France, he was perhaps helping his fellow Paleolithic comrades “visualize” the “target” for their next hunting “mission.”
Since that time about 15,000 years ago, we’ve progressed to images projected on large wall screens (my dad used to use a bedsheet for home movies) to computerized monitors to hand-held tablets and smartphones, and now to Dick Tracy-esque wristwatches and miniature optics mounted on eyeglass-like devices.
Now researchers are talking about projecting images onto special contact lenses or even directly onto the retina of the eye. And with the recent brain-to-brain computer-aided telepathy experiments, is the day far off when visualization is simply relaying virtual environments or situational data straight into your neural nodes?
With the advent of Google Glass and Oculus Rift, display technology appears to be moving quickly toward wearable devices, even as military training is just started to get used to adapting the mobility benefits of iPads and iPhones. Hands-free wearable displays are on the verge of enabling a range of new capabilities for the warfighter, first responders, and training developers.
Reaching for the Glass Ring
Many of the companies I spoke with about forward-looking visual technologies acknowledged they are experimenting with Google Glass, and some with Oculus Rift – a virtual reality headset for gaming, but most are unwilling to go public yet with the details of their research.
LeeAnn Ridgeway, Vice President and General Manager of Simulation and Training Solutions, said Rockwell Collins is exploring linking their EP image generator and avionics technologies with Glass and Rift applications. “It’s pretty exciting stuff.” She expects applications using the new visual hardware to enter the training market “pretty soon, possibly in the next 12 months.”
By the way, a new Rockwell Collins EP8000-plus image generator may be unveiled in December at the Interservice/Industry Training, Simulation, and Education Conference (I/ITSEC).
Google Glass is a US$1,500 wearable computer. The name comes from the placement of the viewing optics on one side of a pair of faux eyeglasses or sunglasses. Since the screen is almost directly in front of the wearer’s eye, the impression is similar to a larger monitor several inches from the face. But still enabling the wearer to see the real world. With a touchpad on the side of the glass frame, users can control the device by “swiping through” an interface displayed on the screen. (Swiping training and practice is required to get the hang of it.)
Several military organizations are “going Glass.” One is the awkwardly acronymed BATMAN II (Battlefield Air Targeting Man-Aided kNowledge) program at the US Air Force Research Laboratory’s 711th Human Performance Wing. They’re testing Google Glass for “augmented vision” applications for pararescue jumpers and terminal air/attack controllers. The pararescuers are special forces with emergency trauma training, and the data conveyed via the Glass might be paired with a smartphone to monitor the vital signs of a patient to determine which casualties require the most urgent care. The Air Force said it is developing proprietary software to enhance the Android operating system that Google uses.
The Battelle Memorial Institute recently demonstrated a Glass app called Tactical Augmented Reality Applications (TARA) using a wireless interface and video recognition with chemical, biological, and radiological/nuclear sensors. Battelle says TARA would enable first responders to quickly assess if there are hazardous agents present. If there’s a hazard, the Google Glass will flash red.
The video recognition could be paired with facial recognition software to alert soldiers or law enforcement officers when a “person of interest” enters the field of view. The UK-based company, Golden-I (an homage to the James Bond novel, Goldeneye, no doubt) has created a Glass-like headset for bobbies that handles facial recognition, scans license plates, and monitors vital signs.
The New York City police department is experimenting with Glass for anything ranging from terror investigations to routine street patrols.
Defense Research and Development Canada has commissioned research into integrating “wearables,” such as glass or body-mounted cameras, with smartphones, all connected to peer-to-peer networks.
Eyewear company Vuzix, together with display maker Six15 Technologies, has received a contract from the US Office of Naval Research to integrate “Glass-ish” optics into a standard pair of military goggles. The objective is to insert virtual characters and effects into a sailor’s field of view for an immersive, personalized training experience.
Google is no newcomer to military and government visualization applications. In 2004, they purchased satellite mapping company Keyhole, a name that evokes the Central Intelligence Agency’s original spy in the sky operation and morphed it into Google Maps. They have also acquired a raft of robotics and optical-focused companies such as Redwood Robotics, Autofuss, and Industrial Perception. In 2012 Google hired the former head of the US Defense Advanced Research Projects Agency (DARPA), Regina Dugan. And, of course, Google’s alleged collaboration with the controversial National Security Agency (NSA) electronic surveillance programs is widely reported.
The Oculus Rift facemask-style display is providing armored personnel carrier (APC) drivers in Norway with a video game-like experience … while they drive the real APC. Four cameras mounted on the outside of the APC feed a seamless wraparound image to the Rift headset, enabling an unobstructed view of the terrain, obstacles, and threats.
Oculus Rift could also be the platform of choice for displaying holographic content. A “cloud graphics” company known as OTOY is expected to introduce a “holographic capture and display system” in 2015. Imagine the training possibilities for that.
BYOT – Bring Your Own Tablet
In the here and now traditional technology of flight simulators, vendors have been focusing in recent months on the back end of the device, the area previously known as the instructor operator station (IOS) but which may come to be referred to as the instructor’s “command chair.”
Three of the major simulator manufacturers – CAE, FlightSafety International, and Frasca International – have transformed the IOS into a Star Trekkie command post, complete with both large touch-screens and smaller form tablets through which the instructor can initiate training scenarios, modify the synthetic weather conditions, or pause the session and do a mini-debrief on the lesson just learned.
“The IOS is no longer fixed to the back wall. It’s movable, a self-contained unit. It’s a chair with displays,” Bruno Cacciola told MS&T. Cacciola is CAE’s Director of Product Strategy and Marketing.
“We really looked at the evolution of the new full-flight simulator from all perspectives – the instructor’s point of view, the crew point of view, and the operational point of view. It’s all about elevating the training experience.”
The redesigned CAE IOS footprint features two large touch-screen monitors, but the real key is that instructors can bring their own Apple iPad or iPhone into the simulator and synchronize their personal device with the instructor station software.
“Mobile device access has become almost second nature,” said Cacciola. “The capability to use his or her own device enables more control for the instructor, including the capability to work beyond the simulator as well.” The iPad/iPhone can also serve as a third display monitor.
CAE’s enhanced IOS also features an automatic event capture system, much like a flight data recorder in an aircraft. Currently, about 20 types of events are “baked in,” according to Marc St-Hilaire, Vice-President, Technology, and Innovation for CAE. If a student exceeds the specified parameters for a profile – for example, too high or too low on the glidepath, or excessive bank too close to the ground – the exceedance is flagged, and the instructor has 3D photos available to show the crew where they erred.
Cacciola explained that a portion of a training scenario can be debriefed immediately, rather than waiting for the formal debrief at the end of the session. “We wanted to bring a tool to the instructor to use in real-time a minute or two after something happens. It’s more effective.”
FlightSafety International’s John Van Maren, Vice President, Simulation, told us the independent movement of their new instructor chair concept “more readily positions the instructor to interact with the pilots.”
He said FlightSafety convened a working group of customers and company engineers. “The group put a lot of focus on the flow, and designed the new IOS to reduce the instructor’s workload.”
FlightSafety’s next-gen IOS is “a totally independent island” with “maximum flexibility to position monitors at different angles – it’s like setting a rear-view mirror,” Van Maren explained.
Overall, the instructor area is also more spacious, allowing easier egress. “The entire back end was redesigned from the customer perspective.” The design is also modular and can be quickly assembled onto different size motion bases.
Frasca International’s new IOS also accommodates tablets, currently Windows 8-based touch screens. President and CEO John Frasca said the IOS is “native” on the tablets, not just a remote connection. For example, an instructor can sit in the right seat of the cockpit and control the simulation session from there instead of “sitting in the back and being totally tied to monitors.”
Offline, even at home, an instructor can use the Windows tablet to generate training scenarios, including pre-loaded event triggers for weather, malfunctions and systems failures, “any number of parameters.” The scenario would then be available for use by other instructors as well.
In Frasca’s IOS design, the instructor screens can be forward- or sideways-facing, and the tablet can be mounted on the arm of the chair.
In development for two years, the first deployment of Frasca’s new instructor chair is on a Sikorsky S-76 helicopter simulator delivered earlier this year to Bristow Helicopters in Scotland.
Fine-Tuning Motion
FlightSafety, which pioneered electric motion systems for flight simulation applications, has incorporated a new motion cueing system with new software algorithms on their latest FS1000 Level D simulator. “The idea is to use objective motion testing criteria to enhance cueing,” said Van Maren.
Compared to the standard physics-based legacy algorithm: “The difference is shocking. How did the industry get by with what we had for so long?”
In legacy days, he explained, cueing was based on the subjective inputs of pilots. “The pilot might say it needs more responsiveness in lateral maneuvers, and the motion engineer would make adjustments.” Then another pilot flies the simulator and provides different, even conflicting feedback on how the motion cueing should feel. “There was no way of measuring performance.”
FlightSafety is now using a tool “to optimize cues to match aircraft data much more closely.” Van Maren said the nucleus of ideas emerged from the international working group that compiled the ICAO document, Manual of Criteria for the Qualification of Flight Simulation Training Devices, Volume I – Aeroplane (Doc 9625, 3rd Edition).
For helicopter simulation, CAE is using a new electrical vibration system in their 3000 Series devices. St-Hilaire told MS&T the three-axis whole-cockpit vibration system is the “only one of its kind.” Aside from the improved pilot cueing, the electric motion and vibration saves simulator operators as much as USD$20-30,000 per year compared with hydraulic motion systems.
What’s Next?
St-Hilaire advises looking to the mystical “Cloud.” In Project Innovate, funded by the Canadian government, CAE is looking at two major themes: interactive simulation in a more user-friendly way, “to bring the user experience to a new level,” and the ability for synthetic environment data to “live in the cloud” – operational data, maintenance data, training data.
“Synthetics will live in the cloud, and users will subscribe to the content,” he predicted.
He reminded that some of the simulation technologies we take almost for granted today – electric motion, LED projectors, graphics processors – were adopted relatively recently.
I’m reminded of an entrepreneur I sat next to on a flight several years ago. He had a special pair of glasses hooked up to his computer (this was before the days of wireless), and he could view the equivalent of a 17-inch monitor via the magic glasses. Meantime, his laptop screen was dark – so no one could look over his shoulder to see what he was working on.
I don’t know if he’s the one who invented Google Glass, of if he’ll be the one to sue Google, claiming they stole his invention. Either way, his wild-eyed idea is coming to pass.
I’m not sure I’m ready, though, for direct brain-to-brain communication, or more accurately “technology-aided telepathy.” Researchers from Harvard Medical School in Boston, Massachusetts were able to decode a message sent from a participant in India who was wearing an internet-linked electroencephalography (EEG). A computer translated the sender’s “thought message” into binary code, and it was relayed via email, first to France, then to the US. The science is known as “phosphenes,” that is, flashes of light in the peripheral vision that appear in a numerical sequence to allow the receiver to decode the message’s information. In other words, using technology to interact electromagnetically with the brain.
I have enough difficulty figuring out what I’m thinking; sending those thoughts to someone else, unfiltered, could be dangerous (for at least one of us).