When a military aircraft ejects ammo, capsules, missiles, or auxiliary fuel tanks during flight, the risk of damage to the aircraft itself is sizeable. The risk is due both to turbulence and to the dynamic characteristics of the ejected payload. The resulting wobble can cause the object to collide with the aircraft, damaging or even destroying it. To prevent such damage, aerospace engineers run simulations using computational fluid dynamics to model the dropped object’s trajectory. The model’s prediction must be verified during flight testing by imaging the objects as they fall. This process is normally time-consuming because typical systems require images to be processed after a flight is complete. An image-based payload trajectory capture solution under development. When a dummy bomb is dropped, two cameras capture it as it falls, allowing 3D real-time trajectory tracking. Courtesy of IPEV. By using two synchronized, high-speed machine vision cameras, along with advanced data transport technology, however, image processing can be performed in real time during flight. The synchronization ensures that stereoscopic right and left images are taken at the same time. “Having that capability is what allows [the system] to generate real-time 3D views,” said JC Ramirez, vice president of engineering and business development at ADL Embedded Solutions. The San Diego-based company helped to develop a system for the Brazilian Air Force Flight Test and Research Institute (Instituto de Pesquisas e Ensaios em Voo, or IPEV) to verify payload trajectories more efficiently. Verifying payload trajectories In addition to training flight test personnel, IPEV also develops flight testing solutions. Several years ago the institute began working on improving payload trajectory verification, with the goal of moving the model-checking process from the ground to a platform designed for airborne use. However, the institute ran into some challenges. The large amount of data generated taxes data transport and image processing capabilities. Capturing high-resolution images at high speed is necessary for determining accurate trajectory, and this leads to generating gigabytes of data per second. According to Ramirez, ADL used CoaXPress (CXP) image capture boards with dual CXP-6 ports capable of handling 1.25 GB/s per camera. The company’s engineers combined these boards with a sixth-generation Intel quad processor that, together, enabled the necessary data compression and storage. An externally mounted dummy bomb is attached to a plane on the ground. Reference marks placed on the bomb will be used to track it as it falls. Courtesy of IPEV. Operating in an airborne environment also presents a challenge. A plane in flight, along with everything attached to it, experiences stress as it vibrates and shakes during takeoff, landing, and maneuvering. ADL is familiar with this issue, Ramirez said, because the company designs and builds small-form-factor embedded systems for military and rugged industrial applications. The overall solution devised by ADL and IPEV consisted of two image processor units, along with a small and tough embedded computing system that complies with the MIL-STD-810E military standard for rugged shock and vibration in airborne operation. Other components of the units included Euresys dual CXP-6 video capture cards that connected to the MIKROTRON EoSens 4CXP cameras. 400 fps Development of the solution started with the cameras, for which IPEV specifies certain capabilities. The institute required the fastest real-time 3D video capture available, Ramirez said. During operation of the system, each camera captured 10-bit grayscale images at 400 fps and at 1920- × 1080-pixel resolution. A Euresys Inter-PC C2C-Link Adapter ran between the two image processor units for camera shutter synchronization. However, this image capture speed brought up an issue related to data transport and processing. The transmission rate would be greater than 11 Gbit/s, requiring a connection much faster than gigabit Ethernet. In addition to storing the captured images, the system would need to locate reference marks, which are placed at many spots on each of the objects to be ejected, such as a dummy bomb, during flight testing. This process of locating the reference marks must be performed for each image, a substantial computational burden when the system is running at 400 fps. Since an ejected object can yaw, pitch, or roll as its center of mass falls through x, y, and z spatial coordinates, points on the object have six degrees of freedom in movement, all of which must be tracked. Using fast data transport within the system made it possible to handle the amount of data produced, Ramirez said. The processing involved only still images, not video, which eased some of the storage burden. The result was an instrument pod that substantially improved testing performance. Originally, test engineers could perform only two test points per day because the plane had to land after each run before the data was processed. With processing and trajectory determination performed while the plane was in flight, however, Ramirez said the engineers could execute two test points during each flight.