Using Drones to Increase Situational Awareness
Lightweight drones that can be easily carried hold great promise to increase the situational awareness of a soldier in the field beyond their line of sight, especially in complex and dangerous areas. However, manual drone operation can distract from soldiers’ core mission or prudent monitoring of their immediate surroundings. To enable drones to fly autonomously in complex and GPS-denied environments Draper has developed an autonomy architecture and software package that is platform-agnostic and demonstrated it in operationally relevant environments. Recently, Draper is working to enable teaming of autonomous robots and drones that carry multiple sensors and can communicate with human operators and central command through the CSIRP project.
Draper's Scalable SW
Draper’s SW is designed to operate on small handheld drones but can be adapted to larger drones or other autonomous vehicles. The scalable and adaptable nature of the software enables the integration of advanced autonomy onto existing vehicles that are otherwise manually controlled, as well as future ones. Upgrading vehicles with autonomous capability will make a variety of platforms much more capable and able to perform new missions independently.
Draper’s SW puts the autonomy function onboard the drone—even tiny ones—reducing the cognitive burden on soldiers. Once provided with a target location, Draper’s SW enables the drone to independently determine how best to get there by planning (in real-time) obstacle-free routes without any prior knowledge of the environment. While the system enables closed-loop and fully autonomous flight, a user interface provides the soldier with up-to-date information about the environment and the type of objects of interest observed during the drone’s flight, as well as information about the drone’s operational status.
Target Recognition Capabilities
Draper has demonstrated its agile obstacle avoidance, GPS-denied localization and 3D mapping on a tiny cellphone-grade processor using only an inertial measurement unit (IMU) and a single passive camera (stealthier than a laser). The Draper team also built in target recognition capabilities, which can be co-trained with mission-relevant synthetic data generated by realistic simulators.
Integrating Draper's SW with AI and Robotic Modeling and Simulation Tools
Draper is adding new features to its autonomy system, including new onboard artificial intelligence (AI) solutions for inferring depth distance for each pixel in an image to enable operations in more complex 3D environments. Draper is also developing advanced robotic modeling and simulation tools to enable test and evaluation of new technologies, in a variety of environments, and at a much lower cost, to assess impact and tradeoffs of incorporating new technology (e.g., new sensors or algorithms) into an autonomous system. Longer term, Draper wants to enable autonomous adaptive exploration of large environments, possibly for groups of cooperating drones.