Human Machine Teaming May be the Future of Drones in the Defense Industry
Originally published to DroneLife
Next-Level Human-Machine Teaming
By: Dawn Zoldi (Colonel, USAF Ret.)
Since the beginning of warfare, combatants have sought ways to increase that reach while removing themselves further and further from the battlefield. Distance can mean the difference between life and death to a soldier, sailor, airman, marine or guardian.
Robots extend human reach and possibilities.
Designing For Success
Autonomous drones play a significant role in today’s conflicts. From a distance, drones provide intelligence, surveillance and reconnaissance (ISR) to warriors, relaying crucial information about enemy locations, whether over-the-hill or in a remote cave. They augment their human operators, delivering critical goods to dangerous locations (e.g., blood to the front lines), detecting dangers (e.g., hidden landmines) and rapidly resupplying units. In the right scenarios, drones can conduct life-saving missions.
These systems operate using clear repeatable rules based on unambiguous sensed data. In accordance with current DoD policy, a human remotely guides them through the operational task life cycle. This ensures that commanders and operators can exercise appropriate levels of human judgment in any given mission. This necessitates that military organizations responsibly design and develop, as well as deploy and use advanced capabilities.
Responsible design includes ensuring that such systems remain safe, secure and effective so as to avoid unintended consequences. This demands incorporating appropriate safeguards, and humans in the mix, to mitigate risks of serious failures or incidents.
As a result, experts predict that the DoD’s biggest in research and development (R&D), will flow into improvements in HMT and machine intelligence. But effective HMT is about much more than just the automated platform; it’s about the entire system.
The Whole and Its Parts
On the civil side, drone regulations distinguish between the uncrewed aircraft (UA) itself and the uncrewed aircraft system (UAS). The UA, according to these regs, “means an aircraft operated without the possibility of direct human intervention from within or on the aircraft.” A small UAS includes not only the UA itself, but “its associated elements (including communication links and the components that control the small unmanned aircraft) that are required for the safe and efficient operation…”
So, too, HMT is about much more than just partnering a person with the right machine. It’s about the components in and on that machine and how humans interact with them. An entirely new R&D discipline has arisen around the concept of systemic human-robot interaction (HRI). Its focus areas, relevant to military autonomous systems, include, among other things, how humans and machines communicate.
In machines, operating systems (OS) provide the key to this communication and to effective HMT. An OS manages all other applications and programs in a machine’s computer and enables applications to interact with a computer’s hardware through a designated application programme interface (API). The OS manages hardware resources (e.g., CPU, memory); runs applications to enable user inactions and provides a user interface, usually a graphical user interface (GUI) through which the user interacts with the computer.
For drones and robots, several other critical systems come into play, such as a core autopilot and on-board companion computer that integrates flight controller, CPU, video encoder, GPU, Neural Processing Unit (NPU) and electronic speed controllers (ESCs).
To tackle current design challenges, the Defense Innovation Unit’s (DIU) Blue UAS Framework sources the type of safe, secure and reliable components that DoD requires for its drones. DIU on-ramps commercial off-the-shelf (COTS) systems through its Cleared List and, as part of its Foundry, engages with companies to modify its tech to meet DoD standards. As part of these efforts, DIU matches plug-and-play small drone components with systems on its Cleared List. Having a “Blue” component integrated into one’s system increases the chances of successful employment within DoD channels.
Two autonomous drone companies, XTEND, creators of the immersive XTEND Operating System (XOS), and ModalAI, a California-based startup whose VOXL® family of autopilots, have joined forces to take HMT to the next level.
A Winning Combination
Founded in 2018, XTEND’s co-founders, Aviv Shapira (CEO) and Rubi Liani (CTO) originally planned to develop a mixed reality game with drones. Liani had previously founded Israel’s drone racing league and Shapira brought significant augmented reality/virtual (AR/VR) experience to the table.
The co-founders discovered their game use case for drones could also be applied to the defense industry. In short order, they secured a contract with the Israel Defense Forces to provide revolutionary human-guided autonomous machine systems to enable any operator to perform extremely accurate maneuvers and actions, in any environment, with minimal training.
In just five years, XTEND has grown to over 100 employees in offices in the U.S. and Singapore, in addition to its Headquarters and an R&D Center in Israel. It produces three models of unique human-guided drone systems that have been produced in the 1000s: the Griffon counter-UAS drone (its first offering), the Wolverine a multi-mission workhorse that can be outfitted with a claw or other tools, the related Wolverine ISR lightweight outdoor drone and the XTENDER micro tactical ISR drone, which can operate in tight GPS-denied spaces. The company’s keystone product, the XOS, powers all of its drones.
The XOS provides a unified core across all of these platforms, offering advanced capabilities that include:
- AR GFX SDK – Enables adding real-time augmented reality 3D graphics from various external data sources via SKYLORD’s™API | SDK
- Robust OS architecture – Allows integration with multiple aerial and none-aerial platforms with minimal configuration.
- Distributed OS – Allows integration with multiple aerial and none-aerial platforms with minimal configuration.
- Dynamic payload API – Enables adding physical payloads with variable configurations and data connectivity in order to transform SKYLORD™ platforms into infinitely capable tool sets for variable situations.
- ML-based dynamic sensor fusion – Machine Learning based proprietary sensor fusion that allows SKYLORD™ drones to operate with great spatial accuracy in complex dynamic environments under variable lighting conditions.
XOS works alongside XTEND’s autonomous drones and handles the pipeline from the human, his or her mission decision and final action. One operator can use all of these drones together and switch between them. For example, soldiers have used the Wolverine to carry an XTENDER drone to the door.
According to Shapira, “XTEND tries to enable humans by providing robots that can act autonomously in life threatening situations.” He continued, “These tools enrich users to complete more complicated tasks by combining human discretion and machine autonomy. In a military use case, you can send a drone to complete a task instead of a human. This saves lives.”
XTEND sought a computing platform powerful and secure enough to be worthy of their human-guided drone ecosystem of products. They discovered ModalAI, and the company’s VOXL 2 autopilot, through their connections with DIU.
ModalAI announced its VOXL 2, with more artificial intelligence (AI) computing capability than any other similar product globally. DIU partially funded the development of VOXL 2 to advance domestic autopilot capabilities, as part of its Blue UAS Framework 2.0.
Weighing only 16 grams and powered by the Qualcomm® Flight RB5 5G platform, VOXL 2 integrates a PX4 real-time flight controller, a state-of-the-art CPU, video encoder, GPU and Neural Processing Unit (NPU), with ModalAI’s open VOXL SDK.
The VOXL SDK comes complete with autonomous behaviors required to safely and reliably fly BVLOS and avoid obstacles. The included mapping and planning software provides a route for a given desired trajectory, mapping and navigating around obstacles to achieve the best path. Collision Prevention sets the parameter for minimum allowed approach distance. This tech supports robust support for 4G/5G-based beyond visual line of sight (BVLOS) flight.
Liani recalled, “At XTEND, we are focused on software. We didn’t know how to minimize drone hardware into a 16g autopilot. With Raspberry Pi we need to create the carrier board; with Jetson we need to build the drivers. VOXL gave us everything we needed, including the cameras, because ModalAI designed it to simply be popped into a drone. It was a perfect fit.”
Other considerations that the XTEND weighed, in engaging in this partnership, included that VOXL 2 is NDAA-compliant, has the right size, weight and power (SWaP), pre-integrated algorithms and GPS-denied capabilities specifically designed for the drones and could easily integrate into XTEND’s products. The company’s support and communications efforts factored in as well. ModalAI provides an active forum for its users.
Chad Sweet, CEO of ModalAI, explained, “Hardware is hard. At ModalAI, we’ve already done the difficult work for you. We’ve thoughtfully engineered autonomous capabilities into a small package which is ready to integrate with your software. This can cut your development time in half.”
Elevating Performance
Now, that XTEND’s XOS runs on VOXL 2, it can handle an unprecedented amount of onboard processing. This enables localization, GPS-denied navigation and swarming with XTEND’s drones. Conversely, because XOS is compatible with VOXL, other OEMs that already in the ModalAI ecosystem can add XOS to their fleets.
This combination of best-in-class components not only reduces training time for operators, it also reduces the barrier to entry for any organization that wants to employ autonomous drones in dangerous environments…and take human-machine teaming to greater heights.
Liani said, “The interaction between the human and the machine is very important. With XTEND’s XOS and ModalAI’s VOXL 2, you can put the human in the loop and let the machine do the dangerous work. In terms of productivity, the resultant equation is 1+1=3.”