AISUM Autonomous Drone

AISUM Autonomous Drone


The Artificial Intelligence Small Unit Manuever (AISUM) Challenge seeks to develop algorithms to improve the maneuverability and reconnaissance abiliies of drones operating in confined spaces. AISUM has given several examples of scenarios that the drones may have to perform. The drones must demostrate; navigation, mapping, collision avoidance, object recognition, color detection, and friend/foe determination. The drone will come with standard hardware and has the capacity to carry point five pounds of additional sensor/computational payload. As the drones will be working in confined, interior spaces, GPS signals will not be available for navigation and mapping.

Artificial Intelligence for Small Unit Maneuver (AISUM) combines Naval Expeditionary Warfare and Special Operations Forces (SOF) tactical maneuver elements with Robotic Autonomous Systems (RAS) to create a low risk human machine maneuver element that gains, maintains and extends access in complex, contested and congested areas, providing a decisive advantage and precision application of effects.

Over the past 20 years the SOF community perfected the combination of its tactical maneuver elements and small arms precision fire with overhead unmanned and remotely piloted airborne Intelligence, Surveillance, and Reconnaissance (ISR) in its aerial delivered strike to precisely find, fix and finish networked threats. However, these non-state actors mostly operated in a two-dimensional battlespace that was mainly rural and urban sprawl, with little to no access to Electronic Warfare (EW) tools.

Though the Special Operations community have become masters of the 2-dimensional threat environment over the past two decades, with today’s emerging threat environments, adversaries attempt to contest all domains with advanced technologies and take advantage of the complex and congested 3-dimensional battlespace. This reduces the effectiveness of SOF tactical maneuver elements, ISR and precision fires, creating the inability to maintain continuous airborne ISR feeds and key communications and navigation bands in the electromagnetic spectrum in these dense urban clutter environments, leaving our tactical maneuver elements challenged.

For example, in a mission to infiltrate and obtain information or an asset from a sub terrain tunnel system in a contested and congested threat environment, AISUM could be used to gain intelligence and minimize human risk. Conceptually, satellite imagery would provide the initial data used to generate a 3-dimensional terrain map of an area of operations, allowing mission planners, operators and drones to plan, train and rehearse within the simulated environment. These activities allow the team to develop and optimize specific tactics, techniques and procedures.

General Overview

This Prize Challenge reaches out to the defense, academic and industrial community to enhance research and development in the field of artificial intelligence for small unit maneuver. The objective of the Prize Challenge is for the development of an algorithm to enhance the maneuver and reconnaissance capabilities of autonomous drones within defined scenarios. The objective includes the utilization of the Government Furnished Property (GFP) drones, sensors, and onboard processing.

Example Scenario 1:

  1. Maneuver: Begin mission at entrance to building. Navigate through the building using open passages and any route to exit through open rear door.
  2. Recon: Identify and count the number of key predefined objects, and simple properties (i.e. color) within the internal space (TBD: humans, object of interest)


Example Scenario 2:

  1. Maneuver: Begin mission at entrance to building. Navigate through the building, around limited obstacles in order to maximize coverage, and then exit through open rear door.
  2. Mapping: Collect data of internal space to produce, post-hoc, a 3D model of the internal environment.
  3. Recon: Identify the number and location of key predefined objects within the internal space (TBD: humans, object of interest).
  4. Recon: Identify numbers of civilians vs combatants (threat / no-threat) within internal space. (TBD: identifying characteristics)


The AISUM Prize Challenge is broken into three phases; technical white paper submission, virtual simulation environment, and live scenario at a military training site.

Phase I will be conducted in two phases: Initial submission of White Paper Concept followed by an invitation to provide Virtual Presentation of the White Paper Concept. The White Paper Concept shall describe the details of the technical approach in response to the problem statement.
White Paper Concept should be developed around the following conditions:

  • GFP will consist of stretch X drone, onboard processing, frontal camera, and optical avoidance sensors
  • Operation in a Non-GPS environment and an autonomous flight mode
  • Additional hardware payload proposals may be considered


The Government will select up to 25 White Papers Concept(s) for invitation for a Virtual Presentation of their proposed concept to a panel of judges. The Virtual Presentation shall provide a summary of the White Paper Concept and include detail on how participants intend to accomplish the Prize
Challenge objective. Up to 10 winners will be selected to participate in Phase II.

Phase II participants shall develop specific algorithms that will be used to compete in virtual scenarios. The participants will be evaluated for their algorithms to be used within a Government provided virtual map. Participants shall provide the following items to the Government:

  • Unreal level log file (Log file shall provide drone positions during the maneuver and identify drone objects detected.). This is located in: LinuxNoEditor/AISUM/Binaries/Linux/Gamelog.txt OR WindowsNoEditor/AISUM/Binaries/Win64/Gamelog.txt
  • A participant developed log file in accordance with the provided Government template—contains detections of the Objects of Interest, classification/identifications, etc. The log file shall provide drone positions during the maneuver, identify collisions, and the specific time the drone detects Objects of Interest and Humans (to include identification of combatants or non-combatants).
  • A 3D environment map of any format, preferably viewable in Blender (https://www.blender.org). (If it cannot be opened in Blender, include the name of the preferred viewing software.)
  • A 2D floorplan of the mapped space, which includes icons/text for the location of Objects of Interest and Humans.
  • A video screenshot of the run which includes either video from the drone, or a god’s eye perspective video of the run. The video screenshot shall include text, in real-time, is being written to the performer log file. For example, the text may include “printing drone position of x to log file” or “detected chair at x time”. Also, a .txt file containing the total number of uniquely identified objects of interest (e.g. “we identified 16 unique chairs”) must be provided when participant log files are sent to the Government.


Participants shall provide a Phase II White Paper response to the following questions:

  • Summary of the participant’s technical approach to include a description of the methods used, algorithms, novelty aspects, etc.
  • Description of the aspects of the participant’s solution that went as expected and what performed well during the participant’s recorded run.
  • Description of the aspects of the participant’s solution that went worse than expected or did not perform as well.
  • Provide rationale for any inconsistency between the provided unreal level log file and participant developed log file.
  • If selected for Phase III, how do you expect performance issues in Phase II will affect performance in the Phase III, real life scenario? Specifically, what are your plans to address and overcome these challenges?


The purpose of the Phase II White Paper response is to provide narrative to support participants’ solutions and delivered log files of the Virtual Presentation. See additional Phase II White Paper Submission instructions in the below “How to Enter” section.
GFP Includes:

  • Microsoft Research AIRSIM environment for autonomous system using UE Unreal Engine Version 4
  • Map will be an open environment containing a multi-story, multiple room building


Up to 10 winners will be selected to participate in Phase III.

Phase III participants will utilize their developed algorithms with the provided drone and compete in a real life scenario within a hospital building at Muscatatuck Urban Training Center (MUTC). During the real life scenario, the participants shall launch the drone from the Government defined entryway into a limited accessible area within the first floor of the building that may differ from the available virtual representation. The drone shall autonomously enter the building and collect the following information:

  • Search and build a 2D floorplan of the accessible interior space.
  • Detect and count the number of humans within the accessible space.
  • Classify the humans within the accessible space as threats (holding brooms), or as civilians (empty hands).
  • Detect and count the number of Objects of Interest which may represent a threat to the team. The exact definition of the Object of Interest will be provided just prior to mission execution.
  • Place the detected humans (threat/no-threat) and Objects of Interest in the correct locations on the 2D floorplan.
  • Produce the 2D floorplan and object detection data in a usable format within 10 minutes of the UAS exiting the building.


Each participant shall provide the following items to the Government at the conclusion of each participant’s Phase III scenario run:

  • A participant developed log file in accordance with the provided Government template—contains detections of the Objects of Interest, classification/identifications, etc. The log file shall provide drone positions during the maneuver, identify collisions, and the specific time the drone detects Objects of Interest and Humans (to include identification of combatants or non-combatants). The log file shall contain clear information on the total number of humans detected, the total number of Objects of Interest, and the total number of Combatants and Civilians identified.
  • A 2D floorplan of the mapped space, which includes icons/text for the location of Objects of Interest and Humans.
  • The UAS will be launched from the defined entryway.
  • The UAS is limited to an accessible area within the first floor of the building, which may be less than the available virtual representation.
  • The UAS will enter the building autonomously and assist operators by collecting the following information:
    • Search and build a 2D floorplan of the accessible interior space.
    • Detect and count the number of humans within the accessible space.
    • Classify the humans within the accessible space as threats (holding brooms), or as civilians (empty hands).
    • Detect and count the number of Objects of Interest which may represent a threat to the team. The exact definition of the Object of Interest will be provided just prior to mission execution.
    • Place the detected humans (threat/no-threat) and Objects of Interest in the correct locations on the 2D floorplan.
    • Produce the 2D floorplan and object detection data in a usable format within 10 minutes of the UAS exiting the building.
  • GFP will consist of stretch X drone, onboard processing, frontal camera, and optical avoidance sensors
  • Operation in a Non-GPS environment and an autonomous flight mode


All winner(s) will be determined at the end of this phase.

Key Dates

The government estimates the following timeline for the completion of the prize challenge.

Phase I

  • Technical White Paper Submission open 50 calendar days
    • Opens Dec. 10, 2020
    • Closes Jan. 29, 2021
  • NSWC Crane Project Talks: Dec. 17, 2020 (A virtual panel was available to answer questions related to the challenge.gov posting and the Prize Challenge from 1300-1500 ET)
    • NSWC Crane Project Talks Presentation can be found here
    • The recording of this event can be found here
  • Evaluation period 10-14 calendar days
  • Virtual presentation 14 calendar days
  • Complete internal documentation/obtain approval after final evaluation: 7-10 days


Phase II

  • Virtual Invitational Test Day on Tuesday, June 15th
  • Official Virtual Run for Record on Tuesday, July 13th
  • Phase II White Paper submission closes at 0800ET, Monday, July 19th
  • Begin internal review of results on Monday, July 19th
  • Complete internal documentation/obtain approval after final evaluation: 7-12 days
  • Phase II set to end by July 31st with winners and advancing participants announced


Phase III

  • Live event at Muscatatuck Urban Training Center (MUTC) on Monday, October 18th – Friday, October 22nd
  • Complete internal documentation/obtain approval after final evaluation
  • Phase III set to end by November 5th with winners announced


Questions & Answers
Official responses to questions that have been received will be uploaded to beta.SAM.gov via Special Notice N0016420SNB14. View this beta.SAM posting here.


Twenty (20) participants were invited to present their Phase I White Paper concepts to the Artificial Intelligence for Small Unit Maneuver (AISUM) Prize Challenge judging panel. You can view the submitted white paper below. Invited participant names are listed below in alphabetical order:

  1. AMTech One
  2. ASEC, Inc.
  3. Blue Wave AI Labs, LLC
  4. Codex Laboratories LLC
  5. Draper
  6. EpiSys Science, Inc.
  7. Exyn Technologies
  8. Heron Systems Inc.
  9. Indiana University-Bloomington
  10. InfoDao LLC
  11. IUPUI
  12. Physical Optics Corporation/Mercury
  13. Raytheon BBN Technologies
  14. RIT/NUAIR, Inc.
  15. Shield AI, Inc.
  16. Skyline Nav AI Inc.
  17. Sonalysts, Inc.
  18. SSCI, Inc.
  19. Trueface
  20. TurbineOne, LLC

    wpChatIcon