Email this Article Email   

CHIPS Articles: Distributed Real-time Autonomously Guided Operations eNgine

Distributed Real-time Autonomously Guided Operations eNgine
In combat, speed, maneuverability, and accuracy win
By Lt. Cmdr. Rollie J. Wicks - November 3, 2016
Growing up in rural South Dakota, I learned how to operate and fix machinery at a young age. I was always curious about the computers on agriculture equipment. As a high school junior, I participated in a Science, Technology, Engineering, and Mathematics (STEM) program hosted at South Dakota State University and the Earth Resources Observation and Science (EROS) Data Center. During this STEM program, I was exposed to Geographic Information Systems (GIS) for the first time and this experience sparked my interest in pursuing higher education in a data science field.

I went on to pursue a Bachelor of Science at Texas A&M University and then a Master of Science at Naval Postgraduate School in data science and space systems related studies.

As a millennial and a Navy officer, using computers in new and interesting ways continues to be a personal driving force for fixing difficult problems in the U.S. Intelligence Community (IC) and Department of Defense (DoD). Over the past seven years, I have formed a team of government and industry experts and designed the Distributed Real-time Autonomously Guided Operations eNgine (DRAGON).

The DRAGON software is an agent-based and multi-purpose Artificial Intelligence (AI) engine. Though DRAGON is far from being the first AI capability in the US IC and DoD, it is leveraging new perspectives for challenging AI research problems and developing next generation technologies.

DRAGON is being implemented for intelligence and defense test use cases spanning from content analysis related office work to Marine Corps squad-level littoral Reconnaissance Surveillance Target Acquisition operations. The current suite of DRAGON capabilities is made up of hundreds of open, reusable, and modular components that enable advanced human-to-machine communications, data sharing, and data-to-decision services with semantic technologies. These semantic technologies provide the means for the AI to read or visualize, and then understand text, graphics, and physical objects from both information products and the physical geographic environment. In total, there are twelve core semantic systems being developed under the DRAGON project.

Developments in human-to-machine communications are allowing for more natural communications. The use of a keyboard and a mouse are a temporary means by which humans communicate with computers today.

For the Marine Corps version of DRAGON, gesture recognition technologies are being researched to enable military operators to communicate what tasks to perform based on hand and arm signals from the Marine Corps field manual. When military operators are using a keyboard, a mouse, a joystick, etc., their reaction time can be impeded by unnatural interfaces and processes. Millennials, the newest generation of military members, want intelligent and more natural human-to-machine communications that reduce their cognitive burden or the time spent communicating orders and information while increasing their speed and effectiveness of military operations.

In the Marine Corps, this type of communications is known as Manned Unmanned Teaming (MUMT) with “human on the loop” command and control. Emerging operational concepts such as MUMT will continue to shape how humans interact with computers. Reducing the interaction time between humans and computers to accurately communicate orders will ultimately result in the ability of human operators to make more key decisions in less time.

A key goal of the DRAGON software is to offload routine “human in the loop” tasks to computers so humans can focus on the key and special tasks that require direct human involvement. The DRAGON team has created software development and semantic technologies to enable software engineers and user experience designers to capture how humans work, i.e., infantry Marines. The software development team conducts interviews with the experts of an organization, and they observe the processes of the human workforce. The interview and observation process collects information on how people work, the terminology they use, and the relationships between organizational goals, people, schedules, data, and systems.

Programming these processes into computers enables the computers to perform more of the work functions without human intervention. The addition of the AI allows computers to deal with situations where programming is insufficient. This AI process emulates how humans deal with new situations and make decisions. Unlike humans, computers don’t need to eat, rest, or take breaks. As a result of its flexibility and persistence, DRAGON has demonstrated the ability to increase the speed of operations of IC and DoD test cases by one to three orders of magnitude.

Another design feature of DRAGON is to assist the human operator with “human on the loop” command and control. Data sharing and the data-to-decision processes are rapidly evolving with AI. DRAGON shares data through an information bus where stovepipe or isolated systems can be connected and share information. This framework can be hosted in a common information technology room, on remotely piloted vehicles, or on autonomous systems. The information bus can connect dozens to hundreds of systems such as information analysis, navigation control, sensor control, targeting, and weapons.

The connected systems can both publish and subscribe to information without human intervention. Consequently, DRAGON is not dependent on humans to move information to the right place at the right time. Using a computer that is a size of a credit card, DRAGON can move information at speeds of more than one thousand times faster than humans. In reality, more human intervention results in slower information transfer.

Within DRAGON, information from multiple systems and sensors can be organized, interpreted, and then “fused” autonomously. While computers continue to operate at speeds thousands of times faster than humans, the algorithms analyzing the information are advancing to achieve nearly human expert level accuracy. DRAGON can analyze more than one billion records in less than a second and then package and accurately present the results to the operators as decision points.

Before the development of DRAGON and other similar AI technologies, humans were making decisions over long periods of time. In the context of military operations, delaying a decision could place friendly forces under greater and unnecessary risk or allow an adversary to gain an advantage. With DRAGON, humans can make informed decisions in a matter of seconds.

In DoD lab tests, the DRAGON software has demonstrated the ability to present a Marine with critical command and control decisions from multiple systems in a matter of seconds.

Specifically, DRAGON has demonstrated the ability to use less than five percent of the Marine’s time to make command and control decisions during a mock Reconnaissance Surveillance Target Acquisition operation. In combat, speed, maneuverability, and accuracy win.

Like the introduction of blitzkrieg tactics used in World War II, DRAGON’s autonomous data sharing and semi-autonomous data-to-decision processes have the potential to cause disorganization among enemy forces by enabling friendly commanders to increase the speed of military operations beyond the reaction times of the enemy forces. Consequently, AI technologies will change the way commanders gather intelligence, make decisions, and command and control forces.

Computer vision components of the DRAGON framework were demonstrated during Unmanned Warrior 2016. The Unmanned Warrior 2016 demonstration allowed the U.S. Navy, UK Royal Navy, and Allies to test unmanned systems and AI technologies in tactically representative environments with coalition forces.

The DRAGON computer vision components were used for autonomous Combat Identification of vessels and hosted as software defined payloads on two U.S. Navy NRQ-21A Integrator Unmanned Aerial Vehicles (UAVs), which were deployed from shore facilities and operated in locations along the northwest coastline of Scotland. The computer vision components accurately detected and reported more than twice the number of vessels at sea than the human-only operated missions.

Lt. Cmdr. Rollie Wicks is a member of the FY16 SECNAV Naval Innovation Advisory Council (NIAC), a dynamic forum for advisors to conduct research, advance problem-solving projects, and advise the Secretary of the Navy on innovation opportunities within the DON. The DON Office of Strategy and Innovation coordinates support and oversight of the NIAC. For more information, please contact DON_Innovation@navy.mil.

Join DON Innovation on https://www.facebook.com/NavalInnovation or @DON_Innovation or visit the SECNAV/DON Innovation website at http://www.secnav.navy.mil/innovation/Pages/Home.aspx

Official Photo, U.S. Navy Lt. Cmdr. Rollie J. Wicks
Official Photo, U.S. Navy Lt. Cmdr. Rollie J. Wicks
Related CHIPS Articles
Related DON CIO News
Related DON CIO Policy
CHIPS is an official U.S. Navy website sponsored by the Department of the Navy (DON) Chief Information Officer, the Department of Defense Enterprise Software Initiative (ESI) and the DON's ESI Software Product Manager Team at Space and Naval Warfare Systems Center Pacific.

Online ISSN 2154-1779; Print ISSN 1047-9988