Unmanned Mission Command, Pt. 1

By Tim McGeehan

The following two-part series discusses the command and control of future autonomous systems. Part 1 describes how we have arrived at the current tendency towards detailed control. Part 2 proposes how to refocus on mission command.

Introduction

In recent years, the U.S. Navy’s unmanned vehicles have achieved a number of game-changing “firsts.” The X-47B Unmanned Combat Air System (UCAS) executed the first carrier launch and recovery in 2013, first combined manned/unmanned carrier operations in 2014, and first aerial refueling in 2015.1 In 2014, the Office of Naval Research demonstrated the first swarm capability for Unmanned Surface Vehicles (USV).2 In 2015, the NORTH DAKOTA performed the first launch and recovery of an Unmanned Underwater Vehicle (UUV) from a submarine during an operational mission.3 While these successes may represent the vanguard of a revolution in military technology, the larger revolution in military affairs will only be possible with the optimization of the command and control concepts associated with these systems. Regardless of specific mode (air, surface, or undersea), Navy leaders must fully embrace mission command to fully realize the power of these capabilities.

Unmanned History

“Unmanned” systems are not necessarily new. The U.S. Navy’s long history includes the employment of a variety of such platforms. For example, in 1919, Coast Battleship #4 (formerly USS IOWA (BB-1)) became the first radio-controlled target ship to be used in a fleet exercise.4 During World War II, participation in an early unmanned aircraft program called PROJECT ANVIL ultimately killed Navy Lieutenant Joe Kennedy (John F. Kennedy’s older brother), who was to parachute from his bomb-laden aircraft before it would be guided into a German target by radio-control.5 In 1946, F6F Hellcat fighters were modified for remote operation and employed to collect data during the OPERATION CROSSROADS atomic bomb tests at Bikini.6 These Hellcat “drones” could be controlled by another aircraft acting as the “queen” (flying up to 30 miles away). These drones were even launched from the deck of an aircraft carrier (almost 70 years before the X-47B performed that feat).

A Hellcat drone takes flight. Original caption: PILOTLESS HELLCAT (above), catapulted from USS Shangri-La, is clear of the carrier’s bow and climbs rapidly. Drones like this one will fly through the atomic cloud. (All Hands Magazine June 1946 issue)

However, the Navy’s achievements over the last few years were groundbreaking because the platforms were autonomous (i.e. controlled by machine, not remotely operated by a person). The current discussion of autonomy frequently revolves around the issues of ethics and accountability. Is it ethical to imbue these machines with the authority to use lethal force? If the machine is not under direct human control but rather evaluating for itself, who is responsible for its decisions and actions when faced with dilemmas? Much has been written about these topics, but there is a related and less discussed question: what sort of mindset shift will be required for Navy leaders to employ these systems to their full potential?

Command, Control, and Unmanned Systems

According to Naval Doctrine Publication 6 – Command and Control (NDP 6), “a commander commands by deciding what must be done and exercising leadership to inspire subordinates toward a common goal; he controls by monitoring and influencing the action required to accomplish what must be done.”7 These enduring concepts have new implications in the realm of unmanned systems. For example, while a commander can assign tasks to any subordinate (human or machine), “inspiring subordinates” has varying levels of applicability based on whether his units consist of “remotely piloted” aircraft (where his subordinates are actual human pilots) or autonomous systems (where the “pilot” is an algorithm controlling a machine). “Command” also includes establishing intent, distributing guidance on allocation of roles, responsibilities, and resources, and defining constraints on actions.8 On one hand, this could be straightforward with autonomous systems as this guidance could be translated into a series of rules and parameters that define the mission and rules of engagement. One would simply upload the mission and deploy the vehicle, which would go out and execute, possibly reporting in for updates but mostly operating on its own, solving problems along the way. On the other hand, in the absence of instructions that cover every possibility, an autonomous system is only as good as the internal algorithms that control it. Even as machine learning drastically improves and advanced algorithms are developed from extensive “training data,” an autonomous system may not respond to novel and ambiguous situations with the same judgment as a human. Indeed, one can imagine a catastrophic military counterpart to the 2010 stock market “flash crash,” where high-frequency trading algorithms designed to act in accordance with certain, pre-arranged criteria did not understand context and misread the situation, briefly erasing $1 trillion in market value.9

“Control” includes the conduits and feedback from subordinates to their commander that allow them to determine if events are on track or to adjust instructions as necessary. This is reasonably straightforward for a remotely piloted aircraft with a constant data link between platform and operator, such as the ScanEagle or MQ-8 Fire Scout unmanned aerial systems. However, a fully autonomous system may not be in positive communication. Even if it is ostensibly intended to remain in communication, feedback to the commander could be limited or non-existent due to emissions control (EMCON) posture or a contested electromagnetic (EM) spectrum. 

Mission Command and Unmanned Systems

In recent years, there has been a renewed focus across the Joint Force on the concept of “mission command.” Mission command is defined as “the conduct of military operations through decentralized execution based upon mission-type orders,” and it lends itself well to the employment of autonomous systems.10 Joint doctrine states:

“Mission command is built on subordinate leaders at all echelons who exercise disciplined initiative and act aggressively and independently to accomplish the mission. Mission-type orders focus on the purpose of the operation rather than details of how to perform assigned tasks. Commanders delegate decisions to subordinates wherever possible, which minimizes detailed control and empowers subordinates’ initiative to make decisions based on the commander’s guidance rather than constant communications.”11

Mission command for an autonomous system would require commanders to clearly confer their intent, objectives, constraints, and restraints in succinct instructions, and then rely on the “initiative” of said system. While this decentralized arrangement is more flexible and better suited to deal with ambiguity, it opens the door to unexpected or emergent behavior in the autonomous system. (Then again, emergent behavior is not confined to algorithms, as humans may perform in unexpected ways too.) 

In addition to passing feedback and information up the chain of command to build a shared understanding of the situation, mission command also emphasizes horizontal flow across the echelon between the subordinates. Since it relies on subordinates knowing the intent and mission requirements, mission command is much less vulnerable to disruption than detailed means of command and control.

However, some commanders today do not fully embrace mission command with human subordinates, much less feel comfortable delegating trust to autonomous systems.  They issue explicit instructions to subordinates in a highly-centralized arrangement, where volumes of information flow up and detailed orders flow down the chain of command. This may be acceptable in deliberate situations where time is not a major concern, where procedural compliance is emphasized, or where there can be no ambiguity or margin for error. Examples of unmanned systems suitable to this arrangement include a bomb disposal robot or remotely piloted aircraft that requires constant intervention and re-tasking, possibly for rapid repositioning of the platform for a better look at an emerging situation or better discrimination between friend and foe. However, this detailed control does not “function well when the vertical flow of information is disrupted.”12 Furthermore, when it comes to autonomous systems, such detailed control will undermine much of the purpose of having an autonomous system in the first place.

A fundamental task of the commander is to recognize which situations call for detailed control or mission command and act appropriately. Unfortunately, the experience gained by many commanders over the last decade has introduced a bias towards detailed control, which will hamstring the potential capabilities of autonomous systems if this tendency is not overcome.

Current Practice

The American military has enjoyed major advantages in recent conflicts due to global connectivity and continuous communications. However, this has redefined expectations and higher echelons increasingly rely on detailed control (for manned forces, let alone unmanned ones). Senior commanders (or their staffs) may levy demands to feed a seemingly insatiable thirst for information. This has led to friction between the echelons of command, and in some cases this interaction occurs at the expense of the decision-making capability of the unit in the field. Subordinate staff watch officers may spend more time answering requests for information and “feeding the beast” of higher headquarters than they spend overseeing their own operations.

It is understandable why this situation exists today. The senior commander (with whom responsibility ultimately resides) expects to be kept well-informed. To be fair, in some cases a senior commander located at a fusion center far from the front may have access to multiple streams of information, giving them a better overall view of what is going on than the commander actually on the ground. In other cases, it is today’s 24-hour news cycle and zero tolerance for mistakes that have led senior commanders to succumb to the temptation to second-guess their subordinates and micromanage their units in the field. A compounding factor that may be influencing commanders in today’s interconnected world is “Fear of Missing Out” (FoMO), which is described by psychologists as apprehension or anxiety stemming from the availability of volumes of information about what others are doing (think social media). It leads to a strong, almost compulsive desire to stay continually connected.  13

Whatever the reason, this is not a new phenomenon. Understanding previous episodes when leadership has “tightened the reins” and the subsequent impacts is key to developing a path forward to fully leverage the potential of autonomous systems.

Veering Off Course

The recent shift of preference away from mission command toward detailed control appears to echo the impacts of previous advances in the technology employed for command and control in general. For example, when speaking of his service with the U.S. Asiatic Squadron and the introduction of the telegraph before the turn of the 20th century, Rear Admiral Caspar Goodrich lamented “Before the submarine cable was laid, one was really somebody out there, but afterwards one simply became a damned errand boy at the end of a telegraph wire.”14

Later, the impact of wireless telegraphy proved to be a mixed blessing for commanders at sea. Interestingly, the contrasting points of view clearly described how it would enable micromanagement; the difference in opinion was whether this was good or bad. This was illustrated by two 1908 newspaper articles regarding the introduction of wireless in the Royal Navy. One article extolled its virtues, describing how the First Sea Lord in London could direct all fleet activities “as if they were maneuvering beneath his office windows.”15 The other article described how those same naval officers feared “armchair control… by means of wireless.”16 In century-old text that could be drawn from today’s press, the article quoted a Royal Navy officer:

“The paramount necessity in the next naval war will be rapidity of thought and of execution…The innovation is causing more than a little misgiving among naval officers afloat. So far as it will facilitate the interchange of information and the sending of important news, the erection of the [wireless] station is welcomed, but there is a strong fear that advantage will be taken of it to interfere with the independent action of fleet commanders in the event of war.”

Military historian Martin van Creveld related a more recent lesson of technology-enabled micromanagement from the U.S. Army. This time the technology in question was the helicopter, and its widespread use by multiple echelons of command during Viet Nam drove the shift away from mission command to detailed control:

“A hapless company commander engaged in a firefight on the ground was subjected to direct observation by the battalion commander circling above, who was in turn supervised by the brigade commander circling a thousand or so feet higher up, who in his turn was monitored by the division commander in the next highest chopper, who might even be so unlucky as to have his own performance watched by the Field Force (corps) commander. With each of these commanders asking the men on the ground to tune in his frequency and explain the situation, a heavy demand for information was generated that could and did interfere with the troops’ ability to operate effectively.”17

However, not all historic shifts toward detailed control are due to technology; some are cultural. For example, leadership had encroached so much on the authority of commanders in the days leading up to World War II that Admiral King had to issue a message to the fleet with the subject line “Exercise of Command – Excess of Detail in Orders and Instructions,” where he voiced his concern. He wrote that the:

“almost standard practice – of flag officers and other group commanders to issue orders and instructions in which their subordinates are told how as well as what to do to such an extent and in such detail that the Custom of the service has virtually become the antithesis of that essential element of command – initiative of the subordinate.”18

Admiral King attributed this trend to several cultural reasons, including anxiety of seniors that any mistake of a subordinate be attributed to the senior and thereby jeopardize promotion, activities of staffs infringing on lower echelon functions, and the habit and expectation of detailed instructions from junior and senior alike. He went on to say that they were preparing for war, when there would be neither time nor opportunity for this method of control, and this was conditioning subordinate commanders to rely on explicit guidance and depriving them from learning how to exercise initiative. Now, over 70 years later, as the Navy moves forward with autonomous systems the technology-enabled and culture-driven drift towards detailed control is again becoming an Achilles heel.

Read Part 2 here.

Tim McGeehan is a U.S. Navy Officer currently serving in Washington. 

The ideas presented are those of the author alone and do not reflect the views of the Department of the Navy or Department of Defense.

References

[1] Northrup Grumman, X-47B Capabilities, 2015, http://www.northropgrumman.com/Capabilities/x47bucas/Pages/default.aspx

[2] David Smalley, The Future Is Now: Navy’s Autonomous Swarmboats Can Overwhelm Adversaries, ONR Press Release, October 5, 2014, http://www.onr.navy.mil/en/Media-Center/Press-Releases/2014/autonomous-swarm-boat-unmanned-caracas.aspx

[3] Associated Press, Submarine launches undersea drone in a 1st for Navy, Military Times, July 20, 2015, http://www.militarytimes.com/story/military/tech/2015/07/20/submarine-launches-undersea-drone-in-a-1st-for-navy/30442323/

[4] Naval History and Heritage Command, Iowa II (BB-1), July 22, 2015, http://www.history.navy.mil/research/histories/ship-histories/danfs/i/iowa-ii.html

[5] Trevor Jeremy, LT Joe Kennedy, Norfolk and Suffolk Aviation Museum, 2015, http://www.aviationmuseum.net/JoeKennedy.htm

[6] Puppet Planes, All Hands, June 1946, http://www.navy.mil/ah_online/archpdf/ah194606.pdf, p. 2-5

[7] Naval Doctrine Publication 6:  Naval Command and Control, 1995, http://www.dtic.mil/dtic/tr/fulltext/u2/a304321.pdf, p. 6

[8] David Alberts and Richard Hayes, Understanding Command and Control, 2006, http://www.dodccrp.org/files/Alberts_UC2.pdf, p. 58

[9] Ben Rooney, Trading program sparked May ‘flash crash’, October 1, 2010, CNN, http://money.cnn.com/2010/10/01/markets/SEC_CFTC_flash_crash/

[10] DoD Dictionary of Military and Associated Terms, March, 2017, http://www.dtic.mil/doctrine/new_pubs/jp1_02.pdf

[11] Joint Publication 3-0, Joint Operations, http://www.dtic.mil/doctrine/new_pubs/jp3_0.pdf

[12] Ibid

[13] Andrew Przybylski, Kou Murayama, Cody DeHaan , and Valerie Gladwell, Motivational, emotional, and behavioral correlates of fear of missing out, Computers in Human Behavior, Vol 29 (4), July 2013,  http://www.sciencedirect.com/science/article/pii/S0747563213000800

[14] Michael Palmer, Command at Sea:  Naval Command and Control since the Sixteenth Century, 2005, p. 215

[15] W. T. Stead, Wireless Wonders at the Admiralty, Dawson Daily News, September 13, 1908, https://news.google.com/newspapers?nid=41&dat=19080913&id=y8cjAAAAIBAJ&sjid=KCcDAAAAIBAJ&pg=3703,1570909&hl=en

[16] Fleet Commanders Fear Armchair Control During War by Means of Wireless, Boston Evening Transcript, May 2, 1908, https://news.google.com/newspapers?nid=2249&dat=19080502&id=N3Y-AAAAIBAJ&sjid=nVkMAAAAIBAJ&pg=470,293709&hl=en

[17] Martin van Creveld, Command in War, 1985, p. 256-257.

[18] CINCLANT Serial (053), Exercise of Command – Excess of Detail in Orders and Instructions, January 21, 1941

Featured Image: An X-47B drone prepares to take off. (U.S. Navy photo)

3 thoughts on “Unmanned Mission Command, Pt. 1”

  1. Eliminating detailed command as a norm is long overdue in the Navy.

    Hard to do when troops can be greeted on the beach by reporters and the president indulges in monitoring spec ops via live stream.

    To actually make this happen, the review process for those in command needs to re-emphasize the actual development of leaders as a priority, rather than not screwing up and continuing our risk averse environment.

    This is tough at a time when collisions and wide spread graft are consuming leadership’s attention. But it is worth noting that leadership also failed to prevent either outcome.

    Junior officers will not become good senior officers without time off the leash. This will involve making errors, and hopefully extracting themselves from predicaments. Being on the leash means never committing error, not the best way to learn, but more important, a lousy way to season men for independent command.

    But if you cannot do it with your men, you will never do it with a machine.

  2. Valuable and interesting points made here about managing machines.

    With all the current commercial & military focus on maritime leadership, with formal leadership and teamwork courses required for advancement and certification, these courses nearly always completely miss (as you suggest) an important component of modern maritime supervision: machine management.

    Whether autonomous, semi-autonomous, adaptive, or simply highly automated, machines play a large and increasing role in our fleets. Shipboard leadership involves knowing how best to apportion tasking between humans and machines or between different types of machines to optimize productivity or mission impact; how to train and provide feedback to machine operators and even to the machines themselves; how to modify machines or their functions to better accomplish the mission or keep operating during casualties or battle damage; how to collaborate usefully with machine technicians and engineers; what the machine’s true capabilities & limitations are – including “off-label” applications; how to recognize impending machine malfunction and fend it off; how to recognize and optimize a machine’s “quirks,” etc.

    These management concepts have existed as long as machines have, but take on increased importance on ships of decreasing crew size, and where AI plays an increasing role. We need to recognize the Hal in the room – perhaps it’s better to say, “We’re all Scotty now” – and teach maritime leaders how to manage machines.

  3. Control of autonomous systems is a crucial area of discussion for future warfighting, but I am afraid the author is a bit light on his literature review, pulling the argument back several steps from where we already are.

    First, I love the reference to Coast Battleship No. 4 as a radio-controlled target; a truly little-known bit of Navy history. Missed, though, was the TDR-1, a purpose-built optionally-manned aircraft, which showed the limits of early guided weapons. [http://www.navalaviationmuseum.org/attractions/aircraft-exhibits/item/?item=tdr]
    As a further aside for those interested, I recently read the book “Warriors and Wizzards” by Martin J. Bollinger, about German glide bombs in WWII–bottom line, remotely piloting weapons hurts efficiency so much that 1940s efforts were largely a waste of effort against cheaper traditional options.

    Skipping ahead a few decades, though, and the US Navy had fielded a wide variety of autonomous weapons–Sidewinder, Harpoon, Mk 48, etc.–many of which have proven their effectiveness in combat. Fire-and-forget weapons are heavily biased, but they are still logic-driven machines, and operators need to know that logic to effectively employ them.

    Pushing forward into the future, I don’t expect a categorical shift, but I do expect that the split between manual skills and underlying systems knowledge will continue to shift towards understanding the autopilot vice flying by the seat of the pants.

    The last time there was a “revolution in military affairs”, we lost decades of hard-won knowledge chasing the illusory promise of easy victory. Let’s not make the same mistake by inventing problems; just find out what we’re already doing and improve on that.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.