Tag Archives: drones

Leading the Blind: Teaching UCAV to See

In “A Scandal in Bohemia”, Sherlock Holmes laments, “You [Watson] see, but you do not observe. The distinction is clear.” Such is the current lament of America’s fleet of UCAVs, UGV’s, and other assorted U_V’s: they have neither concept nor recognition of the world around them. To pass from remote drones living on the edges of combat to automated systems at the front, drones must cross the Rubicon of recognition.

To See

Still can't see a thing.
Help!

The UCAV is the best place to start, as the skies are the cleanest canvas upon which drones could cast their prying eyes. As with any surveillance system, the best ones are multi-faceted. Humans use their five senses and a good portion of deduction.  Touch is a bit too close for UCAV, smell and hearing would be both useless and uncomfortable at high speed, and taste would be awkward. Without that creative deductive spark, drones will need a bit more than a Mk 1 Eyeball. Along with radar, good examples for how a drone might literally “see” besides a basic radar picture are the likes of the layered optics of the ENVG (Enhanced Night Vision) or the RLS (Artillery Rocket Launch Spotter).

Operators for typical optical systems switch between different modes to understand a picture. A USN Mk38 Mod-2 24MM Bushmaster has a camera system with an Electro-Optical System (EOS), Forward Looking Infrared (FLIR), and a laser range-finder. While a Mod-2 operator switches between the EOS and FLIR, in the ENVG, both modes are combined to create an NVG difficult to blind. For a drone, digital combination isn’t necessary, all inputs can be perceived by a computer at one time. Optical systems can also be put on multiple locations on the UCAV to aid in creating a 3D composite of the contact being viewed. Using an array of both EOS and FLIR systems simultaneously could allow drones to “see” targets in more varied and specific aspect than the human eye.

For the deployment of these sensors, the RLS is a good example of how sensors can “pass” targets to one another. In RLS, after target data is collected amongst audio and IR sensors, flagged threats are passed to the higher-grade FLIR for further designation and potential fire control solution. A UCAV outfitted with multiple camera systems could, in coordination with radar, pass detected targets within a certain parameter “up” to better sensors. Targets viewed in wide-angle scans (such as stealth aircraft only seen) can be passed “down” to radar with further scrutiny based on bearing. UCAV must be given a suite of sensors that would not merely serve a remote human operator, but for the specific utility of the UCAV itself that could take advantage of the broad-access of computer capabilities.

And Observe

In-game models for real-life comparison.
In-game models for real-life comparison.

However, this vast suite of ISR equipment still leaves a UCAV high-and-dry when it comes to target identification. Another officer suggested to me that, “for a computer to identify an air target, it has to have an infinite number of pictures of every angle and possibility.” With 3-D rendered models of desired aircraft, UCAV could have that infinite supply of pictures with varying sets of weapons and angles of light. If a UCAV can identify an aircraft’s course and speed, it would decrease that “range” of comparison to other aircraft or a missiles by orienting that contact’s shape and all comparative models along that true motion axis. Whereas programs like facial recognition software build models from front-on pictures, we have the specifications on most if not all global aircraft. Just as searching the internet for this article, typing “Leading” into the search bar eliminates all returns without the word. In the same way, a UCAV could eliminate all fighter aircraft when looking at a Boeing 747. 3-D modeled comparisons sharpened by target-angle perspective comparisons could identify an airborne contact from any angle.

A UCAV also need not positively identify every single airborne target. A UCAV could be loaded with a set of parameters as well as a database limited to those aircraft of concern in the operating area. AEGIS flags threats by speed, trajectory, and other factors; so too could a UCAV gauge its interest level in a contact based on target angle and speed in relation to the Carrier Strike Group (CSG). Further, loading every conceivable aircraft into an onboard database is as sensible as training a pilot to recognize the make and model of every commercial aircraft on the planet. A scope of parameters for “non-military” could be loaded into a UCAV along with the specific models of regional aircraft-of-interest. The end-around of strapping external weapons to commercial aircraft or using those aircraft as weapons could be defeated by the previously noted course/speed parameters, as well as a database of weapons models.

Breaking Open the Black Box

The musings of an intrigued amateur will not solve these problems; our purpose here is to break open the black box of drone operations and start thinking about our next step. We take for granted the remote connections that allow our unmanned operations abroad, but leave a hideously soft underbelly for our drones to be compromised, destroyed, or surveilled at the slightest resistance. Success isn’t as simple as building the airframe and programming it to fly. For a truly successful UCAV, autonomy must be a central goal. A whole bevy of internal processes must be mastered, in particular the ability of the UCAV to conceive and understand the world around it. The more we parse out the problem, the more ideas we may provide to those who can execute them. I’m often told that, “if they could do this, they would have done it”… but there’s always the first time.

Matt Hipple is a surface warfare officer in the U.S. Navy.  The opinions and views expressed in this post are his alone and are presented in his personal capacity.  They do not necessarily represent the views of U.S. Department of Defense or the U.S. Navy.

Video Game AI and the Future UCAV Top Gun

My brother in flight school should be glad we played so much Ace Combat 4.
Alright, Roomba, now start sweeping for enemy units.

A Roomba is useful because it can sweep up regular messes without constant intervention, not because it can exit and enter its docking station independently. Although the Navy’s new X-47B Unmanned Combat Air Vehicle (UCAV) has, by landing on a carrier, executed an astounding feat even for humans, this ability only means our weapons have matured past their typical one-way trips. The real challenge will be getting a UCAV to defend units while sweeping up the enemy without remote guidance (i.e. autonomously). The answer is as close as the games running on your Xbox console.

 

 

Player One: Insert Coin

ace-combat-5-the-unsung-war-20041024114416859
Simulated fighters are UCAVs having an out-of body experience.

Considering the challenge of how an air-to-air UCAV might be programmed, recall that multiple generations of America’s youth have already fought untold legions of advanced UCAV’s. Developers have created artificial “intelligences” designed to combat a human opponent in operational and tactical scenarios with imperfect information; video games have paved the way for unmanned tactical computers.

A loose application of video game intelligence (VGI) would work because VGI is designed to operate in the constrained informational environment in which a real-life UCAV platform would operate. Good (i.e. fun) video game AI exists in the same fog of war constraints as their human opponents; the same radar, visual queues, and alerts are provided to the computer and human players. The tools to lift that veil for computer and human are the same. Often, difficulty levels in video games are not just based on the durability and damage of an enemy, but on the governors installed by programmers on a VGI to make competition fair with a human opponent. This is especially evident in Real Time Strategy (RTS), where the light-speed all-encompassing force management and resource calculations of a VGI can more often than not overwhelm the subtler, but slower, finesse of the human mind within the confines of the game. Those who wonder when humans will go to war using autonomous computers fail to see the virtual test-bed in which we already have, billions of times.

This Ain’t Galaga

No extra lives, and forget about memorizing the level's flight patterns.
No extra lives, and forget about memorizing the level’s flight patterns.

Those uninitiated must understand how VGI has progressed by leaps and bounds from the pre-programmed paths of games such as the early 1980’s arcade shooter Galaga; computer opponents hunt, take cover, maneuver evasively, and change tactics based on opportunities or a sudden states of peril. The 2000’s Half-Life and HALO game series were especially lauded for their revolutions in AI – creating opponents that seemed rational, adapting to a player’s tactics. For the particular case of UCAV air-to-air engagements, since the number of flight combat simulators is innumerable, from Fighter Pilot on the Commodore 64 in 1984 to the Ace Combat series. Computers have been executing pursuit curves, displacement rolls, and defensive spirals against their human opponents since before I was born.

However, despite its utility, VGI is still augmented with many “illusions” of intelligence, mere pre-planned responses (PPR); the real prize is a true problem-solving VGI to drive a UCAV. That requires special programming and far more processing power. In a real UCAV, these VGI would be installed into a suite far more advanced than a single Pentium i7 or an Xbox. To initiate a learning and adapting problem-solving tactical computer, the DARPA SyNAPSE program offers new possibilities, especially when short-term analog reasoning is coordinated with messier evolutionary algorithms. Eventually, as different programs learn and succeed, they can be downloaded and replace the lesser adaptations on other UCAVs.

I’ve Got the Need, The Need For Speed

Unlike Maverick, drones will never have to go through motorcycle safety training.
Unlike Maverick, drones will never have to go through motorcycle safety training.

When pilots assert that they are more intuitive than computer programs, they are right; this is, however, like saying the amateur huntsman with an AR-15 is lesser trained than an Austrian Arabesquer. The advantage is not in the quality of tactical thought, but in the problem solving rate-of-fire and speed of physical action. A VGI executing air-to-air tactics in a UCAV can execute the OODA loop encompassing the whole of inputs much faster than the human mind, where humans may be faster or more intuitive in solving particular, focused problems due to creativity and intuition. Even with the new advanced HUD system in their helmets, a human being cannot integrate input from all sensors at an instant in time (let alone control other drones). Human pilots are also limited in their physical ability to maneuver. G-suits exist because our 4th and 5th generation fighters have abilities far in excess of what the human body is capable. This artificially lowers aircraft tactical performance to prevent the death or severe damage of the pilot inside.

Pinball Wizard: I Can’t See!

VGI doesn’t have a problem with the how, it’s the who that will be the greatest challenge when the lessons of VGI are integrated into a UCAV. In a video-game, the VGI is blessed with instant recognition; its enemy is automatically identified when units are revealed, their typology is provided instantly to both human and VGI. A UCAV unable to differentiate between different radar contacts or identify units via its sensors is at a disadvantage to its human comrades or enemies. Humans still dominate the field of integrating immediate quality analysis with ISR within the skull’s OODA loop. Even during the landing sequence, the UCAV cheated in a way by being fed certain amounts of positional data from the carrier.

We’ve passed the tutorial level of unmanned warfare; we’ve created the unmanned platforms capable of navigating the skies and a vast array of programs designed to drive tactical problems against human opponents. Before we pat ourselves on the back, we need to effectively integrate those capabilities into an independent platform.

Matt Hipple is a surface warfare officer in the U.S. Navy.  The opinions and views expressed in this post are his alone and are presented in his personal capacity.  They do not necessarily represent the views of U.S. Department of Defense or the U.S. Navy.

Defeating Floating IEDs with USVs

By CDR Jeremy Thompson, USN

This concept proposal explores a technology solution to the problem of risk to first responders when identifying, neutralizing, and exploiting “surface-floating” maritime improvised explosive devices (SF/MIEDs).

Does the Navy need a maritime equivalent of the Talon Counter-IED robot?
Does the Navy need a maritime equivalent of the Talon Counter-IED robot?

When considering the proliferation of technology for use against land-based improvised explosive devices (IEDs), it may be puzzling to many observers why remote IED Defeat (IEDD) technologies, particularly robots, have yet to fully cross over into the maritime domain. Although some unmanned underwater vehicle programs designed for limpet mine-like object detection on ships are in development, much less attention has been given to countering SF/MIEDs. In general, the purpose of MIEDs is to destroy, incapacitate, harass, divert, or distract targets such as ships, maritime critical infrastructure and key resources (CI/KR), and personnel. MIEDs may also present obstacles (real or perceived) with the purpose of area denial or egress denial. As a subset of the MIED family, the “surface-floating” MIED operates on the water’s surface in environments such as harbors, the littorals, the riparian, and the open ocean. It may be either free floating or self-propelled, with remote control (manual or pre-programmed) or with no control (moves with the current). It is a tempting low-tech, low-cost option for an adversary.

Thankfully, SF/MIED incidents have been rare in recent times, the last significant use occurring during the Vietnam war. Nonetheless, a capability gap is highlighted by the challenge they represent—namely, that a human must unnecessarily expose themselves to the object. One material solution to a surface-floating IED may be to develop an IED Defeat Unmanned Surface Vessel (USV) around a design philosophy based on IEDD robots used in land warfare. Protection of high value units and critical infrastructure / key resources would be its primary missions along with counter-area denial. Its most likely operating environment would be CI/KR dense areas such as harbors and seaports as well as the riparian environment since rivers are constricted in the water space available to shipping to maneuver around SF/MIED threats. A key element of design philosophy in an IEDD USV would be to meet the expectations of the customer—the first responder. Military explosive ordnance disposal (EOD) units and civilian bomb squads are much more likely to accept a platform in which the console and all other human interface features are nearly identical in look, placement, feel, and responsiveness as the most popular robots they have been accustomed to operating such as the TALON robot by QinetiQ and Packbot by iRobot.

A functional hierarchy could be drawn around major tasks such as reacquisition of a suspected surface-floating IED, identify/classify, threat removal, neutralization, and recovery of the IED for exploitation. Modularized payload packages to execute these tasks may include a towing package, an attachments package (e.g. hooks, magnets), a neutralization tool package to include both precision and general disruption EOD tools, an explosives, chemical, and radiological detection package, and an electronic counter-measures package.

Numerous trade-offs between weight, power, stability, and the complexity of modular packages would need to be considered and tested, however, variants like a “high-low” combination of a complex and simple USV working together may minimize some of the trade-off risk. If an IEDD USV were to be developed key recommendations include:

  • Official liaison between NAVSEA (US Naval Sea Systems Command) between PMS-406 (Unmanned Maritime Systems) and PMS-408 (EOD/CREW program) to ensure the transfer of USV expertise between PMS divisions.
  • A DOTMLPF assessment to determine whether limpet mines or surface-floating IEDs are more likely and more dangerous to U.S. assets and personnel given the uncertainty of future naval operations.
  • Including civilian bomb squads in the design and development process early to increase the potential for demand and cross-over with the law enforcement sector and therefore reduced long-term program costs.

Current UUV programs under development include the Hull UUV Localization System (HULS) and Hovering Autonomous Underwater Vehicle (HAUV).

This article was re-posted by permission from, and appeared in its original form at NavalDrones.com.

What’s at Stake in the Remote Aviation Culture Debate

It has been written that it is difficult to become sentimental about . . . the new type of seaman—the man of the engine and boiler rooms. This idea is born of the belief that he deals with material things and takes no part in the glorious possibilities of war or in the victories that are won from storms. This theory is absolutely false . . . for there is music as well as the embodiment of power about the mechanisms that drive the great ships of today.

—Capt Frank Bennett, USN
The Steam Navy of the United States, 1897

Hunting for a wingman
                                      Hunting for a wingman

From our flyboy friends in the U.S. Air Force comes the article “The Swarm, the Cloud, and the Importance of Getting There First” in the July/Aug issue of the Air & Space Power Journal (including the lead-in excerpt). In it, friend-of-CIMSEC Maj David Blair and his partner Capt Nick Helms, both manned-aircraft and drone pilots, address their vision for the future of the aviation warfare concept of operations and the cultural sea changes that must take place to accommodate it. Needless to say, such a vision is also relevant to the future of naval aviation. So if you’ve got some beach-reading time ahead of you, dig in. The link above includes the full article:

This article advocates an aviation future of manned–remotely piloted synergy in which automation amplifies rather than replaces the role of aviators in aviation. In this vision, aviators are judged solely by their effects on the battlefield. Amidst this new standard of decentralized execution is the “swarm,” a flock of highly sophisticated unmanned combat aerial vehicles that serve as “loyal wingmen” for manned strike aircraft. Here, every striker is a formation flexibly primed to concentrate effects at the most decisive times and locations. This future also includes the “cloud,” a mass of persistent remotely piloted aircraft (RPA) that provide vertical dominance through wholesale fire support from airspace cleared by the swarm. Fusion amplifies the human capacity for judgment by delegating routine tasks to automation and “demanding” versatile effects in response to fog and friction rather than “commanding” inputs.

The challenge is not technological but cultural. To realize this future, we first must accept remote aviation as a legitimate part of the Air Force story, and then we must look to deep streams of airpower thought in order to understand it. First, Gen Henry “Hap” Arnold teaches us air-mindedness—to fully leverage a technology, we must develop both humans and hardware. Second, Gen Elwood Quesada describes an aviator’s relationship with technology—the discussion is never “human versus machine”; rather, it concerns the relationship between humans and machines. Instead of a cybernetic view in which automation reduces the role of humans in the world, we argue for a capabilities-based perspective that uses automation to empower aviators to better control the battlespace. Third, Col John Boyd reminds us that identities are always in flux in response to changing technical possibilities.

Thus, the F-22 and the RPA are more akin than we realize since both embrace the power of advanced processors and networked data links. An Airman’s view of RPA futures enables manned–remotely piloted fusion, and both traditional and remote aviators must build that future together as equals. The friendly lives saved and enemy lives taken by RPAs in the air campaigns of the last decade merit this acceptance. 

Dave also recommends the article “Why Drones Work: The Case for Washington’s Weapon of Choice” by Daniel Byman.