In “A Scandal in Bohemia”, Sherlock Holmes laments, “You [Watson] see, but you do not observe. The distinction is clear.” Such is the current lament of America’s fleet of UCAVs, UGV’s, and other assorted U_V’s: they have neither concept nor recognition of the world around them. To pass from remote drones living on the edges of combat to automated systems at the front, drones must cross the Rubicon of recognition.
To See
The UCAV is the best place to start, as the skies are the cleanest canvas upon which drones could cast their prying eyes. As with any surveillance system, the best ones are multi-faceted. Humans use their five senses and a good portion of deduction. Touch is a bit too close for UCAV, smell and hearing would be both useless and uncomfortable at high speed, and taste would be awkward. Without that creative deductive spark, drones will need a bit more than a Mk 1 Eyeball. Along with radar, good examples for how a drone might literally “see” besides a basic radar picture are the likes of the layered optics of the ENVG (Enhanced Night Vision) or the RLS (Artillery Rocket Launch Spotter).
Operators for typical optical systems switch between different modes to understand a picture. A USN Mk38 Mod-2 24MM Bushmaster has a camera system with an Electro-Optical System (EOS), Forward Looking Infrared (FLIR), and a laser range-finder. While a Mod-2 operator switches between the EOS and FLIR, in the ENVG, both modes are combined to create an NVG difficult to blind. For a drone, digital combination isn’t necessary, all inputs can be perceived by a computer at one time. Optical systems can also be put on multiple locations on the UCAV to aid in creating a 3D composite of the contact being viewed. Using an array of both EOS and FLIR systems simultaneously could allow drones to “see” targets in more varied and specific aspect than the human eye.
For the deployment of these sensors, the RLS is a good example of how sensors can “pass” targets to one another. In RLS, after target data is collected amongst audio and IR sensors, flagged threats are passed to the higher-grade FLIR for further designation and potential fire control solution. A UCAV outfitted with multiple camera systems could, in coordination with radar, pass detected targets within a certain parameter “up” to better sensors. Targets viewed in wide-angle scans (such as stealth aircraft only seen) can be passed “down” to radar with further scrutiny based on bearing. UCAV must be given a suite of sensors that would not merely serve a remote human operator, but for the specific utility of the UCAV itself that could take advantage of the broad-access of computer capabilities.
And Observe
However, this vast suite of ISR equipment still leaves a UCAV high-and-dry when it comes to target identification. Another officer suggested to me that, “for a computer to identify an air target, it has to have an infinite number of pictures of every angle and possibility.” With 3-D rendered models of desired aircraft, UCAV could have that infinite supply of pictures with varying sets of weapons and angles of light. If a UCAV can identify an aircraft’s course and speed, it would decrease that “range” of comparison to other aircraft or a missiles by orienting that contact’s shape and all comparative models along that true motion axis. Whereas programs like facial recognition software build models from front-on pictures, we have the specifications on most if not all global aircraft. Just as searching the internet for this article, typing “Leading” into the search bar eliminates all returns without the word. In the same way, a UCAV could eliminate all fighter aircraft when looking at a Boeing 747. 3-D modeled comparisons sharpened by target-angle perspective comparisons could identify an airborne contact from any angle.
A UCAV also need not positively identify every single airborne target. A UCAV could be loaded with a set of parameters as well as a database limited to those aircraft of concern in the operating area. AEGIS flags threats by speed, trajectory, and other factors; so too could a UCAV gauge its interest level in a contact based on target angle and speed in relation to the Carrier Strike Group (CSG). Further, loading every conceivable aircraft into an onboard database is as sensible as training a pilot to recognize the make and model of every commercial aircraft on the planet. A scope of parameters for “non-military” could be loaded into a UCAV along with the specific models of regional aircraft-of-interest. The end-around of strapping external weapons to commercial aircraft or using those aircraft as weapons could be defeated by the previously noted course/speed parameters, as well as a database of weapons models.
Breaking Open the Black Box
The musings of an intrigued amateur will not solve these problems; our purpose here is to break open the black box of drone operations and start thinking about our next step. We take for granted the remote connections that allow our unmanned operations abroad, but leave a hideously soft underbelly for our drones to be compromised, destroyed, or surveilled at the slightest resistance. Success isn’t as simple as building the airframe and programming it to fly. For a truly successful UCAV, autonomy must be a central goal. A whole bevy of internal processes must be mastered, in particular the ability of the UCAV to conceive and understand the world around it. The more we parse out the problem, the more ideas we may provide to those who can execute them. I’m often told that, “if they could do this, they would have done it”… but there’s always the first time.
Matt Hipple is a surface warfare officer in the U.S. Navy. The opinions and views expressed in this post are his alone and are presented in his personal capacity. They do not necessarily represent the views of U.S. Department of Defense or the U.S. Navy.
An example of excellence in forward thinking, well done! This man will go far and so too will his “all seeing UCAV’s”, I hope he becomes part of the project.
You’ve highlighted one of the shortfalls with the US Navy approach to unmanned ISR so far. There seems to be an assumption that UAVs with a sensor will be immediately able to pass data to the shooter in order to enable kinetic action. This could be the case against a high-end enemy that can be quickly and defintively identified through technical means. What has happened over the last set of wars, however, is that the “enemy” can be hard to identify…and it often takes a lot of time and data to enable serious human-in-the-loop analysis to get usable targeting data. There are probably ways to use technology to minimize the need for eyes-on, real-time human involvement in collection…but putting a camera and a radar on an UAV does not automate the targeting process.
Absolutely! Perhaps I watered down my point with the bit about integrated ISR devices, but the ability to read what those cameras are showing is the most important part of this puzzle.