All posts by Guest Author

Proficiency Versus Effectiveness: What Readiness Is Not

Redefining Readiness Topic Week

By Jesse Schmitt

As I write this in late April, America’s National Football League (NFL) Draft season is in full swing. Around the country, hundreds of athletes showcase their athletic prowess in tests and combines, thousands of journalists, scouts, and coaches observe every metric they can find, and millions of football fans greedily consume every rumor, analysis, and mock draft they can find in the “content desert” that is the football offseason. And every year, a certain class of player emerges. These athletes are genetic freaks, even among their impressive peer group. They run faster, leap higher, and throw farther than everyone else. They emerge from the NFL combines and pro days to mass media acclaim, with the eye-popping numbers to back it up. Among these standouts, a given NFL team will find a “favorite” and the team’s ownership, coaches, or staff will invariably fall in love with their favorite’s potential and draft them in an early round, signing them to a multi-million dollar, multi-year contract. Just as invariably, year after year, many teams’ favorites on draft day will be disappointments when the NFL season kicks off. 

That is not a foregone conclusion, and a number of hyped early-round choices go on to successful NFL careers. But many more of them flame out; their athletic gifts do not provide them the same advantage in their professional careers as they had at the college level. It turns out there is more to being a star linebacker or running back than simply running fast and jumping far. Of course, no NFL scout can resist claiming they predicted such an outcome. “Yeah, his numbers were good,” they say knowingly, “but it didn’t show up on the game tape.”

Now, imagine these athletes never had the chance to play in a live game. There is no game tape to provide context to their incredible testing numbers. Imagine a world in which players were only ever tested, clocked, ranked, and drafted but never actually took the field to play against another team for a trophy. Each team has no schedule, but must be ready to play a game at any time. Could they truly project their ability to win by listing how strong and fast each player is?

The set for the 2010 NFL Draft at Radio City Music Hall in New York City. Photo By Marianne O’Leary – originally posted to Flickr as “NFL Draft 2010 Set at Radio City Music Hall”, CC BY 2.0, via Wikimedia commons. NFL, NFL Draft, NFL teams, and other logos are registered trademarks of the National Football League and the respective teams.

The U.S. Department of Defense is like an NFL franchise that, for lack of any game tape, relies solely on the athletic testing numbers to determine how ready their players are for the next game. Internal metrics of success, as the only numbers available, take on an outsized importance in predicting battlefield success. But anyone with even a small amount of military (or athletic) experience understands that there is more to winning than simply being the most proficient person on the field. A military, like a football team, has to be effective, as well.

In common usage, “proficient” and “effective” are nearly synonymous, but the distinction between them is critical to understanding how evaluating military forces must change. “Proficiency” is being competent or skilled at a task, whereas “effectiveness” is something that successfully produces a desired or intended result. The reason they are so easily conflated is that in many cases, they are the same: when a young private finds his weapon is jammed, his proficiency at clearing his rifle should yield the effective result of getting back into the fight. 

To continue the athlete analogy, proficiency is testing well at the combine; effectiveness is catching touchdowns and making tackles in a game. There is a connection between the two, but a loose one, depending on the position and individual player. 

Warfighting is not a task where proficiency necessarily correlates directly to effectiveness. Furthermore, there is no established metric for measuring battlefield proficiency. The U.S. has not fought an industrialized peer competitor since 1945, and even if it had, such proficiency would be only tangentially relevant in the hyper-connected information competition space of 2021 and beyond. 

The U.S. military’s current fixation on “readiness” as a term is more focused on proficiency than effectiveness. Readiness has its roots in Congressional hearings: the Pentagon, ever vigilant of Congressional budgetary oversight, justifies budget requests and expenditures with metrics such as the number of exercises performed, deployments made, and man-hours spent on maintaining and operating systems. 

Moving down the chain, tactical units report similar metrics to the Combatant Commands, who are always hungry for data. Each echelon reports their own measures of performance because the appetite for budgetary justification is omnipresent and unforgiving. These are not selfish or incompetent actions on anyone’s part—it is simply the way that incentives have been structured. 

What is not being asked are the more important, but difficult to measure, questions: Are the actions being taken actually effective? Is the force actually capable of successfully producing the desired outcome, which is victory in a full scale armed conflict? For that matter, is the goal of a modern military solely to win a declared conflict, or is it to compete effectively with global and regional adversaries before overt hostilities? Time will ultimately tell if “Readiness” actually reflected the ability to succeed, but there is also plenty more that could be done now to determine that answer before relying on hindsight.

Generals Brown and Berger explained this same problem in their recent War on the Rocks op-ed “Redefine Readiness or Lose.” These veterans of high-level DoD decision-making charged that “readiness” could not just be synonymous with “availability”, as it stymied the incentives for long-term transformational actions in favor of maintaining current systems and strategies to maintain a fictional sense of omnipresent readiness. Put slightly more irreverently, the driving incentive for too much of the DoD is presenting green (i.e., “ready”) metrics on PowerPoint slides to their boss, who can report green metrics to their boss, so on and so forth, all the way to Congressional Armed Services Committees. It is important to reiterate: these are not the results of incompetent or cowardly individuals, they are simply the result of the systemic incentives as they exist today.

This critique is not new, nor is it particularly challenged by most military leaders today. What remains, though, is a deafening vacuum for what the solution must be. The fact that change is needed is widely accepted; what that change should be is undetermined. The challenge, as always, is capturing a subjective measurement in appropriately objective metrics for data collection. How can one tell what a “ready” unit is? Up until now, the U.S. has relied far too much on measurements of proficiency. Units report their training activity continuously, and the most “ready” unit is the one that most recently completed a standard list of operations and tasks. The U.S. military is an athlete that runs an incredible 40-yard sprint, jumps through the gym’s roof, and throws the ball a mile and a half, but has never analyzed the game film to see how effective it really is.

The concept of U.S military “game film” is problematic for two reasons: first, all clear-thinking Americans should hope and pray never to actually play the “game” for which the military trains. Military leaders should leverage “practice film,” to keep the analogy intact: how well does the military play in a setting as close to the game as possible? 

Second is that the challenges faced by the U.S. in its most recent “scrimmage” film from Vietnam, Iraq, and Afghanistan paints an ugly picture about its readiness to handle asymmetric challenges. The U.S. showed up to the game with almost every imaginable advantage—the most athletic (read: proficient) players, high-tech uniforms, a full roster, coolers full of Gatorade, air support, and scores of expert coaches to lead them. The opponents arrived with fewer players, some outdated but simple equipment, and home field advantage. And to be frank, they won, because both teams had prepared to play “football” and the U.S. team had not expected to find itself in a soccer match. The U.S. military, in the process of measuring their proficiency and determining themselves “ready” for conflict, had diminished their ability to adapt once they understood the reality of the game. That those failures were both military and political in nature prove that sometimes the front office drafts a player who just is not a good fit for the game, regardless of the numbers on draft day.

There have been positive recent developments towards getting more practice film, including a significant push for more wargaming across the naval services, and numerous large scale force-on-force exercises at Marine Corps Air-Ground Combat Center, Twentynine Palms. These efforts are laudable and should be reinforced, but they are not solutions in and of themselves. Wargames and exercises are laboratories, where experiments can be run with unproven pieces of gear and newly-developed tactics. What they must become is an actual testing ground, wherein a force proves its readiness in a subjective manner. “Ready for what?” is the obvious rebuttal, and one of the questions posed in the aforementioned generals’ op-ed. The answer should be “problem solving.”

This is the precise point where the concept of effectiveness must be most divorced from the concept of proficiency. The problems faced by deployed military forces around the globe today do not have a proven solution. There is no handbook or checklist that describes how a decision maker should counter malign narratives and compete with undeclared special operations forces. A modern military unit, more than any other time in history, must be able to find solutions to unexpected problems. Proficiency is an element of that—a robust patrolling effort will be doomed from the start if none of the squad leaders can navigate on a map—but planners and leaders who can recognize the effectiveness of a potential action are the truest measure of readiness. A simple example might be a staff that can account for whether patrolling through the town is useful or if it will simply feed a hostile propaganda campaign. Measuring a unit’s ability to solve unknown problems is how the force should move forward in marrying proficiency with effectiveness.

This is, of course, far easier said than done. The current model of performing a list of tasks to a certain standard prior to deployment is nearly antithetical to evaluating effectiveness (though not proficiency, which is not useless, just incomplete on its own). The DoD has honed the current readiness model through decades of conflict and painful lessons in Iraq and Afghanistan; slaughtering that sacred cow is a tall order. Instead of outright replacement, priorities must be realigned to allow for testing and evaluation of decision-making and problem-solving skills in a tactical setting. It will entail the hard decisions of identifying which areas of proficiency are irrelevant. Time is the ultimate constrained resource, so deliberate thought must be put into what proficiency metrics are absolutely essential. But if the metrics that determine readiness change, and leaders communicate that appropriately, then the demand signal for objective proficiency metrics will automatically fall. This challenge is one of risk acceptance and messaging at all levels of command, but it is the cultural shift called for by Generals Berger and Brown.

Finally is the difficult task of testing problem solving skills. One answer to the question of how to evaluate problem solving is deceptively simple. Wargaming, in certain forms, is perhaps the best form of measuring the ability to solve problems. For relatively low costs in manpower and materials, war games allow commanders and staffs to test their ability to move through the Observe-Orient-Decide-Act loop. The game might be intricately detailed to stress a staff’s ability to find and retain critical information, or it can be as simple as a verbal brief. The point is that the staff must demonstrate the ability to analyze and synthesize new information from incomplete or uncertain information environments. They must study, understand, and enhance their ability to think through new problems, rather than solve old ones more and more proficiently. It is the equivalent of quizzing a quarterback on what he would do if the defense aligned in an unexpected formation. His ability to throw accurately is rendered meaningless if he cannot understand the environment and make the right decision of who to throw to.

If anything defines the competition continuum, it is uncertainty. This applies across tactical, operational, and strategic echelons, which can be addressed by corresponding changes to the war game structure. Tactical units can maneuver forces on a map, with a dedicated red cell providing a thinking, adaptive enemy. Strategic war games could range from discussing diagrams on white boards all the way to advanced simulation software emulating the actions and opinions of entire national populations. 

The question of “how” can be addressed in a number of specific ways, depending on the specific nature of the unit or headquarters being tested. The point is that for a unit to be ready in any modern and meaningful sense its problem solving ability must be tested and evaluated. This will be subjective, to a certain extent, and will require a corresponding cultural shift in risk acceptance and understanding by commanders. Moreover, the specific training and evaluation events need to be created, though many elements within the DoD have already set a firm foundation for such a campaign. But until effectiveness is valued as much as proficiency is, the U.S. military is at risk of finding itself woefully unprepared when the game kicks off.

Captain Jesse Schmitt is the Assistant Intelligence Officer for the 31st Marine Expeditionary Unit in Okinawa, Japan. He earned Bachelor’s Degrees in Political Science and Economics from the University of Florida, and is currently completing a Master’s Degree in International Relations. He firmly believes that if the U.S. ever wins both the Men’s and Women’s World Cup, the rest of the world should have to call the game “soccer” until the next tournament.

Featured image: GULF OF ALASKA (May 7, 2021) – F/A-18F Super Hornets, assigned to the “Black Knights” of Strike Fighter Squadron (VFA) 154, and the “Tomcatters” of Strike Fighter Squadron (VFA) 31 are secured to the flight deck of the aircraft carrier USS Theodore Roosevelt (CVN 71) May 7, 2021, in support of flight operations above the Joint Pacific Alaska Range Complex and Gulf of Alaska during Exercise Northern Edge 2021 (NE21). (U.S. Navy photo by Mass Communication Specialist 3rd Class Brandon Richardson)

Moving Toward A Holistic, Rigorous, Analytical Readiness Framework

Redefining Readiness Topic Week 

By Connor S. McLemore, Shaun Doheney, Philip Fahringer, and Dr. Sam Savage

On April 24th, 1980, eight American helicopters heavily laden with special forces and their equipment launched from the aircraft carrier USS Nimitz operating in the Arabian Sea. They flew northeast into the Iranian desert to rendezvous with refueling aircraft in order to attempt a rescue of 52 hostages taken in 1979 from the American Embassy in Teheran. The operation, Eagle Claw, ended in disaster: a dust cloud kicked up by aircraft propellers and helicopter rotor blades caused one of the helicopters to collide with a refueling aircraft and explode, killing eight U.S. personnel and wounding several others. Yet the mission had already been aborted prior to the collision. During the flight to the refueling site, three helicopters suffered equipment failures, leaving just five able to continue the mission. Mission go/no-go criteria required at least six helicopters to continue, and the order from the president to abort the mission was passed. The tragic collision occurred when aircraft were attempting to transfer fuel in order to depart Iran after the mission was already cancelled.

Helicopter capability to support Eagle Claw was quantifiable based on historical helicopter failure data, and yet prior to the mission, it was not quantified. The Holloway Report, which detailed the results of the investigation into the mission’s failure, laid bare how the number of helicopters sent was a major contributing factor to the early mission abort, and recommended that more helicopters should have been sent. Using basic probability theory and known helicopter failure rates, the mission had an estimated probability of 32 percent that there would not be six helicopters ready at the refueling site. Former President Jimmy Carter said in 2015, when asked if he wished he had done anything differently as president, “I wish I’d sent one more helicopter to get the hostages, and we would have rescued them, and I would have been re-elected.” Yet over 40 years later, the same underlying military readiness shortfalls that prioritized availability over capability for those helicopters remain largely unfixed.

April 24, 1980- Three RH-53 Sea Stallion helicopters are lined up on the flight deck of the nuclear-powered aircraft carrier USS NIMITZ (CVN-68) in preparation for Operation Eagle Claw. (Photo via Wikimedia Commons)

The urgent need for changes to the military’s existing readiness framework has been called for by General Charles Brown, the chief of staff of the Air Force, and General David Berger, the commandant of the U.S. Marine Corps. In their recent War on the Rocks article, they describe the necessity of a better analytical framework for the joint force to better assess balancing operational costs of existing forces with investment costs to modernize and replace those forces. The two service chiefs point out, “Our current readiness model strongly biases spending on legacy capabilities for yesterday’s missions, at the expense of building readiness in the arena of great-power competition and investing in modern capabilities for the missions of both today and tomorrow.” In order to address that problem, they call for “a framework for readiness” and “a more precise understanding of risk — to what, for how long, and probability.” Our team at Probability Management, a 501(c)(3) nonprofit dedicated to improving communication of uncertainty and risk, wholeheartedly agrees.

Achieving the service chiefs’ vision of a better analytical framework will require changes to both the qualitative and quantitative underpinnings of the existing readiness system. To improve the quantitative parts, we recommend the implementation of a supporting data framework that is capable of informing probability-based capabilities assessments by making the Defense Department’s readiness data more flexible, visible, sharable, and usable.

Diagnosing the Problem

The readiness system contributes to unquantified capabilities of combinations of military assets, zero-risk mindsets in combatant commanders, and requirements that are excessive. These problems must be addressed in any future readiness system. It is unreasonable to expect service chiefs to push back on requirements from combatant commanders if discussions around the capabilities of combinations of military assets are purely subjective. We make no claim as to what acceptable capability thresholds should be; however, we must point out that even if the service chiefs and combatant commanders were in complete agreement on a threshold, without a way to quantify the probability of achieving it, requirements will probably remain excessive. Requirements supported with too many or not enough forces leads to imbalanced risks and costs. Why pay to achieve a 99.9 percent chance of success when a 90 percent chance of success is adequate? Conversely, why risk a 10 percent chance of failure when a 99.9 percent chance of success is required? The existing readiness system does not support thinking in these terms.

Fixing the limitations of the existing readiness system is not purely a data challenge. However, too many problems with the existing military readiness system are a direct result of the ways in which Defense Department data is being collected, stored, and communicated. We do not advocate for the military readiness system to remove subjectivity from all readiness calculations – the readiness system should always endeavor to support the service chiefs in assessing what is meaningful, not just against what is measurable. However, subjectivity supported by systematic estimates of probability is likely to outrun subjectivity alone. The existing readiness system is simply not capable of providing military leaders with timely, fully-informed systemic probabilistic estimates of mission capabilities.

A principal problem with the existing system is that metrics associated with readiness requirements are routinely measured as “ready or not.” Once a unit meets a defined level of performance, the unit is declared “ready.” However, even when units are “ready,” it is often vague as to what they are ready for and when. Additionally, the percentages used in current readiness metrics cannot easily be aggregated by mathematically defensible means. For example, a notional requirement could be that at least 80 percent of unit systems must be ready. If a unit with 90 percent of its systems ready reports as “ready,” and pairs with a dissimilar unit at 70 percent and reporting “unready”, the combined capability of the two units to support a given mission at an uncertain future time is unclear. Fundamentally, the existing military readiness system cannot be used to quantitatively predict probabilities of mission success at uncertain future times for portfolios of dissimilar assets.

When joining readiness levels of dissimilar units, the lack of a mathematically defensible readiness framework results in important issues being distorted or lost, leaving little coherence for understanding the capabilities of the joint force at the operational or even strategic levels. Because joint force capabilities required by combatant commanders are not being credibly quantified, and because the service chiefs, who are tasked with providing ready and capable military forces to combatant commanders pay to support those requirements, the combatant commanders have little incentive to ask for fewer forces than are needed. Additionally, combatant commanders are responsible for requesting capabilities that cover their missions now, not several years in the future. Why would they risk failure by not requesting enough forces, especially considering they aren’t paying for the forces they receive? Combatant commanders are simply doing their jobs when they prioritize getting more forces in their theater now over future capabilities. Yet this is clearly a problem because marginal improvements to fulfil near-term requirements may be coming at enormous cost to important future capabilities.

DoD is not being hindered by a lack of enough data. Rather, it is hindered by an inability to see and make use of the data it collects in its many vast, opaque, stovepiped databases. Today, most military readiness data arrives in the form of historical records and subject matter expert opinions. An ideal future military readiness data framework should be able to better use that existing data while also enabling the continuous and increasingly automated collection and sharing of additional data sources that are authoritative, timely, clean, and contain useful information. Yet it is simplistic to think it will be possible to efficiently apply advanced data science techniques such as simulations, regression, machine learning, and artificial intelligence on data that is not easily visible, sharable, and usable. The lack of the appropriate data framework prevents a readiness framework from being informed by timely, visible, usable data. Fundamentally, there is no format today that allows for the efficient sharing of datasets in DoD, and much time is being wasted figuring out what data is available, if it contains useful information, and then transforming it manually so that it can be used for analysis.

A New Framework

A real gap lies in the ability of DoD to quantitatively measure what makes military units combat effective (vice combat available) and the associated costs of those capabilities. We propose the military adopts a uniform data framework that could be used within and across military services and systems to better quantify readiness predictions both now and in the future, such as providing timely estimates of the probability that helicopters will not be capable for a given mission, and to better communicate risks, such as the probability of an important mission failing because not enough helicopters are being sent. Such a data framework would allow for better sharing and employment of existing data, resulting in better quantitative metrics, leading to better cost and risk tradeoff discussions among decision-makers. This could be used to allow decision-makers to see forward in time as well with capability outcomes generated continuously through the use of large military datasets. This will allow data sources and models to evolve over time, resulting in improving probability-based capability predictions that would go a long way toward supporting the outcomes Generals Brown and Berger propose. A new approach can allow planners, commanders, and decision-makers to speak the same language to communicate, “How ready are units for what?”

Probability Management has long advocated for improvements to the quantitative underpinnings of the military readiness framework, and detailed technical explanations and example use cases for the data framework we recommend can be found in our published technical articles. In our work, we describe the underlying problems of the existing framework, and describe necessary steps if the joint force is to adopt a “holistic, rigorous, analytical framework to assess readiness properly,” as the service chiefs rightly demand.

In straightforward terms, the data framework is best explained as the standardized representation of the readiness of military assets in the form of columns of data with statistical dependence between columns preserved. Unlike data in the existing framework, these columns of data can then be straightforwardly rolled up to probabilistically estimate the capabilities of groups of dissimilar assets operating in uncertain environments. This data framework can be used to improve military readiness reporting systems broadly by conveying the probability of achieving specified levels of availability and capability for specified missions now and at uncertain future times. We believe the limits of the existing system do not result from the limitations of math, but rather from the limits of the data structures employed in readiness calculations. In contrast, our framework supports simple and straightforward arithmetic, while transparently carrying along probabilistic information that may be extracted when required.

The basic approach, democratized and standardized by Probability Management, does not forecast the future with a single number, for example, “on average a helicopter is operable 75 percent of the time,” but instead models many possible futures for each helicopter using cross-platform data standards. In terms of Operation Eagle Claw, consider eight columns of data with 1,000 rows each, with each column of data representing one helicopter. Every row represents a different possible future, with a one in a row if the aircraft is up and a zero if it is down. Each row will have on average 750 ones and 250 zeros. The total number of operable aircraft out of the eight is represented in a ninth column summing each of the original eight columns row by row. When the original Eagle Claw assumptions are entered into this framework, of the thousand elements in the ninth column, about 320 (32 percent) would have fewer than the six required for the mission, indicating a 32 percent chance of failure. These sorts of row-by-row calculations, known as vector operations, are trivial in virtually any software platform today, so the open standard framework could be used to enable chance-informed decisions across any other existing or future readiness software platforms.

When the readiness of an asset is represented as a column of thousands of “ready or nots,” then units can be combined in a row-by-row sum to provide a column of assets available in each of 1,000 futures. This approach, long used in stove-piped simulations, becomes a framework by simply storing simulated results in a database and making them sharable.

For example, suppose the DoD is choosing between two aircraft systems, A and B. They are both operational 75 percent of the time, but A is 5 percent cheaper than B. We might pick A to save money. But suppose that system A goes down at random 25 percent of the time, and system B is guaranteed to be operational for 7.5 flight hours and then require 2.5 hours of maintenance. Traditional readiness metrics based on averages can’t detect the difference between A and B, except on cost. But for missions requiring less than 7.5 flight hours, system B is vastly superior, because you can arrange to have 100 percent of the fleet in action. The added predictability may be well worth the 5 percent cost premium. For missions over 7.5 hours, system B is worthless, as no aircraft will be ready. With System A, however, a few planes will survive the long mission. So again, we should be asking “how ready for what,” where “how ready” may be interpreted as “what are the chances?” Chance-informed capability decisions would allow the Defense Department to quantify cost today versus the chance of adverse events tomorrow.

Conclusion

The adoption of this new data framework for military readiness would go a long way toward achieving the quantitative underpinnings necessary to support the service chiefs’ vision and it can be used to fix the fundamental problem they call out: the “gold-plating” of existing force requirements at the expense of future capability. Additionally, the framework we propose is merely a data standard, not requiring any particular software implementation, is not proprietary, is available at no cost to the government, and does not require the wholesale elimination of the existing military readiness system; it can expand upon the existing system and be implemented incrementally. It is not designed to eliminate subjectivity from a commander’s readiness calculations, nor should it. A more structured readiness approach that explicitly acknowledges uncertainty is complementary to subjective estimates. Commanders will still need to make decisions subjectively based on myriad factors, including their own risk tolerance.

Combining the predicted risks of a portfolio of dissimilar assets occurs commonly in the commercial sector. Our approach has long been widely used in applications in financial engineering, insurance, and many other industries. It has been applied to portfolios of oil exploration projects at a global energy firm and portfolios of risk mitigations at a large utility. We are confident that our approach is both straightforward to understand and simple to use without specialized software or mathematical training. For example, using the same data framework we propose the military adopts for its readiness system, Probability Management taught modern portfolio theory, a complicated subject that involves evaluating the predicted returns of financial portfolios, to West Oakland Middle School eighth graders in 2017. The students quickly understood the framework and employed it effectively, and we are confident that military personnel will be able to easily employ it in the area of military readiness.

Our proposed data framework is already adopted by commercial organizations in sectors as diverse as healthcare, energy, and defense to quantitatively support decisions and mitigate risks. Lockheed Martin, Pacific Gas & Electric, and Kaiser Permanente are incorporating the framework to better assess the likelihood of critical outcomes in terms of probabilities of success and failure based on historical performance and future predictions.

The data framework we propose is generic and is easily tailored to new-use cases and industries, including to an improved military readiness framework. The expectation is that if our approach were applied in a military readiness context, it would support a better analytical framework for the joint force and allow better assessments. This would help better balance operational costs of existing forces with investment costs to modernize and replace those forces. Through this framework, military leaders would have a more practical understanding of the tradeoffs within military readiness and better manage the challenges of today and tomorrow.

Mr. Connor McLemore is a principal operations research analyst for CANA Advisors and the Chair of National Security Applications at ProbabilityManagement.org. He has over 12 years of experience in scoping, performing, and implementing analytic solutions. He holds Masters’ degrees from the Naval Postgraduate School in Monterey, California, and the Naval War College in Newport, Rhode Island, and is a former naval officer and graduate of the United States Navy Fighter Weapons School (TOPGUN) with numerous operational deployments during 20 years of service. 

Shaun Doheney is a Senior Data and Analytics Strategy Consultant for a large global company with experience as a Chief Analytics Officer for an Inc. 5000 company. He is a retired Marine Corps Lieutenant Colonel who has conducted, participated in, or led a whole host of analyses and evaluations across major Department of Defense decision support processes. He is the Chair for the Military Operations Research Society’s Readiness Working Group and the Chair of Resources and Readiness Applications at ProbabilityManagement.org. 

Mr. Philip Fahringer is a Fellow and Strategic Modeling Engineer for Lockheed Martin Aeronautics with 35 years of combined military and defense applied research in analytics and decision support. He holds a Master’s degree Operations Analysis from the Naval Postgraduate School in Monterey, California, and in Strategic Studies from the Army War College in Carlisle, Pennsylvania, and he is a former naval officer with numerous operational deployments and strategic planning assignments during 20 years of service. 

Dr. Sam L. Savage is Executive Director of ProbabilityManagement.org, a 501(c)(3) nonprofit devoted to the communication and calculation of uncertainty. The organization has received funding from Chevron, Lockheed Martin, General Electric, PG&E, Wells Fargo and others, and Harry Markowitz, Nobel Laureate in Economics was a founding board member. Dr. Savage is author of The Flaw of Averages: Why We Underestimate Risk in the Face of Uncertainty (John Wiley & Sons, 2009, 2012), is an Adjunct Professor in Civil and Environmental Engineering at Stanford University and a Fellow of Cambridge University’s Judge Business School. He is the inventor of the Stochastic Information Packet (SIP), a standardized, auditable data array for conveying uncertainty. Dr. Savage received his Ph.D. in computational complexity from Yale University.

Featured Image: NORTH PACIFIC OCEAN (May 3, 2021) – U.S. Marine Corps MV-22 Ospreys, assigned to Marine Medium Tiltrotor Squadron 164 (Reinforced), 15th Marine Expeditionary Unit, prepare to take off from the amphibious assault ship USS Makin Island (LHD 8) in support of Northern Edge 2021. (U.S. Navy photo by Mass Communication Specialist 2nd Class Jeremy Laramore)

The U.S. Military In Competition: Supporting Effort One

By Ryan Ratcliffe

“Ultimate excellence lies…in defeating the enemy without ever fighting.” Sun Tzu

“America is back,” proclaimed President Joe Biden in his first address to the Department of State. “Diplomacy is back at the center of our foreign policy,” he continued, declaring that the United States would repair its alliances and reengage the world to confront global challenges. Speaking to the department that had recently published its comprehensive assessment of relations with China and called for a revival of U.S. foreign policy, the President’s words were likely well received. 

Designed to serve as a modern-day “long telegram,” The Elements of the China Challenge reveals the Chinese Communist Party’s aim to revise world order, highlighting many of the Party’s malign and coercive activities designed to achieve its subversive ambitions. The authors of this seminal work, the Secretary of State’s Policy Planning Staff, conclude by calling for the United States to refashion its foreign policy around the principles of freedom and to reserve the use of military force for when all other measures have failed. 

Diplomacy will certainly be fundamental in addressing the China challenge, but returning it to the forefront of U.S. foreign policy will not occur overnight. Decades of systemic decline have reduced U.S. diplomatic capacity and left the military as the more prolific instrument of national power. Therefore, until the Department of State completes the transformation it requires and returns to primacy, the Department of Defense will need to shoulder some of the burden in answering the China challenge. One such element that the Department of Defense should assist with is countering actions in the gray zone. 

The gray zone is the “contested arena somewhere between routine statecraft and open warfare,” and it includes actions taken to achieve relative geopolitical gains without triggering escalation. So, despite the U.S. military’s preeminence as a joint fighting force (or, more likely, as a testament to it), state actors like the Chinese Communist Party are pursuing and achieving success in the gray zone. These revisionist actors purposefully avoid armed conflict in an effort to evade the lethal arm of U.S. foreign policy, challenging traditional assumptions of strategic deterrence. Left unchecked, these gray zone victories will gradually amass into a direct threat to U.S. national security; this makes it well within the military’s purview to orient on the gray zone. 

To help counter the gray zone strategy the Chinese Communist Party regularly uses, the U.S. military should consider non-traditional approaches to competition. Such creative military support would buy U.S. diplomacy time to rebuild its capacity for securing freedom, while still avoiding armed conflict. This article offers a framework for achieving this non-traditional support from the U.S. military: First, the United States should leverage its military manpower to legally support non-military competition activities. Second, the U.S. military should establish a forward-deployed headquarters to command and control the military’s involvement in competition. Finally, the senior military officer in this competition-focused headquarters should be charged with synchronizing operations in the information environment.

Providing Military Support to Competition

Although there are many definitions, competition is generally viewed as fundamental to international relations. To effectively compete, though, one must develop a thorough understanding of the competition. Therefore, it is important to recognize that the pinnacle of Sun Tzu’s “win without fighting” maxim is not determining the fate of a battle before it is fought; the true measure of excellence is to never even do battle because one’s aims were wholly achieved without conflict.

Recognizing that its competitors seek to achieve their revisionist aims below the threshold of armed conflict, the U.S. military must adapt its deterrence strategies to counter illegal gray zone operations. Deterrence by denial, defined as elevating the probability of operational failure, likely has a greater chance of succeeding in the gray zone than deterrence by punishment, such as taking retaliatory action. To effect deterrence by denial in the gray zone, the U.S. military should enable its interagency and international partners to counter the illicit activity that often underpins gray zone strategies.

Countering malign behavior has many faces: U.S. allies upholding their own sovereignty, partner nations protecting their economic livelihood, and the U.S. government imposing economic sanctions are all examples of ways to achieve competition’s strategic ends. In spite of their critical role in opposing illicit gray zone activities, though, the agencies that confront malign behavior often lack the manpower to reach the point of taking action. Manning shortfalls, exacerbated by the expansive size of the Indo-Pacific, leave the U.S. military’s interagency and international partners unable to muster adequate back-end support for their efforts, such as maintaining robust intelligence pictures or gathering actionable evidence of malign activity.

The U.S. Department of Defense, on the other hand, is more adequately manned for the task of competing in the Indo-Pacific. For example, U.S. Indo-Pacific Command alone has approximately 375,000 military and civilian personnel assigned to its area of responsibility. In contrast, the U.S. Department of State employs only 70,000 individuals throughout the world, and the New Zealand Defence Force is just over 15,000 strong. So, while the U.S. military’s interagency and international partners lack the manpower to identify and track gray zone activity, nearly two decades of asymmetric conflict has left the U.S. military manned, trained, and equipped to provide precisely the type of back-end intelligence and logistical support its partners require. Any manpower that the U.S. military contributes to these partners is likely to have outsized impacts on their ability to compete and counter gray zone actions.

It should be clarified that supporting competition cannot detract from the U.S. military’s lethality, a unique capability it alone provides. Therefore, like any effective strategy, efforts to counter gray zone operations should be focused on the most important and attainable objectives. Careful consideration should be given to preserving the “means” of the U.S. military to protect national security: its combat power. However, supporting competition and retaining combat power are not mutually exclusive. For example, many of the skills required to perform maritime governance, such as fusing multi-source intelligence data or finding and fixing maritime vessels, directly translate to combat operations. Activities supporting competition can and should be used to build operator proficiency, while simultaneously contributing to deterrence by demonstrating credible military power in the region.

A Competition-Focused Command

To synchronize the U.S. military’s support to competition and maintain awareness of the collective risk acceptance, U.S. Indo-Pacific Command should consider establishing a competition-focused headquarters west of the International Date Line. This headquarters would provide the unity of command required to develop the foreign, joint, and diplomatic partnerships needed to compete with a state actor seeking to undermine regional stability. 

Some will view a headquarters west of the International Date Line as incurring too much risk of escalation, but the opposite is more likely true. The U.S. military will be unable to deter illicit gray zone activities without accepting some risk, as a willingness to accept risk demonstrates commitment to one’s adversary. Accepting greater risk does not mean the United States should condone or take wanton action, though, as disparate or conflicting actions could elevate the risk of unintentional or inadvertent escalation. Instead, risk acceptance should be carefully considered for the confluence of the U.S. military’s efforts to support competition. Establishing a forward headquarters, then, directly addresses the issue of risk: it would demonstrate commitment to the United States’ competitors while maintaining a complete picture of the military’s support to competition, thereby decreasing the risk of escalation.

The ideal structure for this organization would be a Combined Joint Interagency Task Force (CJIATF) led by a senior civilian, with a 3-star general or flag officer serving as deputy, and possessing a cabinet of experienced foreign policy advisors and liaison officers. To ensure a “forward mindset”, the CJIATF should be positioned in Guam, Australia, or Palau, and staffed by a mix of deployed and permanent personnel. Such a construct will provide responsive command and control for regional competition, increasing the U.S. military’s ability to support competition while freeing other units to maintain their focus on conventional mission sets and traditional strategic deterrence.

There are many factors that make a CJIATF the ideal choice to serve as the nexus of military support to competition. First, joint task forces provide a “single mission focus” with the ability to “integrate forces…in a dynamic and challenging political environment.” They also foster “de facto interdependence” through an environment of mutual reliance on their members’ contributions. The “combined” and “interagency” modifiers emphasize the need to include a wide variety of partners within this joint task force—partners that should range from U.S. government agencies to a host of international liaison officers.

To effectively navigate the complex requirements of providing military support to external entities, the CJIATF should employ a blend of traditional and novel concepts. For example, the “by, with, and through” approach translates well to competition. Although more commonly associated with armed conflict, this approach offers an outline for orchestrating interagency and international cooperation: The CJIATF strives for actions performed by its partners, with the support of U.S. forces, and through a legal and diplomatic framework. Such an approach would allow the U.S. military to support a wide range of partner actions, while still abiding by international law and applicable U.S. Code. 

Competition demands exactly the kind of interagency and international cooperation that a joint task force promotes. As the stakes of competition continue to rise with tensions following suit, detailed integration between the military, government agencies, and partner nations remains the best hope for diminishing the risk of conflict. Avoiding conflict while preserving the rules-based international order should be considered success in today’s competition, and a CJIATF is uniquely capable of supporting this objective.

Information-Related Synchronization

In addition to enabling partner actions, the CJIATF will need to grapple with the fact that all competition efforts are inexorably tied to the information environment. Instant communication and weaponized social media offer ample opportunity for competitive exploitation, and they present considerable risk of inadvertent escalation. Additionally, misinformation and deep fakes now offer low-cost means to wreak societal havoc with ambiguous attribution. 

Despite some documented success, the United States has struggled to develop a unified understanding of information operations. Difficulty preventing exploitation, such as Russian interference in U.S. elections or Chinese industrial espionage, creates a perception of information vulnerability—a setback the United States must rectify. To fix this, consideration of information operations needs to move from garrison staff, academia, and think tanks to the operational realm. The role of every warfighter has changed: You are no longer simply “doing land operations, you’re doing an information operation in the land domain.” 

The connection between competition and information makes the CJIATF’s responsibilities inherently information-centric. Therefore, in addition to enabling its partners to compete, the CJIATF must also synchronize information operations in support of competition. This focus on gaining advantage in the battle for information dominance makes the CJIATF’s senior military officer a logical choice to serve as the first Joint Force Information Component Commander (JFICC).

The JFICC is a novel approach to the battle-tested functional component commander construct. Commanders designate a subordinate functional component commander to establish unity of command and unity of effort for operations in a specific domain (i.e., there are normally separate component commanders for operations on land, at sea, in the air, etc.). This type of domain-specific synchronization is precisely what the United States needs to overcome its previous setbacks in the information environment. Additionally, operations can have significant effects in the information environment regardless of the domain they are executed in, and the inverse is also true: Actions in the information environment can have real-world impacts in physical domains

To manage this cross-domain interaction, the JFICC needs to synchronize the projected effects of all efforts in the region. This robust requirement, combined with the move towards all-domain operations, means the JFICC must be able to deploy their information-related capabilities as a demonstrable force, in line with the forces operating in the other domains. They will then need the authority and capacity to mass this force in support of decisive action. Designating the CJIATF commander as the JFICC directly addresses this requirement by placing “the responsibility and authority for execution of the decisive [information] tasks under a commander vice under a staff officer.”

Although it is non-doctrinal for a joint force commander to serve as a functional component commander, there is some precedent for this: A Special Operations Joint Task Force commander may also serve as the Joint Force Special Operations Component Commander. Comparing special operations forces with information-related capabilities turns out to be rather effective, as neither “own” physical battlespace, neither historically have a dedicated service component, and both often operate below the level of armed conflict.

Any future JFICC will likely be faced with an extraordinarily complex all-domain problem, and will be able to directly impact national strategic objectives. Experience is the best teacher, and the U.S. military should capitalize on the opportunity to advance its ability to operate in the information environment by employing the first JFICC below the threshold of armed conflict. Doing so will provide valuable insight into the dyad of operating in the physical domain and information environment simultaneously—a fundamental challenge of all-domain operations. 

Competition is Not a Nine-to-Five Job

To realize its strategic aims, the United States must make a concerted effort to align its means and ways to achieve its ends, building upon its strengths while improving its deficiencies. Supporting interagency and international partners where command and control resides with a new headquarters commanded by a novel functional component commander presents formidable challenges, but it is precisely the economy-of-force option the United States needs to better compete now. This structure aligns with strategic guidance from the highest levels of the U.S. Government, and it could eventually serve as an example for other theaters to emulate, such as U.S. Africa Command.

Finally, a subtle but crucial aspect of implementing this construct is that it acknowledges the gray zone operating environment for what it is: a contest of wills without violence. Left unchecked, these gray zone strategies could lead to a Tzu-esque defeat of the rules-based international order before Clausewitzian force is ever used. With this recognition, the U.S. military must shift its approach to competition from a “nine-to-five” business model and begin applying effort commensurate with traditional military operations.

Utilizing the U.S. military to support competition will likely be difficult and often uncomfortable. Additionally, establishing another forward-deployed headquarters will require both personal sacrifice and significant monetary commitment, and appointing a new functional component commander will challenge traditional military thought. There is, however, something much worse than these changes: failure to levy the existing force against today’s challenges with the hope that tomorrow’s force will be more capable of achieving success. Hope is not a viable course of action when national security is at stake.

As put forth in The Elements of the China Challenge, the United States needs to mobilize the global forces who ascribe to the existing world order and are willing to defend it, keeping one aim in mind: securing freedom. The forward-mindset and combined, interagency nature of this supporting approach will do exactly that: ensure the United States’ efforts align with and support the strategic interests of its partners in the Indo-Pacific. This paradigm shift could be the rallying cry that unites a constellation of allies and partners to form the whole-of-society approach this contest will require. The time for action is now; the world order we know depends on it.

 Major Ryan “Bevo” Ratcliffe is an EA-6B Electronic Countermeasures Officer and a Forward Air Controller in the United States Marine Corps, currently assigned to I MEF Information Group. He will begin pursuing a Master of International Public Policy from The Johns Hopkins University School of Advanced International Studies in August 2021.  

The views expressed are those of the author and do not reflect the official position of the United States Marine Corps, Department of the Navy, or Department of Defense.

Featured image: A JH-7 fighter bomber attached to a naval aviation brigade under the PLA Northern Theater Command taxies on the runway with drogue parachute during a flight training exercise in early January, 2021. (eng.chinamil.com.cn/Photo by Duan Yanbing)

Software-Defined Tactics and Great Power Competition

By LT Sean Lavelle, USN

There are two components to military competency: understanding and proficiency. To execute a task, like driving a ship, one must first understand the fundamentals and theory—the rules of navigation, how the weather impacts performance, how a ship’s various controls impact its movement. Understanding is stable and military personnel forget the fundamentals slowly. Learning those fundamentals, though, does not eliminate the need to practice. Failing to practice tasks like maneuvering the ship in congested waters or evaluating potential contacts of interest will quickly degrade operational proficiency.

In the coming decades, human understanding of warfighting concepts will still be paramount to battlefield success. Realistic initial training and high-end force-on-force exercises will be critical to building that understanding. However, warfighters cross-trained as software developers will make it far easier to retain proficiency without as much rote, expensive practice. Their parent units will train them to make basic applications, and they will use these skills to translate their hard-won combat understanding into a permanent proficiency available to anyone with the most recent software update.

These applications, called software-defined tactics, will alert tacticians to risk and opportunity on the battlefield, ensuring they can consistently hit the enemy’s weak points while minimizing their own vulnerabilities. They will speed force-wide learning by orders of magnitude, create uniformly high-performing units, and increase scalability of conventional forces.

Vignette

Imagine an F-35 section leader commanding two F-35 fighters tasked to patrol near enemy airspace and kill any enemy aircraft who approach. As the F-35s establish a combat air patrol in the assigned area, the jet’s sensors indicate there are two flights of adversary aircraft approaching the formation, one from off the nose to the north and the other off the right wing, from the east. Each of these flights consists of four bandits that are individually overmatched by the advanced F-35s. Safety is to the south.

These F-35s have enough missiles within the section to reliably kill four enemies, but are facing eight. Since the northern group of bandits are a bit closer, the section leader decides to move north and kill them. The section’s volley of missiles all achieve solid hits, and there are now four fewer enemy aircraft to threaten the larger campaign.

Now out of missiles, the section turns south to head back home. That’s when the section leader realizes the mistake. As the F-35s flowed northward, they traveled farther away from safety while the eastern group of bandits continued to close on the F-35s, cutting off their path home. 

The only options at this point are to try to travel around the bandits or go through them. A path around them would run the fighters out of fuel, so the flight leader goes straight for the four enemy aircraft, hoping that the bandits will have seen their friends shot down and run away in fear.

The gambit fails, however, and the remaining enemy aircraft close with the F-35s and shoot them down. What should have been an easy victory ended in a tactical stalemate, and in a war where the enemy can build their simple aircraft faster than America can build complex F-35s, the 2:1 exchange ratio is in their favor strategically.

This could have gone differently.

Persistent and Available Tactical Lessons 

Somebody in the F-35 fleet had likely made a mistake similar to this example during a training evolution long before the fateful dogfight. They might have even taken a few days out of their schedule to write a thoughtful lessons-learned paper about it. This writing is critically important. It communicates to other pilots the fundamental knowledge required to succeed in combat. However, success in combat demands not just understanding, but proficiency as well. An infantryman who has not fired a rifle in a few years likely still understands how to shoot, but their lack of practice means they will struggle at first.

Under a software-defined tactics regime, in addition to writing a paper, the pilot could have written software that would have alerted future pilots about the impending danger. While those pilots would still need to understand the risk, ever-watching software would alert them to risks in real-time so that a lack of recent practice would not be fatal. A quick software update to the F-35 fleet would have dramatically and permanently reduced the odds of anyone ever making that mistake again.

The program would not have had to be complex. It could have run securely, receiving data from the underlying mission system without transmitting data back to the aircraft’s mission computers. This one-way data pipe would have eliminated the potential for ad-hoc software to accidentally hamper the safety of the aircraft.

The F-35’s mission computer in our example already had eight hostile tracks displayed. The F-35’s computer also knew how many missiles it had loaded in its weapons bay. If that data were pushed to a software-defined tactics application, the coder-pilot could have written a program that executed the following steps:

  1. Determine how many targets can be attacked, given the missiles onboard.
  2. If there are enough missiles to attack them all, recommend attacking them all. If there are more hostile tracks than missiles (or a predefined missile-to-target ratio), run the following logic to determine which targets to prioritize.
  3. Determine all the possible ordered combinations of targets. There are 1,680 combinations in the original example—a small number for a computer.
  4. For each combination, simulate the engagement and determine if an untargeted aircraft could cut off the escape towards home. Store the margin of safety distance.
  5. If a cutoff is effective in a given iteration, reject that combination of targets and test the next one.
  6. Recommend the combination of targets to the flight commander with the widest clear path home. Alert the flight commander if there is no course of action with a clear path home.

This small program would have instantly told the pilot to engage the eastern targets, and that engaging the targets to the north would have allowed the eastern targets to cut off the F-35s’ route to safety. Following this recommendation would have allowed the F-35s to maintain a 4:0 kill ratio and live to fight another day.

A simple version of this program could have been written by two people in a single day—16 man-hours—if they had the right tools. Completing tactical testing in a simulator and ensuring the software’s reliability would take another 40-80 man-hours. 

Alternatively, writing a compelling paper about the situation would take a bit less time: around 20-40 hours. However, a force of 1,000 pilots spending 30 minutes each to read the paper would require 500 man-hours. Totaling these numbers, results in 96 man-hours on the high-end for software-defined tactics versus 520 man-hours on the low-end for writing and reading. While both are necessary, writing software is much more efficient than writing papers.

To truly train the force not to make this mistake without software-defined tactics, every pilot would need to spend around five hours—a typical brief, simulator, and debrief length—in training events that stressed the scenario. That yields an additional 10,000 man-hours, given one student and one instructor for each training event. At that point, all of the training effort might reduce instances of the mistake by about 75%.

To maintain that level of performance, aircrew would need to practice this scenario once every six months in simulators. That is 10,000 hours every six months. Over five years, you’d need to spend more than 100,000 man-hours to maintain proficiency in this skill across the force.

Software-defined tactics applications do not need ongoing practice to maintain currency. They do need to be updated periodically to account for tactical changes and to improve them, though. Budgeting 100 man-hours per year is reasonable for an application of this size. That is 500 man-hours over five years.

Pen-and paper updates require 100,000 man-hours for a 75% reduction in a mistake. Software-driven updates require 596 man-hours for a nearly 100% reduction. It is not close. 

When a software developer accidentally creates a bug, they code a test that will alert them if anyone else ever makes that same mistake in the future. In this way, a whole development team learns and gets more reliable with every mistake they make. Software-defined tactics offer that same power to military units.

Software Defined Tactics in Action

While the F-35 example is hypothetical, software-defined tactics are not. The Navy’s P-8 community has been leveraging a software-defined tactics platform for the last four years to great effect. The P-8 is a naval aircraft primarily designed to hunt enemy submarines. Localization—the process by which a submarine-hunting asset goes from initial detection to accurate estimate of the target’s position, course, and speed—is among the most challenging parts of prosecuting adversary submarines.

On the P-8, the tactical coordinator decides on and implements the tactics the P-8 will use to localize a submarine. It takes about 18 months of time in their first squadron to qualify as a tactical coordinator and demonstrate reliable proficiency in this task. These months include thousands of hours of study, hundreds of hours in the aircraft and simulator, and dozens of hours defending their knowledge in front of more experienced tacticians.

When examining the data the P-8 community collects, there is a clear and massive disparity in performance between inexperienced and experienced personnel. There is another massive disparity between those experienced tacticians who have been selected to be instructors because of demonstrated talent and those who have not. In other words, there are both experience and innate talent factors with large impacts on performance in submarine localization.

The community’s software-defined tactics platform has made it so that a junior tactician (inexperienced and possibly untalented) with 6-months of time in platform performs exactly as well as an instructor (experienced and talented) with 18-months in platform. It does this largely by reducing tactician mistakes—alerting them to the opportunities the tactical situation presents and dissuading them from enacting poor tactical responses.

This makes the P-8 force extremely scalable in wartime. In World War II, America beat Japan because it was able to quickly and continually train high-quality personnel. It took nine months to train a basic fighter pilot in 1942. It takes two or three years to go from initial flight training until arriving at a fleet squadron in 2021. Reducing time to train with software-defined tactics will restore that rapid scalability to America’s modern forces.

The P-8 community has had similar results for many tactical scenarios. It does this, today, with very little integration into the P-8s mission system. Soon, its user-built applications will be integrated with a one-way data pipe from the aircraft’s mission system that will enable the full software-defined tactics paradigm. A team called the Software Support Activity at the Naval Air Systems Command will manage the security of this system and provide infrastructure support. Another team consisting of P-8 operators at the Maritime Patrol and Reconnaissance Weapons School will develop applications based on warfighter needs. 

Technical Implementation

Implementing this paradigm across the US military will yield a highly capable force that can learn at speeds orders of magnitude faster than its adversaries. Making the required technical changes will be inexpensive.

On the P-8, implementing a secure computing environment with one-way data flow was always part of the acquisition plan. That should be the case for all future platform acquisitions. All it requires is an open operating system and a small amount of computing resources reserved for software-defined tactics applications.

Converting legacy platforms will be slightly more difficult. If a platform has no containerized computing environment, it is possible to add one, though. The Air Force recently deployed Kubernetes—a framework that allows for securely containerized applications to be inserted in computing environments—on a U-2. Feeding mission-system data to this environment and allowing operators to build applications with it will enable software-defined tactics.

If it is possible to securely implement this on the U-2, which was built in 1955, any platform in the U.S. arsenal can be modified to accept software-defined tactics applications.

Human Implementation

From a technical standpoint, implementing this paradigm is trivial. From the human perspective, it is a bit harder. However, investing in operational forces’ technical capabilities without the corresponding human capabilities will result in a force that operates in the way industry believes it should, rather than the way warfighters know it should. A tight feedback loop between the battlefield reality and the algorithms that help dominate that battlefield is essential. Multi-year development cycles will not keep up.

As a first step, communities should work to identify the personnel they already have in their ranks with some ability to develop software. About a quarter of Naval Academy graduates enter the service each year with majors that require programming competency. These officers are a largely untapped resource.

The next step is to provide these individuals with training and tools to make software. An 80-hour, two-week course customized to the individual’s talent level is generally enough to get a new contributor to a productive level on the P-8’s team. A single application pays for this investment many times over. Tools available on the military’s unclassified and secret networks like DI2E and the Navy’s Black Pearl enable good practices for small-scale software development.

Finally, this cadre of tactician-programmers should be detailed to warfare development centers and weapons schools during their non-operational tours. Writing code and staying current with bleeding-edge tactical issues should be their primary job once there. Given the significant contribution this group will make to readiness, this duty should be rewarded at promotion boards to maintain technical competence in senior ranks.

A shortcut to doing this could be to rely on contractors to develop software-defined tactics. To maximize the odds of success, organizations should ensure that these contractors 1) are co-located with experienced operators, 2) are led by a tactician with software-development experience, 3) can deploy software quickly, 4) have at least a few tactically-current, uniformed team members, and 5) are funded operationally vice project-based so they can switch projects quickly as warfighters identify new problems. 

The Stakes

Great power competition is here. China’s economy is now larger than America’s on a purchasing parity basis. America no longer has the manufacturing capacity advantage that led to victory in World War II, nor the ability to train highly-specialized warfighters rapidly. To maintain America’s military dominance in the 21st century, it must leverage the incredible talent already resident in its armed forces.

When somebody in an autocratic society makes a mistake, they hide that mistake since punishment can be severe. The natural openness that comes from living in a democratic society means that American military personnel are able to talk about mistakes they have made, reason about how to stop them from happening again, and then implement solutions. The U.S. military must give its people the tools required to implement better, faster, and more permanent solutions. 

Software-defined tactics will yield a lasting advantage for American military forces by leveraging the comparative advantages of western societies: openness and a focus on investing in human capital. There is no time to waste.

LT Sean Lavelle is an active-duty naval flight officer who instructs tactics in the MQ-4C and P-8A. He leads the iLoc Software Development Team at the Maritime Patrol and Reconnaissance Weapons School and holds degrees from the U.S. Naval Academy and Johns Hopkins University. The views stated here are his own and are not reflective of the official position of the U.S. Navy or Department of Defense.

Featured image: A P-8A Poseidon conducts flyovers above the Enterprise Carrier Strike Group during exercise Bold Alligator 2012. (U.S. Navy photo by Mass Communication Specialist 3rd Class Daniel J. Meshel/Released)