By Dmitry Filipoff
Capt. Arthur “Trip” Barber (ret.) served in the Navy for 41 years, ultimately capping his career in the civilian Senior Executive Service as the Navy’s Chief Analyst for the last 12 of those years. In this wide-ranging and candid conversation, Barber discusses major challenges with ongoing force structure assessments, how he helped lead the Navy’s Assessment Division (N81) in shaping the budget, and what it will take for the Navy to make some of its most ambitious warfighting concepts a reality.
The Navy is at an inflection point as it conducts force structure assessments to determine what the future fleet may look like, including along the lines of an integrated force that more closely involves the Marines. How do you view the challenges of force structure assessment and fleet design in this modern age of rapid change?
Force structure assessments should be based on some form of analysis of steady-state and wartime capability requirements that uses specific types of units to meet these requirements, and that uses a specific set of military objectives to be achieved, and a concept of operations for the units’ employment against specific threats when doing so. If the design of the force is stable and the characteristics and employment concepts of its units are mature, this is a pretty straightforward task. That is not where the Navy is today.
When major changes in unit characteristics and force employment concepts are underway but have not yet stabilized and the threat is evolving rapidly, the task of doing a force structure assessment that produces specific provable numbers as the “requirement” for each type of unit becomes an unbounded problem. That is the situation the Navy is in today. It does not help that the whole process has become politicized and that the Navy is operating with a force structure requirement number that is unaffordable with current ship types, but is being told that the number must be achieved anyway, and soon. Force structure assessments are actually a set of about ten separate assessments of the requirements for that many different specific types of ships, so the aggregate number of 355 or whatever is misleading. If a requirement for 62 submarines is part of the 355 and the Navy has 42, but has 20 extra ships of some other type above the requirement for that type, the aggregate fleet number may be 355 but it is not the right fleet. That nuance is lost in the political process.
Until the new types of units and concepts of operations for these units are decided upon that will best meet our projected warfare challenges, including the newest one of operating as an “integrated” force with the Marines as they implement their new Commandant’s brilliant and disruptive force-planning guidance, it would be nice if the Navy could stop throwing force structure assessment numbers around. It would be ideal, although probably politically unrealistic, to take a pause in such assessments while a whole new design for the future fleet is worked out in detail. This could and should have been done over the last three years since the last force structure assessment, but it was not because the amount of change required was simply too disruptive for the institutional Navy to handle at that time. I think it is underway now, finally, but it will take a couple of years to settle out to the point where a force structure assessment is meaningful.
The Navy is looking to move toward a Distributed Maritime Operations concept for how it employs its forces. What do you make of this warfighting concept, and what could it take the Navy to get there?
Operating as an aggregated and concentrated force against an adversary like China with a massive long-range precision reconnaissance-strike system is not a survivable concept. Distributing the force is essential, but the distributed units need to be able to support each other and concentrate their effects when and where necessary. This requires a significant degree of connectivity and data movement between units at beyond-line-of-sight ranges, in an environment where the adversary is attacking that networking capability as their principal line of military effort. If a distributed force is not connected, it will be defeated in detail.
The concept for executing this connectivity is the “naval tactical grid” or whatever the equivalent joint term of the moment may be. It means everybody can communicate what they need to, when they need to. This aspiration began with Vice Admiral Art Cebrowski’s FORCEnet concept in the 1990s, but it has not made the implementation progress that it should have. Engineering this without ripping out every radio on every unit and starting over is really hard technically, and expensive. Nobody wants to own and pay this bill. Enforcing conformance of all systems and units to common and truly interoperable data and communications protocols is very difficult organizationally. But until we do both, distributed operations will not work very well. Achieving this connectivity vision is as disruptive a change as changing the design of the units that make up the fleet, and this is another change that I do not think is going on at the pace that is appropriate to the strategic situation we are in.
You served as the Navy’s Chief Analyst for 12 years at the Assessment Division, OPNAV N81. During that time you sought to change the relationship N81 had with other organizations, namely from being a so-called “honest broker” and more toward an analytics service provider. What exactly does this change mean and what effect did it have on those relationships?
This operating model that I built for the Navy’s analytic enterprise during my 12 years was dismantled after I retired, so my comments on how it used to work are not too relevant anymore. The Navy’s analytic resources are no longer concentrated in N81, they have been distributed back across many sponsors who are now largely on their own to conceive and execute whatever analysis they think they need, subject to some review of the proposed topics by the Navy Analytic Office. This is essentially a return to the situation of the 1990s. No other Service today operates their analytic enterprise in this manner. N81 is no longer a “service provider” of analysis to other sponsors, although they remain the only Navy line organization with professionally-trained operations analysts who know how to conduct and supervise analytic projects, and the only organization that does not own the programs that it analyzes.
The freedom from program ownership lets N81 be a “dispassionate” analyst of capabilities. I never liked the phrase “honest broker” to characterize this, because it implies that all others are “dishonest,” which is simply not true. But a dispassionate provider may not choose to analyze issues that are not going to drive major areas of force capability, and the answers provided may not be those that the sponsor with the issue wants to hear. Both of these factors made the N81 service provider model a bit unpopular.
In recent years the OPNAV staff saw some restructuring that sought to give the strategists a greater ability to provide inputs into developing the budget, and where before the strategists were often viewed as relatively weak influencers when it came to developing budget builds. How do you see the evolving relationship between strategy inputs and the POM process?
“Strategy” is the purview of the Chairman of the Joint Chiefs and the Secretary of Defense, not the individual services. What the services call “strategy” and own is the concepts of force employment that they plan to use in execution of the strategy. The Navy’s strategic expression that counts is the budget it builds that funds and delivers specific forces and capabilities. That determines what the Navy can do, and more importantly what it cannot do. Navy strategists have generally been unable to articulate timely guidance that made strategic choices about what to not do in order to focus the available resources on the most critical things that must be done. When resources are constrained, you cannot have one without the other. Their guidance said everything was important, few things could be cut, and it usually came out late, so far into the POM-development timeline that funding decisions had already been made. Time, tides, and POMs wait for no man.
Recent CNO guidance documents from the Navy strategists have been better than I have seen for several decades, but they still cannot seem to bear to identify the full scale of things to not do in order to maximize the strategic contribution of the Navy within actually available resources. In the end, if strategists cannot make such hard calls on the relentless POM timeline, then programmers must. It has typically been the job of N81 to use results from analysis to help the programmers make these calls; that is why I found working in N81 so professionally rewarding and why having it within N8 was so important to the Navy. Unfortunately, the current political imperatives of sustaining and even increasing force structure size without a corresponding top line increase have limited the effect of strategic focus and analysis on what actually gets funded in the POM.
There is debate on how well wargaming has been employed to inform decision-making, particularly with respect to analytic rigor and ownership. How could wargaming evolve to have greater analytic clout within DoD?
Wargaming can have a distinct and important role in shaping the Navy’s direction, particularly in times of disruptive and rapid technological change, such as we are now in. If wargames are structured in a disciplined manner with realistic assumptions about event timing, logistic feasibility, threat behavior, and technology; are focused on the most critical issues; and have the right participants, they can be very useful. They can provide insights about which courses of action, technologies and concepts of employment are promising, and which are unlikely to work in a specific scenario. Wargames are human-in-the-loop activities whose outcomes change with each set of humans that do them, they are not rigorous repeatable analysis. Their best role is to shape the assumptions that are used by quantitative analysis before that analysis is applied against new operational problems. This analysis should then ideally be tested through operational experimentation that puts hardware (real or virtual) in the loop to see if things work out the way as predicted.
This “virtuous cycle” of wargame-analyze-experiment is the ideal way to design a force, but each step of this cycle is today owned and operated by a different organization, and it is fairly rare that the full cycle is actually executed in a coherent, end-to-end manner on a specific operational challenge. It takes time, focus, and patience to achieve this end-to-end coherence, and with organizational leadership rotations being what they are in the U.S. military, this is hard to do. There are a few areas where this happens fairly well, such as in air and missile defense, but the Joint and Navy future-force planning processes could and should do this more broadly and better.
You once described your role as N81B as not necessarily being focused on the specific analytic techniques being employed, but on being the expert on what is worth studying. Among the many demands for studies and analysis across the Navy, how does one determine what is truly worth studying?
Uniquely within the Navy the N81 staff knows how to structure an issue into an analytic problem, how to select the appropriate techniques to apply to it, and how to either do the work themselves or find the right outside provider to apply those techniques. My job (and that of the admirals who were my bosses in N81) was really to have as many interactions with senior leaders as possible and pay close attention to the bigger picture of what Navy leadership was trying to achieve, what the key problems were in getting there, where they needed (whether they knew it or not) analysis to help them understand the value or risk of the alternatives before them, and what analysis had been done before on that issue.
Senior leaders often do not know how to break a problem down into specific issues where analytic techniques can be applied, nor what issues have been studied before. That is really not their job anyway; that is what they expect a “Chief Analyst” to be able to do. I saw it as my job to recognize and make that connection, then bring back to the N81 staff the key new issues to go work their analytic skills on. Fairly regularly I initiated analytic projects in anticipation of issues that I saw coming months in advance. My experience of being deeply involved in 26 POM cycles in the Pentagon gave me pretty good intuition about what issues were likely to be coming up.
The issues worth studying are the ones that will potentially have a significant impact on warfighting outcomes or on the cost of buying, operating, or maintaining the force; and that are susceptible to analysis. Many issues are interesting, some are important, and some of the important ones are not really amenable to analytic techniques. It was N81’s job when I was there, and it is now instead the Navy Analytic Office’s job to recognize which was which.
Arthur H. (Trip) Barber is a retired Navy Senior Executive Service civilian, a retired Navy Surface Warfare Captain, and an engineering graduate of MIT and the Naval Postgraduate School. He was the Navy’s chief analyst of future force structure and capability requirements on the OPNAV staff as a civilian from 2002 to 2014.
Dmitry Filipoff is CIMSEC’s Director of Online Content. Contact him at Content@cimsec.org.
Featured Image: PACIFIC OCEAN (Jan. 25, 2020) The Theodore Roosevelt Carrier Strike Group transits in formation, Jan. 25, 2020. The Theodore Roosevelt Carrier Strike Group is on a scheduled deployment to the Indo-Pacific. (U.S. Navy photo by Mass Communication Specialist 2nd Class Anthony Rivera/Released)
Reference: “If a distributed force is not connected, it will be defeated in detail.”
One might observe that a distributed force that is communicating offers the enemy the opportunity exploit RF emissions to detect, locate, identify/classify, track, and attack for elements of the force.
The vulnerability of the DMO concept to enemy RF exploitation appears to be a critical factor not well addressed in the analyses that underpin the concept.
Your point is well taken.
Non-broadcast communications, such as via satellites, exist. Potentially, airborne-mediated point to point links could also be used should that technology be available. All have vulnerabilities, of course, but reducing RF emissions so a force is not easily detected is possible.
Correct, but that does not mean that an unconnected distributed force is the way to go, it simply means that there are even greater challenges to the DMO concept than are explored here.
DMO might be a goal to work towards, but getting it to work with current ships through adding on technological layers is likely to fail because current fleets are simply not designed with the concept in mind.
What are the Captain’s thoughts on the apparent lack of determination to support the USMC via NSFS? I say ‘apparent’ because of the guns on the LCS and the FFGx. Ships of their size should have 5 inch guns. Similarly, the lack of operational guns on the DDG-1000 which is disgraceful.
Excellent interview!
“N81 is no longer a “service provider” of analysis to other sponsors, although they remain the only Navy line organization with professionally-trained operations analysts who know how to conduct and supervise analytic projects, and the only organization that does not own the programs that it analyzes.”
************************************
There are many trained OR/SA professionals outside of N81 who regularly oversee large projects (notably capability based assessments (CBAs) and analysis of alternatives (AoAs).
I suspect the establishment of Navy Analytic Office (NAO) reflects Navy leadership’s dissatisfaction with the “centralized” N81-focused system. Not sure how well that has worked out.
“the only organization that does not own the programs that it analyzes”
This is the important line – analysts within the same unit or direct chain of command as those to whom they are reporting their analysis are more open to bias, even if it’s not their intent to be biased.
Analytics are most reliable when they are done by an independent agency, but that can also draw the ire of people in power when the analysts call out specific units or projects. Recently the Navy’s run into a lot of problems by pushing projects that have resulted in failures or under-performing results, and one might reasonably wonder where in the process an unbiased analyst was asked “How likely is this investment of money and development hours to result in fulfillment of the project’s goals?”