Tag Archives: organizational learning

Design, Decide, Forget: Why the Navy Needs a Lessons-Learned Center for Shipbuilding

By Marcus Jones

In March 2025 testimony before the House Armed Services Committee’s Seapower and Projection Forces Subcommittee, Ronald O’Rourke, naval analyst for the Congressional Research Service since 1984, sharpened an excellent recommendation he has raised over more than a decade: the U.S. Navy should establish a dedicated institutional mechanism for systematically capturing, analyzing, and transmitting lessons learned from its shipbuilding programs.1

Although the U.S. Navy has accumulated an extraordinary body of experience in ship design and construction over more than two centuries, it continues to make avoidable mistakes in major acquisition programs such as proceeding into construction with incomplete designs, integrating immature technologies, projecting unrealistic cost and schedule estimates, and eroding accountability structures once a program becomes politically or industrially “too big to fail.” These errors are not unique to the Navy, but they are particularly consequential in the context of shipbuilding, where program timelines are long, platforms are few and expensive, and consequences are measured in strategic as well as fiscal terms.

O’Rourke’s solution is a “lessons-learned center” for naval shipbuilding: a dedicated, continuous, and institutionalized effort to capture knowledge from past programs, distill it into accessible form, and ensure it informs future design, acquisition, and oversight decisions. The value of such an entity, he argues, would lie in its ability to prevent repeated mistakes, reduce waste, improve program outcomes, and help sustain the Navy’s long-term force design and industrial base goals. It addresses key features of the Navy’s acquisition environment: the discontinuous and generational nature of major shipbuilding programs; the structural fragmentation of knowledge across commands, contractors, and government agencies; and the absence of an educational or doctrinal home for critical institutional memory.

Unlike weapons or aircraft programs, which may see dozens or hundreds of iterations within a single career, major ship classes are often designed and constructed once every 20 or 30 years. The effect of this long cycle time is that most individuals involved in a new class of ships – whether program managers, naval architects, flag officers, or congressional staffers – may have had no direct role in the last one. What should be institutional memory therefore becomes diffuse personal recollection, vulnerable to retirement, reassignment, or obsolescence. Moreover, the knowledge necessary to understand past program outcomes is distributed across a complex web of organizations: Program Executive Offices, NAVSEA and its affiliated labs and centers, shipyards and primes and sub-tier contractors, OPNAV resource sponsors, the Office of the Secretary of Defense, and various congressional committees and watchdogs. Each retains only partial and often incompatible records, and there is little incentive or mechanism for aggregating these into a unified analytic understanding. While internal program reviews, GAO reports, and RAND studies may document lessons after the fact, there has never been an entity within the Navy tasked with curating, synthesizing, or teaching these insights.

Interestingly, O’Rourke does not propose a narrowly bureaucratic mechanism but envisions a range of possible instantiations, from a structured repository of documents to a more active, curriculum- and wargame-integrated enterprise. But what matters in his framing is not form but function: the institutionalization of a reflective capacity for learning from experience and applying that learning prospectively in ways that materially improve outcomes.

Such a capability, if properly implemented, would amount to a kind of strategic memory for the Navy, one able to withstand changes in leadership, budget, and political context, while enabling the service to treat shipbuilding not as a sequence of isolated procurements but as a continuous and evolving system of practice. It is not, therefore, a technocratic fix for acquisition inefficiencies, but a cultural transformation within the Navy’s approach to its own history of design, development, and production. It holds out the prospect that the Navy would not only save money and avoid failure, but reaffirm its preferred identity as a thinking, adaptive, and strategically serious organization. It is this deeper institutional value – far beyond process improvement – that makes O’Rourke’s proposal for a naval shipbuilding lessons-learned center important and long overdue.

Joint Lessons on Lessons Learned

The idea has modest precedent and ample justification. One of the most robust models of institutional learning in the defense sector is the U.S. Army’s Center for Army Lessons Learned (CALL), established in 1985 in response to the operational shortcomings revealed during Operation Urgent Fury in Grenada. CALL’s mission was to systematically collect, analyze, and disseminate operational and tactical lessons. Over time, it became fully integrated into Army doctrine and planning, fielding collection teams, producing analytic bulletins, and shaping professional military education. But of particular relevance to the Navy’s shipbuilding enterprise is a less widely known but equally instructive initiative: the Center for Army Acquisition Lessons Learned (CAALL), housed within the Army Materiel Systems Analysis Activity.2

Established following the 2010 Army Acquisition Review, which cited the absence of a centralized mechanism for analyzing acquisition successes and failures, CAALL provides an authoritative source for acquisition-specific lessons across the Army’s program offices. It operates a web-enabled Acquisition Lessons Learned Portal (ALLP) through which project teams submit concise, structured, and searchable lessons, each tagged by acquisition phase, milestone, cost and schedule impact, and functional category.

These are not vague observations, but distilled from real program experience and embedded in metadata-rich formats that support both searchability and trend analysis. CAALL analysts conduct deep-dive studies of recurring issues, such as documentation burden, Earned Value Management failures, or test duplication, and prepare “just-in-time” lesson packages for project managers entering specific acquisition phases. The Center also engages in outreach, publishes bulletins, curates spotlight topic zones, and supports internal Army decision-making with synthesized data on the top five systemic challenges facing Army programs. It demonstrates that institutional learning is within reach but requires structured data, a deliberate submission pipeline, professional analytical support, and educational integration. It also shows how lessons can be transformed from static reflections into dynamic inputs for decision support, policy revision, and curriculum development. Most importantly, CAALL demonstrates that such a capability can be sustained over time, through leadership endorsement, modest staffing, and the aggressive use of digital tools.

A shipbuilding-focused counterpart – scaled appropriately to the Navy’s size, resourced modestly, and empowered to draw insight from both current and historical programs – would not need to reinvent the wheel. It would only need to learn how others have made their institutions learn.

Other models further underscore the feasibility and necessity of such a capability. The Joint Lessons Learned Program (JLLP) applies a five-phase process – discovery, validation, resolution, evaluation, and dissemination – to lessons arising from joint exercises, operations, and experiments. Its information system, JLLIS, acts as a system of record for tracking, archiving, and analyzing lessons that affect force development and joint capability planning.3

A more technical and directly relevant precedent is found in NASA’s Lessons Learned Information System (LLIS).4 NASA’s LLIS arose from the hard-won awareness, following the Challenger and Columbia disasters, that high-stakes engineering efforts demand not only risk management tools but a durable culture of reflection and improvement. NASA’s system integrates lessons into program planning and design reviews and allows for long-term traceability of decisions and failures. The agency’s approach, emphasizing root cause analysis, organizational memory, and education, aligns with the intended mission of an NSLLC to translate the history of naval shipbuilding experience into anticipatory guidance for future programs. Like NASA, the Navy deals with one-off, bespoke, high-cost platforms with life cycles spanning decades. The discipline required to learn systematically from such endeavors is the same.

Even in the commercial sector, complex system integrators such as Boeing, Airbus, and multinational energy firms have turned to lessons-learned systems, both formal and ad hoc, to analyze catastrophic failures and to course-correct future programs. The Construction Industry Institute’s lessons-learned repositories, used by engineering and construction firms to improve execution of large-scale infrastructure projects, is still another model for post-project analysis and feedback. These efforts are often grounded in shared technical taxonomies, design decision trees, and “causal maps” that allow construction organizations to relate performance outcomes to earlier architectural or managerial choices. The Navy’s shipbuilding community, which is distinguished by even greater system and technological complexities and similar exposure to path-dependent design choices, lacks such a coherent and systematized mechanism. An NSLLC would hold out the promise of that capability.

Of course, these precedents cannot simply be imitated wholesale, but they offer essential lessons in form, function, and value. Each succeeds not by relying on passive documentation and informal processes, but by embedding structured learning into the decision cycles and professional cultures of their organizations. What an NSLLC must do is adapt this logic to the particularities of U.S. naval shipbuilding: its long timelines, institutional fragmentation, industrial dependencies, and strategic visibility. It must provide an analytic and educational platform that helps naval leaders and engineers reason more effectively about cost, capability, risk, and design. It must produce continuity across ship classes and across generations of acquisition professionals. And it must do so not as a retrospective archive alone, but as a living resource embedded in professional education, program governance, and future planning.

Over the past several decades, the U.S. Navy has been the subject of repeated and increasingly urgent calls to establish a formal mechanism for doing just that, all of which have, time and again, failed to take root. While the service has often acknowledged the recurrence of major programmatic mistakes – most notably in high-profile acquisition efforts such as the Littoral Combat Ship, the Zumwalt-class destroyer, and the Ford-class aircraft carrier – it has not developed a durable, institutionalized capacity for engineering and acquisition-oriented organizational learning. This failure has not gone unremarked. A lineage of initiatives, proposals, and critiques – some internal, some external, some aspirational, others postmortem – has identified the absence of such a capacity as a root contributor to the Navy’s persistent shipbuilding troubles.

Perhaps the most compelling of these efforts is a 2022 MIT thesis by naval engineer Elliot Collins, which deserves attention not only for its technical sophistication but for its diagnosis of a deep institutional shortcoming.5 Collins, a Navy officer serving in the DDG(X) design program, observed firsthand what he describes as a structural absence of organizational memory in Navy ship design and acquisition. His thesis, written under the auspices of MIT’s Naval Construction and Engineering program, proposes the creation of a Navy Design Notebook System (NDNS): a digital, structured, and lifecycle-aware framework for recording and organizing design decisions, assumptions, lessons, and engineering rationale across a ship’s development. Drawing inspiration from both Toyota’s engineering notebook practice and the best traditions of systems engineering, Collins lays out a clear taxonomy and architecture for capturing knowledge in real time and rendering it useful across multiple programs and decades. Crucially, the NDNS is not just a data storage concept, but a model for how design reasoning can be institutionalized so that the lessons of one generation are accessible and intelligible to the next.

The significance of Collins’s proposal lies in the lineage of failed or underdeveloped efforts that it implicitly seeks to redeem. As far back as the 1970s, the Navy undertook an informal initiative known as the REEF POINTS series, pamphlet-style reflections on acquisition experience intended to help incoming program officers.6 But the REEF POINTS effort lacked formal backing, consistent authorship, or archival permanence, and it quickly faded as personnel rotated out and no office assumed responsibility for sustaining it. Later assessments, including a 1993 Department of Defense Inspector General report, found that the Navy lacked a centralized system for capturing acquisition lessons learned, and more critically, that it made little practical use of the systems it did possess. Data were gathered, but not applied; observations made, but not preserved; patterns noted, but not internalized.7 The diagnosis repeated itself in a 2002 analytical review commissioned by the Army’s War College, which found that across the Department of Defense, lessons-learned programs often failed not for lack of insight but for lack of organizational stewardship, cultural support, and procedural integration.8

Why, then, despite these longstanding recognitions, has the Navy failed to institutionalize a lasting lessons-learned capability in its shipbuilding enterprise? The reasons are multiple and reflect a misalignment between the operational culture of the Navy and the administrative and engineering demands of ship design. Unlike the tactical communities of naval aviation or undersea warfare – where debriefing, checklist revisions, and iterative training are ingrained – the acquisition enterprise lacks a comparable feedback loop. Moreover, the Navy’s engineering education pathways, from undergraduate technical training to postgraduate systems curricula, have not systematically incorporated acquisition case studies or design failures into their pedagogy. There is no consistent mechanism to bring shipbuilding experience into the classroom, the wargame, or the design studio. Lessons remain tacit, siloed, and anecdotal.

That the Navy has lacked such a capacity for so long is a failure of imagination and institutional design, but it not an irremediable one. The architecture of such a capability already exists in other domains, from NASA to the Army to the commercial nuclear sector. The Navy does not need to invent a solution from whole cloth; it needs to adapt proven models to its own technical and cultural context. What is required is not another ad hoc study or retrospective review, but the establishment of a permanent Naval Shipbuilding Lessons-Learned Center, a durable institutional home where technical memory, engineering reasoning, and acquisition insight can be collected, structured, and applied. The central question, then, is not whether such a center is needed, but what it should consist of, how it should function, and where it should reside.

The Devil in the Details

To be more than a bureaucratic corrective or another forgotten archive, a shipbuilding lessons-learned program must fulfill a set of core functions as intellectually rigorous as the failures it seeks to prevent and not just catalog what has gone wrong in previous programs or indulge in generalities about process improvement. The first and most essential function is to identify and preserve actual lessons: not loose observations or platitudes, but knowledge with clear causal content, derived from real program experience, and supported by traceable evidence.

To qualify as such, a lesson must demonstrate causal specificity: what precisely caused the outcome it describes, and why. It must be replicable or at least transferable across contexts, suggesting how it might inform other ship types or acquisition models. It must be traceable to primary sources – engineering drawings, test data, milestone reviews – so that its logic can be reconstructed and its authority verified. It must be actionable, capable of informing future decisions, whether at the level of design margin, contract structure, or policy architecture. And ideally, it should possess counterfactual depth: the ability to show not only what happened, but what might have happened differently under other choices.

When filtered through this lens, the lessons that matter and that a center must preserve fall broadly into five categories. First are design integration lessons, insights into how complex systems interact within the hull, and how early design assumptions or immature technologies can generate cascading failures, as in the DDG-1000’s power system or the Ford-class’s EMALS launch mechanism. Second are construction and manufacturing lessons, which speak to the translation of design into physical product: the timing of block assembly, the thresholds at which digital coordination outperforms paper-based workflows, the effects of workforce experience on productivity. Third are program management and acquisition lessons (perhaps the most politically fraught) concerning contract type selection, milestone pacing, and the dangers of concurrency. Fourth are industrial base and supply chain lessons, which trace how changes in the broader defense industrial ecosystem—supplier attrition, workforce bottlenecks, fragility in the materials base—constrain program execution in ways the Navy and its private shipbuilders often fail to anticipate. And finally, there are historical, strategic, and doctrinal lessons, which reveal how misalignments between strategic ambition and industrial reality (fleet design concepts that outpace build capacity, for instance) can derail even well-managed programs.

Still, it is not enough just to identify them; lessons must be preserved and organized within a structure that allows them to be used. Here, the Navy can draw on models such as that proposed by Collins in his thesis: a digital, lifecycle-aware knowledge framework that tags and stores design decisions, assumptions, and lessons in a manner that makes them accessible not only to current program staff but to future generations. Such a system would form the backbone of the NSLLC’s information architecture: structured, searchable, phase-referenced, and durable. It would allow engineers working on SSN(X) to understand not just that the Virginia-class succeeded or stumbled in certain areas, but why, under what constraints, and according to which tradeoffs. It would enable program sponsors to distinguish between lessons that were context-specific and those that reflect deeper structural patterns.

Ultimately, the most critical function of the NSLLC, however, is not archival but pedagogical. Lessons, to be meaningful, must be taught as part of a living curriculum, and not simply as dry memoranda or summary slides. The center must work directly with educational institutions to embed lessons into the professional formation of officers, policy officials, engineers, and acquisition professionals. This means developing decision-forcing cases that place students in the shoes of historical program leaders, confronting them with the actual dilemmas and constraints those leaders faced. It means designing wargames and exercises that test tradeoffs in acquisition, industrial surge, and fleet composition. It means seeding capstone projects, research initiatives, and faculty development efforts with questions drawn from real program history. And it means, above all, creating a culture in which experience is not simply remembered but used as a guide to reasoning, as a check against institutional hubris or forgetfulness, and as a source of comparative advantage in a strategic environment where time and resources are finite. Finally, the Center must function diagnostically on behalf of Navy decision-makers, as a resource for the review of future program plans, bringing to bear its corpus of structured knowledge to identify early warning signs of known failure modes, or to highlight opportunities for constructive borrowing across ship classes. This is not a matter of punitive oversight, but of anticipatory guidance and bringing past reasoning to bear on present decisions in a way that deepens accountability and reduces risk.

What this amounts to is a knowledge institution, not in the narrow academic sense but in the most operationally vital sense of the term. The NSLLC would exist to ensure that the U.S. Navy no longer builds its ships without memory. It would translate past pain into future prudence, and costly failure into usable foresight. And it would mark, at last, the point at which naval shipbuilding began to behave not just as a procurement function, but as a learning system worthy of the stakes it bears.

The Way Ahead

What would such a center look like in practice? If the value of a Naval Shipbuilding Lessons-Learned Center lies in the integrity and usability of its knowledge, then its organizational structure must be equally deliberate. It should not replicate the diffuse and stovepiped landscape of existing program oversight offices, but rather bridge engineering, acquisition, policy, and education communities. And in keeping with the realities of today’s defense fiscal environment, it must be lean, digitally enabled, and architected from the start to minimize overhead. The NSLLC should be organized as a small, hybrid analytical and educational unit with as small a group of affiliated personnel as circumstances permit, including naval engineers with experience in major design and production programs; acquisition professionals familiar with contracting and program management dynamics; historians of technology and naval policy who can trace institutional lineages and doctrinal consequences; and digital knowledge architects to manage its structured repository and analytic tools. Core activities would be augmented by short-term fellows – rotating billets for officers, civilians, or academics on sabbatical or detail – who would conduct targeted case studies, contribute to curriculum development, or lead diagnostic reviews of current programs. Rather than attempt to recreate or replace existing program data flows, the Center should connect to them and draw from NAVSEA, PEO Ships and Submarines, CRS, GAO, and DoD IG reports, but synthesize across them with the purpose of creating pedagogically and analytically coherent insights.

To reduce cost and footprint, the Center must leverage digital tools aggressively. A cloud-based digital architecture, modeled in part on the NDNS framework, would form the heart of the operation: a searchable, metadata-tagged, phase-referenced archive of lessons that supports analysis, instruction, and red-teaming of future programs. Visualization tools like interactive timelines, decision trees, and traceability matrices should be prioritized over staff-intensive publishing or editorial operations. Whenever possible, the Center’s materials should be reusable across formats: a single case study might underpin a midshipman seminar, an acquisition wargame, and a policy memo to ASN(RDA). In this sense, the Center is less a physical institute than a virtual and modular capability: one that enables reflection, instruction, and anticipatory decision support wherever shipbuilding is debated or taught.

As to its location, the author will admit to a conflict-of-interest, being a longtime member of the U.S. Naval Academy faculty. It may, therefore, sound parochial to suggest that the NSLLC be housed at Annapolis. That said, there are good reasons, symbolic and practical, why the Naval Academy may be a fitting institutional home. The Academy is the Navy’s enduring schoolhouse, the place where generations of officers are introduced not just to the fleet, but to the long arc of naval experience. It offers a rare confluence of technical education, historical reflection, and leadership formation.

Moreover, it sits proximate to the Washington-area institutions with which the NSLLC would regularly interact – NAVSEA, the Navy labs and warfare centers, OPNAV and the Secretariat organization, and the various acquisition and oversight bodies headquartered in the capital region. Perhaps most importantly, the Academy is a place not just of training, but of memory. To locate the Center there would signal that lessons are not just compliance artifacts or after-action musings, but a core component of professional identity. It would allow the Center’s work to be integrated directly into engineering coursework, capstone design, fleet seminars, and acquisition electives. And it would give midshipmen, from the beginning of their careers, access to a body of knowledge that has existed until now only in fragments.

But what matters is not the administrative chain but the Center’s function: to make memory usable, to make learning permanent, and to help the Navy move from a culture of crisis improvisation to one of cumulative, adaptive competence. Wherever it is housed, a Naval Shipbuilding Lessons-Learned Center should embody the values it seeks to cultivate: frugality, clarity, and strategic discipline. And in doing so, it may just help the Navy build not only better ships, but a better institution.

Dr. Marcus Jones is an associate professor in the history department at the United States Naval Academy

Endnotes

1. Ronald O’Rourke (11 March 2025), “Statement before the Armed Services Committee Seapower and Projection Forces Subcommittee, U.S. House of Representatives, Hearing on ‘The State of U.S. Shipbuilding’” (Congressional Research Service Report 7-5700) pp.1-3.

2. Iracki, Jill, 2014. “Army acquisition lessons learned,” Defense AT&L (September–October 2014) pp.36-40.

3. Thomas, J.T. and Schultz, D.L. (2015), “Lessons about Lessons: Growing the Joint Lessons Learned Program.” Joint Forces Quarterly 79, pp.113-120.

4. Ganopol, A., Oglietti, M., Ambrosino, A., Patt, F., Scott, A., Hong, L. and Feldman, G., 2017. “Lessons learned: an effective approach to avoid repeating the same old mistakes.” Journal of Aerospace Information Systems14(9), pp.483-492; Also Miller, S.B., 2005. “Lessons Learned or Lessons Noted: Knowledge Management in NASA.” In ASTD 2005 Research-to-Practice Conference Proceedings (p. 140).

5. Collins, E.J., 2022. “A Method for Organized Institutional Learning in the Navy Shipbuilding Community” (Doctoral dissertation, Massachusetts Institute of Technology).

6. Wellborn Jr, R.M., 1976. “Formulation and Use of Lessons Learned in NAVSEASYSCOM Acquisition Programs” (Project Report, Defense Systems Management College)

7. Reed, D.E., Gimble, T.F., Koloshey, J.L., Ward, E.J. and Alejandro, J.K., 1993. “Acquisition-Type Lessons-Learned Programs Within the Military Departments” (No. IG-DOD-93173).

8. Snider, K.F., Barrett, F.J. and Tenkasi, R., 2002. “Considerations in acquisition lessons-learned system design.” Acquisition Review Quarterly9(1), pp.67-84.

Featured Image: The USS Harvey C. Barnum Jr. under construction at Bath Iron Works in July 2023. (Photo via Bath Iron Works)

Upgrading the Mindset: Modernizing Sea Service Culture for Trust in Artificial Intelligence

By Scott A. Humr

Winning on the future battlefield will undoubtedly require an organizational culture that promotes human trust in artificial intelligent systems. Research within and outside of the US military has already shown that organizational culture has an impact on technology acceptance, let alone, trust. However, Dawn Meyerriecks, Deputy Director for CIA technology development, remarked in a November 2020 report by the Congressional Research Service that senior leaders may be unwilling, “to accept AI-generated analysis.” The Deputy Director goes on to state that, “the defense establishment’s risk-averse culture may pose greater challenges to future competitiveness than the pace of adversary technology development.” More emphatically, Dr. Adam Grant, a Wharton professor and well-known author, called the Department of Defense’s culture, “a threat to national security.” In light of those remarks, the Commandant of the Marine Corps, General David H. Berger, stated at a gathering of the National Defense Industrial Association that, “The same way a squad leader trusts his or her Marine, they have to trust his or her machine.” The points of view in the aforementioned quotes raise an important question: Do Service cultures influence how its military personnel trust AI systems?

While much has been written about the need for explainable AI (XAI) and need for increasing trust between the operator and AI tools, the research literature is sparse on how military organizational culture influences the trust personnel place in AI imbued technologies. If culture holds sway over how service personnel may employ AI within a military context, culture then becomes an antecedent for developing trust and subsequent use of AI technologies. As the Marine Corps’s latest publication on competing states, “culture will have an impact on many aspects of competition, including decision making and how information is perceived.” If true, military personnel will view information provided by AI agents through the lens of their Service cultures as well.

Our naval culture must appropriately adapt to the changing realities of the new Cognitive Age. The Sea Services must therefore evolve their Service cultures to promote the types of behaviors and attitudes that fully leverage the benefits of these advanced applications. To compete effectively with AI technologies over the next decade, the Sea Services must first understand their organizational cultures, implement necessary cultural changes, and promote double-loop learning to support beneficial cultural adaptations.

Technology and Culture Nexus

Understanding the latest AI applications and naval culture requires an environment where experienced personnel and technologies are brought together through experimentation to better understand trust in AI systems. Fortunately, the Sea Service’s preeminent education and research institution, the Naval Postgraduate School (NPS), provides the perfect link between experienced educators and students who come together to advance future naval concepts. The large population of experienced mid-grade naval service officers at NPS provides an ideal place to help understand Sea Service culture while exploring the benefits and limitations of AI systems.

Not surprisingly, NPS student research has investigated trust in AI, technology acceptance, and culture. Previous NPS research has explored trust through interactive Machine Learning (iML) in virtual environments for understanding Navy cultural and individual barriers to technology adoption. These and other studies have brought important insights on the intersection of people and technologies.

One important aspect of this intersection is culture and how it is measured. For instance, the Competing Values Framework (CVF) has helped researchers understand organizational culture. Paired with additional survey instruments such as E-Trust or the Technology Acceptance Models (TAM), researchers can better understand if particular cultures trust technologies more than other types. CVF is measured across six different organizational dimensions that are summarized by structure and focus. The structure axis ranges from control to flexibility, while focus axis ranges from people to organization, see figure 1.

Figure 1 – The Competing Values Framework – culture, leadership, value from Cameron, Kim S., Robert E. Quinn, Jeff DeGraff, and Anjan V. Thakor. Competing Values Leadership, Edward Elgar Publishing, 2014.

Most organizational cultures contain some measure of each of the four characteristics of the CVF. The adhocracy quadrant of the CVF, for instance, is characterized by innovation, flexibility, and increased speed of solutions. To this point, an NPS student researcher found that Marine Corps organizational culture was characterized as mostly hierarchical. The same researcher found that this particular group of Marine officers also preferred the Marine Corps move from a hierarchical culture towards an adhocracy culture. While the population in the study was by no means representative of the entire Marine Corps, it does generate useful insights for forming initial hypotheses and the need for additional research which explores whether hierarchical cultures impede trust in AI technologies. While closing this gap is important for assessing how a culture may need to adapt, actually changing deeply rooted cultures requires significant introspection and the willingness to change.

The ABCs: Adaptations for a Beneficial Culture

“Culture eats strategy for breakfast,” quipped the revered management guru, Peter Drucker—and for good reason. Strategies that seek to adopt new technologies which may replace or augment human capabilities, must also address culture. Cultural adaptations that require significant changes to behaviors and other deeply entrenched processes will not come easy. Modifications to culture require significant leadership and participation at all levels. Fortunately, organizational theorists have provided ways for understanding culture. One well-known organizational theorist, Edgar Schein, provides a framework for assessing organizational culture. Specifically, culture can be viewed at three different levels which consist of artifacts, espoused values, and underlying assumptions.

The Schein Model provides another important level of analysis for investigating the military organizational culture. In the Schein model, artifacts within militaries would include elements such as dress, formations, doctrine, and other visible attributes. Espoused values are the vision statements, slogans, and codified core values of an organization. Underlying assumptions are the unconscious and unspoken beliefs and thoughts that undergird the culture. Implementing cultural change without addressing underlying assumptions is the equivalent to rearranging the deck chairs on the Titanic. Therefore, what underlying cultural assumptions could prevent the Sea Services from effectively trusting AI applications?

One of the oldest and most ubiquitous underlying assumptions of how militaries function is the hierarchy. While hierarchy does have beneficial functions for militaries, it may overly inhibit how personnel embrace new technologies and decisions recommended by AI systems. Information, intelligence, and orders within the militaries largely flow along well-defined lines of communication and nodes through the hierarchy. In one meta-analytic review on culture and innovation, researchers found that hierarchical cultures, as defined by CVF, tightly control information distribution. Organizational researchers Christopher McDermott and Gregory Stock stated, “An organization whose culture is characterized by flexibility and spontaneity will most likely be able to deal with uncertainty better than one characterized by control and stability.” While hierarchical structures can help reduce ambiguity and promote stability, they can also be detrimental to innovation. NPS student researchers in 2018, not surprisingly, found that the hierarchical culture in one Navy command had a restraining effect on innovation and technology adoption.

CVF defined adhocracy cultures on the other hand are characterized by innovation and higher tolerances for risk taking. For instance, AI applications could also upend well-defined Military Decision Making Processes (MDMP). MDMP is a classical manifestation of codified processes that supports underlying cultural assumptions on how major decisions are planned and executed. The Sea Services should therefore reevaluate and update its underlying assumptions on decision making processes to better incorporate insights from AI.

In fact, exploring and promoting other forms of organizational design could help empower its personnel to innovate and leverage AI systems more effectively. The late, famous systems thinking researcher, Donella Meadows, aptly stated, “The original purpose of a hierarchy is always to help its originating subsystems do their jobs better.” Therefore, recognizing the benefits, and more importantly the limits of hierarchy, will help leaders properly shape Sea Service culture to appropriately develop trustworthy AI systems. Ensuring change goes beyond a temporary fix, however, requires continually updating the organization’s underlying assumptions. This takes double-loop learning.

Double-loop Learning

Double-loop learning is by no means a new concept. First conceptualized by Chris Argyris and Donald Schön in 1974, double-loop learning is the process of updating one’s underlying assumptions. While many organizations can often survive through regular use of single-loop learning, they will not thrive. Unquestioned organizational wisdom can perpetuate poor solutions. Such cookie-cutter solutions often fail to adequately address new problems and are discovered to no longer work. Rather than question the supporting underlying assumptions, organizations will instead double-down on tried-and-true methods only to fail again, thus neglecting deeper introspection.

Such failures should instead provide pause to allow uninhibited, candid feedback to surface from the deck-plate all the way up the chain of command. This feedback, however, is often rare and typically muted, thus becoming ineffectual to the people who need to hear it the most. Such problems are further exacerbated by endemic personnel rotation policies combined with feedback delays that rarely hold the original decision makers accountable for their actions (or inactions).

Implementation and trust of AI systems will take double-loop learning to change the underlying cultural assumptions which inhibit progress. Yet, this can be accomplished in several ways which go against the normative behaviors of entrenched cultures. Generals, Admirals, and Senior Executive Service (SES) leaders should create their own focus groups of diverse junior officers, enlisted personnel, and civilians to solicit unfiltered feedback on programs, technologies, and most importantly, organizational culture inhibitors which hold back AI adoption and trust. Membership and units could be anonymized in order to shield junior personnel from reprisals while promoting the unfiltered candor senior leadership needs to hear in order to change the underlying cultural assumptions. Moreover, direct feedback from the operators using AI technologies would also avoid the layers of bureaucracy which can slow the speed of criticisms back to leadership.

Why is this time different?

Arguably, the naval services have past records of adapting to shifts in technology and pursuing innovations needed to help win future wars. Innovators of their day such as Admiral William Sims developing advanced naval gunnery techniques and the Marine Corps developing and improving amphibious landing capabilities in the long shadow of the Gallipoli campaign reinforce current Service cultural histories. However, many technologies of the last century were evolutionary improvements to what was already accepted technologies and tactics. AI is fundamentally different and is akin to how electricity changed many aspects of society and could fundamentally disrupt how we approach war.

In the early 20th century, the change from steam to electricity did not immediately change manufacturing processes, nor significantly improve productivity. Inefficient processes and machines driven by steam or systems of belts were never reconfigured once they were individually equipped with electric motors. Thus, many benefits of electricity were not realized for some time. Similarly, Sea Service culture will need to make a step change to fully take advantage of AI technologies. If not, the Services will likely experience a “productivity paradox” where large investments in AI do not fully deliver the efficiencies promised. 

Today’s militaries are sociotechnical systems and underlying assumptions are its cultural operating system. Attempting to plug AI application into a culture that is not adapted to use it, nor trusts it, is the equivalent of trying to install an Android application on a Windows operating system. In other words, it will not work, or at best, not work as intended. We must, therefore, investigate how naval service cultures may need to appropriately adapt if we want to fully embrace the many advantages these technologies may provide.

Conclusion

In a 2017 report from Chatham House titled, “Artificial Intelligence and the Future of Warfare,” Professor Missy Cummings stated, “There are many reasons for the lack of success in bringing these technologies to maturity, including cost and unforeseen technical issues, but equally problematic are organizational and cultural barriers.” Echoing this point, the former Director of the Joint Artificial Intelligence Center (JAIC), Marine Lieutenant General Michael Groen, stated “culture” is the obstacle, not the technology, for developing the Joint All-Domain, Command and Control (JADC2) system, which is supported by AI. Yet, AI/ML technologies have the potential to provide a cognitive-edge that can potentially increase the speed, quality, and effectiveness of decision-making. Trusting the outputs of AI will undoubtedly require significant changes to certain aspects of our collective naval cultures. The Sea Services must take stock of their organizational cultures and apply the necessary cultural adaptations, while fostering double-loop learning in order to promote trust in AI systems.

Today, the Naval Services have a rare opportunity to reap the benefits of a double-loop learning. Through the COVID-19 pandemic, the Sea Services have shown that they can adapt responsively and effectively to dynamic circumstances while fulfilling their assigned missions. The Services have developed more efficient means to leverage technology to allow greater flexibility across the force through remote work and education. If, however, the Services return to the status quo after the pandemic, they will have failed to update many of its outdated underlying assumptions by changing the Service culture.

If we cannot change the culture in light of the last three years, it portends poor prospects for promoting trust in AI for the future. Therefore, we cannot squander these moments. Let it not be said of this generation of Sailors and Marines that we misused this valuable opportunity to make a step-change in our culture for a better approach to future warfighting.

Scott Humr is an active-duty Lieutenant Colonel in the United States Marine Corps. He is currently a PhD candidate at the Naval Postgraduate School as part of the Commandant’s PhD-Technical Program. His research interests include trust in AI, sociotechnical systems, and decision-making in human-machine teams. 

Featured Image: An F-35C Lightning aircraft, assigned to Strike Fighter Squadron (VFA) 125, prepares to launch from the flight deck of the aircraft carrier USS George H. W. Bush (CVN 77) during flight operations. (U.S. Navy photo by Mass Communication Specialist 3rd Class Brandon Roberson)