By Harrison Schramm
Navy leadership should scrutinize the somewhat disjointed approach the military writ large and the Navy in particular have adopted towards AI. As a statistician, it is clear that success in large inference machines – to include AI – requires three things: technology, people, and culture. Technology is the easiest and paradoxically the least important component. Because of the hierarchical nature of military career paths, the people currently serving are poised to be the leaders of this revolution. There is ample talent in the ranks, provided it is effectively cultivated and employed.
Focus on culture. Successful AI culture is a departure from Navy norms. Military culture is based around assured success, but compared to aviation or weaponeering, AI requires far more failure before reaching success. It is estimated that upwards of 85 percent of AI projects fail. Resist the temptation to think better performers, technology, or business processes will change this number. Instead, accept that this failure rate is a fundamental feature of developing AI. With this in mind, what should the Navy do to create this culture?
First and foremost, understand the difference between technical failure and management failure. Technical failures are, if not good, at a minimum necessary. They are a precondition to success, and some even use the phrase ‘fail forward.’ Second, resist the temptation to want to create one AI tool to solve them all. Start small and on issues that have relatively low risk. Safety data is a perfect opportunity for this type of small start.
Finally, ensure that Sailors – those who are currently wearing the uniform – are exposed to the process and shortcomings of AI in the course of their duties. Formal education programs – such as NPS – are necessary, but not sufficient. The CO of the first AI-enabled warship is someone who is currently serving, not someone we will assess later in the future. Should this CO find themselves in the midst of a shooting conflict where AI is an enabling tool, they will be commanding from the CIC. The engineers and technicians who developed the technology will not. Warfighters will ultimately be responsible for the operational outcomes of these tools – both positive and negative.
The ability to harness and use AI is critical, but Navy culture must change to realize its potential. Military members must remain the masters – never the servants – of technology.
Harrison Schramm is a retired Navy commander and a professional statistician. During his time in the Navy, he flew the H-46 and MH-60S. Ashore he worked in Operations Research, teaching at the Naval Postgraduate School and working at OPNAV N81. He is the immediate past President of the Analytics Society of INFORMS.
Featured Image: A Fire Controlman monitors a radar console for air and surface contacts in the combat information center aboard the forward-deployed Arleigh Burke-class guided-missile destroyer USS Donald Cook (DDG-75). (U.S. Navy photo)
Well written and perfectly timed !
I hope this article gets shared, and read aloud verbatim, at least twice a year in an all-hands safety stand down.
Every single command under the CNO.