This monograph was originally published by the Army War College under the title Lying To Ourselves: Dishonesty in the Army Profession and is republished with permission. Read it in its original form here. It deserves to be noted that the described themes and dynamics are not solely limited to the specific military service being examined.
Read Part One here.
By Dr. Leonard Wong and Dr. Stephen J. Gerras
Downrange
One might expect that ethical boundaries are more plainly delineated in a combat environment—the stakes are higher, and the mission is more clearly focused. Discussions with officers, however, revealed that many of the same issues in the garrison environment also emerge in combat. For example, a senior officer described how the combat mission can lead to putting the right “spin” on reports: “We got so focused on getting bodies to combat that we overlooked a lot of issues like weight control, alcohol, or PT.” Not surprisingly, directed training is also often sidestepped in theater. One captain spoke of trying to complete mandatory Sexual Assault Prevention and Response Program (SHARP) training:
“We needed to get SHARP training done and reported to higher headquarters, so we called the platoons and told them to gather the boys around the radio and we said, ‘Don’t touch girls.’ That was our quarterly SHARP training.”
But stretching the truth downrange often extends beyond compliance with mandatory training. A major described how Green 2 sensitive item reports were submitted early every morning. Despite the usual 100 percent accountability, however, it was obvious that it could not have been conducted to standard since nobody ever knocked on their doors to check weapon serial numbers. Another officer related how supply accountability in a combat zone could be manipulated by misrepresenting the truth:
“We found ways to beat the system. You show up in country and you get a layout and immediately what do you do? You do a shortage annex for everything. So that way the Army—with an infinite budget in country—would replenish your product [even though] the unit never really lost the equipment in the beginning.”
Discussions with senior officers revealed other examples of bending the truth. One colonel stated that, “The cost of investigating a lost widget isn’t worth the cost of the item; they write it off and later say it was lost to the Pakistanis.” Another colonel stated:
“We were required to inspect 150 polling sites in Iraq (which nobody could possibly ever do) and fill out an elaborate spreadsheet. The spreadsheet was to get validation for higher that you did what they told you to. We gave them what they wanted.”
One frequently provided example of deception at the senior level concerned readiness assessments of partner forces. It was not uncommon for readiness ratings to vary in conjunction with deployment cycles. In other words, the commander’s assessments were not based so much on the counterpart unit capabilities as they were on the American unit stage of deployment. As one colonel explained:
“I show up and [the readiness assessments] go yellow or green to red. I’m ready to leave – they go from yellow to green. We went through the reports with the CG every ninety days. Everyone wanted to believe what they wanted to believe.”
One widespread recurring requirement for junior leaders in Afghanistan and Iraq was the storyboard—a PowerPoint narrative describing unit events and occurrences. One senior officer pointed out, however, that:
“Every contact with the enemy required a storyboard. People did not report enemy contact because they knew the storyboard was useless and they didn’t want to go through the hassle.”
A captain gave his perspective and his eventual approach to providing incomplete and inaccurate storyboards to higher headquarters:
“I understand there is a higher reporting requirement of which I reported verbally, and I did a proper debrief—I wrote it down and then I sent it to them. [But now] I have to combine a bunch of pictures onto a PowerPoint slide. Now I’m doing this storyboard because there’s an IED, because a donkey fell off the mountain, because some dude’s dog came in and I had to shoot it on the COP and now this dude is mad. It became an absolute burden. So what ended up happening was [that] after about the first couple of months, you’re saving your storyboards, and as soon as you had an incident that [was] somewhat similar to what you already had, it became a cut and paste gig. And the quality of the information that you are giving them wasn’t painting the picture for higher as to what was going on. And you can say, “Yes, Lieutenant, you should have done better.” You’re absolutely right. But when I only had 4 hours between this mission and the next, what’s better – spending 15 minutes to make this beautiful storyboard or planning my next operation?”
The attitude of “I don’t need to tell anyone what happened” was also found in other areas where it was perceived that the reporting requirements were too onerous. For example, one officer discussed his unit’s failure to ask permission to respond to indirect fire (IDF):
“Counterfire became a big issue in terms of [the] ability to counterfire when you were receiving IDF. Some companies in our battalion were returning fire without an accurate grid. They got shot at so they shot back. Of course, they were out in the middle of nowhere with a low chance of collateral damage. [But] people in our battalion knew, and just didn’t say anything. I’m not sure how high up people knew, but it was accepted. That was the norm. We’ll just not say anything about it.”
Another area that reflected the malleability of ethical standards was the distribution of cash through the Commander’s Emergency Response Program (CERP). As one senior officer noted, “CERP is not tracked in detail and everyone knows it.” Another colonel observed:
“CERP money is an area where we probably fudge. We gave company commanders a lot of money that we powered down to people who weren’t trained. We probably submitted reports that weren’t accurate.”
Ethical Fading
At the outset of this monograph, it was brashly declared that most U.S. Army officers routinely lie. It would not be surprising if many uniformed readers raised a skeptical eyebrow at that claim. Indeed, it would not be unusual for nearly all military readers to maintain a self-identity that takes offense with notions of dishonesty or deception. Ironically, though, many of the same people who flinched at that initial accusation of deceit probably yawned with each new example of untruthfulness offered in the preceding pages. “White” lies and “innocent” mistruths have become so commonplace in the U.S. Army that there is often no ethical angst, no deep soul-searching, and no righteous outrage when examples of routine dishonesty are encountered. Mutually agreed deception exists in the Army because many decisions to lie, cheat, or steal are simply no longer viewed as ethical choices.
Behavioral ethics experts point out that people often fail to recognize the moral components of an ethical decision because of ethical fading. Ethical fading occurs when the “moral colors of an ethical decision fade into bleached hues that are void of moral implications.”13 Ethical fading allows us to convince ourselves that considerations of right or wrong are not applicable to decisions that in any other circumstances would be ethical dilemmas. This is not so much because we lack a moral foundation or adequate ethics training, but because psychological processes and influencing factors subtly neutralize the “ethics” from an ethical dilemma. Ethical fading allows Army officers to transform morally wrong behavior into socially acceptable conduct by dimming the glare and guilt of the ethical spotlight.
One factor that encourages ethical fading in the Army is the use of euphemisms and obscure phrases to disguise the ethical principles involved in decisions.14 Phrases such as checking the box and giving them what they want abound and focus attention on the Army’s annoying administrative demands rather than dwelling on the implications of dishonesty in official reports. Indeed, many officers even go as far as to insist that lying to the system can better be described as prioritizing, accepting prudent risk, or simply good leadership.
A more recent and significant development concerning ethical fading is the exponential growth in the number of occasions that an officer is obliged to confirm or verify compliance with requirements. When it comes to requirements for units and individuals, the Army resembles a compulsive hoarder. It is excessively permissive in allowing the creation of new requirements, but it is also amazingly reluctant to discard old demands. The result is a rapid accumulation of directives passed down, data calls sent out, and new requirements generated by the Army. Importantly, the Army relies on leaders to enforce compliance of the increasing amount of requirements and to certify the accuracy of the expanding number of reports sent upward.
The first time that officers sign an OER support form authenticating a counseling session that never happened or check a box saying, “I have read the above requirements” when they really only glanced at the 1,800-word IA acceptable use policy, they might feel a tinge of ethical concern. After repeated exposure to the burgeoning demands and the associated need to put their honor on the line, however, officers become ethically numb. Eventually, their signature and word become tools to maneuver through the Army bureaucracy rather than symbols of integrity and honesty.15 This desensitization dilutes the seriousness of an officer’s word and allows what should be an ethical decision to fade into just another way the Army does business. To make matters worse, technological advances and the cumulative effects of time have led to today’s officers facing a much larger amount of information to corroborate than their predecessors.
Ethical fading is also influenced by the psychological distance from an individual to the actual point of dishonesty or deception. Lying, cheating, and stealing become easier to choose when there are more steps between an officer and the dishonest act—the greater the distance, the greater the chance for ethical fading.16 Thus, most officers would be extremely uncomfortable telling their rater face-to-face that their unit completed ARFORGEN pre-deployment NBC training when they, in fact, did not. Those same officers, however, would probably be more comfortable conveying the same mistruth via a block checked on the ARFORGEN checklist. Likewise, a digital, instead of handwritten, signature on a sponsorship form attesting that an officer was briefed on the sponsorship program prior to PCSing—when they were not—broadens the separation between the officer and the dishonest act. Even the Army’s ubiquitous PowerPoint charts provide briefers the ability to focus on intricate color-coded metrics and thus distance themselves from the inaccurate or ambiguous information the metrics may be conveying.
The psychological distance between a person and the consequences of a dishonest act can also influence ethical fading. A moral decision can lose its ethical overtones if the eventual repercussions of such a choice are either unknown or minimized. For example, the explanation of an officer concerning inaccurate storyboards is illustrative of the common perception that much of the information submitted upward disappears into the ether of the Army bureaucracy:
“Where do the story boards go? They’re going to [a] magic storyboard heaven somewhere where there are billions of storyboards that are collected or logged somehow? After doing hundreds of storyboards, I honestly can’t tell you where any of them go. I send them to my battalion level element who does something with them who then sends them to some other element who eventually puts them on a screen in front of somebody who then prints them out and shreds them? I don’t know.”
Dismissing any potential damage that may result from a misleading or incomplete storyboard allows leaders to view the requirement as yet another petty bureaucratic obligation void of any ethical considerations.
Making Excuses
With ethical fading serving to bolster the self-deception that problematic moral decisions are ethics-neutral, any remaining ethical doubts can be overcome by justifications and rationalizations. While discussions with officers revealed a wide assortment of justifications for unethical behavior, one rationalization appears to underlie all other rationalizations— that dishonesty is often necessary because the directed task, the data requested, or the reporting requirement is unreasonable or “dumb.” When a demand is perceived as an irritation or annoyance, a person’s less than honest response almost becomes a compensatory act against the injustice.17 Officers convince themselves that instead of being unethical, they are really restoring a sense of balance and sanity to the Army. For example, one officer spoke of the distinction he made between useful and useless required reports:
“You can [ask] anybody in this room—the purpose of sending a SALTA or declaring a TIC, CASEVAC—not a MEDEVAC nine lines—we definitely know why we do that stuff and why we’re reporting. And people jump. They’re timely. They’re accurate. . .But some of this stuff is: You need this for why? Show me in the reports guide that we use or wherever [that] this is actually a required report. Because right now it seems like you’re just wasting a unit leader’s time.”
Another officer rationalized how ethical standards should be loosened for requirements perceived as unimportant:
“If it’s a green tab leader that’s asking me for information—the battalion commander, brigade commander, or something the division commander is going to see—then I would sit down and do it. That would be accurate reporting. If it was something that was going into a staff and wasn’t going to drive a critical decision the battalion made in terms of training or something I need to accomplish for a METL task . . . what goes up, goes up. Is it probably a little off? Yeah, there’s a margin of error.”
Finally, one officer, in euphemistic terms, summarized the Army’s tolerance for deception on seemingly meaningless requirements:
“I don’t think it’s that anyone expects you to lie. But I think there is an expectation of—I think the word is—equivocation…I don’t want to say it’s accepted, because that doesn’t sound good or it doesn’t sound right. But I think some expectation of equivocation is accepted on dumb things.”
Two other rationalizations are often used as justifications for dishonesty—mission accomplishment and supporting the troops. With these rationalizations, the use of deceit or submitting inaccurate information is viewed as an altruistic gesture carried out to benefit a unit or its soldiers. Officers reported that they sometimes needed to act as Robin Hood—going outside the ethical boundaries to assist others. As one officer nobly put it:
“I’m just going to “check this box” . . . and if I’m 70% accurate—that’s good enough to 1) keep my guys out of trouble and 2) keep my boss out of trouble so we can keep doing good things for the country.”
One captain recalled an instance where an IED injured a platoon leader and his replacement during a relief in place. The incident required an assessment of possible traumatic brain injury for both lieutenants. The captain explained:
“I falsified the [traumatic brain injury] report that changed a distance from the IED strike [to where] one person was standing. So that way someone didn’t come back down and stick a finger in my CO’s chest and say, “You need to evac that lieutenant right now!” Because in the middle of [a] RIP, that’s not going to happen. If I do that, I’m going to put my boys in bags because they don’t have any leadership. That ain’t happening. I owe the parents of this country more than that.”
Another officer rationalized how funds were deceptively obtained in theater on behalf of the troops:
“It’s odd that in situations that I’ve been in, it’s never been blatant self-interest. It’s never been, “I’m going to get this money so I can buy myself two couches for my office while I’m in Afghanistan.” [Instead], it’s always like—for us, it was hard as hell to get water heaters. For some reason we could not get hot showers for our soldiers. It wasn’t CERP money, but we had to finagle God-knows-how-many organizations to finally get these things and we had to say we’re using this for this, when in fact it was so our guys could have hot showers when they get back off patrol. The truth of the matter is that, at the level that we’re at, a lot of times we gotta get it done and we’re going to find a way to do it.”
Another officer accurately described how the rationalization process softens the sting of dishonesty:
“You feel more comfortable if it’s not for us—if it’s for what we think is the greater good. Like [lying about] all the 350-1 requirements prior to going on block leave. I want my soldiers to go on leave . . . It’s not for me. It’s for the greater good. [But] that doesn’t mean it’s right.”
Rationalizing allows officers to maintain their self-image as a person of integrity despite acts of dishonesty.
Read Part Three.
Leonard Wong is a research professor in the Strategic Studies Institute at the U.S. Army War College. He focuses on the human and organizational dimensions of the military. He is a retired Army officer whose career includes teaching leadership at West Point and serving as an analyst for the Chief of Staff of the Army. His research has led him to locations such as Afghanistan, Iraq, Kosovo, Bosnia, and Vietnam. He has testified before Congress. Dr. Wong’s work has been highlighted in news media such as The New York Times, The Wall Street Journal, The Washington Post, New Yorker, CNN, NPR, PBS, and 60 Minutes. Dr. Wong is a professional engineer and holds a B.S. from the U.S. Military Academy and an M.S. and Ph.D. from Texas Tech University.
Stephen J. Gerras is a Professor of Behavioral Sciences in the Department of Command, Leadership, and Management at the U.S. Army War College. He served in the Army for over 25 years, including commanding a light infantry company and a transportation battalion, teaching leadership at West Point, and serving as the Chief of Operations and Agreements for the Office of Defense Cooperation in Ankara, Turkey. Colonel (Ret.) Gerras holds a B.S. from the U.S. Military Academy and an M.S. and Ph.D. in industrial and organizational psychology from Penn State University.
Endnotes
13, Ann Tenbrunsel and David M. Messick, “Ethical fading: The Role of Self-Deception in Unethical Behavior,” Social Justice Research, Vol. 17, No. 2, June 2004, p. 224. Also see Max H. Bazerman and Ann E. Tenbrunsel, Blind Spots, Princeton, NJ: Princeton University Press, 2011, pp. 30-31.
14. Tenbrunsel and Messick refer to such phrases as language euphemisms, 226.
15. See Albert Bandura, “Selective Moral Disengagement in the Exercise of Moral Agency,” Journal of Moral Education, 31, No. 2, 2002, pp. 101-119, for how repeated exposure leads to moral disengagement.
16. Dan Ariely, The (Honest) Truth about Dishonesty, New York: HarperCollins, 2012, p. 59.
17. Ariely, pp. 177-178.
Featured Image: PHILIPPINE SEA (Sept. 24, 2020) Sailors from Naval Beach Unit 7 man a line alongside Sailors from the amphibious transport dock ship USS New Orleans (LPD 18) during an underway replenishment. (U.S. Navy photo by Mass Communication Specialist 2nd Class Kelby Sanders)