by Joel P. Gleason
Temptation of Details
Mission command philosophy is an intent-based leadership style that encourages commanders to eschew the overly controlling methodology of detailed command. The principles of mission command philosophy are all meant to free commanders from small decisions by creating independent agents on the battlefield who operate off of intent alone.
Yet executing intent-based leadership often conflicts with a military culture that demands details. Perhaps even unnecessary details. Any organizational leader using an intent-based philosophy will suffer from a certain cognitive dissonance as they encounter detail-focused processes. Add in the volume of data available in a large organization and the temptation of details often distracts leaders from truly executing mission command.
This temptation towards interesting, available, and overwhelming details contributes to a modern problem with information overload that derails more than just military commanders. Researchers in humanitarian assistance and disaster relief (HADR) fields have an info-overload problem too. These fields draw insights from decades of history and a myriad of other sources. Urgent relief, like intent-based leadership, often demands a less detailed approach.
HADR Researcher’s Approach
Rapid Rural Appraisal (RRA) was a practice developed by Dr. Robert Chambers to address the problem of information overload in HADR. The academic descendant of RRA, Rapid Assessment Procedures (RAP) is one of the best qualitative approaches in the field. Chambers recognized that a “long-and-dirty” approach to the problems HADR organizations hoped to solve was leaving too many unaided. Too little, too late, as they say. At the same time, a “quick-and-dirty” approach was also unhelpful, even potentially risky because it ran a high risk of biases leading to inaccurate assumptions.
This is why many HADR agencies, including UNICEF and USAID, frequently employ RAP as a middle-ground methodology. RAP offers research tools, qualitative approaches, and inquiry methods that limit bias with rapid results. Since its conception in 1981, countless lives have been improved, or even saved, through RAP-driven interventions.
Many of the innovative methods within RAP help HADR teams break out of limited thinking. Some methods are field-specific, others have more general applicability. First among the innovations that Chambers proposed, the paradoxical and complementary concepts of “optimal ignorance” and “proportionate accuracy” are the most unique innovations when compared with tools already available to military decision makers.
Optimal and Appropriate
“Optimal ignorance,” a deliberately contrarian term, “refers to the importance of knowing what it is not worth knowing. It requires great courage to implement. It is far, far easier to demand more and more information than it is to abstain from demanding it.” In other words, seeking optimal ignorance requires deliberately going about not wasting energy or time on information that distracts from the primary inquiry.
Chambers also challenged adopters of optimal ignorance to determine what level of “proportionate accuracy” is necessary. Chambers observed “much of the data collected has a degree of accuracy which is unnecessary. Orders of magnitude, and directions of change, are often all that is needed or that will be used.” In the academic evolution from RRA to RAP, this concept became, “appropriate imprecision.” Appropriate imprecision, like proportionate accuracy, is a mechanism for seeking only the measure of detail required for action. It is the preferred term because it indicates the optimally ignorant level of precision that researchers and decision makers should seek.
Neither of these ideas are intended as excuses for guess-work. Practicing optimal ignorance and appropriate imprecision requires a deliberate plan of inquiry in which practitioners determine ahead of time what is unnecessary to know and deliberately waste no energy or time on processing that information. All information is filtered, even if unintentionally, and these concepts make that filtering deliberate. The decision maker who executes the Commander’s Role in the Operations Process using optimal ignorance will gain more clarity in less time and make better decisions.
Triage of Information
Commanders are conducting a form of information triage when they use optimal ignorance and appropriate imprecision to deliberately manage and avoid information overload. It is also possible to identify topics for optimal ignorance in the middle of the decision process. At times, information triage can be like medical mass-casualty triage because leaders must identify, and set aside, the “dire” and “expectant” cases if the other information is to survive.
If doctors try to treat everyone in a mass casualty, they will fail to save the lives that can be saved. Likewise, in information triage, some questions are not worth pursuing. There may be no harm in pursuing a curiosity at the right time. However, no doctor treats a cold in the aftermath of a firefight, nor can they focus all resources on a hopeless case amid a flood of treatable victims. Triage is a hard decision regarding patients. It is simply a necessary decision about information. Optimal ignorance shapes the best decision that can be made at the moment. Like a trauma team, leaders need to learn information triage.
Questions from senior decision makers tend to generate tasks for the supporting staff. Pursuing a curiosity in a time-constrained environment can sidetrack work that is occurring in your team. If that ongoing effort represents disciplined initiative, the very nature of asking unnecessary questions can have an opportunity cost in which the questioner becomes an obstacle to their own intent.
Leaders can triage information when they deliberately ask “questions with a decision making purpose” (Qw/DMP). When the organization is overstretched, leaders maintain optimal ignorance by holding questions that, even if answered, will not influence or change a decision they need to make. Leaders who want to avoid staff burnout should also consider: the utility of knowing; the information feasibility (what is required to find an answer?); and the anticipated unnecessary details from their inquiry.
As a Garrison Commander, my organization had such diverse responsibilities that I began to write “Qw/DMP” at the top of every fourth page of my notebook. When I later confessed this to my deputy, he told me he had noticed a change from discovery questioning to decision-based inquiry around the time I started the practice. Some leaders, especially those with an insatiable curiosity, may need to use Qw/DMP to free their staff from “accidental science projects.” This is not an admonition to never ask questions, just to consider the impact or find a better inquiry method.
My Qw/DMP filter also required that I trust the experts to follow my intent. Seeking optimal ignorance can free leaders to admit that they do not know everything, especially if they are leading an enterprise-based organization where no single individual can master every specialty employed on their teams. Detail command and the accompanying detail demand tend to imply that leaders have difficulty extending trust to subordinates, no matter how much more expertise that subordinate has on the matter at hand.
It takes a confident leader to admit total ignorance in front of subordinates. Chambers implied that it takes some form of courage to say that we do not need certain answers. Other scholars have highlighted that it takes a quality intellect to admit that there is more than one answer, that the answer is dependent on a heightened sensitivity for antecedent conditions, that it is unknowable, or that we do not need to know. Specifically, leaders do not need to know the details in order to trust a competent, cohesive team.
Extension of Trust
Trusting your teams without examining all of their information is only half of optimal ignorance. In addition to limiting the inquiries we pursue, leaders will find that to obtain an optimal information level we must also deliberately select (and deselect) which reports we receive.
As a G4, an officer once fretted over my schedule upon realizing that he would not have time to tour me around the other three barracks in his footprint. When I asked him what he felt I needed to see, he said, “Well sir, there’s mold there too.”
He was not seeking to prove anything unbelievable or request any unusual support. The Army’s detail-demand culture had this leader playing show and tell to gain trust. Not every situation will allow it, but in this instance it took less than a minute to relay that if he said mold was there, I believed him without the tour.
Sometimes optimal ignorance looks like trust.
Leaders who ask what kind of mold is growing in a barracks ought to have some expertise in mold typology or they are just delaying in real ignorance. I have a fair resume of expertise and experience, but that does not prevent me from confessing total ignorance in a new category. Developing an optimal ignorance filter allows leaders to skip unnecessary confirmation steps and get their teams the help they need even if decision makers have not personally investigated. This may not always save leaders time, but it almost always releases time back to the team.
Optimal ignorance can accelerate quality decisions by eliminating distractions but it should not be an excuse to make unstudied decisions on a “panic azimuth” bearing. I did not skip out on the mold tour because I did not care about it. I trusted that a subordinate was more informed and had more direct contact with the problem than I could ever have. If I had been simply saying, “good enough,” that would not have been good enough.
Optimal ignorance and appropriate imprecision are not the same as taking a metaphorical panic azimuth. Panic azimuth, in land navigation training, allows hopelessly lost trainees to go in a single direction that will intersect with a terrain feature and get them to safety. Using panic azimuth is not a successful completion of the orienteering course, it simply gets the participant out of the situation. Ignoring or avoiding necessary information can put leaders on an information panic azimuth.
Leaders must comprehend the difference between seeking information triage through optimal ignorance with appropriate imprecision and taking a shortcut to get out of the situation due to suboptimal guesswork. Appropriate and optimal are key words. If leaders manage information in a manner that actually prioritizes the urgent over the important, they have probably taken a panic azimuth. Screening out noise through a developed filter is likely to be appropriate. Indiscriminate deletion of email or blanket cancellation of meetings are unlikely to reflect an optimal method. Optimal ignorance means selecting the trusted source of expertise in a crisis and setting your azimuth from that advice.
Optimal Echelon for Optimal Ignorance
“Influence scales, control doesn’t.” That insight from author Todd Henry truly resonates with me. If a leader can control five people, they might be able to control ten. However, a leader who can influence five people can potentially influence a million. Experienced leaders know that the larger or more complex the organization, the more they should be seeking to influence, rather than control, their formation. Decision makers may see optimal ignorance and appropriate imprecision as filters best employed at echelons defined by influence.
In truth, optimal ignorance is rarely the right technique for platoon leaders, company commanders, or even battalion staff. Tactical commanders may not always find it appropriate to be imprecise. However, leaders who make decisions that guide an organization sufficiently large, or with diverse enough responsibilities, that one person cannot plausibly know every detail, should consider adding optimal ignorance and appropriate imprecision to their repertoire of information filters. These tools may apply to decisions in small, yet complex, nodes of the Army, or Army-wide Enterprise organizations.
Application of Ignorance
Mission command philosophy envisions competent, cohesive teams with mutual trust utilizing venues for shared understanding and operating off clear commander’s intent that allows disciplined initiative through mission orders. In a truly competent and cohesive team, optimal ignorance represents a prudent risk for decision makers. Part of the team’s mutual trust may be an acknowledgement that a commander’s decisions are already based on information filtered by their team. The question is whether their information filter is random or if a deliberate triage is in place to improve decisions.
Optimal ignorance is a contradictory concept born of frustration in a field where timely measurements and assessments are difficult, if not impossible. Both optimal ignorance and appropriate imprecision bear a moderate risk of looking like haste. However, decisions formed in the middle of an information overload are often poor and short sighted. Overwhelming circumstances mean that decisions have a higher risk of actually occurring in haste. Leaders should consider deliberately, rather than accidentally, filtering information. Consider developing optimal ignorance.
LTC(P) Joel P. Gleason is the Ordnance Branch Chief for Enlisted Personnel Management Division, US Army Human Resources Command. He is a former Garrison Commander and a graduate of the School of Advanced Military Studies.
The views expressed in this article are those of the author and do not reflect the official policy or position of the U.S. Army Human Resources Command, Department of the Army, or the U.S. Government.