top of page
Writer's pictureJeff Hulett

The Missouri Fallacy: When the “show-me” approach leads to disaster

Updated: Jul 29


Missouri is known as the show-me state.  When someone says: “When it comes to this situation, I’m from the show-me state,” they likely mean they want hard proof or evidence supporting some statement or belief.  They are behaving skeptically and demanding rigor in reasoning.  It is a strong statement to support thoughtfulness and factfulness in the decision process.


Missouri is called the Show-me State due to a statement made by U.S. Congressman Willard Duncan Vandiver in 1899, emphasizing the need for proof or evidence.


However, this seemingly positive call for reasoning rigor can lead to a lack of creativity and a lack of deep understanding.  It can also lead to disaster.


Examples of the Missouri Fallacy:


The light we cannot see


The title of the Anthony Doerr Pulitzer Prize-winning book and movie "All the Light We Cannot See" refers to both literal and metaphorical meanings. Literally, it represents the wavelengths of the electromagnetic spectrum that are beyond human eyes' ability to detect.  It means there is no way to see some light, even though we believe it is there. Metaphorically, it speaks to the idea that goodness can still exist in the world even during times of darkness and destruction.  While we cannot always see the goodness, we believe it is there.


The Missouri fallacy show-me position is - “It we only see darkness, the goodness is not there.”


Understanding all the possibilities


In physics, neuroscience, and information theory – there is a difference between what an observer sees a system do over time and what the system could do but has not yet been observed.  This is the difference between the “empirical” distribution of the system states and its “maximum entropy" distribution.  This would be like rolling two dice 10 times and observing a roll of 7, 9, 4, and 3 over those 10 roles.


The Missouri fallacy show-me person would say - “Well, I have never seen a 12 rolled, so we will rule that possibility out."  Even though, there is a one out of 36 or 2.78% chance of rolling a 12.


The difference between the unfamiliar and the improbable


Thomas Schelling, the late Harvard economist and game theorist, wrote the foreword to the book “Pearl Harbor: Warning and Decision” by Roberta Wohlstetter. In it, he captured the idea that our lack of preparation for the Pearl Harbor attack resulted from American decision-makers neglecting to hedge against the actual choice made by the Japanese.  The American decision-makers only acted upon more obvious Japanese moves. Schelling’s insight highlights the complexity of strategic surprise and the need to consider multiple scenarios when assessing threats.  Schelling said:


“There is a tendency in our planning to confuse the unfamiliar with the improbable.”


America’s failure to anticipate a direct attack left us woefully unprepared.  2,403 people died because of the Pearl Harbor attack. The direct attack was not without precedent. Only about two years before Pearl Harbor, the British actively prepared for the Nazi air attack known as the Battle of Britain.  The preparation included decentralizing the location of the Royal Air Force planes.  Because the RAF planes were in many locations across Britain, the British mainland was able to absorb the vicious Nazi attacks and preserve their air assets to fight back successfully. Fast-forward two years later, presumably, the knowledge of this decentralized strategic approach was available to the American Pacific theater decision-makers. This strategic approach was unfamiliar to the American mindset but, given the very recent British experience and the alignment of Japanese and Nazi tactics, a direct attack was certainly not improbable. 


So, if the Americans had found a way to learn from the experience of our British allies and decentralize the Pacific fleet location, the attack on Pearl Harbor would have been far less impactful – or perhaps would not have happened.


The Missouri fallacy show-me position is – “Let’s act upon the familiar – that the Japanese would only try and sabotage the Pacific fleet.  So the Pacific fleet was centralized, better protecting against sabotage.” However, the ships were sitting ducks for a direct Japanese attack.


Only managing what we can measure


In business, because we measure so much, there are many Missouri Fallacy examples.  This example relates to employee performance management.  A common employer practice is to use employee productivity measures to drive raises, promotions, and bonus decisions. What can be measured at an individual level is only a subset of all that drives individual performance or relates to the performance of the whole.  In some cases, the narrow - "manage what we can measure" approach could drive bad behaviors.  In the professional services arena, potential bad behaviors related to compensation and promotion may include:


a)       billing and hour reporting exaggerations,

b)      staff billing at a high rate while sacrificing personal health or skills training investment,

c)       decision-maker over-reliance on measures considered for promotion or compensation changes, and,

d)      decision-maker under-reliance on soft data and judgment to recognize a fulsome set of employee contributions.  


Companies generally have policies to control potential bad behaviors. The enforcement of these policies is notoriously difficult.  The point is that the use of certain measures as targets may create perverse incentives contrary to long-term company health.  It is up to company leadership to be vigilant and actively manage the risks associated with these sorts of perverse incentives.  It is a slippery slope and hard to control.  In the case of compensation and promotion decisions, it is questionable to use measures as hard targets – because of the perverse incentive risks.


The Missouri fallacy show-me position is – “Let’s make employee decisions only based on the data we can see, it is quicker and protects the decision-maker from criticism.”  They are not as willing to make the more accurate but harder to judge and easier to criticize decisions for the company and employees.


In summary, the Missouri Fallacy is where we focus on the known and verifiable data and underweight or ignore valid intuition or valid but otherwise less visible or verifiable indicators. Four examples demonstrated how the Missouri Fallacy gets us in trouble. There are certainly more examples.


So, what do we do about it!? Essential for overcoming the Missouri Fallacy challenge is having a consistent, repeatable decision process. A wonderful decision process example is provided by philosopher, statistician, and reverend Thomas Bayes. Bayes lived over 200 years ago and developed a process, with the help of Richard Price and Pierre-Simon Laplace, that became known as Bayesian Inference. Bayesian Inference is a belief updating method based on priors and likelihoods leading to an accurate and probabilistic understanding of our ever-changing world. Also, modern decision technology, such as Definitive Choice, helps to implement a strong belief updating process. Bayesian math is not difficult and with the help of Definitive Choice, implementing a rigorous belief updating process is math-free. Please check out this article for more information on Bayesian Inference:



Free access to the Definitive Choice app is provided with each purchase of the book:


 

Comentários


bottom of page