Pearl Harbor: Perpetual happy hour, misunderestimation and the mother of all biases

Authored by:  Daniel T. Murphy

Introduction

Through the later months of 1941, while Japanese diplomats pretended to negotiate with the United States, Japan’s leaders planned for war.  Japan’s grand strategy was to seize the resource-rich Dutch East Indies, Singapore, and Malaya before their oil supplies would run out in December 1941.  Japan would seize the Philippines to protect the sea lanes between the new southern resource area and the Japanese home islands, and they would destroy the U.S. Pacific Fleet at Pearl Harbor to prevent any U.S. counter-attack until the new territories had been secured.

Japan’s Pearl Harbor attack was successful largely due to superb planning and execution and intelligence denial and deception.  However, U.S. intelligence helped the Japanese by failing to provide policymakers with sufficient indication and warning of attack.  The U.S. intelligence failure was multi-dimensional, with cognitive, organizational, cultural and analytical components.

Cognitive Bias

More than anything else, December 7, 1941 was proof that “the mother of all biases is mindset.”[1] Morgan Jones argues that biases easily lead us astray because the mind doesn’t rigorously test the logic of every new piece of information it receives.[2]  According to Sam Wang, consumers of information are prone to selectively accept information that reinforces what they already believe.[3]  In 1941, the war in Europe was having a positive impact on the U.S. economy.  The American people believed that “happy days were here again.”  According to John Hughes-Wilson, the Army command in Hawaii was a “perpetual happy hour”.  American forces in Hawaii demonstrated “a dangerous lack of awareness of the possibility of a surprise enemy attack” and had “the ingrained habits of peacetime.”[4]  While there was plenty of evidence that conflict was approaching, the American mindset was on a future of peace and prosperity, and U.S. policymakers shared that mindset.  According to Richard Betts, surprise attacks are usually successful due to the unwillingness of political leaders to believe the evidence.[5]  The American people and American policymakers believed that, if war did come, it would not touch U.S. soil.

As a result, crucial indications of Japanese intent were partially or wholly ignored – For example, the message from Tokyo to Japanese diplomatic posts in London, Hong Kong, Singapore, etc. to destroy not only their codes and ciphers, but their code machines as well.  Analysts were confident that war would not happen unless they first heard the promised “Winds Code” message broadcast from Tokyo to the Japanese embassies.

In some respects, the U.S. cognitive bias in the days leading up to the Pearl Harbor attack was similar to the cognitive bias of the 1980s, when CIA analysts were convinced that the Soviet regime was stable.  Jack Davis called this phenomenon “tribal think”.[6]  It is interesting to think about how analytical techniques of today, like Argument Mapping or the Delphi Method, could have helped policymakers expand their critical thinking of the “what ifs” of Japanese intent in the days leading up to December 7, 1941.

Organizational Breakdowns

Our intelligence failure on December 7 was also caused by organizational breakdowns from the executive level in Washington to the ground-level in Hawaii.  As conflict became more probable, the chief executive failed to establish an integrated national intelligence organization that could have properly gathered, analyzed and disseminated indications of attack.  Government agencies worked in stovepipes.  Betts emphasized the lack of coordination between the State Department, War Department, and Navy.  For example, Ambassador Joseph Grew was told by the Peruvian ambassador that the war would begin with an attack on Pearl Harbor[7] – one of many data points that was likely not shared with Navy commanders in Hawaii.

Hughes-Wilson and Betts both pointed to the barriers caused by bifurcation of command.  The Navy’s OP-20-G was focused on reading Japanese Naval traffic, code-named Orange, and the Army’s SIS group studied Japanese diplomatic traffic, code-named Purple.  Although the two groups were physically located only one block apart in Washington, they did not trust each other, did not share information, and did not connect the dots for indications and warnings of Japanese intent.  In fact, because analysts were only showing “some of the intelligence to some of the decision-makers some of the time,”[8] they contributed to the “ambiguity of evidence”[9] that happens when there is an excess of raw data.  As a result, it was left to the policymakers to connect the dots.  And they didn’t do it.

On the ground in Hawaii, information-sharing between Admiral Kimmel and General Short and their respective staffs was informal at best, and there was no coordination with the FBI’s Chief of Station in Honolulu.  In fact, in the days leading up to the attack, nobody realized that no US agency in Hawaii was listening to Japanese phone lines.[10]  Instead of the “lunacy of the Navy briefing on odd days, and the Army briefing on even days,”[11] imagine if the two services had engaged in Red Team Analysis or Adversarial Collaboration.

Cultural Bias

According to Roger George and James Bruce, it is at the consumer level where there is the greatest potential for analytical error through policy or policymaker bias.[12]  Cultural bias at the consumer level was a significant cause of the intelligence failure on December 7. In the words of Hughes-Wilson, the U.S. had a “total underestimation of the Japanese as potential enemies.”[13]  U.S. policymakers did not understand Japanese culture.  They underestimated the pride and fierce independence of the Japanese people.  They did not appreciate that Japan had clawed their way out of an economic depression, and back onto the world economic stage through a hyper-nationalistic and hyper-militaristic mindset that would never allow them to return to a posture of subservience to the West.  Most importantly, the U.S. wrongly assumed that Japan would choose the path of diplomacy and a spirit of negotiation that any Western country would choose.  If U.S. policymakers, through critical thinking and structured analytic techniques had been able to shed their cultural bias, they would have known that war was Japan’s only real option.

Lack of Analytical Sophistication

Our intelligence failure on December 7 was also an analytical failure.  If U.S. policymakers had been able to understand that war was Japan’s only real option, structured analysis may have helped them to see the absolute necessity for Japan to buy time by destroying the U.S. Pacific Fleet.  Hughes-Wilson argues that if the FBI had compared Tricycle’s (Dusko Popov) Abwehr questionnaire with the Navy’s message intercepts, the significance would probably have struck them “like a thunderbolt.”[14]  Lack of analytical techniques (because they were largely not invented in 1941), prevented the intelligence community from connecting the data points, and creating useful hypotheses on Japanese intent.  For example, what if analysts had had the modern tools and techniques to collaboratively evaluate: (a) The Royal Navy’s successfully configured torpedoes for use in shallow water against the Italian battle fleet at Taranto in 1941; (b) the Abwehr questionnaire that asked “What is the progress of the dredger at the entrance to East and South East Loch?”; and (c) the Peruvian ambassador’s assertion that Pearl Harbor would be attacked first.  Would analysts have altered their mindset that Pearl Harbor’s hills and shallow anchorage made torpedo attack a technical impossibility?

Conclusion

The U.S. intelligence failure in December 1941 was multi-dimensional.  U.S. intelligence services failed to provide sufficient indications and warning of the Japanese attack that killed more than 2,000 Americans and destroyed the U.S. Pacific Fleet.  It was, as Hughes-Wilson argued, “the ultimate intelligence blunder”.[15]  The good news is that we learned, and continue to learn from that blunder.  And it is one of the reasons why today, the concepts of collaboration, critical thinking, and structured analytical techniques are commonplace (and growing) in the intelligence community.

Copyright 2012:  Daniel T. Murphy


[1] M. Jones, The Thinker’s Toolkit (New York, NY: Three Rivers Press, 1998) 31.

[2] M. Jones, 27.

[3] S. Wang and S. Aamodt, “Your Brain Lies to You”, New York Times, June 27, 2008.

[4] J. Hughes-Wilson, Military Intelligence Blunders and Cover-Ups (New York: Carroll & Graff, 2004) 79.

[5]  R. Betts, Surprise Attack: Lessons for Defense Planning (Washington, DC, Brookings Institution, 1982) 4.

[6] J. Davis, Why Bad Things Happen to Good Analysts, 164.

[7] R. Betts, Surprise Attack 42.

[8] Hughes-Wilson, 67.

[9] R. Betts, “Analysis, War and Decision: Why Intelligence Failures are Inevitable,” (World Politics, Vol. 31, No. 1, Oct 1978), 69.

[10] Hughes-Wilson, 98.

[11] Hughes-Wilson, 100.

[12] J. Bruce and R. George, Intelligence Analysis: The Emergence of a Discipline (Washington, DC: Georgetown University Press, 2008) 12.

[13] Hughes-Wilson, 62.

[14] Hughes-Wilson, 69.

[15] Hughes-Wilson, 101.

Advertisements