Annika Kulovesi
Obviously they should have seen it coming
Updated: Aug 19, 2019
In this essay series, I will write down my own thoughts about Eliezer Yudkowsky’s essays on the Rationality: Ai to Zombies –series from the point of view of a historian. My reason for writing these is primarily to organize my own thoughts regarding the use of rationality as a mental tool in my own profession, and as such, I do not presume to even attempt to appeal to a very wide audience. However, if you are reading this disclaimer and find my essays insightful or entertaining, more the power to you and I implore you to go and read the original essays, if you have not already.
The reason why historical actors tend to appear to us as either Masterminds or Imbeciles can be attributed to both hindsight bias, and the fact that people – historians especially – are very keen on constructing coherent narratives of the past. While it is considered key to consider only what the people themselves knew by the time of any particular source, the historian usually already has the already existing narrative in mind, from start to finish. We know what’s coming next, and thus sometimes we need to remind ourselves that the people at the time did not. It is notoriously difficult to predict the future, or even the consequences of your own actions. There are simply too many factors to consider. Even if in hindsight, some particular feature may stand out above all else, because it is the straw that broke the camel’s back.
In his essay concerning Hindsight Bias, Yudkowsky uses the Challenger disaster as an example, reminding that preventing the disaster ‘would have required, not attending to the problem with the O-rings, but attending to every warning sign which seemed as severe as the O-ring problem, without benefit of hindsight. It could have been done, but it would have required a general policy much more expensive than just fixing the O-Rings.’
Resulting from hindsight bias, we tend to think that successful people were successful in their endeavours because they could plan their course meticulously Meanwhile, those who failed ought to have been able to predict that one thing and in failing to do so, appear to be have been idiots. Humans are not well equipped to rigorously separate forward and backward messages, so even mindful historians can fall prey to allowing forward messages to be contaminated by backward ones.
Examples of this kind of thinking is especially rife among political history.
Another thing that causes bafflement in students of history of every level is the assumption that most other people likely share your interpretation of a message’s contents. This gets confounded when you take into account the historian’s perspective of usually actually knowing what the message was supposed to say, due to the consequences of its misinterpretations.
In ”Illusion of Transparency: Why No One Understands You”¹, Yudkowsky recites a Second World War example used in a heuristics study by Keysar and Barr to illustrate an over-confident interpretation:
“-two days before Germany’s attack on Poland, Chamberlain sent a letter intended to make it clear that Britain would fight if any invasion occurred. The letter, phrased in polite diplomatese, was heard by Hitler as conciliatory—and the tanks rolled.”
It is an instinctive reaction to tear at one’s figurative beard at the stupidity of both parties involved – how could Chamberlain have left any room for interpretation, and what possessed Hitler to think that in absence of a direct threat, Britain would stall military action? However, Chamberlain’s style was to be very cautious and mild-mannered in his communication, and it had never resulted in a war before. Similarly, Hitler may have decided to act regardless of the word choices in Chamberlain’s message. We may never know, but this exchange makes both appear as Imbeciles, knowing both how the war ended, and what it cost.
Hindsight Bias is one of those mechanisms of the mind that historians are well aware of and actively work to undermine, yet end up submitting to too often. Be it hubris, attachment to one’s own narrative, or just laziness of meta-cognition, we all make this mistake sometimes. Still, it should be considered a required professional skill to be able to go backwards with one’s thinking and separate one’s own knowledge from what information motivated a particular source.
¹ Yudkowsky, Eliezer. ”Rationality: from AI to Zombies” Berkeley, MIRI (2015). 34–36. The study he refers to in the essay: Boaz Keysar and Dale J. Barr, “Self Anchoring in Conversation: Why Language Users Do Not Do What They ‘Should,”’ in Heuristics and Biases: The Psychology of Intuitive Judgment: The Psychology of Intuitive Judgment, ed. Griffin Gilovich and Daniel Kahneman (New York: Cambridge University Press, 2002), 150–166, doi:10.2277/0521796792.