Unsurprisingly, much of the media commentary around the Senate enquiry into the London Whale episode has taken the view that the JP Morgan executives were engaged in some type of conspiracy to cover up the events. Another possibility, however, is that the series of errors and failures to escalate reported by the Senate were a function of a series of cognitive biases and limitations. We will draw on the work of the psychologist, Daniel Kahneman, in considering this question.
First, let’s look at the VAR measurement issue, a source of much discussion. On January 16th 2012, not just the CIO's VAR limit but the Banks's limit was breached by the CIO's positions in the Credit Portfolio. The question was posed by the CIO's chief risk officer (CRO), Irvin Goldman, "How can we assess the cause of the breach, how it will be resolved and by when?" (Senate Permanent Sub-Committee on Investigations Report into the losses).
This was a very difficult question to answer and it seems that avoidance took place because the answer that apparently came back — "to switch to the new VAR model as soon as possible” — clearly answered a different question —"When will the [VAR] model be fixed?” The easier question — was thus apparently substituted for the harder questions. The VAR model had been faulted by various risk managers and it was known that a new model was being worked on to replace it. Knowing that may have allowed the convenient notion to take shape in peoples' minds that no action or further discussion was needed until the model was fixed. Meanwhile the breach continued and the loss widened. Whatever questions the improvement of the VAR model could answer, solving the portfolio problems were not amongst them. Daniel Kahneman, who has written extensively on these types of issues, calls this the substitution effect, explaining that that "when faced with a difficult question, we often answer an easier one instead, usually without noticing the substitution" ("Thinking, Fast and Slow (TFS),” P.12).
Second, the sheer size of the JP Morgan balance sheet may have anchored CIO and other leaders in numbers that were of an unhelpfully large magnitude. Could the knowledge that the balance sheet was over $2 trillion in value have had an unconscious, or even conscious, effect in minimizing the concern around numbers that were so much smaller? Certainly the $2 trillion number was well known to senior executives at JP Morgan, and was referred to several times at the Senate cross-examinations. Does not a VAR of 67 or even $132 million seem relatively trivial In comparison to such a balance sheet? Could this anchoring potentially (and obviously falsely) have cooled reactions to the breach of the VAR limit with the unfortunate effect of reducing the apparent urgency to act? It’s possible right?
Third, Kahneman writes that we "are prone to overestimate how much we understand about the world" (TFS, p.14). Certainly thinking, falsely, that one is in control and has a full understanding of events will tend to limit one’s readiness to sound the alarm. One of the JP Morgan Executives, Ira Drew, when questioned by the Senate about whether the credit portfolio was really acting as a hedge, referred to the difficulties in assessing the question because of the size and complexity of the balance sheet. With the advantage of time and hindsight that may be relatively easy for her to convey but is it likely Drew would have made such an admission to regulators at the time? It seems possible that, in fact, JPMC’s executives may not even have been aware of the complexities and how far they were from being able to manage the issue.
[Senate Report: JPMorgan Ignored Risks — Best Reactions]
Fourth is the propensity of people to make assumptions about future performance based on thin data. Kahneman's favorite equation, he tells us (TFS,p.176), is "success equals talent and luck, great success equals a little more talent and a lot more luck." As regards the CIO trading team, there were some very strong trading years prior to 2012. The expectation then amongst senior management may well have been for that level of performance to continue. Actually such an eventuality was unlikely. Far more likely was that performance would regress to the mean over time. The part played by luck is generally greater than is allowed for when assessing performance of any player, be it golf or trading. Thin data no doubt also plagued the assessment of the risk modeling team's future performance. Here we have the PHD fallacy- whereby a person with a PHD is endowed with powers far beyond his narrow field of academic endeavor: for example, does academic achievement necessarily translate into the ability to produce error-free VAR models with minimal support and time? We have found out that it does not. Surprisingly not perhaps.
One is left after this brief survey with the feeling that Iksil was largely on his own in all this. He too had likely underestimated the complexity of the positions he had got into until it was too late. Iksil acknowledged that the portfolio had grown to out of control monster proportions in a call to a colleague, transcribed in the Senate report, on March 16th 2012: "there's nothing that can be done, there's no hope". Having a better sense of the limits of human understanding and cognitive behaviors would certainly be helpful if leaders are to avoid re-occurrence of such incidents as the London Whale.Andrew Waxman writes on operational risk in capital markets and financial services. Andrew is a consultant in IBM's US financial risk services and compliance group. The views expressed her are those of his own. As an operational risk manager, Andrew has worked at some of the ... View Full Bio