A few questions jumped out at me while reading the New York Times article “Risk Mismanagement“, which I came across on the EconTalk page that contains NNT’s interview. The main ones I want to explore in detail are these:
- Why do people find it so hard to accept complexity?
- Which biases are acting to blind us to the uncertainty inherent in complex systems?
- Why are we drawn to reliance on error-prone models, often getting it precisely wrong rather than vaguely right?
- How can we get “on the right side of bias” in dealing with complex systems?
In essence, this is the same idea that forms the basis for this blog: how we can better manage uncertainty through understanding why we find it so hard to accept.
I believe that a focus on elements that prevent us from engaging usefully with complex systems fits well with what Nassim Nicholas Taleb has been talking about in interviews recently: while he is not focused on the issue of “expert error”, implicitly much of what he talks about requires finding ways around the “foolishness” or “propensity to be fooled” of people engaged in the financial system (although I accept that not everyone will be inclined to view complexity in the same way that he does!). Perhaps this solution set might be divided into two: 1) structural shifts that would create conditions of increased robustness through limiting (i.e. governing) already-identified fragility-increasing behaviours, and 2) bias-correcting policies that would correct incentives and “nudge” behaviours away from current or future errors that might individually or culminatively compromise the system.
On the structural side we definitely need structural guidelines that recognise that the global financial system is complex and therefore to make it more robust, just as a plane has to be designed for some seriously rough storms and other contingencies besides. This could involve (as NNT has suggested) limiting both the size of institutions and the leverage they can build up, to prevent a small number of highly leveraged players or institutions from pulling down the whole system when something goes wrong (i.e. limiting damage from ‘individual’ errors like LTCM). Other examples here could be mandated slack in the system, pre-prepared plans for liquidity provision etc.
But it would also be useful to decrease “pilot error” for those who work within the system, for two reasons: a) to prevent the sum of decisions across the system from creating structural changes which increase fragility (i.e. preventing re-optimization through creeping changes), and b) to further hedge against LTCM-like errors which, while perhaps not systemically dangerous under the right structural framework, are nevertheless not to be desired (except as regular lessons, perhaps). Here, we need to be aware of aspects such as ensuring incentive structures remain systemically sound, and understanding how decision-making processes, cultural norms and the influence of biases and emotions in behaviours across the buy-side, sell-side and regulators contribute to the issue of robustness.
I now believe that these two issues re intrinsically linked – importantly, we need to understand how people regard complexity and why they find it hard to accept the uncertainty that it creates in order to design adequate responses – both structural and bias-correcting. This is almost the first step, and leads to those four questions above.
As an aside, my boss asked me a very similar question today, prompted in turn by a great conversation with Darko Lovric last night. She wanted to know what the mental process people went through when they have to give up a core element in their everyday worldview, such as belief in a deity. When people shift their outlook on the world in a fundamental way, how does the brain cope? What replaces the lost viewpoint? Is there a mourning process?
Perhaps understanding the psychological and neurological processes of letting go of a central viewpoint will help us understand the process that people need to go through in moving from thinking deterministically to thinking in terms of scenarios, uncertainty or complexity. Darko, I’d love your thoughts in particular on this last point.