Risk Management
By Jay Lipeles | 25 June 2010
To the Editor, Mechanical Engineering magazine:
I read with alarm, the cover story in your current issue: "Risk— Informed Decision Making". The basic message that risk can be managed is wrong and extremely dangerous. Mechanical Engineering does a great disservice to its readership and the engineering community at large by promoting such nonsense.
To cite a few examples: In the investigation that followed the Challenger disaster, NASA testified as to its risk assessment made prior to the event. The assessment was wrong. Seven astronauts died, and it cost the nation many hundreds of millions of dollars and several years to recover. To my mind, NASA never has, witness your article. Bad thinking remains.
The error in NASA's assessment (and related risk management theories) is that inherent in the theory (and usually unstated, as in your article) is the assumption that the situation is ergodic. NASA's assessment was based on (among other things) test data taken at room temperature. The probability of failure of the O rings was based on that data and it was incorrect [[because, as is necessarily usually the case in real life, the data was incomplete: normxxx]]. The situation was non-ergodic. The theory was inapplicable. The analysis was wrong and disaster followed.
Long Term Capital Management (LTCM) lost $4.6 billion in 1998 and subsequently failed. Among the reasons for its failure was that it assessed its risk with an analysis similar to the one in your article, which assumed that the situation was ergodic. It was not. When Russia defaulted on its debt, the mistake was revealed and LTCM went down. [[This error was systematically repeated in the models of the market created by the so-called 'quants'— which so famously failed in the summer of 2007. They made the mistake of assuming that a couple of decades of recent market behavior (during a rather anomolous period in the market's history, one might add) was a sufficient sample of market behavior.: normxxx]]
Now we are struggling through the worst recession since the Great Depression. It was initiated by the collapse of the subprime market. It goes without saying, that there were a number of contributing causes. Among them was the bundling of mortgages into securities backed up by 'risk analyses' that predicted the [combined] risk to be very low. [[They committed a common error; like LTCM, they assumed that the various risks were independent of each other— that the probability of any large number of defaults occurring at the same time was close to nil— whereas, in fact, so long as the 'players' are the same (ie, the general public, in the case of MBBs), the system is highly prone to highly correlated excursions (due to 'herding behavior'— group 'panic' buying or selling).: normxxx]] The analyses assumed that the situation was ergodic. Whoops! In all of history, was there ever a more costly mistake?
Inherent in all analyses of this type is the assumption that the situation is ergodic. It almost never is. What the authors would have us believe is that a careful, thorough 'risk analysis' can be helpful in reducing risk. Nonsense! What such an analysis does is provide a sense that risk is being addressed when it is not. It contributes to a false sense of security.
The most important quality an engineer brings to the game is his judgment. Greater reliance on [[so-called 'formal' (mathematical) models of : normxxx]] 'risk management' really implies (and demands) reduced reliance on judgment (engineering, financial or whatever). It is a very bad strategy, based as it is on an assumption known to be wrong! It is an extremely dangerous strategy. We should instead be relying more heavily on, and developing the people, who possess good judgment.
Friday, June 25, 2010
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment