Statistical and applied probabilistic knowledge is the core of knowledge; statistics is what tells you if something is true, false, or merely anecdotal; it is the "logic of science"; it is the instrument of risk-taking; it is the applied tools of epistemology; you can't be a modern intellectual and not think probabilistically—but... let's not be suckers. The problem is much more complicated than it seems to the casual, mechanistic user who picked it up in graduate school.
New product development projects are highly risky technical undertakings. Organizations frequently seek to manage the risk involved using standard risk management procedures, knowing that a company that better manages risks is less vulnerable. Nevertheless, NPD projects continue to fail to meet expectations for delivery time, budget, and outcomes. In this paper, we explore reasons why, despite employing self-evidently correct risk management procedures, adversities occurred in 19 major information systems projects. Project managers focused on the familiar, the measurable, the favorable, the noncommittal, and the controllable while excluding other risks that significantly affected their project performance. We have characterized this tendency as a series of five lures that leave projects vulnerable to risks.
Philippe Vallat's insight:
Quote: All too often, risk management rests upon what can easily be counted-what we call the "lure of the measurable."
Albert Einstein once wrote on a blackboard: “Not everything that counts can be counted, and not everything that can be counted counts.”...
“Conditional complexity” (also called cyclomatic complexity) is a term used to measure the complexity of software. The term refers to the number of possible paths through a program function; a higher value means higher maintenance and testing costs.
To make the systems we depend upon more resilient ideally we would want more redundancy within critical systems and weaker coupling between them. Localization and de-complexification of basic needs (food, water, waste etc) would provide some societal resilience if systems resilience was lost. We would have more buffering at all levels, that is, larger inventories throughout society.
All this is the very opposite of the direction of economic forces.
Qu'on le veuille ou pas, l'incertitude génère inconfort, inquiétude, peur voire anxiété, soit des émotions « managérialement incorrectes » dans un monde où certains croient encore dur comme fer que la décision dite rationnelle est le summum de l'intelligence de l'homo economicus.
La 4ème session était donc consacrée spécifiquement à l'incertitude et à la différence entre incertitude et risque. Revenons sur ces principes au nom parfois étrange... Le risque, il concerne un fu...
Philippe Vallat's insight:
Le risque concerne un futur dont la distribution d’états possibles n’est pas connue a priori, mais estimable sur la base d’un certain nombre de tirage.
L’incertitude, en revanche, correspond à un futur dont la distribution d’états est non seulement inconnue, mais impossible à connaître. Cette incertitude est dite « objective » : elle ne tient pas à l’incompétence de l’observateur mais à la nature même du phénomène