05 March 2020

John D Sterman - Collected Quotes

"All dynamics arise from the interaction of just two types of feedback loops, positive (or self-reinforcing) and negative (or self-correcting) loops. Positive loops tend to reinforce or amplify whatever is happening in the system […] Negative loops counteract and oppose change." (John D Sterman, "Business Dynamics: Systems thinking and modeling for a complex world", 2000)

"Bounded rationality simultaneously constrains the complexity of our cognitive maps and our ability to use them to anticipate the system dynamics. Mental models in which the world is seen as a sequence of events and in which feedback, nonlinearity, time delays, and multiple consequences are lacking lead to poor performance when these elements of dynamic complexity are present. Dysfunction in complex systems can arise from the misperception of the feedback structure of the environment. But rich mental models that capture these sources of complexity cannot be used reliably to understand the dynamics. Dysfunction in complex systems can arise from faulty mental simulation-the misperception of feedback dynamics. These two different bounds on rationality must both be overcome for effective learning to occur. Perfect mental models without a simulation capability yield little insight; a calculus for reliable inferences about dynamics yields systematically erroneous results when applied to simplistic models." (John D Sterman, "Business Dynamics: Systems thinking and modeling for a complex world", 2000)

"Even if our cognitive maps of causal structure were perfect, learning, especially double-loop learning, would still be difficult. To use a mental model to design a new strategy or organization we must make inferences about the consequences of decision rules that have never been tried and for which we have no data. To do so requires intuitive solution of high-order nonlinear differential equations, a task far exceeding human cognitive capabilities in all but the simplest systems."  (John D Sterman, "Business Dynamics: Systems thinking and modeling for a complex world", 2000)

"Faced with the overwhelming complexity of the real world, time pressure, and limited cognitive capabilities, we are forced to fall back on rote procedures, habits, rules of thumb, and simple mental models to make decisions. Though we sometimes strive to make the best decisions we can, bounded rationality means we often systematically fall short, limiting our ability to learn from experience." (John D Sterman, "Business Dynamics: Systems thinking and modeling for a complex world", 2000)

"For any given population of susceptibles, there is some critical combination of contact frequency, infectivity, and disease duration just great enough for the positive loop to dominate the negative loops. That threshold is known as the tipping point. Below the tipping point, the system is stable: if the disease is introduced into the community, there may be a few new cases, but on average, people will recover faster than new cases are generated. Negative feedback dominates and the population is resistant to an epidemic. Past the tipping point, the positive loop dominates .The system is unstable and once a disease arrives, it can spread like wildfire that is, by positive feedback-limited only by the depletion of the susceptible population." (John D Sterman, "Business Dynamics: Systems thinking and modeling for a complex world", 2000)

"Just as dynamics arise from feedback, so too all learning depends on feedback. We make decisions that alter the real world; we gather information feedback about the real world, and using the new information we revise our understanding of the world and the decisions we make to bring our perception of the state of the system closer to our goals."( John D Sterman, "Business dynamics: Systems thinking and modeling for a complex world", 2000)

"Much of the art of system dynamics modeling is discovering and representing the feedback processes, which, along with stock and flow structures, time delays, and nonlinearities, determine the dynamics of a system. […] the most complex behaviors usually arise from the interactions (feedbacks) among the components of the system, not from the complexity of the components themselves." (John D Sterman, "Business Dynamics: Systems thinking and modeling for a complex world", 2000)

"Of course, the information systems governing the feedback we receive can change as we learn. They are part of the feedback structure of our systems. Through our mental models we define constructs such as GDP or scientific research, create metrics for these ideas, and design information systems to evaluate and report them. These then condition the perceptions we form. Changes in our mental models are constrained by what we previously chose to define, measure, and attend to. Seeing is believing and believing is seeing. They feed back on one another."  (John D Sterman, "Business Dynamics: Systems thinking and modeling for a complex world", 2000)

"The existence of the tipping point means it is theoretically possible to completely eradicate a disease. Eradication does not require a perfect vaccine and universal immunization but only the weaker condition that the reproduction rate of the disease fall and remain below one so that new cases arise at a lower rate than old cases are resolved." (John D Sterman, "Business Dynamics: Systems thinking and modeling for a complex world", 2000)

"The mental models people use to guide their decisions are dynamically deficient. […] people generally adopt an event-based, open-loop view of causality, ignore feedback processes, fail to appreciate time delays between action and response and in the reporting of information, do not understand stocks and flows and are insensitive to nonlinearities that may alter the strengths of different feedback loops as a system evolves." (John D Sterman, "Business Dynamics: Systems thinking and modeling for a complex world", 2000)

"The robustness of the misperceptions of feedback and the poor performance they cause are due to two basic and related deficiencies in our mental model. First, our cognitive maps of the causal structure of systems are vastly simplified compared to the complexity of the systems themselves. Second, we are unable to infer correctly the dynamics of all but the simplest causal maps. Both are direct consequences of bounded rationality, that is, the many limitations of attention, memory, recall, information processing capability, and time that constrain human decision making."(John D Sterman, "Business Dynamics: Systems thinking and modeling for a complex world", 2000) 

"The self-reinforcing feedback between expectations and perceptions has been repeatedly demonstrated […]. Sometimes the positive feedback assists learning by sharpening our ability to perceive features of the environment, as when an experienced naturalist identifies a bird in a distant bush where the novice sees only a tangled thicket. Often, however, the mutual feedback of expectations and perception blinds us to the anomalies that might challenge our mental models and lead to deep insight." (John D Sterman, "Business Dynamics: Systems thinking and modeling for a complex world", 2000)

"To learn we must use the limited and imperfect information available to us to un￾derstand the effects of our own decisions, so we can adjust our decisions to align the state of the system with our goals (single-loop learning) and so we can revise our mental models and redesign the system itself (double-loop learning)." (John D Sterman, "Business Dynamics: Systems thinking and modeling for a complex world", 2000)

"To avoid policy resistance and find high leverage policies requires us to expand the boundaries of our mental models so that we become aware of and understand the implications of the feedbacks created by the decisions we make. That is, we must learn about the structure and dynamics of the increasingly complex systems in which we are embedded." (John D Sterman, "Business dynamics: Systems thinking and modeling for a complex world", 2000)

"Unfortunately, people are poor intuitive scientists, generally failing to reason in accordance with the principles of scientific method. For example, people do not generate sufficient alternative explanations or consider enough rival hypotheses. People generally do not adequately control for confounding variables when they explore a novel environment. People’s judgments are strongly affected by the frame in which the information is presented, even when the objective information is unchanged. People suffer from overconfidence in their judgments (underestimating uncertainty), wishful thinking (assessing desired outcomes as more likely than undesired outcomes), and the illusion of control (believing one can predict or influence the outcome of random events). People violate basic rules of probability, do not understand basic statistical concepts such as regression to the mean, and do not update beliefs according to Bayes’ rule. Memory is distorted by hindsight, the availability and salience of examples, and the desirability of outcomes. And so on."  (John D Sterman, "Business Dynamics: Systems thinking and modeling for a complex world", 2000)

"Deep change in mental models, or double-loop learning, arises when evidence not only alters our decisions within the context of existing frames, but also feeds back to alter our mental models. As our mental models change, we change the structure of our systems, creating different decision rules and new strategies. The same information, interpreted by a different model, now yields a different decision. Systems thinking is an iterative learning process in which we replace a reductionist, narrow, short-run, static view of the world with a holistic, broad, long-term, dynamic view, reinventing our policies and institutions accordingly." (John D Sterman, "Learning in and about complex systems", Systems Thinking Vol. 3 2003)

"Eliciting and mapping the participant's mental models, while necessary, is far from sufficient [...] the result of the elicitation and mapping process is never more than a set of causal attributions, initial hypotheses about the structure of a system, which must then be tested. Simulation is the only practical way to test these models. The complexity of the cognitive maps produced in an elicitation workshop vastly exceeds our capacity to understand their implications. Qualitative maps are simply too ambiguous and too difficult to simulate mentally to provide much useful information on the adequacy of the model structure or guidance about the future development of the system or the effects of policies." (John D Sterman, "Learning in and about complex systems", Systems Thinking Vol. 3 2003)

No comments:

Post a Comment

Related Posts Plugin for WordPress, Blogger...

Misquoted: Andrew Lang's Using Statistics for Support rather than Illumination

The quote is from Andrew Lang's speech from 1910 (see [3]) referenced in several other places (see [4], [5], [6]) without specifying the...