21 August 2022

Peter Bevelin - Collected Quotes

"Changes in size or time influences form, function and behavior. If something of a certain size is made bigger or smaller, it may not work the same way. Some things get better and others get worse." (Peter Bevelin, "Seeking Wisdom: From Darwin to Munger",  2003)

"Every action has consequences. Both intended and unintended. No matter how carefully we plan, we can't anticipate everything. Often we fail to consider what other events are likely to occur as a result of some action. […] By solving one problem, we generate another one and sometimes create an even worse one." (Peter Bevelin, "Seeking Wisdom: From Darwin to Munger", 2003)

"It is hard to predict something when we don't (or can't) foresee or understand how an entire system works, what key variables are involved, their attributes, how they influence one another and their impact. Even if we know the key variables, their values may be impossible to estimate. They may also change over time and be dependent on context. It may also be impossible to estimate how they will interact as a whole." (Peter Bevelin, "Seeking Wisdom: From Darwin to Munger", 2003)

"Many systems fail because they focus on the machines, not the people that use them. […] Humans are involved in designing, execution and follow-up. Excluding ignorance and insufficient knowledge, given the complexity of human and non-human factors interacting, there is a multitude of ways in which things can go wrong." (Peter Bevelin, "Seeking Wisdom: From Darwin to Munger", 2003)

"Often we try to get too much information, including misinformation, or information of no use to explain or predict. We also focus on details and what's irrelevant or unknowable and overlook the obvious truths. Dealing with what's important forces us to prioritize. There are often just a few actions that produce most of what we are trying to achieve. There are only a few decisions of real importance." (Peter Bevelin, "Seeking Wisdom: From Darwin to Munger", 2003) 

"Optimization of one variable may cause the whole system to work less efficiently. Why? The performance of most systems is constrained by the performance of its weakest link. A variable that limits the system from achieving its goal or optimum performance. […] When trying to improve the performance of a system, first find out the system's key contraint(s)- which may be physical (capacity, material, the market) or non-physical (policies, rules, measurements) -and its cause and effect relationship with the system. Maybe the constraint is based on faulty assumptions that can be corrected. Then try to "strengthen" or change the weakest link. Watch out for other effects - wanted or unwanted - that pop up as a consequence. Always consider the effects on the whole system." (Peter Bevelin, "Seeking Wisdom: From Darwin to Munger", 2003)

"Predictions about the future are often just projections of past curves and present trends. This is natural since our predictions about the future are made in the present. We therefore assume the future will be much like the present. But the future can't be known until it arrives. It is contingent on events we can't see." (Peter Bevelin, "Seeking Wisdom: From Darwin to Munger", 2003)

"'Regression to the mean' […] says that, in any series of events where chance is involved, very good or bad performances, high or low scores, extreme events, etc. tend on the average, to be followed by more average performance or less extreme events. If we do extremely well, we're likely to do worse the next time, while if we do poorly, we're likely to do better the next time. But regression to the mean is not a natural law. Merely a statistical tendency. And it may take a long time before it happens." (Peter Bevelin, "Seeking Wisdom: From Darwin to Munger", 2003)

"Science works by elimination. To avoid drowning in low-information observations or experiments, scientists think in advance about what the most important and conclusive experiments would be: What are we trying to achieve or prove, and how can we reach these ends? What can't happen? This way, they narrow down the possibilities." (Peter Bevelin, "Seeking Wisdom: From Darwin to Munger", 2003)

"Some systems are more prone to accidents than others because of the number of components, their connections and interactions. The more variables we add to a system, and the more they interact, the more complicated we make it and the more opportunity the system has to fail. Improving certain parts in highly interconnected systems may do little to eliminate future problems. There is always the possibility of multiple simultaneous failures and the more complicated the system, the harder it is to predict all possible failures." (Peter Bevelin, "Seeking Wisdom: From Darwin to Munger", 2003)

"Try to optimize the whole and not a system's individual parts. Think through what other variables may change when we alter a factor in a system. Trace out the short and long-term consequences in numbers and effects of a proposed action to see if the net result agrees with our ultimate goal."  (Peter Bevelin, "Seeking Wisdom: From Darwin to Munger", 2003)

No comments:

Post a Comment

Related Posts Plugin for WordPress, Blogger...

On Data: Longitudinal Data

  "Longitudinal data sets are comprised of repeated observations of an outcome and a set of covariates for each of many subjects. One o...