Showing posts with label inferences. Show all posts
Showing posts with label inferences. Show all posts

09 July 2023

On Inference (2000-2009)

"Even if our cognitive maps of causal structure were perfect, learning, especially double-loop learning, would still be difficult. To use a mental model to design a new strategy or organization we must make inferences about the consequences of decision rules that have never been tried and for which we have no data. To do so requires intuitive solution of high-order nonlinear differential equations, a task far exceeding human cognitive capabilities in all but the simplest systems." (John D Sterman, "Business Dynamics: Systems thinking and modeling for a complex world", 2000)

"The difficulty facing us when we have to make inferences is two-fold. First, we may build entirely the wrong mental model from the information we read or hear. […] The second difficulty facing us is that we may well build a reasonably correct initial representation of a problem, but this representation may be impoverished in some way because we have no idea what inferences are relevant […]" (S Ian Robertson, "Problem Solving", 2001)

"Ignorance of relevant risks and miscommunication of those risks are two aspects of innumeracy. A third aspect of innumeracy concerns the problem of drawing incorrect inferences from statistics. This third type of innumeracy occurs when inferences go wrong because they are clouded by certain risk representations. Such clouded thinking becomes possible only once the risks have been communicated." (Gerd Gigerenzer, "Calculated Risks: How to know when numbers deceive you", 2002)

"Information needs representation. The idea that it is possible to communicate information in a 'pure' form is fiction. Successful risk communication requires intuitively clear representations. Playing with representations can help us not only to understand numbers (describe phenomena) but also to draw conclusions from numbers (make inferences). There is no single best representation, because what is needed always depends on the minds that are doing the communicating." (Gerd Gigerenzer, "Calculated Risks: How to know when numbers deceive you", 2002)

"Natural frequencies facilitate inferences made on the basis of numerical information. The representation does part of the reasoning, taking care of the multiplication the mind would have to perform if given probabilities. In this sense, insight can come from outside the mind." (Gerd Gigerenzer, "Calculated Risks: How to know when numbers deceive you", 2002)

"Overcoming innumeracy is like completing a three-step program to statistical literacy. The first step is to defeat the illusion of certainty. The second step is to learn about the actual risks of relevant events and actions. The third step is to communicate the risks in an understandable way and to draw inferences without falling prey to clouded thinking. The general point is this: Innumeracy does not simply reside in our minds but in the representations of risk that we choose." (Gerd Gigerenzer, "Calculated Risks: How to know when numbers deceive you", 2002)

"When natural frequencies are transformed into conditional probabilities, the base rate information is taken out (this is called normalization). The benefit of this normalization is that the resulting values fall within the uniform range of 0 and 1. The cost, however, is that when drawing inferences from probabilities (as opposed to natural frequencies), one has to put the base rates back in by multiplying the conditional probabilities by their respective base rates." (Gerd Gigerenzer, "Calculated Risks: How to know when numbers deceive you", 2002)

"Bayesian inference is a controversial approach because it inherently embraces a subjective notion of probability. In general, Bayesian methods provide no guarantees on long run performance." (Larry A Wasserman, "All of Statistics: A concise course in statistical inference", 2004)

"The Bayesian approach is based on the following postulates: (B1) Probability describes degree of belief, not limiting frequency. As such, we can make probability statements about lots of things, not just data which are subject to random variation. […] (B2) We can make probability statements about parameters, even though they are fixed constants. (B3) We make inferences about a parameter θ by producing a probability distribution for θ. Inferences, such as point estimates and interval estimates, may then be extracted from this distribution." (Larry A Wasserman, "All of Statistics: A concise course in statistical inference", 2004)

"Statistical inference, or 'learning' as it is called in computer science, is the process of using data to infer the distribution that generated the data." (Larry A Wasserman, "All of Statistics: A concise course in statistical inference", 2004)

"A mental model is conceived […] as a knowledge structure possessing slots that can be filled not only with empirically gained information but also with ‘default assumptions’ resulting from prior experience. These default assumptions can be substituted by updated information so that inferences based on the model can be corrected without abandoning the model as a whole. Information is assimilated to the slots of a mental model in the form of ‘frames’ which are understood here as ‘chunks’ of knowledge with a well-defined meaning anchored in a given body of shared knowledge." (Jürgen Renn, "Before the Riemann Tensor: The Emergence of Einstein’s Double Strategy", "The Universe of General Relativity" Ed. by A.J. Kox & Jean Eisenstaedt, 2005)

"Statistics is the branch of mathematics that uses observations and measurements called data to analyze, summarize, make inferences, and draw conclusions based on the data gathered." (Allan G Bluman, "Probability Demystified", 2005)

"The basic idea of going from an estimate to an inference is simple. Drawing the conclusion with confidence, and measuring the level of confidence, is where the hard work of professional statistics comes in." (Charles Livingston & Paul Voakes, "Working with Numbers and Statistics: A handbook for journalists", 2005)

"The dual meaning of the word significant brings into focus the distinction between drawing a mathematical inference and practical inference from statistical results." (Charles Livingston & Paul Voakes, "Working with Numbers and Statistics: A handbook for journalists", 2005)

"In specific cases, we think by applying mental rules, which are similar to rules in computer programs. In most of the cases, however, we reason by constructing, inspecting, and manipulating mental models. These models and the processes that manipulate them are the basis of our competence to reason. In general, it is believed that humans have the competence to perform such inferences error-free. Errors do occur, however, because reasoning performance is limited by capacities of the cognitive system, misunderstanding of the premises, ambiguity of problems, and motivational factors. Moreover, background knowledge can significantly influence our reasoning performance. This influence can either be facilitation or an impedance of the reasoning process." (Carsten Held et al, "Mental Models and the Mind", 2006)

"Mental models abide by the principle of parsimony: They represent only possibilities compatible with the premises, and they represent clauses in the premises only when they hold in a possibility. Fully explicit models represent clauses when they do not hold too. The advantage of mental models over fully explicit models is that they contain less information, and so they are easier to work with. But they can lead reasoners astray. The occurrence of these systematic and compelling fallacies is shocking. The model theory predicts them, and they are a 'litmus' test for mental models, because no other current theory predicts them. They have so far resisted explanation by theories of reasoning based on formal rules of inference, because these theories rely on valid rules." (Philip N Johnson-Laird," Mental Models, Sentential Reasoning, and Illusory Inferences", [in "Mental Models and the Mind"], 2006)

"In mathematics, the first principles are called axioms, and the rules are referred to as deduction/inference rules. A proof is a series of steps based on the (adopted) axioms and deduction rules which reaches a desired conclusion. Every step in a proof can be checked for correctness by examining it to ensure that it is logically sound." (Cristian S Calude et al, "Proving and Programming", 2007)

"[…] statistics is the key discipline for predicting the future or for making inferences about the unknown, or for producing convenient summaries of data." (David J Hand, "Statistics: A Very Short Introduction", 2008)

"Case-based reasoning is a paradigm in machine learning whose idea is that a new problem can be solved by noticing its similarity to a set of problems previously solved. Case-based reasoning regards the inference of some proper conclusions related to a new situation by the analysis of similar cases from a memory of previous cases. Very often similarity between two objects is expressed on a graded scale and this justifies application of fuzzy sets in this context." (Salvatore Greco et al, "Granular Computing and Data Mining for Ordered Data: The Dominance-Based Rough Set Approach", 2009)

"Causal inference is different, because a change in the system is contemplated - an intervention. Descriptive statistics tell you about the data that you happen to have. Causal models claim to tell you what will happen to some of the numbers if you intervene to change other numbers." (David A. Freedman, "Statistical Models: Theory and Practice", 2009)

"Traditional statistics is strong in devising ways of describing data and inferring distributional parameters from sample. Causal inference requires two additional ingredients: a science-friendly language for articulating causal knowledge, and a mathematical machinery for processing that knowledge, combining it with data and drawing new causal conclusions about a phenomenon." (Judea Pearl, "Causal inference in statistics: An overview", Statistics Surveys 3, 2009)

On Inferences (1975-1999)

"[Fuzzy logic is] a logic whose distinguishing features are (1) fuzzy truth-values expressed in linguistic terms, e. g., true, very true, more or less true, or somewhat true, false, nor very true and not very false, etc.; (2) imprecise truth tables; and (3) rules of inference whose validity is relative to a context rather than exact." (Lotfi A. Zadeh, "Fuzzy logic and approximate reasoning", 1975)

"Pencil and paper for construction of distributions, scatter diagrams, and run-charts to compare small groups and to detect trends are more efficient methods of estimation than statistical inference that depends on variances and standard errors, as the simple techniques preserve the information in the original data." (W Edwards Deming, "On Probability as Basis for Action", American Statistician, Volume 29, Number 4, November 1975)

"The treatment of the economy as a single system, to be controlled toward a consistent goal, allowed the efficient systematization of enormous information material, its deep analysis for valid decision-making. It is interesting that many inferences remain valid even in cases when this consistent goal could not be formulated, either for the reason that it was not quite clear or for the reason that it was made up of multiple goals, each of which to be taken into account." (Leonid V Kantorovich, "Mathematics in Economics: Achievements, Difficulties, Perspectives", [Nobel lecture] 1975)

"The advantage of semantic networks over standard logic is that some selected set of the possible inferences can be made in a specialized and efficient way. If these correspond to the inferences that people make naturally, then the system will be able to do a more natural sort of reasoning than can be easily achieved using formal logical deduction." (Avron Barr, Natural Language Understanding, AI Magazine Vol. 1 (1), 1980)

"Deduction is typically distinguished from induction by the fact that only for the former is the truth of an inference guaranteed by the truth of the premises on which it is based. The fact that an inference is a valid deduction, however, is no guarantee that it is of the slightest interest." (John H Holland et al, "Induction: Processes Of Inference, Learning, And Discovery", 1986)

"Our approach assumes that the central problem of induction is to specify processing constraints that will ensure that the inferences drawn by a cognitive system will tend to be plausible and relevant to the system's goals. Which inductions should be characterized as plausible can be determined only with reference to the current knowledge of the system. Induction is thus highly context dependent, being guided by prior knowledge activated in particular situations that confront the system as it seeks to achieve its goals. The study of induction, then, is the study of how knowledge is modified through its use." (John H Holland et al, "Induction: Processes Of Inference, Learning, And Discovery", 1986)

"Models are often used to decide issues in situations marked by uncertainty. However statistical differences from data depend on assumptions about the process which generated these data. If the assumptions do not hold, the inferences may not be reliable either. This limitation is often ignored by applied workers who fail to identify crucial assumptions or subject them to any kind of empirical testing. In such circumstances, using statistical procedures may only compound the uncertainty." (David A Greedman & William C Navidi, "Regression Models for Adjusting the 1980 Census", Statistical Science Vol. 1 (1), 1986)

"It is difficult to distinguish deduction from what in other circumstances is called problem-solving. And concept learning, inference, and reasoning by analogy are all instances of inductive reasoning. (Detectives typically induce, rather than deduce.) None of these things can be done separately from each other, or from anything else. They are pseudo-categories." (Frank Smith, "To Think: In Language, Learning and Education", 1990)

"[…] semantic nets fail to be distinctive in the way they (1) represent propositions, (2) cluster information for access, (3) handle property inheritance, and (4) handle general inference; in other words, they lack distinctive representational properties (i.e., 1) and distinctive computational properties (i.e., 2-4). Certain propagation mechanisms, notably 'spreading activation', 'intersection search', or 'inference propagation' have sometimes been regarded as earmarks of semantic nets, but since most extant semantic nets lack such mechanisms, they cannot be considered criterial in current usage." (Lenhart K Schubert, "Semantic Nets are in the Eye of the Beholder", 1990)

"This absolutist view of mathematical knowledge is based on two types of assumptions: those of mathematics, concerning the assumption of axioms and definitions, and those of logic concerning the assumption of axioms, rules of inference and the formal language and its syntax. These are local or micro-assumptions. There is also the possibility of global or macro-assumptions, such as whether logical deduction suffices to establish all mathematical truths." (Paul Ernest, "The Philosophy of Mathematics Education", 1991)

"The essential idea of semantic networks is that the graph-theoretic structure of relations and abstractions can be used for inference as well as understanding. […] A semantic network is a discrete structure as is any linguistic description. Representation of the continuous 'outside world' with such a structure is necessarily incomplete, and requires decisions as to which information is kept and which is lost." (Fritz Lehman, "Semantic Networks",  Computers & Mathematics with Applications Vol. 23 (2-5), 1992)

"Virtually all mathematical theorems are assertions about the existence or nonexistence of certain entities. For example, theorems assert the existence of a solution to a differential equation, a root of a polynomial, or the nonexistence of an algorithm for the Halting Problem. A platonist is one who believes that these objects enjoy a real existence in some mystical realm beyond space and time. To such a person, a mathematician is like an explorer who discovers already existing things. On the other hand, a formalist is one who feels we construct these objects by our rules of logical inference, and that until we actually produce a chain of reasoning leading to one of these objects they have no meaningful existence, at all." (John L Casti, "Reality Rules: Picturing the world in mathematics" Vol. II, 1992)

"When the distributions of two or more groups of univariate data are skewed, it is common to have the spread increase monotonically with location. This behavior is monotone spread. Strictly speaking, monotone spread includes the case where the spread decreases monotonically with location, but such a decrease is much less common for raw data. Monotone spread, as with skewness, adds to the difficulty of data analysis. For example, it means that we cannot fit just location estimates to produce homogeneous residuals; we must fit spread estimates as well. Furthermore, the distributions cannot be compared by a number of standard methods of probabilistic inference that are based on an assumption of equal spreads; the standard t-test is one example. Fortunately, remedies for skewness can cure monotone spread as well." (William S Cleveland, "Visualizing Data", 1993)

"The science of statistics may be described as exploring, analyzing and summarizing data; designing or choosing appropriate ways of collecting data and extracting information from them; and communicating that information. Statistics also involves constructing and testing models for describing chance phenomena. These models can be used as a basis for making inferences and drawing conclusions and, finally, perhaps for making decisions." (Fergus Daly et al, "Elements of Statistics", 1995)

"Fuzzy systems are excellent tools for representing heuristic, commonsense rules. Fuzzy inference methods apply these rules to data and infer a solution. Neural networks are very efficient at learning heuristics from data. They are 'good problem solvers' when past data are available. Both fuzzy systems and neural networks are universal approximators in a sense, that is, for a given continuous objective function there will be a fuzzy system and a neural network which approximate it to any degree of accuracy." (Nikola K Kasabov, "Foundations of Neural Networks, Fuzzy Systems, and Knowledge Engineering", 1996)

"Theories rarely arise as patient inferences forced by accumulated facts. Theories are mental constructs potentiated by complex external prods (including, in idealized cases, a commanding push from empirical reality)." (Stephen J Gould, "Leonardo's Mountain of Clams and the Diet of Worms", 1998)

"Whereas formal systems apply inference rules to logical variables, neural networks apply evolutive principles to numerical variables. Instead of calculating a solution, the network settles into a condition that satisfies the constraints imposed on it." (Paul Cilliers, "Complexity and Postmodernism: Understanding Complex Systems", 1998)

"Let us regard a proof of an assertion as a purely mechanical procedure using precise rules of inference starting with a few unassailable axioms. This means that an algorithm can be devised for testing the validity of an alleged proof simply by checking the successive steps of the argument; the rules of inference constitute an algorithm for generating all the statements that can be deduced in a finite number of steps from the axioms." (Edward Beltrami, "What is Random?: Chaos and Order in Mathematics and Life", 1999)

"[…] philosophical theories are structured by conceptual metaphors that constrain which inferences can be drawn within that philosophical theory. The (typically unconscious) conceptual metaphors that are constitutive of a philosophical theory have the causal effect of constraining how you can reason within that philosophical framework." (George Lakoff, "Philosophy in the Flesh: The Embodied Mind and its Challenge to Western Thought", 1999)

On Inference (2010-2019)

"A second approach to statistical inference is estimation, which focuses on finding the best point estimate of the population parameter that’s of greatest interest; it also gives an interval estimate of that parameter, to signal how close our point estimate is likely to be to the population value." (Geoff Cumming, "Understanding the New Statistics", 2012)

"Regression analysis, like all forms of statistical inference, is designed to offer us insights into the world around us. We seek patterns that will hold true for the larger population. However, our results are valid only for a population that is similar to the sample on which the analysis has been done." (Charles Wheelan, "Naked Statistics: Stripping the Dread from the Data", 2012)

"Statistical inference is the drawing of conclusions about the world (more specifically: about some population) from our sample data." (Geoff Cumming, "Understanding the New Statistics", 2012)

"The four questions of data analysis are the questions of description, probability, inference, and homogeneity. [...] Descriptive statistics are built on the assumption that we can use a single value to characterize a single property for a single universe. […] Probability theory is focused on what happens to samples drawn from a known universe. If the data happen to come from different sources, then there are multiple universes with different probability models.  [...] Statistical inference assumes that you have a sample that is known to have come from one universe." (Donald J Wheeler," Myths About Data Analysis", International Lean & Six Sigma Conference, 2012)

"Mental models represent possibilities, and the theory of mental models postulates three systems of mental processes underlying inference: (0) the construction of an intentional representation of a premise’s meaning – a process guided by a parser; (1) the building of an initial mental model from the intension, and the drawing of a conclusion based on heuristics and the model; and (2) on some occasions, the search for alternative models, such as a counterexample in which the conclusion is false. System 0 is linguistic, and it may be autonomous. System 1 is rapid and prone to systematic errors, because it makes no use of a working memory for intermediate results. System 2 has access to working memory, and so it can carry out recursive processes, such as the construction of alternative models." (Sangeet Khemlania & P.N. Johnson-Laird, "The processes of inference", Argument and Computation, 2012)

"The true foundations of mathematics do not lie in axioms, definitions, and logical inference, which are the foundational elements of formal mathematics. The true foundations of mathematics lie in the minds of mathematicians as they interact with and try to make sense of their world - in their ideas, their intuitions, and their aesthetic sensibility." (William Byers, "Deep Thinking: What Mathematics Can Teach Us About the Mind", 2015)

"Again, classical statistics only summarizes data, so it does not provide even a language for asking [a counterfactual] question. Causal inference provides a notation and, more importantly, offers a solution. As with predicting the effect of interventions [...], in many cases we can emulate human retrospective thinking with an algorithm that takes what we know about the observed world and produces an answer about the counterfactual world." (Judea Pearl & Dana Mackenzie, "The Book of Why: The new science of cause and effect", 2018)

On Inferences (1950-1974)

"The study of inductive inference belongs to the theory of probability, since observational facts can make a theory only probable but will never make it absolutely certain." (Hans Reichenbach, "The Rise of Scientific Philosophy", 1951)

"The technical analysis of any large collection of data is a task for a highly trained and expensive man who knows the mathematical theory of statistics inside and out. Otherwise the outcome is likely to be a collection of drawings - quartered pies, cute little battleships, and tapering rows of sturdy soldiers in diversified uniforms - interesting enough in the colored Sunday supplement, but hardly the sort of thing from which to draw reliable inferences." (Eric T Bell, "Mathematics: Queen and Servant of Science", 1951)

"Statistics is the name for that science and art which deals with uncertain inferences - which uses numbers to find out something about nature and experience." (Warren Weaver, 1952)

"From the outset it was clear that the two kinds of reasoning have different tasks. From the outset. they appeared very different: demonstrative reasoning as definite, final, 'machinelike'; and plausible reasoning as vague, provisional, specifically 'human'. Now we may see the difference a little more distinctly. In opposition to demonstrative inference, plausible inference leaves indeterminate a highly relevant point: the 'strength' or the 'weight' of the conclusion. This weight may depend not only on clarified grounds such as those expressed in the premises, hut also on unclarified unexpressed grounds somewhere on the background of the person who draws the conclusion. A person has a background, a machine has not. Indeed, you can build a machine to draw demonstrative conclusions for you, but I think you can never build a machine that will draw plausible inferences." (George Pólya, "Mathematics and Plausible Reasoning", 1954)

"The result of the mathematician's creative work is demonstrative reasoning, a proof; but the proof is discovered by plausible reasoning, by guessing. If the learning of mathematics reflects to any degree the invention of mathematics, it must have a place for guessing, for plausible inference." (George Pólya, "Induction and Analogy in Mathematics", 1954)

"The heart of all major discoveries in the physical sciences is the discovery of novel methods of representation and so of fresh techniques by which inferences can be drawn - and drawn in ways which fit the phenomena under investigation." (Stephen Toulmin, "The Philosophy of Science", 1957)

"The first [principle], is that a mathematical theory can only he developed axiomatically in a fruitful way when the student has already acquired some familiarity with the corresponding material - a familiarity gained by working long enough with it on a kind of experimental, or semiexperimental basis, i.e. with constant appeal to intuition. The other principle [...]  is that when logical inference is introduced in some mathematical question, it should always he presented with absolute honesty - that is, without trying to hide gaps or flaws in the argument; any other way, in my opinion, is worse than giving no proof at all." (Jean Dieudonné, "Thinking in School Mathematics", 1961)

"Statistics is that branch of mathematics which deals with the accumulation and analysis of quantitative data. There are three principal subdivisions in the field of statistics but these overlap, more often than not, in actual practice. First, inference from samples to population by means of probability is called statistical inference. Second, descriptive statistics is defined as the characterization and summarization of a given set of data without direct reference to inference. And finally, sampling statistics deals with methods of obtaining samples for statistical inference." (David B MacNeil, "Modern Mathematics for the Practical Man", 1963)

"A mathematical proof, as usually written down, is a sequence of expressions in the state space. But we may also think of the proof as consisting of the sequence of justifications of consecutive proof steps - i.e., the references to axioms, previously-proved theorems, and rules of inference that legitimize the writing down of the proof steps. From this point of view, the proof is a sequence of actions (applications of rules of inference) that, operating initially on the axioms, transform them into the desired theorem." (Herbert A Simon, "The Logic of Heuristic Decision Making", [in "The Logic of Decision and Action"], 1966)

"Inductive inference is the only process known to us by which essential new knowledge comes into the world." (Sir Ronald A Fisher, "The Design of Experiments", 1971)

"Probability theory, for us, is not so much a part of mathematics as a part of logic, inductive logic, really. It provides a consistent framework for reasoning about statements whose correctness or incorrectness cannot be deduced from the hypothesis. The information available is sufficient only to make the inferences 'plausible' to a greater or lesser extent.". (Ralph Baierlein, "Atoms and Information Theory: An Introduction to Statistical Mechanics", 1971)

"The statistician cannot excuse himself from the duty of getting his head clear on the principles of scientific inference, but equally no other thinking man can avoid a like obligation." (Sir Ronald A Fisher, "The Design of Experiments", 1971)

"[...] we will adopt the broad view and will take 'probability', to be a quantitative relation, between a hypothesis and an inference, corresponding to the degree of rational belief in the correctness of the inference, given the hypothesis. The hypothesis is the information we possess, or assume for the sake of argument. The inference is a statement that, to a greater or lesser extent, is justified by the hypothesis. Thus 'the probability' of an inference, given a hypothesis, is the degree of rational belief in the correctness of the inference, given the hypothesis." (Ralph Baierlein, "Atoms and Information Theory: An Introduction to Statistical Mechanics", 1971)

"An analogy is a relationship between two entities, processes, or what you will, which allows inferences to be made about one of the things, usually that about which we know least, on the basis of what we know about the other. […] The art of using analogy is to balance up what we know of the likenesses against the unlikenesses between two things, and then on the basis of this balance make an inference as to what is called the neutral analogy, that about which we do not know." (Rom Harré," The Philosophies of Science", 1972)

"The process [of statistical analysis] usually begins by the postulating of a model worthy to be tentatively entertained. The data analyst will have arrived at this tentative model in cooperation with the scientific investigator. They will choose it 'So that, in the light of the then available knowledge, it best takes account of relevant phenomena in the simplest way possible. it will usually contain unknown parameters. Given the data the analyst can now make statistical inferences about the parameters conditional on the correctness of this first tentative model. These inferences form part of the conditional analysis. If the model is correct, they provide all there is to know about the problem under study, given the data." (George E P Box & George C Tjao, "Bayesian Inference in Statistical Analysis", 1973)

"[…] it is not enough to say: 'There's error in the data and therefore the study must be terribly dubious'. A good critic and data analyst must do more: he or she must also show how the error in the measurement or the analysis affects the inferences made on the basis of that data and analysis." (Edward R Tufte, "Data Analysis for Politics and Policy", 1974)

On Inferences (1900-1949)

"I may as well say at once that I do not distinguish between inference and deduction. What is called induction appears to me to be either disguised deduction or a mere method of making plausible guesses." (Bertrand Russell, "Principles of Mathematics", 1903)

"A theorem […] is an inference obtained by constructing a diagram according to a general precept, and after modifying it as ingenuity may dictate, observing in it certain relations, and showing that they must subsist in every case, retranslating the proposition into general terms." (Charles S Peirce, "New Elements" ["Kaina stoiceia"], 1904)

"The type of reasoning found in mathematics seems thus not only available but essentially interwoven with every inference in non-mathematical reasoning, being always used in one of its two steps ; facility in making the other step, the more difficult one, must be attained through other than purely mathematical training." (Jacob W A Young, "The Teaching of Mathematics", 1907)

"It is experience which has given us our first real knowledge of Nature and her laws. It is experience, in the shape of observation and experiment, which has given us the raw material out of which hypothesis and inference have slowly elaborated that richer conception of the material world which constitutes perhaps the chief, and certainly the most characteristic, glory of the modern mind." (Arthur J Balfour, "The Foundations of Belief", 1912)

"The ends to be attained [in mathematical teaching] are the knowledge of a body of geometrical truths to be used. In the discovery of new truths, the power to draw correct inferences from given premises, the power to use algebraic processes as a means of finding results in practical problems, and the awakening of interest In the science of mathematics." (J Craig, "A Course of Study for the Preparation of Rural School Teachers", 1912)

"Whenever possible, substitute constructions out of known entities for inferences to unknown entities." (Bertrand Russell, 1924)

"Hypothesis, however, is an inference based on knowledge which is insufficient to prove its high probability." (Frederick L Barry, "The Scientific Habit of Thought", 1927) 

"The development of mathematics toward greater precision has led, as is well known, to the formalization of large tracts of it, so that one can prove any theorem using nothing but a few mechanical rules. [...] One might therefore conjecture that these axioms and rules of inference are sufficient to decide any mathematical question that can at all be formally expressed in these systems. It will be shown below that this is not the case, that on the contrary there are in the two systems mentioned relatively simple problems in the theory of integers that cannot be decided on the basis of the axioms." (Kurt Gödel, "On Formally Undecidable Propositions of Principia Mathematica and Related Systems", 1931)

"An inference, if it is to have scientific value, must constitute a prediction concerning future data. If the inference is to be made purely with the help of the distribution theory of statistics, the experiments that constitute evidence for the inference must arise from a state of statistical control; until that state is reached, there is no universe, normal or otherwise, and the statistician’s calculations by themselves are an illusion if not a delusion. The fact is that when distribution theory is not applicable for lack of control, any inference, statistical or otherwise, is little better than a conjecture. The state of statistical control is therefore the goal of all experimentation." (William E Deming, "Statistical Method from the Viewpoint of Quality Control", 1939)

"An observation, strictly, is only a sensation. Nobody means that we should reject everything but sensations. But as soon as we go beyond sensations we are making inferences." (Sir Harold Jeffreys, "Theory of Probability", 1939)

"Inference by analogy appears to be the most common kind of conclusion, and it is possibly the most essential kind. It yields more or less plausible conjectures which may or may not be confirmed by experience and stricter reasoning." (George Pólya, "How to Solve It", 1945)

"If the chance of error alone were the sole basis for evaluating methods of inference, we would never reach a decision, but would merely keep increasing the sample size indefinitely." (C West Churchman, "Theory of Experimental Inference", 1948)

On Inferences (-1899)

"Analysis is the obtaining of the thing sought by assuming it and so reasoning up to an admitted truth; synthesis is the obtaining of the thing sought by reasoning up to the inference and proof of it." (Eudoxus, cca. 4th century BC)

"Every stage of science has its train of practical applications and systematic inferences, arising both from the demands of convenience and curiosity, and from the pleasure which, as we have already said, ingenious and active-minded men feel in exercising the process of deduction." (William Whewell, "The Philosophy of the Inductive Sciences Founded Upon Their History", 1840)

"There is in every step of an arithmetical or algebraical calculation a real induction, a real inference from facts to facts, and what disguises the induction is simply its comprehensive nature, and the consequent extreme generality of its language." (John S Mill, "A System of Logic, Ratiocinative and Inductive", 1843)

"A mere inference or theory must give way to a truth revealed; but a scientific truth must be maintained, however contradictory it may appear to the most cherished doctrines of religion." (David Brewster, "More Worlds Than One: The Creed of the Philosopher and the Hope of the Christian", 1856)

"Truths are known to us in two ways: some are known directly, and of themselves; some through the medium of other truths. The former are the subject of Intuition, or Consciousness; the latter, of Inference; the latter of Inference. The truths known by Intuition are the original premises, from which all others are inferred." (John S Mill, "A System of Logic, Ratiocinative and Inductive", 1858)

"And first, it is necessary to distinguish from true inductions, certain operations which are often improperly called by that name. A true induction is a process of inference - it proceeds from the known to the unknown; and whenever any operation contains no inference, it is not  really an induction. And yet most of the examples given in the common  works on logic, as examples of induction, are of this character." (George R Drysdale, "Logic and Utility: The tests of truth and falsehood, and of right and wrong", 1866)

"It must be the ground of all reasoning and inference that what is true of one thing will be true of its equivalent, and that under carefully ascertained conditions Nature repeats herself." (William S Jevons, "The Principles of Science: A Treatise on Logic and Scientific Method", 1874)

"Economic science is but the working of common sense aided by appliances of organized analysis and general reasoning, which facilitate the task of collecting, arranging, and drawing inferences from particular facts. Though its scope is always limited, though its work without the aid of common sense is vain, yet it enables common sense to go further in difficult problems than would otherwise be possible." (Alfred Marshall, "Principles of Economics", 1890)

25 July 2021

Larry A Wasserman - Collected Quotes

"A smaller model with fewer covariates has two advantages: it might give better predictions than a big model and it is more parsimonious (simpler). Generally, as you add more variables to a regression, the bias of the predictions decreases and the variance increases. Too few covariates yields high bias; this called underfitting. Too many covariates yields high variance; this called overfitting. Good predictions result from achieving a good balance between bias and variance. […] fiding a good model involves trading of fit and complexity." (Larry A Wasserman, "All of Statistics: A concise course in statistical inference", 2004)

"Bayesian inference is a controversial approach because it inherently embraces a subjective notion of probability. In general, Bayesian methods provide no guarantees on long run performance." (Larry A Wasserman, "All of Statistics: A concise course in statistical inference", 2004)

"Bayesian inference is appealing when prior information is available since Bayes’ theorem is a natural way to combine prior information with data. Some people find Bayesian inference psychologically appealing because it allows us to make probability statements about parameters. […] In parametric models, with large samples, Bayesian and frequentist methods give approximately the same inferences. In general, they need not agree." (Larry A Wasserman, "All of Statistics: A concise course in statistical inference", 2004)

"Inequalities are useful for bounding quantities that might otherwise be hard to compute." (Larry A Wasserman, "All of Statistics: A concise course in statistical inference", 2004)

"Probability is a mathematical language for quantifying uncertainty." (Larry A Wasserman, "All of Statistics: A concise course in statistical inference", 2004)

"Statistical inference, or 'learning' as it is called in computer science, is the process of using data to infer the distribution that generated the data." (Larry A Wasserman, "All of Statistics: A concise course in statistical inference", 2004)

"[…] studying methods for parametric models is useful for two reasons. First, there are some cases where background knowledge suggests that a parametric model provides a reasonable approximation. […] Second, the inferential concepts for parametric models provide background for understanding certain nonparametric methods." (Larry A Wasserman, "All of Statistics: A concise course in statistical inference", 2004)

"The Bayesian approach is based on the following postulates: (B1) Probability describes degree of belief, not limiting frequency. As such, we can make probability statements about lots of things, not just data which are subject to random variation. […] (B2) We can make probability statements about parameters, even though they are fixed constants. (B3) We make inferences about a parameter θ by producing a probability distribution for θ. Inferences, such as point estimates and interval estimates, may then be extracted from this distribution." (Larry A Wasserman, "All of Statistics: A concise course in statistical inference", 2004)

"The frequentist point of view is based on the following postulates: (F1) Probability refers to limiting relative frequencies. Probabilities are objective properties of the real world. (F2) Parameters are i xed, unknown constants. Because they are not fluctuating, no useful probability statements can be made about parameters. (F3) Statistical procedures should be designed to have well-defined long run frequency properties." (Larry A Wasserman, "All of Statistics: A concise course in statistical inference", 2004)

"The important thing is to understand that frequentist and Bayesian methods are answering different questions. To combine prior beliefs with data in a principled way, use Bayesian inference. To construct procedures with guaranteed long run performance, such as confidence intervals, use frequentist methods. Generally, Bayesian methods run into problems when the parameter space is high dimensional." (Larry A Wasserman, "All of Statistics: A concise course in statistical inference", 2004) 

"The most important aspect of probability theory concerns the behavior of sequences of random variables." (Larry A Wasserman, "All of Statistics: A concise course in statistical inference", 2004)

"There is a tendency to use hypothesis testing methods even when they are not appropriate. Often, estimation and confidence intervals are better tools. Use hypothesis testing only when you want to test a well-defined hypothesis." (Larry A Wasserman, "All of Statistics: A concise course in statistical inference", 2004)

"Things are changing. Statisticians now recognize that computer scientists are making novel contributions while computer scientists now recognize the generality of statistical theory and methodology. Clever data mining algorithms are more scalable than statisticians ever thought possible. Formal statistical theory is more pervasive than computer scientists had realized." (Larry A Wasserman, "All of Statistics: A concise course in statistical inference", 2004)

"Undirected graphs are an alternative to directed graphs for representing independence relations. Since both directed and undirected graphs are used in practice, it is a good idea to be facile with both. The main difference between the two is that the rules for reading independence relations from the graph are different." (Larry A Wasserman, "All of Statistics: A concise course in statistical inference", 2004)

01 June 2021

On Syllogism I

"The Syllogism consists of propositions, propositions consist of words, words are symbols of notions. Therefore if the notions themselves (which is the root of the matter) are confused and over-hastily abstracted from the facts, there can be no firmness in the superstructure. Our only hope therefore lies in a true induction." (Francis Bacon, The New Organon, 1620)

"[…] mathematics is not, never was, and never will be, anything more than a particular kind of language, a sort of shorthand of thought and reasoning. The purpose of it is to cut across the complicated meanderings of long trains of reasoning with a bold rapidity that is unknown to the mediaeval slowness of the syllogisms expressed in our words." (Charles Nordmann, "Einstein and the Universe", 1922)

"Knowledge is ours only if, at the moment of need, it offers itself to the mind without syllogisms or demonstrations for which there is no time." (André Maurois, "Un Art de Vivre" ["The Art of Living"], 1939)

"A serious threat to the very life of science is implied in the assertion that mathematics is nothing but a system of conclusions drawn from definitions and postulates that must be consistent but otherwise may be created by the free will of the mathematician. If this description were accurate, mathematics could not attract any intelligent person. It would be a game with definitions, rules and syllogisms, without motivation or goal." (Richard Courant & Herbert Robbins, "What Is Mathematics?", 1941)

"The construction of hypotheses is a creative act of inspiration, intuition, invention; its essence is the vision of something new in familiar material. The process must be discussed in psychological, not logical, categories; studied in autobiographies and biographies, not treatises on scientific method; and promoted by maxim and example, not syllogism or theorem." (Milton Friedman, "Essays in Positive Economics", 1953)

"[…] the distinction between rigorous thinking and more vague ‘imaginings’; even in mathematics itself, all is not a question of rigor, but rather, at the start, of reasoned intuition and imagination, and, also, repeated guessing. After all, most thinking is a synthesis or juxtaposition of advances along a line of syllogisms - perhaps in a continuous and persistent ‘forward'’ movement, with searching, so to speak ‘sideways’, in directions which are not necessarily present from the very beginning and which I describe as ‘sending out exploratory patrols’ and trying alternative routes." (Stanislaw M Ulam, "Adventures of a Mathematician", 1976)

"Since mental models can take many forms and serve many purposes, their contents are very varied. They can contain nothing but tokens that represent individuals and identities between them, as in the sorts of models that are required for syllogistic reasoning. They can represent spatial relations between entities, and the temporal or causal relations between events. A rich imaginary model of the world can be used to compute the projective relations required for an image. Models have a content and form that fits them to their purpose, whether it be to explain, to predict, or to control." (Philip Johnson-Laird, "Mental models: Toward a cognitive science of language, inference, and consciousness", 1983)

"Whenever I have talked about mental models, audiences have readily grasped that a layout of concrete objects can be represented by an internal spatial array, that a syllogism can be represented by a model of individuals and identities between them, and that a physical process can be represented by a three-dimensional dynamic model. Many people, however, have been puzzled by the representation of abstract discourse; they cannot understand how terms denoting abstract entities, properties or relations can be similarly encoded, and therefore they argue that these terms can have only 'verbal' or propositional representations." (Philip Johnson-Laird, "Mental Models: Towards a Cognitive Science of Language, Inference and Consciousness", 1983)

"Formal logic and the logical syllogism encapsulate connectedness in reasoning." (Marshall McLuhan & Eric McLuhan, "Laws of Media: The New Science", 1988)

"Metaphorizing is a manner of thinking, not a property of thinking. It is a capacity of thought, not its quality. It represents a mental operation by which a previously existing entity is described in the characteristics of another one on the basis of some similarity or by reasoning. When we say that something is (like) something else, we have already performed a mental operation. This operation includes elements such as comparison, paralleling and shaping of the new image by ignoring its less satisfactory traits in order that this image obtains an aesthetic value. By this process, for an instant we invent a device, which serves as the pole vault for the comparison’s jump. Once the jump is made the pole vault is removed. This device could be a lightning-speed logical syllogism, or a momentary created term, which successfully merges the traits of the compared objects." (Ivan Mladenov, "Conceptualizing Metaphors: On Charles Peirce’s marginalia", 2006)

04 February 2021

On Deduction (1800-1849)

"One very reprehensible mode of theory-making consists, after honest deductions from a few facts have been made, in torturing other facts to suit the end proposed, in omitting some, and in making use of any authority that may lend assistance to the object desired; while all those which militate against it are carefully put on one side or doubted." (Henry De la Beche, "Sections and Views, Illustrative of Geological Phaenomena", 1830)

"Facts [...] are not truths; they are not conclusions; they are not even premises, but in the nature and parts of premises. The truth depends on, and is only arrived at, by a legitimate deduction from all the facts which are truly material." (Samuel T Coleridge, "The Table Talk and Omniana of Samuel Taylor Coleridge", 1831)

"The deduction of effect from cause is often blocked by some insuperable extrinsic obstacle: the true causes may be quite unknown." (Carl von Clausewitz, "On War", 1832)

"Physical astronomy is the science which compares and identifies the laws of motion observed on earth with the motions that take place in the heavens; and which traces, by an uninterrupted chain of deduction from the great principle that governs the universe, the revolutions and rotations of the planets, and the oscillations of the fluids at their surfaces; and which estimates the changes the system has hitherto undergone, or may hereafter experience - changes which require millions of years for their accomplishment." (Mary Somerville, "The Connection of the Physical Sciences", 1834)

"Every stage of science has its train of practical applications and systematic inferences, arising both from the demands of convenience and curiosity, and from the pleasure which, as we have already said, ingenious and active-minded men feel in exercising the process of deduction." (William Whewell, "The Philosophy of the Inductive Sciences Founded Upon Their History", 1840)

"These sciences, Geometry, Theoretical Arithmetic and Algebra, have no principles besides definitions and axioms, and no process of proof but deduction; this process, however, assuming a most remarkable character; and exhibiting a combination of simplicity and complexity, of rigour and generality, quite unparalleled in other subjects." (William Whewell, "The Philosophy of the Inductive Sciences", 1840)

Related Posts Plugin for WordPress, Blogger...

On Leonhard Euler

"I have been able to solve a few problems of mathematical physics on which the greatest mathematicians since Euler have struggled in va...