"[Fuzzy logic is] a logic whose distinguishing features are (1) fuzzy truth-values expressed in linguistic terms, e. g., true, very true, more or less true, or somewhat true, false, nor very true and not very false, etc.; (2) imprecise truth tables; and (3) rules of inference whose validity is relative to a context rather than exact." (Lotfi A. Zadeh, "Fuzzy logic and approximate reasoning", 1975)
"Pencil and paper for construction of distributions, scatter diagrams, and run-charts to compare small groups and to detect trends are more efficient methods of estimation than statistical inference that depends on variances and standard errors, as the simple techniques preserve the information in the original data." (W Edwards Deming, "On Probability as Basis for Action", American Statistician, Volume 29, Number 4, November 1975)
"The treatment of the economy as a single system, to be controlled toward a consistent goal, allowed the efficient systematization of enormous information material, its deep analysis for valid decision-making. It is interesting that many inferences remain valid even in cases when this consistent goal could not be formulated, either for the reason that it was not quite clear or for the reason that it was made up of multiple goals, each of which to be taken into account." (Leonid V Kantorovich, "Mathematics in Economics: Achievements, Difficulties, Perspectives", [Nobel lecture] 1975)
"The advantage of semantic networks over standard logic is that some selected set of the possible inferences can be made in a specialized and efficient way. If these correspond to the inferences that people make naturally, then the system will be able to do a more natural sort of reasoning than can be easily achieved using formal logical deduction." (Avron Barr, Natural Language Understanding, AI Magazine Vol. 1 (1), 1980)
"Models are often used to decide issues in situations marked by uncertainty. However statistical differences from data depend on assumptions about the process which generated these data. If the assumptions do not hold, the inferences may not be reliable either. This limitation is often ignored by applied workers who fail to identify crucial assumptions or subject them to any kind of empirical testing. In such circumstances, using statistical procedures may only compound the uncertainty." (David A Greedman & William C Navidi, "Regression Models for Adjusting the 1980 Census", Statistical Science Vol. 1 (1), 1986)
"It is difficult to distinguish deduction from what in other circumstances is called problem-solving. And concept learning, inference, and reasoning by analogy are all instances of inductive reasoning. (Detectives typically induce, rather than deduce.) None of these things can be done separately from each other, or from anything else. They are pseudo-categories." (Frank Smith, "To Think: In Language, Learning and Education", 1990)
"[…] semantic nets fail to be distinctive in the way they (1) represent propositions, (2) cluster information for access, (3) handle property inheritance, and (4) handle general inference; in other words, they lack distinctive representational properties (i.e., 1) and distinctive computational properties (i.e., 2-4). Certain propagation mechanisms, notably 'spreading activation', 'intersection search', or 'inference propagation' have sometimes been regarded as earmarks of semantic nets, but since most extant semantic nets lack such mechanisms, they cannot be considered criterial in current usage." (Lenhart K Schubert, "Semantic Nets are in the Eye of the Beholder", 1990)
"This absolutist view of mathematical knowledge is based on two types of assumptions: those of mathematics, concerning the assumption of axioms and definitions, and those of logic concerning the assumption of axioms, rules of inference and the formal language and its syntax. These are local or micro-assumptions. There is also the possibility of global or macro-assumptions, such as whether logical deduction suffices to establish all mathematical truths." (Paul Ernest, "The Philosophy of Mathematics Education", 1991)
"The essential idea of semantic networks is that the graph-theoretic structure of relations and abstractions can be used for inference as well as understanding. […] A semantic network is a discrete structure as is any linguistic description. Representation of the continuous 'outside world' with such a structure is necessarily incomplete, and requires decisions as to which information is kept and which is lost." (Fritz Lehman, "Semantic Networks", Computers & Mathematics with Applications Vol. 23 (2-5), 1992)
"Virtually all mathematical theorems are assertions about the existence or nonexistence of certain entities. For example, theorems assert the existence of a solution to a differential equation, a root of a polynomial, or the nonexistence of an algorithm for the Halting Problem. A platonist is one who believes that these objects enjoy a real existence in some mystical realm beyond space and time. To such a person, a mathematician is like an explorer who discovers already existing things. On the other hand, a formalist is one who feels we construct these objects by our rules of logical inference, and that until we actually produce a chain of reasoning leading to one of these objects they have no meaningful existence, at all."
"When the distributions of two or more groups of univariate data are skewed, it is common to have the spread increase monotonically with location. This behavior is monotone spread. Strictly speaking, monotone spread includes the case where the spread decreases monotonically with location, but such a decrease is much less common for raw data. Monotone spread, as with skewness, adds to the difficulty of data analysis. For example, it means that we cannot fit just location estimates to produce homogeneous residuals; we must fit spread estimates as well. Furthermore, the distributions cannot be compared by a number of standard methods of probabilistic inference that are based on an assumption of equal spreads; the standard t-test is one example. Fortunately, remedies for skewness can cure monotone spread as well." (William S Cleveland, "Visualizing Data", 1993)
"The science of statistics may be described as exploring, analyzing and summarizing data; designing or choosing appropriate ways of collecting data and extracting information from them; and communicating that information. Statistics also involves constructing and testing models for describing chance phenomena. These models can be used as a basis for making inferences and drawing conclusions and, finally, perhaps for making decisions." (Fergus Daly et al, "Elements of Statistics", 1995)
"Fuzzy systems are excellent tools for representing heuristic, commonsense rules. Fuzzy inference methods apply these rules to data and infer a solution. Neural networks are very efficient at learning heuristics from data. They are 'good problem solvers' when past data are available. Both fuzzy systems and neural networks are universal approximators in a sense, that is, for a given continuous objective function there will be a fuzzy system and a neural network which approximate it to any degree of accuracy." (Nikola K Kasabov, "Foundations of Neural Networks, Fuzzy Systems, and Knowledge Engineering", 1996)
"Theories rarely arise as patient inferences forced by accumulated facts. Theories are mental constructs potentiated by complex external prods (including, in idealized cases, a commanding push from empirical reality)." (Stephen J Gould, "Leonardo's Mountain of Clams and the Diet of Worms", 1998)
"Whereas formal systems apply inference rules to logical variables, neural networks apply evolutive principles to numerical variables. Instead of calculating a solution, the network settles into a condition that satisfies the constraints imposed on it." (Paul Cilliers, "Complexity and Postmodernism: Understanding Complex Systems", 1998)
"Let us regard a proof of an assertion as a purely mechanical procedure using precise rules of inference starting with a few unassailable axioms. This means that an algorithm can be devised for testing the validity of an alleged proof simply by checking the successive steps of the argument; the rules of inference constitute an algorithm for generating all the statements that can be deduced in a finite number of steps from the axioms." (Edward Beltrami, "What is Random?: Chaos and Order in Mathematics and Life", 1999)
"[…] philosophical theories are structured by conceptual metaphors that constrain which inferences can be drawn within that philosophical theory. The (typically unconscious) conceptual metaphors that are constitutive of a philosophical theory have the causal effect of constraining how you can reason within that philosophical framework." (George Lakoff, "Philosophy in the Flesh: The Embodied Mind and its Challenge to Western Thought", 1999)
No comments:
Post a Comment