21 April 2021

On Measurement (1970-1979)

"A mature science, with respect to the matter of errors in variables, is not one that measures its variables without error, for this is impossible. It is, rather, a science which properly manages its errors, controlling their magnitudes and correctly calculating their implications for substantive conclusions." (Otis D Duncan, "Introduction to Structural Equation Models", 1975)

"I find it more difficult, but also much more fun, to get the right answer by indirect reasoning and before all the evidence is in. It’s what a theoretician does in science. But the conclusions drawn in this way are obviously more risky than those drawn by direct measurement, and most scientists withhold judgment until there is more direct evidence available. The principal function of such detective work - apart from entertaining the theoretician - is probably to so annoy and enrage the observationalists that they are forced, in a fury of disbelief, to perform the critical measurements." (Carl Sagan, "The Cosmic Connection: An Extraterrestrial Perspective", 1975)

"Thinking in words, consciousness is behavior, experiment is measurement." (Celia Green, "The Decline and Fall of Science", 1976)

"Crude measurement usually yields misleading, even erroneous conclusions no matter how sophisticated a technique is used." (Henry T Reynolds, "Analysis of Nominal Data", 1977) 

"Numbers are the product of counting. Quantities are the product of measurement. This means that numbers can conceivably be accurate because there is a discontinuity between each integer and the next. Between two and three there is a jump. In the case of quantity there is no such jump, and because jump is missing in the world of quantity it is impossible for any quantity to be exact. You can have exactly three tomatoes. You can never have exactly three gallons of water. Always quantity is approximate." (Gregory Bateson, "Number is Different from Quantity", CoEvolution Quarterly, 1978)

"The amount of information conveyed by the message increases as the amount of uncertainty as to what message actually will be produced becomes greater. A message which is one out of ten possible messages conveys a smaller amount of information than a message which is one out of a million possible messages. The entropy of communication theory is a measure of this uncertainty and the uncertainty, or entropy, is taken as the measure of the amount of information conveyed by a message from a source. The more we know about what message the source will produce, the less uncertainty, the less the entropy, and the less the information." (John R Pierce, "An Introduction to Information Theory: Symbols, Signals and Noise", 1979)

No comments:

Post a Comment

Related Posts Plugin for WordPress, Blogger...

On Data: Longitudinal Data

  "Longitudinal data sets are comprised of repeated observations of an outcome and a set of covariates for each of many subjects. One o...