24 February 2024

On Numbers: Binary Numbers

"[The information of a message can] be defined as the 'minimum number of binary decisions which enable the receiver to construct the message, on the basis of the data already available to him.' These data comprise both the convention regarding the symbols and the language used, and the knowledge available at the moment when the message started." (Dennis Gabor, "Optical transmission" in Information Theory : Papers Read at a Symposium on Information Theory, 1952)

"[...] there can be such a thing as a simple probabilistic system. For example, consider the tossing of a penny. Here is a perfectly simple system, but one which is notoriously unpredictable. It maybe described in terms of a binary decision process, with a built-in even probability between the two possible outcomes." (Stafford Beer, "Cybernetics and Management", 1959)

"Bivalence trades accuracy for simplicity. Binary outcomes of yes and no, white and black, true and false simplify math and computer processing. You can work with strings of 0s and 1s more easily than you can work with fractions. But bivalence requires some force fitting and rounding off [...] Bivalence holds at cube corners. Multivalence holds everywhere else." (Bart Kosko, "Fuzzy Thinking: The new science of fuzzy logic", 1993)

"Fuzziness has a formal name in science: multivalence. The opposite of fuzziness is bivalence or two-valuedness, two ways to answer each question, true or false, 1 or 0. Fuzziness means multivalence. It means three or more options, perhaps an infinite spectrum of options, instead of just two extremes. It means analog instead of binary, infinite shades of gray between black and white." (Bart Kosko, "Fuzzy Thinking: The new science of fuzzy logic", 1993)

"Somewhere in the process wishful thinking seems to take over. Scientists start to believe they do math when they do science. This holds to greatest degree in an advanced science like physics or at the theoretical frontier of any science where the claims come as math claims. The first victim is truth. What was inaccurate or fuzzy truth all along gets bumped up a letter grade to the all-or-none status of binary logic. Most scientists draw the line at giving up the tentative status of science. They will concede that it can all go otherwise in the next experiment. But most have crossed the bivalent line by this point and believe that in the next experiment a statement or hypothesis or theory may jump from TRUE to FALSE, from 1 to 0." (Bart Kosko, "Fuzzy Thinking: The new science of fuzzy logic", 1993)

"The binary logic of modern computers often falls short when describing the vagueness of the real world. Fuzzy logic offers more graceful alternatives." (Bart Kosko & Satoru Isaka, "Fuzzy Logic,” Scientific American Vol. 269, 1993)

"A dictionary definition of the word ‘complex’ is: ‘consisting of interconnected or interwoven parts’ […] Loosely speaking, the complexity of a system is the amount of information needed in order to describe it. The complexity depends on the level of detail required in the description. A more formal definition can be understood in a simple way. If we have a system that could have many possible states, but we would like to specify which state it is actually in, then the number of binary digits (bits) we need to specify this particular state is related to the number of states that are possible." (Yaneer Bar-Yamm, "Dynamics of Complexity", 1997)

"A bit involves both probability and an experiment that decides a binary or yes-no question. Consider flipping a coin. One bit of in-formation is what we learn from the flip of a fair coin. With an unfair or biased coin the odds are other than even because either heads or tails is more likely to appear after the flip. We learn less from flipping the biased coin because there is less surprise in the outcome on average. Shannon's bit-based concept of entropy is just the average information of the experiment. What we gain in information from the coin flip we lose in uncertainty or entropy." (Bart Kosko, "Noise", 2006)

"Why should anyone know or care about binary numbers? One reason is that working with numbers in an unfamiliar base is an example of quantitative reasoning that might even improve understanding of how numbers work in good old base ten. Beyond that, it’s also important because the number of bits is usually related in some way to how much space, time or complexity is involved. And fundamentally, computers are worth understanding, and binary is central to their operation." (Brian W Kernighan, "Understanding the Digital World", 2017)

Previous Post <<||>> Next Post

No comments:

Post a Comment

Related Posts Plugin for WordPress, Blogger...

On Data: Longitudinal Data

  "Longitudinal data sets are comprised of repeated observations of an outcome and a set of covariates for each of many subjects. One o...