BOOLEAN ON MATH WORLD
BOOLEAN ON MATH WORLD
Rebecca Ruston
Los Angeles, CA
June 15, 2009
Summary/Abstract:
Boolean algebra, as developed in 1854 by George Boole in his book An Investigation of the Laws of Thought,[1] is a variant of ordinary elementary algebra differing in its values, operations, and laws. Instead of the usual algebra of numbers, Boolean algebra is the algebra of truth values 0 and 1, or equivalently of subsets of a given set. The operations are usually taken to be conjunction ∧, disjunction ∨, and negation ¬, with constants 0 and 1. And the laws are definable as those equations that hold for all values of their variables, for example x∨(y∧x) = x. Applications include mathematical logic, digital logic, computer programming, set theory, and statistics.
In the 1930s, while studying switching circuits, Claude Shannon observed that one could also apply the rules of Boole's algebra in this setting, and he introduced switching algebra as a way to analyze and design circuits by algebraic means in terms of logic gates. Shannon already had at his disposal the abstract mathematical apparatus, thus he cast his switching algebra as the two-element Boolean algebra. In circuit engineering settings today, there is little need to consider other Boolean algebras, thus "switching algebra" and "Boolean algebra" are often used interchangeably.
Logic sentences that can be expressed in classical propositional calculus have an equivalent expression in Boolean algebra. Thus, Boolean logic is sometimes used to denote propositional calculus performed in this way. Boolean algebra is not sufficient to capture logic formulas using quantifiers, like those from first order logic. Although the development of mathematical logic did not follow Boole's program, the connection between his algebra and logic was later put on firm ground in the setting of algebraic logic, which also studies the algebraic systems of many other logics.[3] The problem of determining whether the variables of a given Boolean (propositional) formula can be assigned in such a way as to make the formula evaluate to true is called the Boolean satisfiability problem (SAT), and is of importance to theoretical computer science, being the first problem shown to be NP-complete. The closely related model of computation known as a Boolean circuit relates time complexity (of an algorithm) to circuit complexity.
INTRODUCTION TO BOOLEAN
Boolean algebra was developed in the 1800's by James Bool, an Irish mathematician. It was found to be extremely useful for designing digital circuits, and it is still heavily used by electrical engineers and computer scientists. The techniques can model a logical system with a single equation. The equation can then be simplified and/or manipulated into new forms. The same techniques developed for circuit designers adapt very well to ladder logic programming.
Boolean equations consist of variables and operations and look very similar to normal algebraic equations. The three basic operators are AND, OR and NOT; more complex operators include exclusive or (EOR), not and (NAND), not or (NOR). Small truth tables for these functions are shown in See Boolean Operations with Truth Tables and Gates. Each operator is shown in a simple equation with the variables A and B being used to calculate a value for X. Truth tables are a simple (but bulky) method for showing all of the possible combinations that will turn an output on or off.
It should be clearly understood that Boolean numbers are not the same as binary numbers. Whereas Boolean numbers represent an entirely different system of mathematics from real numbers, binary is nothing more than an alternative notation for real numbers.
No comments:
Post a Comment