Top

Site Menu
History of Statistics

Statistics Topics

Probability Rules

Gerolamo Cardano (1501-1576) wrote the first recorded definition of probability. Mlodinow (2009) paraphrased a modernized definition of probability from Cardano's work, Book on Games of Chance: Suppose a random process has many equally likely outcomes, some favorable (that is, winning), some unfavorable (losing). Then the probability of obtaining a favorable outcome is equal to the proportion of outcomes that are favorable (p. 50). Thus, probability is represented by a proportion.

Because probabilities are represented by proportions, actual representation of a given probability requires a number system that allows values between $0$ and $1$, which has come from combining favorable characteristics of various number systems. The Arabs conquered many lands during the Dark Ages, interacting with many peoples and adopting elements of their numerical systems. The number system we use today has origins in India where the position of a number had meaning (as in a place value), and Hindus contributed numerals along with an arithmetic and algebra. With the beginning of such a number system, where decimal values allow numbers between $0$ and $1$, notation developed which allowed discussions of probability theory.

Two red dice 01

Gerolamo Cardano successfully calculated theoretical probabilities of tossing a die, or even two or three dice. However, he attempted to generalize some probability rules when they did not apply. Although he wasn't always correct in his calculations, he provided a starting point for other probabilists. One famous problem that Cardano attempted, but failed to solve, is The Problem of Points. In 1654, the Chevalier de Méré (1607-1684) posed the same problem to Blaise Pascal. Pascal consulted with Pierre de Fermat about the problem and they both found ways to solve it. Their correspondence led to their discovery of more probability rules and contributed to the popularization of the discussion of probability problems at their time. The problem of points is still commonly discussed and modified today.

Jacob Bernoulli (1655-1705) was a mathematician who came from a family of mathematicians and scientists. He compiled and addressed many new probability problems and ideas in his work Ars Conjectandi, published posthumously in 1713. Ars Conjectandi was split into four sections where Part I was notes on Christiaan Huygen's Reasoning on Games of Chance, which included common probability prompts and some of their solutions. Part II discussed using permutations and combinations to answer probabilistic questions. This was more of a summary because combinations were already widely used at his time to calculate probabilities, including using Pascal's triangle to aid in these calculations. Part III poses and suggests solutions to questions in games of chance. Stigler (2014) indicates that Bernoulli often used a strategy of reducing difficult problems to similar games of chance involving equally likely cases. Part IV was more philosophical in that it sought to apply probability principles to other areas of life, such as moral and economic issues.

Arsconj
Cover of Ars Conjectandi

Thomas Bayes (1702-1761) was a clergyman who is famous for a common probability rule dealing with what was called inverse probability. This was a highly controversial subject which Bayes studied but did not publish because of the backlash he expected his work would generate. It was commonly understood that one could use the occurrence of an event to modify the probability of the next random experiment, what is now called conditional probability, but inverse probability is using the knowledge of a later event to modify the probability of an earlier random event. After Bayes' work was published posthumously in 1713, people used this idea to explore epidemiology, for example knowing that someone developed a certain condition and using that to find the probability that they previously had a disease or treatment.

Another famous mathematician, Augustus De Morgan (1806-1871) published some rules of probability in The Penny Cyclopedia in 1841. He started by defining probability and explained the addition rule for finding the probability of the union of mutually exclusive events and the rules for calculating the probability of a union or an intersection.