The History of Probability

Home Assignments Introduction The Math
Probability, chance and randomness have been around since the ancient days. They could be found in fortune telling, games of chance, philosophy, law, insurance, and errors of prediction in astronomy and medicine (Hald, 1990). In about 1200 B.C., an ancient game was played with four astragali (heel bones of hooved animals). They would grind down the bone into a rough, cube like shape and then make small depressions of various numbers. This is where the idea of the pips, the dots on the dice, we use today came from. They placed high value on a certain throw and other outcomes had smaller probabilities.





In early A.D., the Romans really enjoyed dice. Emperor Claudius is said to have had a table installed into his carriage so he could play dice while he was being driven around. He also wrote a book titled, How to Win at Dice. Why then did no one invent probability earlier? It was because most people did not believe in randomness. Anything that happened was a result of God or the gods making it happen. In addition, the people of this time did not have accurate notation or symbols to calculate such things. There is also the idea that because the dice were not fair, no one noticed the likelihood of a number showing a certain amount of times.


There is evidence that in the late fifteenth century and the early sixteenth century, mathematicians started to experiment with the idea of probability. The video below helps introduce probability.

Girolamo Cardano
Could have been the father of probability.
Girolamo Cardano is actually the first person known to invent probability. However his work was not published until centuries later and during that time the letters were exchanged and probability was born. Cardano was an Italian professor of mathematics and medicine, as well as an avid gambler; he gambled daily. He felt that if he wasn't going to win money, then he could be doing something more worthwhile such as learning. This idea led him to investigate the chances of pulling aces out of a deck of cards and also rolling sevens with two dice. He was the first to realize that there was the same chance to roll a 1,3, or 5 as there was to roll a 2, 4, or 6. People would then bet accordingly if the dice were fair. If the dice were not fair, then the bets would need to be moved to accommodate for that. He was also the first to discover the idea of counting the number of favorable cases (successes) and comparing them to the total number of cases, and he wanted to assign a number from 0 to 1 to the probability of an outcome (Jardine, 2000).
This was said to answer why Cardano isn't the father of probability.
Now on to the fathers of probability.
Pierre de Fermat
Blaise Pascal
In 1494, there arose the Problem of Points, also known as the Unfinished Game. Luca Pacioli was first to write about it in a mathematical context in Summa de Arithmetica. The problem is as follows:
Known as the Unfinished Game problem, the puzzle asked how the pot should be divided when a game of dice has to be abandoned before it has been completed. The challenge is to find a division that is fair according to how many rounds each player has won by that stage. (Devlin, 2010, p.580)
Then in 1654, Antoine Gombaud, the Chevalier de Mere, came across this problem and knew just the man to solve it. He forwarded it to a mathematical prodigy. Blaise Pascal received the problem of points from Gombaud. He sent a letter to Pierre de Fermat to ask for help in solving the Unfinished Game Problem. This led to the invention of probability. The correspondences between these two mathematical geniuses gave way to the solution to the problem of points and the beginning of probability.
Pascal wrote, I was not able to tell you my entire thoughts regarding the problem of the points by the last post, and at the same time, I have a certain reluctance at doing it for fear lest this admirable harmony, which obtains between us and which is so dear to me should begin to flag, for I am afraid that we may have different opinions on this subject. I wish to lay my whole reasoning before you, and to have you do me the favor to set me straight if I am in error or to endorse me if I am correct. I ask you this in all faith and sincerity for I am not certain even that you will be on my side. (Devlin, 2010, p.580).

Both men solved the problem in different ways, and Fermat's way ended up being the more straightforward approach. They came to the conclusion that the person leading the game would get three-fourths of the pot and the person losing would receive one-fourth of the pot. This discovery changed the thought that mathematics could not be used to determine an outcome of a future event, to probability theory, risk management, actuarial science and even the insurance industry (Devlin, 2010).

Daniel Bernoulli
Pierre-Simon Laplace
Jakob Bernoulli
Fun fact: Jakob Bernoulli is the uncle of Daniel Bernoulli.
Mathematicians started applying the concepts of probability to old and new problems. Daniel Bernoulli came across the famous problem, the Petersburg paradox, from his brother Nikolaus. The problem states:
The problem arose from a game of chance in which one player tosses a coin and a second player agrees to pay a sum of money if heads comes up on the first toss, double the money if heads appears on the second toss, four times as much if on the third throw, eight times as much if on the fourth toss, and so on. The paradox arose concerning how much should be paid in before the game to make it fair to both players.(Lightner, 1991, p.628)
Bernoulli came to the conclusion that an infinite amount of money would be needed to satisfy this problem. He is also credited with showing how calculus can be applied to probability.

Following in the footsteps of his countrymen was Pierre-Simon Laplace. He was a French astronomer and mathematician and dubbed The father of modern probability. Though he cared far more for the heavens than mathematics, he developed the theory of continuous probability. He is credited with the formula that Cardano started; the probability of an event is equal to the number of successes or favorable outcomes divided by the total number of possible outcomes. The central limit theorem can also be attributed to Laplace (Debnath & Basu, 2014).

Jakob Bernoulli, a Swiss mathematician, came along and discovered the law of large numbers. The law of large numbers uses Laplace's formula and states that if we calculate the proportion for a large number of outcomes, then the probability will be an accurate representation of the true, or theoretical, probability (Jardine, 2000). Testing Bernoulli's theory were three very bored men. The first was Count Buffon who tossed a coin 4040 times and got heads 2048 times while sitting in a German prison during World War II. The next gentleman was a South African mathematician named John Kerrich. He tossed a coin 10,000 times and heads appeared 5067 times. The final man, who probably needed some friends, was English statistician Karl Pearson. He tossed a coin 24,000 times and it landed on heads 12,012 times. They showed that the larger the number the outcomes, the closer it will be to the true probability. Bernoulli knew what he was talking about.


Karl Pearson
As time continued, probability ushered in the concept of statistics. Many mathematicians used probability to create new distributions and new ways to interpret data. In 1864 they related probability with genetics and hybridization. Probability and biology joined forces in 1894 thanks to the greatest coin flipper of all time, Karl Pearson, and created biometrics. He also was the first to use standard deviation to denote the measurement of probability (Debnath & Basu, 2014).
Now let's move on to the math behind all this history. We'll also look at some applications of Probability.


References