To understand probability distributions, it is important to understand variables. random variables, and some notation.
- A variable is a symbol (A, B, x, y, etc.) that can take on any of a specified set of values.
- When the value of a variable is the outcome of a statistical experiment, that variable is a random variable.
Generally, statisticians use a capital letter to represent a random variable and a lower-case letter, to represent one of its values. For example,
- X represents the random variable X.
- P(X) represents the probability of X.
- P(X = x) refers to the probability that the random variable X is equal to a particular value, denoted by x. As an example, P(X = 1) refers to the probability that the random variable X is equal to 1.
Probability Distributions
An example will make clear the relationship between random variables and probability distributions. Suppose you flip a coin two times. This simple statistical experiment can have four possible outcomes: HH, HT, TH, and TT. Now, let the variable X represent the number of Heads that result from this experiment. The variable X can take on the values 0, 1, or 2. In this example, X is a random variable; because its value is determined by the outcome of a statistical experiment.
A probability distribution is a table or an equation that links each outcome of a statistical experiment with its probability of occurence. Consider the coin flip experiment described above. The table below, which associates each outcome with its probability, is an example of a probability distribution.
Number of heads | Probability |
---|---|
0 | 0.25 |
1 | 0.50 |
2 | 0.25 |
The above table represents the probability distribution of the random variable X.
Cumulative Probability Distributions
A cumulative probability refers to the probability that the value of a random variable falls within a specified range.
Let us return to the coin flip experiment. If we flip a coin two times, we might ask: What is the probability that the coin flips would result in one or fewer heads? The answer would be a cumulative probability. It would be the probability that the coin flip experiment results in zero heads plus the probability that the experiment results in one head.
P(X < 1) = P(X = 0) + P(X = 1) = 0.25 + 0.50 = 0.75
Like a probability distribution, a cumulative probability distribution can be represented by a table or an equation. In the table below, the cumulative probability refers to the probability than the random variable X is less than or equal to x.
Number of heads: x | Probability: P(X = x) | Cumulative Probability: P(X < x) |
---|---|---|
0 | 0.25 | 0.25 |
1 | 0.50 | 0.75 |
2 | 0.25 | 1.00 |
Uniform Probability Distribution
The simplest probability distribution occurs when all of the values of a random variable occur with equal probability. This probability distribution is called the uniform distribution.
P(X = xk) = 1/k
Example 1
Suppose a die is tossed. What is the probability that the die will land on 6 ?
Solution: When a die is tossed, there are 6 possible outcomes represented by: S = { 1, 2, 3, 4, 5, 6 }. Each possible outcome is a random variable (X), and each outcome is equally likely to occur. Thus, we have a uniform distribution. Therefore, the P(X = 6) = 1/6.
Example 2
Suppose we repeat the dice tossing experiment described in Example 1. This time, we ask what is the probability that the die will land on a number that is smaller than 5 ?
Solution: When a die is tossed, there are 6 possible outcomes represented by: S = { 1, 2, 3, 4, 5, 6 }. Each possible outcome is equally likely to occur. Thus, we have a uniform distribution.
This problem involves a cumulative probability. The probability that the die will land on a number smaller than 5 is equal to:
P( X < x =" 1)" x =" 2)" x =" 3)" x =" 4)" 6 =" 2/3
Comments
Post a Comment