Skip to main content

Posts

Showing posts with the label Probability

Basic Probability

The probability of a sample point is a measure of the likelihood that the sample point will occur. Probability of a Sample Point By convention, statisticians have agreed on the following rules. The probability of any sample point can range from 0 to 1. The sum of probabilities of all sample points in a sample space is equal to 1. Example 1 Suppose we conduct a simple statistical experiment. We flip a coin one time. The coin flip can have one of two outcomes - heads or tails. Together, these outcomes represent the sample space of our experiment. Individually, each outcome represents a sample point in the sample space. What is the probability of each sample point? Solution: The sum of probabilities of all the sample points must equal 1. And the probability of getting a head is equal to the probability of getting a tail. Therefore, the probability of each sample point (heads or tails) must be equal to 1/2. Example 2 Let's repeat the experiment ...

Probability Distributions

To understand probability distributions, it is important to understand variables. random variables, and some notation. A variable is a symbol ( A , B , x , y , etc.) that can take on any of a specified set of values. When the value of a variable is the outcome of a statistical experiment, that variable is a random variable . Generally, statisticians use a capital letter to represent a random variable and a lower-case letter, to represent one of its values. For example, X represents the random variable X. P(X) represents the probability of X. P(X = x) refers to the probability that the random variable X is equal to a particular value, denoted by x. As an example, P(X = 1) refers to the probability that the random variable X is equal to 1. Probability Distributions An example will make clear the relationship between random variables and probability distributions. Suppose you flip a coin two times. This simple statistical experiment can have...

Rules of Probability

Often, we want to compute the probability of an event from the known probabilities of other events. This lesson covers some important rules that simplify those computations. Definitions and Notation Before discussing the rules of probability, we state the following definitions: Two events are mutually exclusive or disjoint if they cannot occur at the same time. The probability that Event A occurs, given that Event B has occurred, is called a conditional probability . The conditional probability of Event A, given Event B, is denoted by the symbol P(A|B). The complement of an event is the event not occuring. The probability that Event A will not occur is denoted by P(A'). The probability that Events A and B both occur is the probability of the intersection of A and B. The probability of the intersection of Events A and B is denoted by P(A ∩ B). If Events A and B are mutually exclusive, P(A ∩ B) = 0. The probability tha...

Random Variables

When the numerical value of a variable is determined by a chance event, that variable is called a random variable . Discrete vs. Continuous Random Variables Random variables can be discrete or continuous. Discrete . Discrete random variables take on integer values, usually the result of counting. Suppose, for example, that we flip a coin and count the number of heads. The number of heads results from a random process - flipping a coin. And the number of heads is represented by an integer value - a number between 0 and plus infinity. Therefore, the number of heads is a discrete random variable. Continuous . Continuous random variables, in contrast, can take on any value within a range of values. For example, suppose we flip a coin many times and compute the average number of heads per flip. The average number of heads per flip results from a random process - flipping a...

Poisson Distribution

A Poisson experiment is a statistical experiment that has the following properties: The experiment results in outcomes that can be classified as successes or failures. The average number of successes (μ) that occurs in a specified region is known. The probability that a success will occur is proportional to the size of the region. The probability that a success will occur in an extremely small region is virtually zero. Note that the specified region could take many forms. For instance, it could be a length, an area, a volume, a period of time, etc. Notation The following notation is helpful, when we talk about the Poisson distribution. e : A constant equal to approximately 2.71828. (Actually, e is the base of the natural logarithm system.) μ: The mean number of successes that occur in a specified region. x : The actual number of successes that occur in a specified region. P( x ; μ): The Poisson probability that exactly x ...

Binomial Distribution

The binomial distribution gives the discrete probability distribution of obtaining exactly successes out of Bernoulli trials (where the result of each Bernoulli trial is true with probability and false with probability ). The binomial distribution is therefore given by (1) (2) where is a binomial coefficient . The above plot shows the distribution of successes out of trials with . The binomial distribution is implemented in Mathematica as BinomialDistribution [ n , p ]. The probability of obtaining more successes than the observed in a binomial distribution is (3) where (4) is the beta function , and is the incomplete beta function . The characteristic function for the binomial distribution is (5) (Papoulis 1984, p. 154). The moment-generating function for the distribution is " border="0" height="20" width="27"> (6) (7) (8) (9) (10) (11) The mean is ...