Skip to main content

Skewness

Skewness is a measure of the degree of asymmetry of a distribution. If the left tail (tail at small end of the distribution) is more pronounced than the right tail (tail at the large end of the distribution), the function is said to have negative skewness. If the reverse is true, it has positive skewness. If the two are equal, it has zero skewness.

Several types of skewness are defined, the terminology and notation of which are unfortunately rather confusing. "The" skewness of a distribution is defined to be

 gamma_1=(mu_3)/(mu_2^(3/2)),
(1)
Positively Skewed Distribution: - The value of the arithmetic mean is greater than the mode; then the distribution is called Positively Skewed.

Nagatively Skewed Distribution: - If the value of the mode is greater than the arithmetic mean; the distribution is called Negatively Skewed.
http://upload.wikimedia.org/wikipedia/commons/thumb/b/b3/Skewness_Statistics.svg/446px-

Several forms of skewness are also defined. The momental skewness is defined by

 alpha^((m))=1/2gamma_1.
(2)

The Pearson mode skewness is defined by

 ((mean-mode))/sigma.
(3)

Pearson's skewness coefficients are defined by

 (3(mean-mode))/sigma
(4)

and

 (3(mean-median))/sigma.
(5)

The Bowley skewness (also known as quartile skewness coefficient) is defined by

 ((Q_3-Q_2)-(Q_2-Q_1))/(Q_3-Q_1)=(Q_1-2Q_2+Q_3)/(Q_3-Q_1),
(6)

where the Qs denote the interquartile ranges. The momental skewness is

 alpha^((m))=1/2gamma=(mu_3)/(2mu^(3/2)).

Comments

Popular Posts

Graphical Distribution of Frequency Distribution

Frequency distribution can be presented graphically in any one of the following ways: Histogram Frequency Polygon Smooth Frequency Curve Cumulative Frequency Curve of Ogive Curve Pie-Chart Histogram: - A histogram is an area diagram in which the frequencies corresponding to each class interval of frequency distribution are by the area of a rectangle without leaving no gap between the cosective rectangles. Frequency Polygon: - This is one kind of histogram which is represented by joining the straight lines of the mid points of the upper horizontal side of each rectangle with adjacent rectangles. Smooth Frequency Curve: - This is one kind of histogram which is represented by joining the mid points by free hand of the upper horizontal side of each rectangle with adjacent rectangles. Comulative Frequency Curve or Ogive Curve: - The total frequency of all values less then the upper class boundary of a

Empirical Relation between Mean, Median and Mode

A distribution in which the values of mean, median and mode coincide (i.e. mean = median = mode) is known as a symmetrical distribution. Conversely, when values of mean, median and mode are not equal the distribution is known as asymmetrical or skewed distribution. In moderately skewed or asymmetrical distribution a very important relationship exists among these three measures of central tendency. In such distributions the distance between the mean and median is about one-third of the distance between the mean and mode, as will be clear from the diagrams 1 and 2 Karl Pearson expressed this relationship as:

Poisson Distribution

A Poisson experiment is a statistical experiment that has the following properties: The experiment results in outcomes that can be classified as successes or failures. The average number of successes (μ) that occurs in a specified region is known. The probability that a success will occur is proportional to the size of the region. The probability that a success will occur in an extremely small region is virtually zero. Note that the specified region could take many forms. For instance, it could be a length, an area, a volume, a period of time, etc. Notation The following notation is helpful, when we talk about the Poisson distribution. e : A constant equal to approximately 2.71828. (Actually, e is the base of the natural logarithm system.) μ: The mean number of successes that occur in a specified region. x : The actual number of successes that occur in a specified region. P( x ; μ): The Poisson probability that exactly x

Correlation and Linearity

Correlation coefficients measure the strength of association between two variables. The most common correlation coefficient, called the Pearson product-moment correlation coefficient , measures the strength of the linear association between variables. In this tutorial, when we speak simply of a correlation coefficient, we are referring to the Pearson product-moment correlation. Generally, the correlation coefficient of a sample is denoted by r , and the correlation coefficient of a population is denoted by ρ or R . How to Interpret a Correlation Coefficient The sign and the absolute value of a correlation coefficient describe the direction and the magnitude of the relationship between two variables. The value of a correlation coefficient ranges between -1 and 1. The greater the absolute value of a correlation coefficient, the stronger the linear relationship. The str