# Mardia’s multivariate normality test

## Mardia’s multivariate normality test :

Mardia’s multivariate normality test is a statistical test used to determine whether a sample of multivariate data comes from a normally distributed population. The test is named after the statistician Kanti Mardia, who developed it in the 1970s.
To understand the Mardia’s test, it’s first important to understand what it means for a population to be normally distributed. A population is normally distributed if the majority of the values in the population fall within a certain range, and the remaining values are distributed symmetrically around the range. This means that if you were to plot the values on a graph, they would form a bell-shaped curve, with the majority of the values clustered around the middle of the curve and fewer values on either side.
Now, let’s consider a simple example to illustrate how Mardia’s test works. Suppose we have a sample of 10 people, and we want to determine whether their heights are normally distributed. We can measure the height of each person and plot the values on a graph. If the heights form a bell-shaped curve, then we can conclude that the population of heights is normally distributed.
However, things get more complicated when we have multivariate data. In this case, our data has more than one variable, such as height and weight. To determine whether this multivariate data is normally distributed, we can use Mardia’s test.
The Mardia’s test works by comparing the sample covariance matrix of the data with the theoretical covariance matrix of a normally distributed population. The sample covariance matrix is a matrix that contains the variances and covariances of the variables in the sample. The theoretical covariance matrix is a matrix that contains the variances and covariances of a normally distributed population with the same means and variances as the sample.
If the sample covariance matrix is similar to the theoretical covariance matrix, then we can conclude that the population is normally distributed. However, if the two matrices are significantly different, then we can conclude that the population is not normally distributed.
To illustrate this, let’s consider another example. Suppose we have a sample of 10 people, and we want to determine whether their heights and weights are normally distributed. We can measure the height and weight of each person and calculate the sample covariance matrix. We can then compare this matrix with the theoretical covariance matrix of a normally distributed population with the same means and variances as the sample.
If the sample covariance matrix is similar to the theoretical covariance matrix, then we can conclude that the population of heights and weights is normally distributed. However, if the two matrices are significantly different, then we can conclude that the population is not normally distributed.
In summary, Mardia’s multivariate normality test is a statistical test used to determine whether a sample of multivariate data comes from a normally distributed population. The test compares the sample covariance matrix of the data with the theoretical covariance matrix of a normally distributed population, and if the two matrices are similar, then we can conclude that the population is normally distributed.