Maximum Likelihood Estimation :
Maximum likelihood estimation (MLE) is a statistical technique used to estimate the values of parameters in a given model. It is based on the idea that the observed data is most likely to have occurred under the model with the highest probability.
To understand MLE, let’s consider two examples.
Example 1: Estimating the success rate of a coin
Suppose we have a coin that we believe to be fair (i.e., the probability of obtaining heads and tails is equal). We want to estimate the success rate of this coin using MLE. To do this, we toss the coin 10 times and obtain the following results: HTHTHTHTHT.
The probability of obtaining this sequence of tosses (HTHTHTHTHT) under the assumption that the coin is fair is 1/1024 (i.e., the probability of obtaining heads in the first toss is 1/2, the probability of obtaining tails in the second toss is 1/2, and so on).
Now, suppose we want to estimate the success rate of this coin using MLE. To do this, we consider two models:
Model 1: The coin is fair (i.e., the probability of obtaining heads is 0.5).
Model 2: The coin is biased (i.e., the probability of obtaining heads is not 0.5).
For each model, we calculate the likelihood of obtaining the sequence HTHTHTHTHT. For Model 1, the likelihood is (1/2)^10 = 1/1024. For Model 2, the likelihood is (p)^8 * (1-p)^2, where p is the probability of obtaining heads.
Since the likelihood of Model 1 is higher than the likelihood of Model 2, we conclude that the coin is likely to be fair.
Example 2: Estimating the mean of a normal distribution
Suppose we have a population with a normal distribution with unknown mean and variance. We want to estimate the mean of this population using MLE. To do this, we sample 10 observations from the population and obtain the following values: 2, 3, 4, 5, 6, 7, 8, 9, 10, 11.
The probability of obtaining this sample under the assumption that the population has a normal distribution with mean 5.5 and variance 2.25 is 0.027.
Now, suppose we want to estimate the mean of this population using MLE. To do this, we consider two models:
Model 1: The population has a normal distribution with mean 5.5 and variance 2.25.
Model 2: The population has a normal distribution with mean 6.0 and variance 2.25.
For each model, we calculate the likelihood of obtaining the sample (2, 3, 4, 5, 6, 7, 8, 9, 10, 11). For Model 1, the likelihood is 0.027. For Model 2, the likelihood is 0.016.
Since the likelihood of Model 1 is higher than the likelihood of Model 2, we conclude that the population is likely to have a normal distribution with mean 5.5 and variance 2.25.
In summary, maximum likelihood estimation is a statistical technique used to estimate the values of parameters in a given model. It is based on the idea that the observed data is most likely to have occurred under the model with the highest probability. It is used in a variety of applications, including estimating the success rate of a coin and estimating the mean of a normal distribution.