Jeffreys’s prior :
Jeffreys’ prior is a method of assigning probabilities to different possible values of a parameter in a statistical model. It is based on the idea that the probabilities should be chosen in a way that is invariant to changes in the units of measurement of the parameter.
For example, suppose we are trying to estimate the mean of a normal distribution. We could choose a uniform prior on the range of possible values of the mean, but this would not be invariant to changes in the units of measurement. For example, if we measure the mean in inches, the range of possible values might be from 0 to 10 inches, but if we measure the mean in centimeters, the range of possible values would be from 0 to 254 centimeters.
To avoid this problem, Jeffreys’ prior assigns probabilities in a way that is invariant to changes in the units of measurement. In the case of the normal distribution, this means assigning a probability density that is proportional to the square root of the variance of the distribution. This ensures that the probability of any given range of values is the same, regardless of the units of measurement.
Another example of Jeffreys’ prior is in the case of estimating the precision (inverse variance) of a normal distribution. Here, the prior probability density is proportional to the reciprocal of the square root of the precision. This ensures that the probabilities are invariant to changes in the units of measurement of the precision.
In both of these examples, Jeffreys’ prior provides a way to assign probabilities in a way that is consistent and objective, without the need to make arbitrary choices about the range of possible values or the units of measurement. This can help to reduce bias and improve the accuracy of statistical estimates.