Skip to content

Minimum Average Variance Estimation Mave Method

  • Reduces dataset dimensionality by forming optimal linear combinations of original variables.
  • Chooses coefficients to minimize the average variance of the new variables.
  • Optimization is performed under the constraint that the coefficients sum to 1.

Minimum average variance estimation (MAVE) is a method for reducing the dimensionality of a dataset by finding the linear combination of the original variables that produces the minimum average variance of the resulting variables.

MAVE seeks an optimal linear transformation of the original variables so that the transformed set has the smallest possible average variance. Variance measures how spread out values are around their mean. MAVE determines coefficients for a linear combination of the original variables by solving optimization equations that minimize average variance, subject to the constraint that the sum of the coefficients equals 1.

The variance of the dataset {1, 2, 3} is computed as:

(12)2+(22)2+(32)2=1+0+1=2(1-2)^2 + (2-2)^2 + (3-2)^2 = 1 + 0 + 1 = 2

Given three variables X1, X2, and X3, MAVE finds coefficients a1, a2, and a3 that minimize average variance:

MAVE(X1,X2,X3)=a1X1+a2X2+a3X3\text{MAVE}(X_1, X_2, X_3) = a_1 X_1 + a_2 X_2 + a_3 X_3

The coefficients are determined by solving the optimization problem subject to the constraint that the sum of the coefficients equals 1.

For a dataset of stock returns over time, MAVE can find a linear combination of those returns that yields the minimum average variance of the transformed variables, which can aid in constructing a low-risk, diversified portfolio.

  • Variance
  • Dimensionality reduction
  • Linear combination
  • Optimization