Nadaraya-Watson estimator :
The Nadaraya-Watson estimator is a nonparametric method for estimating the regression function in a supervised learning problem. It is named after the Russian mathematician Alexander Nadaraya and the English statistician John Watson, who developed the method independently in the mid-1960s.
The basic idea behind the Nadaraya-Watson estimator is to estimate the regression function by taking a weighted average of the target variables for the training examples that are close to the point where the regression function is being evaluated. The weights for each training example are determined using a kernel function, which assigns higher weights to examples that are closer to the point of evaluation and lower weights to examples that are further away.
Here is a simple example to illustrate how the Nadaraya-Watson estimator works. Suppose we have a set of training examples that consist of pairs of input and target variables (x, y). For example, we might have the following training set:
(x1, y1) = (1, 2)
(x2, y2) = (2, 3)
(x3, y3) = (3, 5)
(x4, y4) = (4, 7)
Suppose we want to use the Nadaraya-Watson estimator to estimate the value of the regression function at x = 2.5. To do this, we first need to choose a kernel function. A common choice for the kernel function is the Gaussian kernel, which is defined as follows:
k(x) = exp(-x^2 / (2 * h^2))
where h is a bandwidth parameter that controls the width of the kernel. A larger value of h will result in a wider kernel, which will give more weight to training examples that are further away from the point of evaluation. A smaller value of h will result in a narrower kernel, which will give more weight to training examples that are closer to the point of evaluation.
To use the Nadaraya-Watson estimator, we first compute the weights for each training example using the kernel function. For example, if we set h = 1, the weights for each training example would be as follows:
k(x1 – 2.5) = exp(-1.25 / (2 * 1^2)) = 0.7788
k(x2 – 2.5) = exp(-0.25 / (2 * 1^2)) = 0.6065
k(x3 – 2.5) = exp(0.75 / (2 * 1^2)) = 0.6065
k(x4 – 2.5) = exp(1.75 / (2 * 1^2)) = 0.7788
Then, we can estimate the value of the regression function at x = 2.5 by taking a weighted average of the target variables using these weights. In this case, the estimate would be:
y_hat = (0.7788 * 2 + 0.6065 * 3 + 0.6065 * 5 + 0.7788 * 7) / (0.7788 + 0.6065 + 0.6065 + 0.7788) = 4.4444
The Nadaraya-Watson estimator is a simple and effective method for estimating the regression function in a supervised learning problem. It is particularly useful when the data is noisy or when the functional form of the regression function is unknown.
In conclusion, the Nadaraya-Watson estimator is a nonparametric method for estimating the regression function in a supervised learning problem. It works by taking a weighted average of the target variables for the training examples that are close to the point where the regression function is being evaluated. The weights are determined using a kernel function, which assigns higher weights to examples that are closer to the point of evaluation and lower weights to examples that are further away. The Nadaraya-Watson estimator is a simple and effective method for estimating the regression function, particularly when the data is noisy or when the functional form of the regression function is unknown.