Recent Changes
Recent Changes · Search:
 

HomePage


The nearest neighbor idea and local learning in general are not limited to classification, and many of the ideas can be more easily illustrated for regression. Consider the following one-dimensional regression problems (examples from http://www.autonlab.org/tutorials/mbl08.pdf):

example 0 example 1 example 2 Lorem ipsum dolor sit amet


Clearly, linear models do not capture the data well. Example 0 to Example 2 illustrate. We can add more features, like higher order polynomial terms, or we can use a local approach, like nearest neighbor:

In one dimension, each training point will produce a horizontal segment that extends from half the distance to the point on the left to half the distance of the point on the right. These midway points are where the prediction changes. In higher dimension, if we use standard Euclidean distance (aka L2 norm) \|\mathbf{x}-\mathbf{x}_i\|_2, the regions where the closest training instance is the same will be polygonal. This tessellation of input space into nearest neighbor cell is known in geometry as a Voronoi diagram. In 2D, it can be easily visualized (try the applet below, which allows you to move the points around!).

Voronoi diagram


More generally, we can use an arbitrary distance function to define a nearest neighbor. Some popular choices besides ||\mathbf{x}-\mathbf{x}_i||_2 include other norms:

Here are all the learned models on our examples (1NN,9NN,KR,LWR):


Page last modified on 30 March 2010 at 08:24 AM