Monday, November 29, 2021

On Learning in High Dimension Always Amounts to Extrapolation

Recommendable! This is a new paper by Facebook co-authored by Yann LeCun. I am not sure what the ramifications are of the conclusions of this paper, but it may turn out to be quite relevant.

From the abstract:
"The notion of interpolation and extrapolation is fundamental in various fields from deep learning to function approximation. Interpolation occurs for a sample x whenever this sample falls inside or on the boundary of the given dataset's convex hull. Extrapolation occurs when x falls outside of that convex hull. One fundamental (mis)conception is that state-of-the-art algorithms work so well because of their ability to correctly interpolate training data. A second (mis)conception is that interpolation happens throughout tasks and datasets, in fact, many intuitions and theories rely on that assumption. We empirically and theoretically argue against those two points and demonstrate that on any high-dimensional (>100) dataset, interpolation almost surely never happens. Those results challenge the validity of our current interpolation/extrapolation definition as an indicator of generalization performances."

[2110.09485] Learning in High Dimension Always Amounts to Extrapolation (open access)

No comments: