Wednesday, October 27, 2021

Notes on "Learning in High Dimension Always Amounts to Extrapolation"

Recommendable! This latest paper by Facebook AI research and co-authored by Yann LeCun seems to raise some serious issues about machine learning and its prediction capabilities.

From the abstract: 
"The notion of interpolation and extrapolation is fundamental in various fields from deep learning to function approximation. ... One fundamental (mis)conception is that state-of-the-art algorithms work so well because of their ability to correctly interpolate training data. A second (mis)conception is that interpolation happens throughout tasks and datasets, in fact, many intuitions and theories rely on that assumption. We ... demonstrate that on any high-dimensional (>100) dataset, interpolation almost surely never happens. Those results challenge the validity of our current interpolation/extrapolation definition as an indicator of generalization performances."

[2110.09485] Learning in High Dimension Always Amounts to Extrapolation

No comments: