Saturday, May 02, 2020

A foolproof way to shrink deep learning models

This appears to be a very promising line of research in AI & machine learning. So far models used in AI have grown tremendously in size and complexity. Bigger models usually meant better results. 
However, it was suggested in several research papers that these models do not need to be that large and that these models can be compressed without loss of performance or accuracy etc.

A foolproof way to shrink deep learning models | MIT News: MIT researchers have proposed a technique for shrinking deep learning models that they say is simpler and produces more accurate results than state-of-the-art methods. It works by retraining the smaller, pruned model at its faster, initial learning rate.

No comments: