Sunday, May 10, 2020

AI and Efficiency

OpenAI is attempting to define something like a Moore's law for AI & machine learning!



"We’re releasing an analysis showing that since 2012 the amount of compute needed to train a neural net to the same performance on ImageNet classification has been decreasing by a factor of 2 every 16 months ...

Within translation, the Transformer surpassed seq2seq23 performance on English to French translation on WMT’14 with 61x less training compute 3 years later.

We estimate AlphaZero took 8x less compute to get to AlphaGoZero level performance 1 year later. ..."



AI and Efficiency

No comments: