We are coming closer to a brain on chip! What a gigantic chip! Probably it does not yet fit in a typical smartphone! 😄
This could be a game changer!
"... for the first time ever, the ability to train models with up to 20 billion parameters on a single CS-2 system – a feat not possible on any other single device. By enabling a single CS-2 to train these models, Cerebras reduces the system engineering time necessary to run [train] large natural language processing (NLP) models from months to minutes. It also eliminates one of the most painful aspects of NLP — namely the partitioning of the model across hundreds or thousands of small graphics processing units (GPU)."
Cerebras Systems Sets Record for Largest AI Models Ever Trained on A Single Device Single CS-2 System trains multi-billion parameter NLP models including GPT-3L 1.3 Billion, GPT-J 6 Billion , GPT-3 13 Billion and GPT-NeoX 20 Billion; provides simple setup, faster training and enables switching between models with a few keystrokes
No comments:
Post a Comment