Very impressive!
"Generalist AI introduced GEN-0, a class of embodied foundation models trained directly on physical interaction data that demonstrates predictable scaling laws similar to those in large language models.
The company trained GEN-0 on over 270,000 hours of real-world manipulation data — orders of magnitude more than existing robotics datasets — and observed a phase transition at 7 billion parameters where smaller models exhibited ossification (inability to absorb new information) while larger models continued to improve.
The models use a training approach that enables simultaneous thinking and acting by processing asynchronous streams of sensing and action tokens, and work across different robot configurations including six-, seven-, and 16+-degree-of-freedom semi-humanoid robots.
The research demonstrates that pretraining data follows a power-law scaling relationship with downstream task performance, allowing researchers to predict how much data is needed to reach specific performance levels." (Data Points newsletter)
"... One core feature is Harmonic Reasoning, in which the models are trained to simultaneously think and act seamlessly. ...
Surpassing the Intelligence Threshold – in an unprecedented high-data regime for robotics, we observe a phase transition at 7B where smaller models exhibit ossification, while larger ones continue to improve. We’ve since scaled GEN-0 to 10B+ model sizes, and observe fast adaptation to new tasks with increasingly less post-training.
Scaling Laws – GEN-0 models exhibit strong scaling laws, in which more pretraining data and compute consistently (and predictably) improve downstream post-training performance of the model across many tasks. ..."
No comments:
Post a Comment