No reason to panic or to become hysteric!
Remember garbage in garbage out? Well, this seems to be a case of sensitive information in sensitive information out.
Remember garbage in garbage out? Well, this seems to be a case of sensitive information in sensitive information out.
The OpenAI GPT-2 language model was fed with 40GB of Internet text. In the meantime, GPT-3 was released trained on 45TB of compressed plain text!
Can be fixed and it should not be very difficult!
No comments:
Post a Comment