Sunday, May 09, 2021

MathBERT: A Pre-Trained Model for Mathematical Formula Understanding

Since its arrival in late 2018, BERT is everywhere! I wish, I could understand mathematical formulas better! 😄

"Large-scale pre-trained models like BERT, have obtained a great success in various Natural Language Processing (NLP) tasks, while it is still a challenge to adapt them to the math-related tasks. Current pre-trained models neglect the structural features and the semantic correspondence between formula and its context. To address these issues, we propose a novel pre-trained model, namely \textbf{MathBERT}, which is jointly trained with mathematical formulas and their corresponding contexts. In addition, in order to further capture the semantic-level structural features of formulas, a new pre-training task is designed to predict the masked formula substructures extracted from the Operator Tree (OPT), which is the semantic structural representation of formulas. ... To the best of our knowledge, MathBERT is the first pre-trained model for mathematical formula understanding."

[2105.00377] MathBERT: A Pre-Trained Model for Mathematical Formula Understanding

No comments: