TechRxiv
DIBERT_CAMERA_Preprint.pdf (358.7 kB)
Download file

DIBERT: Dependency Injected Bidirectional Encoder Representations from Transformers

Download (358.7 kB)
preprint
posted on 2021-10-18, 21:54 authored by Abdul WahabAbdul Wahab, Rafet Sifa

In this paper, we propose a new model named DIBERT which stands for Dependency Injected Bidirectional Encoder Representations from Transformers. DIBERT is a variation of the BERT and has an additional third objective called Parent Prediction (PP) apart from Masked Language Modeling (MLM) and Next Sentence Prediction (NSP). PP injects the syntactic structure of a dependency tree while pre-training the DIBERT which generates syntax-aware generic representations. We use the WikiText-103 benchmark dataset to pre-train both BERT- Base and DIBERT. After fine-tuning, we observe that DIBERT performs better than BERT-Base on various downstream tasks including Semantic Similarity, Natural Language Inference and Sentiment Analysis.

History

Email Address of Submitting Author

abdul.wahab@rwth-aachen.de

ORCID of Submitting Author

0000-0001-7813-3805

Submitting Author's Institution

RWTH Aachen

Submitting Author's Country

  • Germany