Structure-aware sentence encoder in Bert-based siamese network

Peng, Qiwei, Weir, David and Weeds, Julie (2021) Structure-aware sentence encoder in Bert-based siamese network. Proceedings of the 6th Workshop on Representation Learning for NLP (RepL4NLP-2021), Online, August 6, 2021. Published in: Proceedings of the 6th Workshop on Representation Learning for NLP. 57-63. Association for Computational Linguistics, Bangkok, Thailand. ISBN 9781954085725

[img] PDF - Published Version
Available under License Creative Commons Attribution-NonCommercial ShareAlike.

Download (295kB)

Abstract

Recently, impressive performance on various natural language understanding tasks has been achieved by explicitly incorporating syntax and semantic information into pre-trained models, such as BERT and RoBERTa. However, this approach depends on problem-specific fine-tuning, and as widely noted, BERT-like models exhibit weak performance, and are inefficient, when applied to unsupervised similarity comparison tasks. Sentence-BERT (SBERT) has been proposed as a general-purpose sentence embedding method, suited to both similarity comparison and downstream tasks. In this work, we show that by incorporating structural information into SBERT, the resulting model outperforms SBERT and previous general sentence encoders on unsupervised semantic textual similarity (STS) datasets and transfer classification tasks.

Item Type: Conference Proceedings
Schools and Departments: School of Engineering and Informatics > Informatics
SWORD Depositor: Mx Elements Account
Depositing User: Mx Elements Account
Date Deposited: 12 Oct 2021 08:18
Last Modified: 30 Nov 2021 16:47
URI: http://sro.sussex.ac.uk/id/eprint/102252

View download statistics for this item

📧 Request an update