University of Sussex
Browse
2021.repl4nlp-1.7.pdf (288.64 kB)

Structure-aware sentence encoder in Bert-based siamese network

Download (288.64 kB)
conference contribution
posted on 2023-06-10, 01:22 authored by Qiwei Peng, David WeirDavid Weir, Julie WeedsJulie Weeds
Recently, impressive performance on various natural language understanding tasks has been achieved by explicitly incorporating syntax and semantic information into pre-trained models, such as BERT and RoBERTa. However, this approach depends on problem-specific fine-tuning, and as widely noted, BERT-like models exhibit weak performance, and are inefficient, when applied to unsupervised similarity comparison tasks. Sentence-BERT (SBERT) has been proposed as a general-purpose sentence embedding method, suited to both similarity comparison and downstream tasks. In this work, we show that by incorporating structural information into SBERT, the resulting model outperforms SBERT and previous general sentence encoders on unsupervised semantic textual similarity (STS) datasets and transfer classification tasks.

History

Publication status

  • Published

File Version

  • Published version

Journal

Proceedings of the 6th Workshop on Representation Learning for NLP

Publisher

Association for Computational Linguistics

Page range

57-63

Event name

Proceedings of the 6th Workshop on Representation Learning for NLP (RepL4NLP-2021)

Event location

Online

Event type

conference

Event date

August 6, 2021

Place of publication

Bangkok, Thailand

ISBN

9781954085725

Department affiliated with

  • Informatics Publications

Full text available

  • Yes

Peer reviewed?

  • Yes

Legacy Posted Date

2021-10-12

First Open Access (FOA) Date

2021-10-12

First Compliant Deposit (FCD) Date

2021-10-12

Usage metrics

    University of Sussex (Publications)

    Categories

    No categories selected

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC