University of Sussex
Browse
2023.iwcs-1.5.pdf (4.36 MB)

Towards unsupervised compositional entailment with multi-graph embedding models

Download (4.36 MB)
conference contribution
posted on 2023-11-29, 17:23 authored by Lorenzo Scott Bertolini, Julie WeedsJulie Weeds, David WeirDavid Weir
Compositionality and inference are essential features of human language, and should hence be simultaneously accessible to a model of meaning. Despite being theory-grounded, distributional models can only be directly tested on compositionality, usually through similarity judgements, while testing for inference requires external resources. Recent work has shown that knowledge graph embeddings (KGE) architectures can be used to train distributional models capable of learning syntax-aware compositional representations, by training on syntactic graphs. We propose to expand such work with Multi-Graphs embedding (MuG) models, a new set of models learning from syntactic and knowledge-graphs. Using a compositional entailment task, we show how MuGs can simultaneously handle syntax-aware composition and inference, and remain competitive distributional models with respect to lexical and compositional similarity.

History

Publication status

  • Published

File Version

  • Published version

Journal

15th International Conference on Computational Semantics (IWCS) Proceedings

Publisher

Association for Computational Linguistics

Page range

50-61

Event name

15th International Conference on Computational Semantics (IWCS)

Event location

Nancy, France

Event type

conference

Event date

20th to 23rd June 2023

Department affiliated with

  • Informatics Publications

Institution

University of Sussex

Full text available

  • Yes

Peer reviewed?

  • Yes

Legacy Posted Date

2023-05-04

First Compliant Deposit (FCD) Date

2023-05-04

Usage metrics

    University of Sussex (Publications)

    Categories

    No categories selected

    Licence

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC