2023.iwcs-1.5.pdf (4.36 MB)
Towards unsupervised compositional entailment with multi-graph embedding models
conference contribution
posted on 2023-11-29, 17:23 authored by Lorenzo Scott Bertolini, Julie WeedsJulie Weeds, David WeirDavid WeirCompositionality and inference are essential features of human language, and should hence be simultaneously accessible to a model of meaning. Despite being theory-grounded, distributional models can only be directly tested on compositionality, usually through similarity judgements, while testing for inference requires external resources. Recent work has shown that knowledge graph embeddings (KGE) architectures can be used to train distributional models capable of learning syntax-aware compositional representations, by training on syntactic graphs. We propose to expand such work with Multi-Graphs embedding (MuG) models, a new set of models learning from syntactic and knowledge-graphs. Using a compositional entailment task, we show how MuGs can simultaneously handle syntax-aware composition and inference, and remain competitive distributional models with respect to lexical and compositional similarity.
History
Publication status
- Published
File Version
- Published version
Journal
15th International Conference on Computational Semantics (IWCS) ProceedingsPublisher
Association for Computational LinguisticsPublisher URL
Page range
50-61Event name
15th International Conference on Computational Semantics (IWCS)Event location
Nancy, FranceEvent type
conferenceEvent date
20th to 23rd June 2023Department affiliated with
- Informatics Publications
Institution
University of SussexFull text available
- Yes
Peer reviewed?
- Yes
Legacy Posted Date
2023-05-04First Compliant Deposit (FCD) Date
2023-05-04Usage metrics
Categories
No categories selectedKeywords
Licence
Exports
RefWorks
BibTeX
Ref. manager
Endnote
DataCite
NLM
DC