University of Sussex
Browse

File(s) not publicly available

Matrix Logarithm Parametrizations for Neural Network Covarious Models

journal contribution
posted on 2023-06-08, 00:22 authored by Peter Williams
Neural networks are commonly used to model conditional probability distributions. The idea is to represent distributional parameters as functions of conditioning events, where the function is determined by the architecture and weights of the network. An issue to be resolved is the link between distributional parameters and network outputs. The latter are unconstrained real numbers whereas distributional parameters may be required to lie in proper subsets, or be mutually constrained, e.g. by the positive definiteness requirement for a covariance matrix. The paper explores the matrix-logarithm parametrization of covariance matrices for multivariate normal distributions. From a Bayesian point of view the choice of parametrization is linked to the choice of prior. This is treated by investigating the invariance of predictive distributions, for the chosen parametrization, with respect to an important class of priors.

History

Publication status

  • Published

Journal

Neural Networks

ISSN

08936080

Publisher

Elsevier

Issue

2

Volume

12

Page range

299-308

ISBN

0893-6080

Department affiliated with

  • Informatics Publications

Full text available

  • No

Peer reviewed?

  • Yes

Legacy Posted Date

2012-02-06

Usage metrics

    University of Sussex (Publications)

    Categories

    No categories selected

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC