File(s) not publicly available
Matrix Logarithm Parametrizations for Neural Network Covarious Models
journal contribution
posted on 2023-06-08, 00:22 authored by Peter WilliamsNeural networks are commonly used to model conditional probability distributions. The idea is to represent distributional parameters as functions of conditioning events, where the function is determined by the architecture and weights of the network. An issue to be resolved is the link between distributional parameters and network outputs. The latter are unconstrained real numbers whereas distributional parameters may be required to lie in proper subsets, or be mutually constrained, e.g. by the positive definiteness requirement for a covariance matrix. The paper explores the matrix-logarithm parametrization of covariance matrices for multivariate normal distributions. From a Bayesian point of view the choice of parametrization is linked to the choice of prior. This is treated by investigating the invariance of predictive distributions, for the chosen parametrization, with respect to an important class of priors.
History
Publication status
- Published
Journal
Neural NetworksISSN
08936080Publisher
ElsevierExternal DOI
Issue
2Volume
12Page range
299-308ISBN
0893-6080Department affiliated with
- Informatics Publications
Full text available
- No
Peer reviewed?
- Yes
Legacy Posted Date
2012-02-06Usage metrics
Categories
No categories selectedKeywords
Licence
Exports
RefWorks
BibTeX
Ref. manager
Endnote
DataCite
NLM
DC