# Your search

PROBABILITY & STATISTICS

## Bayesian machine learning via category theory

Resource type

Authors/contributors

- Culbertson, Jared (Author)
- Sturtz, Kirk (Author)

Title

Bayesian machine learning via category theory

Abstract

From the Bayesian perspective, the category of conditional probabilities (a variant of the Kleisli category of the Giry monad, whose objects are measurable spaces and arrows are Markov kernels) gives a nice framework for conceptualization and analysis of many aspects of machine learning. Using categorical methods, we construct models for parametric and nonparametric Bayesian reasoning on function spaces, thus providing a basis for the supervised learning problem. In particular, stochastic processes are arrows to these function spaces which serve as prior probabilities. The resulting inference maps can often be analytically constructed in this symmetric monoidal weakly closed category. We also show how to view general stochastic processes using functor categories and demonstrate the Kalman filter as an archetype for the hidden Markov model.

Publication

arXiv:1312.1445 [math]

Date

2013-12-05

Accessed

2019-11-22T17:32:35Z

Library Catalog

Extra

ZSCC: 0000006 arXiv: 1312.1445

Notes

Comment: 74 pages, comments welcome

Citation

Culbertson, J., & Sturtz, K. (2013). Bayesian machine learning via category theory.

*ArXiv:1312.1445 [Math]*. Retrieved from http://arxiv.org/abs/1312.1445
MACHINE LEARNING

PROBABILITY & STATISTICS

Attachment

Link to this record