Lavoisier S.A.S.
14 rue de Provigny
94236 Cachan cedex
FRANCE

Heures d'ouverture 08h30-12h30/13h30-17h30
Tél.: +33 (0)1 47 40 67 00
Fax: +33 (0)1 47 40 67 02


Url canonique : www.lavoisier.fr/livre/informatique/variational-bayesian-learning-theory/descriptif_4187845
Url courte ou permalien : www.lavoisier.fr/livre/notice.asp?ouvrage=4187845

Variational Bayesian Learning Theory

Langue : Anglais

Auteurs :

Couverture de l’ouvrage Variational Bayesian Learning Theory
This introduction to the theory of variational Bayesian learning summarizes recent developments and suggests practical applications.
Variational Bayesian learning is one of the most popular methods in machine learning. Designed for researchers and graduate students in machine learning, this book summarizes recent developments in the non-asymptotic and asymptotic theory of variational Bayesian learning and suggests how this theory can be applied in practice. The authors begin by developing a basic framework with a focus on conjugacy, which enables the reader to derive tractable algorithms. Next, it summarizes non-asymptotic theory, which, although limited in application to bilinear models, precisely describes the behavior of the variational Bayesian solution and reveals its sparsity inducing mechanism. Finally, the text summarizes asymptotic theory, which reveals phase transition phenomena depending on the prior setting, thus providing suggestions on how to set hyperparameters for particular purposes. Detailed derivations allow readers to follow along without prior knowledge of the mathematical techniques specific to Bayesian learning.
1. Bayesian learning; 2. Variational Bayesian learning; 3. VB algorithm for multi-linear models; 4. VB Algorithm for latent variable models; 5. VB algorithm under No Conjugacy; 6. Global VB solution of fully observed matrix factorization; 7. Model-induced regularization and sparsity inducing mechanism; 8. Performance analysis of VB matrix factorization; 9. Global solver for matrix factorization; 10. Global solver for low-rank subspace clustering; 11. Efficient solver for sparse additive matrix factorization; 12. MAP and partially Bayesian learning; 13. Asymptotic Bayesian learning theory; 14. Asymptotic VB theory of reduced rank regression; 15. Asymptotic VB theory of mixture models; 16. Asymptotic VB theory of other latent variable models; 17. Unified theory.
Shinichi Nakajima is a senior researcher at Technische Universität Berlin. His research interests include the theory and applications of machine learning, and he has published papers at numerous conferences and in journals such as the Journal of Machine Learning Research, the Machine Learning Journal, Neural Computation, and IEEE Transactions on Signal Processing. He currently serves as an area chair for NIPS and an action Editor for Digital Signal Processing.
Kazuho Watanabe is a lecturer at Toyohashi University of Technology. His research interests include statistical machine learning and information theory, and he has published papers at numerous conferences and in journals such as the Journal of Machine Learning Research, the Machine Learning Journal, IEEE Transactions on Information Theory, and IEEE Transactions on Neural Networks and Learning Systems.
Masashi Sugiyama is Director of the RIKEN Center for Advanced Intelligence Project and Professor of Complexity Science and Engineering at the University of Tokyo. His research interests include the theory, algorithms, and applications of machine learning. He has written several books on machine learning, including Density Ratio Estimation in Machine Learning (Cambridge, 2012). He served as program co-chair and general co-chair of the NIPS conference in 2015 and 2016, respectively, and received the Japan Academy Medal in 2017.

Date de parution :

Ouvrage de 558 p.

15.6x23.5 cm

Disponible chez l'éditeur (délai d'approvisionnement : 14 jours).

Prix indicatif 156,12 €

Ajouter au panier

Thème de Variational Bayesian Learning Theory :