krasserm/bayesianmachinelearning
Notebooks related to Bayesian methods for machine learning
repo name  krasserm/bayesianmachinelearning 
repo link  https://github.com/krasserm/bayesianmachinelearning 
homepage  
language  Jupyter Notebook 
size (curr.)  22450 kB 
stars (curr.)  751 
created  20180319 
license  Apache License 2.0 
Bayesian machine learning notebooks
This repository is a collection of notebooks about Bayesian Machine Learning. The following links display the notebooks via nbviewer to ensure a proper rendering of formulas.

Latent variable models  part 1: Gaussian mixture models and the EM algorithm. Introduction to the expectation maximization (EM) algorithm and its application to Gaussian mixture models. Example implementation with plain NumPy/SciPy and scikitlearn for comparison.

Latent variable models  part 2: Stochastic variational inference and variational autoencoders. Introduction to stochastic variational inference with variational autoencoder as application example. Implementation with Tensorflow 2.x.

Variational inference in Bayesian neural networks. Demonstrates how to implement and train a Bayesian neural network using a variational inference approach. Example implementation with Keras.

Bayesian regression with linear basis function models. Introduction to Bayesian linear regression. Implementation from scratch with plain NumPy as well as usage of scikitlearn for comparison.

Gaussian processes. Introduction to Gaussian processes. Example implementations with plain NumPy/SciPy as well as with libraries scikitlearn and GPy.

Bayesian optimization. Introduction to Bayesian optimization. Example implementations with plain NumPy/SciPy as well as with libraries scikitoptimize and GPyOpt. Hyperparameter tuning as application example.

Deep feature consistent variational autoencoder. Describes how a perceptual loss can improve the quality of images generated by a variational autoencoder. Example implementation with Keras.

Conditional generation via Bayesian optimization in latent space. Describes an approach for conditionally generating outputs with desired properties by doing Bayesian optimization in latent space of variational autoencoders. Example application implemented with Keras and GPyOpt.