Bayesian Methods for Machine Learing Course Project Skoltech 2018
We replicate the results of the recent paper Concrete Droput by Gal et al. and extend the results to new experiments.
- Understand and discuss model implementation
- Reproduce experiments on: MNIST, Computer vision task and Reinforcement Learning
- Try different RL environments
- Evaluate the algorithm performance for NLP tasks
- Implement the Concrete Droupout for Recurrent Layers
git clone https://github.com/Alfo5123/ConcreteDropout.git
Papers:
- Concrete Dropout
- Dropout as a Bayesian Approximation: Representing Model Uncertainty in Deep Learning
- Categorical Reparameterization with Gumbel-Softmax
- Variational Dropout and the Local Reparameterization Trick
- The Concrete Distribution: A Continuous Relaxation of Discrete Random Variables
- Improving PILCO with Bayesian Neural Network Dynamics Models
- Recurrent Neural Network Regularization
Repositories:
Blogs: