Bayesian Deep learning with 10% of the weights

Loading

Deep learning grows in popularity and use, but it has two problems. Neural networks have millions of parameters and provide no uncertainty. In this talk, we solve both problems with one simple trick: Bayesian deep learning. We show how to prune 90% of the parameters while maintaining performance. As a bonus, we get the uncertainty over our predictions, which is useful for critical applications.

Follow to receive video recommendations   a   A


Talk slides: https://github.com/RobRomijnders/weight_uncertainty/blob/master/docs/presentation/versions/final_pydata18_bayes_nn_rob_romijnders_1.pdf

Editors Note:

I am looking for editors/curators to help with branches of the tree. Please send me an email  if you are interested.