Bayesian Deep learning with 10% of the weights


Deep learning grows in popularity and use, but it has two problems. Neural networks have millions of parameters and provide no uncertainty. In this talk, we solve both problems with one simple trick: Bayesian deep learning. We show how to prune 90% of the parameters while maintaining performance. As a bonus, we get the uncertainty over our predictions, which is useful for critical applications.

Follow to receive video recommendations   a   A

Talk slides:

Editors Note:

I am looking for editors/curators to help with branches of the tree. Please send me an email  if you are interested.