Chapter 6
Using the Standard Toolbox for Bayesian Deep Learning
As we saw in previous chapters, vanilla NNs often produce poor uncertainty estimates and tend to make overconfident predictions, and some aren’t capable of producing uncertainty estimates at all. By contrast, probabilistic architectures offer principled means to obtain high-quality uncertainty estimates; however, they have a number of limitations when it comes to scaling and adaptability.
While both PBP and BBB can be implemented with popular ML frameworks (as shown in our previous TensorFlow examples), they are very complex. As we saw in the last chapter, implementing even a simple network isn’t straightforward. This means that adapting them to new architectures is awkward and time-consuming (particularly for PBP, although it is possible – see Fully Bayesian Recurrent Neural Networks for Safe Reinforcement Learning). For simple tasks, such as the examples from Chapter 5, Principled Approaches for...