NeurIPS 2021: An Infinite-Feature Extension for Bayesian ReLU Nets ...
--------------------------------------------------------------------------------
An Infinite-Feature Extension for Bayesian ReLU Nets That Fixes Their Asymptotic Overconfidence
Agustinus Kristiadi, Matthias Hein, and Philipp Hennig
Advances in Neural Information Processing Systems (NeurIPS) 2021
--------------------------------------------------------------------------------
► Paper:
► Code:
A Bayesian treatment can mitigate overconfidence in ReLU nets around the training data. But far away from them, ReLU Bayesian neural networks (BNNs) can still underestimate uncertainty and thus be asymptotically overconfident. This issue arises since the output variance of a BNN with finitely many features is quadratic in the distance from the data region. Meanwhile, Bayesian linear models with ReLU features converge, in the infinite-width limit, to a particular Gaussian process (GP) with a variance that grows cubically s