Uncertainty estimation is a key issue when considering the application of deep neural network methods in science and engineering. In this talk, we introduce and benchmark a novel algorithm that quantifies epistemic uncertainty via Monte Carlo sampling from a tempered posterior distribution. It leverages a cyclic, adaptive temperature, as well as a stochastic Metropolis-Hastings step, to efficiently draw from the posterior. Empirically, we demonstrate successful uncertainty quantification and computational efficiency in a one-dimensional regression setting with additional focus on the detection of out-of distribution data. Based on PAC-Bayes theory, we analyse contraction rates and credibility of the posterior distribution.
The talk is based on joint works with Sebastian Bieringer, Gregor Kasieczka and Maximilian F. Steffen.
Personal Website of Mathias Trabs
The talk also can be joined online via our ZOOM MEETING
Meeting room opens at: November 14, 2022, 4.30 pm Vienna
Meeting ID: 639 0252 8585
Password: 506520