Probabilistic Integration (NIPS 2015)

Montréal, Friday, December 11, 2015

This workshop will be hosted by NIPS, a large annual machine learning and computational neuroscience conference.

Integration is the central numerical operation required for Bayesian machine learning (in the form of marginalization and conditioning). Sampling algorithms still abound in this area, although it has long been known that Monte Carlo methods are fundamentally sub-optimal. The challenges for the development of better performing integration methods are mostly algorithmic. Moreover, recent algorithms have begun to outperform MCMC and its siblings, in wall-clock time, on realistic problems from machine learning.

The workshop will review the existing, by now quite strong, theoretical case against the use of random numbers for integration, discuss recent algorithmic developments, relationships between conceptual approaches, and highlight central research challenges going forward.

Among the questions to be addressed by the workshop are

  • How fast can a practical integral estimate on a deterministic function converge (polynomially, super-polynomially, not just “better than \(\sqrt{n}\)”)?

  • How are these rates related, precisely, to prior assumptions about the integrand, and to the design rules of the integrator?

  • To which degree can the source code of an integration problem be parsed to choose informative priors?

  • Are random numbers necessary and helpful for efficient multivariate integration, or are they a conceptual crutch that cause inefficiencies?

  • What are the practical challenges in the design of efficient multivariate integration methods that use such prior information?

The workshop builds upon the growing field of probabilistic numerics, for which probabilistic integration is a core component.

The workshop is organized by Mike Osborne and Philipp Hennig.

Schedule

The workshop will be held on Friday, 11 December in room 512a.

Accepted Papers