You are here

Error message

Deprecated function: The each() function is deprecated. This message will be suppressed on further calls in node_privacy_byrole_nodeapi_prepare() (line 216 of /mnt/datos/html/packages/drupal7/sites/all/modules/node_privacy_byrole/node_privacy_byrole.module).

SCHEDULE

Lecturers:

BS: Bernhard Schölkopf

NL: Neil Lawrence

MG: Mark Girolami

FPC: Fernando Perez-Cruz

RV: Robert Vanderbei

ZG: Zoubin Ghahramani

PO: Peter Orbanz

GL: Gabor Lugosi

JC: John Cunningham

DG: Dilan Gorur

 

Titles and abstracts of courses

 

Kernel Methods

Bernhard Schölkopf, Max Plank Institute Tübingen

The course will cover some basic ideas of learning theory, elements of the theory of reproducing kernel Hilbert spaces, and some machine learning algorithms that build upon this.

 

Concentration inequalities in machine learning

Gabor Lugosi, Universitat Pompeu Fabra

In the analysis of machine learning algorithms one often faces
complicated functions of many independent random variables. In such situations concentration
inequalities, that quantify the size of typical deviations of such functions from
their expected value, offer an elegant and versatile tool. The theory of concentration
inequalities has seen spectacular advance in the last few decades. These inequalities proved
to be useful not only in machine learning but also in a wide variety of areas,
including combinatorics, graph theory, analysis of algorithms. information theory, geometry, just to
name a few. This course offers an introduction to the theory and a summary of some of
the most useful results with a sample of illustrations of their use in learning theory.

 

Diffusions and Geodesic Flows on Manifolds: The Differential Geometry of Markov Chain Monte Carlo

Mark Girolami, University College London

Markov Chain Monte Carlo methods provide the most comprehensive set of simulation based tools to enable inference over many classes of statistical models. The complexity of many applications presents an enormous challenge for sampling methods motivating continual innovation in theory, methodology and associated algorithms. In this series of lectures we will consider one recent advance in MCMC methodology, that has exploited mathematical ideas from differential geometry, classical nonlinear dynamics, and diffusions constrained on manifolds, in attempting to provide the tools required to attack some of the most challenging of sampling problems presented to statisticians. A step-by-step presentation of the material will be provided to ensure that students grasp the fundamental concepts and are able to then develop further theory and methodology at the end of the lectures.

 

Optimization: Theory and Algorithms

Robert Vanderbei, Princeton University

The course will cover linear, convex, and parametric optimization.  In each of these areas, the role of duality will be emphasized as it informs the design of efficient algorithms and provides a rigorous basis for determining optimality. Various versions of the Simplex Method for linear programming will be presented.  The dangers of degeneracy and ways to avoid it will be explained.  Also, both the worst-case and average-case efficiency of the algorithms will be described.  Finally, an efficient algorithm for parametrically solving multi-objective optimization problems will be presented, analyzed, and proposed as a new algorithm for sparse regression.

 

Bayesian Modelling

Zoubin Ghahramani, University of Cambridge

 

Graphical Models

Zoubin Ghahramani, University of Cambridge

 

Applications of Bayesian Modelling

Zoubin Ghahramani, University of Cambridge

 

Introduction to Bayesian Nonparametrics

Peter Orbanz, University of Cambridge

 

Advanced  Bayesian Nonparametrics

Peter Orbanz, University of Cambridge

 

Gaussian Processes

John Cunningham, University of Cambridge

 

Dirichlet Process Practical

Dilan Görür, Yahoo! Labs

 

Gaussian Process Practical

Dilan Görür, Yahoo! Labs

 

Tutorials

 

Probabilistic decision-making, data analysis, and discovery in astronomy

David Hogg, Unversity of New York
Astronomy is a prime user community for machine learning and 
probabilistic modeling.  There are very large, public data sets 
(mostly but not entirely digital imaging), there are simple but
effective models of many of the most important phenomena (stars,
quasars, and galaxies), and there are very good models of telescopes,
cameras, and detectors.  I will show in detail some examples of
problems we were able to solve in astrophysics by bringing
probabilistic inference and decision theory to astronomy.  I will
discuss why many "supervised" methods are not nearly as useful in
astronomy as those that involve generative modeling.  I hope to leave
the audience with real research problems, the solutions to which would
be (a) achievable with contemporary machine-learning methods, and at
the same time (b) very exciting within the astrophysics community.

Theme by Danetsoft and Danang Probo Sayekti inspired by Maksimer