
Learning Faster from Easy Data
In online learning algorithms are typically analysed in the worst case. However, not all data are created equal. We aim to design efficient algorithms that adapt automatically to the difficulty of the data.

Online Learning Beyond Experts
We investigate tradeoffs between predictive performance and computational efficiently by considering online learning of combinatorial and matrix concept classes.

Minimax Square Loss Prediction
The squared Euclidean distance is one of the widely used loss functions in machine learning. We study prediction games with square loss using a pure minimax approach. The goal is to obtain efficient algorithms for natural tasks.

Online Finance
We consider worstcase approaches to sequential investment. We obtain interesting guarantees that can be achieved efficiently.

Switching
Patterns in data often change with time. We take a worstcase approach to designing adaptive algorithms to deal with such nonstationarity.

Nonuniform regret analysis
We study ways to incorporate prior expectations about the environment in the form of datadependent luckiness guarantees.

Robust Statistics
We investigate ways to make reliable inferences with the help of a statistical model, but without assuming that the data are generated from this model.

Algorithmic Information Theory
We investigate interesting variations of the notion of Kolmogorov complexity.

Open Problems
Collection of open problems.