Advanced Topics in Machine Learning
Summer semester 2015
The seminar concentrates on reviewing the latest developments in machine learning by reviewing topics and papers from the last few NIPS, ICML and AISTATS conferences. The seminar is focused on PhD students but outstanding master students are also welcome to join. There will be different topics presented by groups of 2-3 team members. The presentation of each topic should take approximately 60-90 minutes and should also include a review of the relevant background that are needed to understand the presented topics. PhD students should present once a year (every second semester).
|Time||Wednesdays, 16:00 – 18:00|
|Location||S2|02, room A102|
|Lecturer||Prof. Dr. techn. Gerhard Neumann, Prof. Jan Peters, Ph.D., Prof. Stefan Roth, Ph.D.|
- Reproducing Kernel Hilbert Spaces: Papers from: A. Gretton, K. Fukumizu, L. Song
- Deep Neural Networks: Work from Bengio, Hinton, etc… Could be multiple sub-topics
- Structured Prediction: Work from H. Daume, Drew Bagnell
- (Stochastic) Variational Inference: Work from D. Blei
- Sampling for Bayesian Learning: Work on Slice Sampling and papers by Max. Welling (Fisher scoring)
- Recent advances for GPs and Local Regression Methods: Phillip Henning and Titsias
- Recent work on Bandits, Linear Bandits: G. Neu, O. Maillard, R. Munos
- Recent work Decision and Regression Trees
- Recent work on Boosting
- Sampling by Optimization: Papers by G. Papandreo, Jaakkola, Tom Minka, Max Welling
- Bounds in Reinforcement Learning: C. Csevaspari, R, Munos
- New Developments in Stochastic Gradient Descent: Papers by F. Bach
- Online Learning
This class aims to develop Master-level students into prospective Ph.D. student on the topic of machine learning. Therefore, the class will study current work from the top conferences of machine learning (NIPS, ICML and AIStats) and requires reimplementing methods.
Prior completion of Statistical Machine Learning (formerly Machine Learning: Statistical Approaches I).