Machine Learning
Semester: |
SS 2015 |
Type: |
Lecture |
Lecturer: |
|
Credits: |
V3 + Ü1 (6 ECTS credits) |
Links: |
https://www3.elearning.rwth-aachen.de/ss15/15ss-29840/Dashboard.aspx |
Find a list of current courses on the Teaching page.
Type |
Date |
Room |
---|---|---|
Lecture | Thu, 14:15-15:45 | UMIC 025 |
Lecture/Exercise | Thu, 14:15-15:45 | UMIC 025 |
Lecture Description
The goal of Machine Learning is to develop techniques that enable a machine to "learn" how to perform certain tasks from experience.
The important part here is the learning from experience. That is, we do not try to encode the knowledge ourselves, but the machine should learn it itself from training data. The tools for this are statistical learning and probabilistic inference techniques. Such techniques are used in many real-world applications. This lecture will teach the fundamental machine learning know-how that underlies such capabilities. In addition, we show current research developments and how they are applied to solve real-world tasks.
Example questions that could be addressed with the techniques from the lecture include
- Is this email important or spam?
- What is the likelihood that this credit card transaction is fraudulent?
- Does this image contain a face?
Exercises
The class is accompanied by exercises that will allow you to collect hands-on experience with the algorithms introduced in the lecture.
There will be both pen&paper exercises and practical programming exercises based on Matlab (roughly 1 exercise sheet every 2 weeks). Please submit your hand written solutions of the exercises the day before the exercise class in the submission box at room 129, UMIC. Please submit your source code solutions through the L2P system.
We ask you to work in teams of 2-3 students.
Literature
For most part, the lecture will follow the book by Bishop. Additional topics are covered in Duda & Hart's book Christopher M. Bishop, Pattern Recognition and Machine Learning, Springer, 2006 R.O. Duda, P.E. Hart, D.G. Stork, Pattern Classification, 2nd Edition, Wiley-Interscience, 2000 Wherever research papers are necessary for a deeper understanding, we will make them available on this web page.
Additional Resources
- C.E. Rasmussen, C.K.I. Williams Gaussian Processes for Machine Learning MIT Press, 2006 available available online.
- D.J. MacKay Information Theory, Inference, and Learning Algorithms Cambridge University Press, 2003 also available online.
- Lecture videos provided by the Fachschaft
Matlab Resources
- Matlab Online Reference Documentation
- Getting started with Matlab
- Techniques for improving performance
- A useful Matlab Quick-reference card (in German).
Date | Title | Content | Material |
---|---|---|---|
Introduction | Introduction, Probability Theory, Bayes Decision Theory, Minimizing Expected Loss | ||
Exercise 0 | Intro Matlab | ||
Prob. Density Estimation I | Parametric Methods, Gaussian Distribution, Maximum Likelihood | ||
Prob. Density Estimation II | Bayesian Learning, Nonparametric Methods, Histograms, Kernel Density Estimation | ||
Prob. Density Estimation III | Mixture of Gaussians, k-Means Clustering, EM-Clustering, EM Algorithm | ||
Linear Discriminant Functions I | Linear Discriminant Functions, Least-squares Classification, Generalized Linear Models | ||
Exercise 1 | Probability Density, GMM, EM | ||
Linear Discriminant Functions II | Fisher Linear Discriminants, Logistic Regression, Iteratively Reweighted Least Squares | ||
Statistical Learning Theory | Statistical Learning Theory, VC Dimension, Structural Risk Minimization | ||
Linear SVMs | Linear SVMs, Soft-margin classifiers, nonlinear basis functions | ||
Non-Linear SVMs | Kernel trick, Mercer's condition, Nonlinear SVMs, Support Vector Data Description | ||
Exercise 2 | Linear Classifiers, Fisher Linear Discriminants, VC Dimension | ||
Ensemble Methods and Boosting | Model Combination, Bagging, Boosting, AdaBoost, Sequential Additive Minimization | ||
AdaBoost | AdaBoost, Exponential error, Applications, Decision Trees, CART framework, ID3, C4.5 | ||
Randomized Trees | Randomized Decision Trees, Random Forests, Extremely Randomized Trees, Ferns | ||
Graphical Models I | Intro to Graphical Models, Bayesian Networks, Conditional Independence, Bayes Ball algorithm | ||
Exercise 3 | AdaBoost, Decision Trees, Random Forests | ||
Graphical Models II | Markov Random Fields, Converting directed to undirected models | ||
Graphical Models III | Exact Inference, Message Passing, Belief Propagation, Factor Graphs, Sum-product algorithm | ||
Applying MRFs | Junction Tree Algorithm, Loopy BP, Applications of MRFs | ||
Exercise 4 | Bayes Nets, Inference, Belief Propagation | ||
Solving MRFs with Graph Cuts | Solving MRFs with Graph Cuts, s-t Mincut, Alpha Expansion | ||
Exercise 5 | MRFs, Graph Cuts | ||
Repetition | Repetition | ||
Test Exam | Test Exam | ||
Test Exam | Test Exam |