zum Inhalt springen

Mathematical Foundations of Data Analysis II

Organizer: Dr. Boqiang Huang

Time Schedule:

Room Changed!!!
Lectures:        Tuesday 16-17:30 (Hörsaal 2.03, Raum 203), Thursday 14-15.30 (Cohn-Vossen Raum 313)
Exercises:       Thursday 16-17:30 (Cohn-Vossen Raum 313)


This is part II of the lecture serial "Mathematical Foundations of Data Analysis". Part I had been given in WS 2018/2019.

The whole serial aims to give a comprehensive introduction of state-of-the-art data analysis methods together with their mathematical motivations, theories, and algorithm realizations in MATLAB. In part I, we study deterministic data analysis methods. In part II, we study statistical data analysis methods (including statistical learning).

In part II, we mainly focus on the mathematical explanation of multi-channel data decomposition/representation in terms of principal component analysis (PCA) and independent component analysis (ICA), typical regression methods based on linear or nonlinear models, typical classification/clustering methods, where the support vector machine (SVM) will be particularly discussed. Moreover, the concept of supervised learning and unsupervised learning will be explained in details. If we have more time, the ideas of those famous machine learning methods, e.g. backpropagation (BP) neural network, convolutional neural network (CNN), recursive neural network (RNN), residual neural network etc, will be also investigated.

The course will be given in English, and it is mainly designed for Master Students.

Main References:

1. G. James, D. Witten, T. Hastie, R. Tibshirani, An introduction to statistical learning: with applications in R, Springer, 2013.
2. T. Hastie, R. Tibshirani, J. Friedman, The elements of statistical learning: data mining, inference, and prediction, Springer Series in Statistics, 2016.
3. I. Goodfellow, Y. Bengio, A. Courville, Deep Learning, MIT Press, 2016.
4. A. Hyvaerinen, J. Karhunen, E. Oja, Independent Component Analysis, New York: John Wiley & Sons Inc., 2001.

Extensive readings

5. A. Antoniou, W.-S. Lu, Practical optimization: algorithms and engineering applications, Springer, 2007.
6. Y. LeCun, Y. Bengio, G. Hinton, Deep learning, Nature, vol. 521, pp. 436-444, 2015.
7. Y. LeCun, Y. Bengio, Convolutional networks for images, speech, and time-series, The Handbook of Brain Theory and Neural Networks, vol. 3361, 1995.
8. C. Goller, A. Kuechler, Learning task-dependent distributed representations by backpropagation through structure. IEEE Int. Conf. on Neural Networks, 1996.
9. K. He, X. Zhang, S. Ren, J. Sun, Deep Residual Learning for Image Recognition, IEEE Conf. on Computer Vision and Pattern Recognition (CVPR), Las Vegas, USA, 2016.


Week01    2019.04.02    Intro
Week02    2019.04.09    Pro&Info    2019.04.11    Num    Paper: IMCRA    Exercise01
Week03    2019.04.16    StaMod      Exercise02
Week04    2019.04.23    Exercise03
Week05    2019.04.30    Exercise04
Week06    2019.05.07    Exercise05  (updated on 2019.05.22)
Week07    2019.05.14 -- 2019.05.16    Canceled
Week08    2019.05.21    Exercise06
Week09    2019.05.30    Project01    DataSet
Week12    2019.06.18    Exercise07
Week15    2019.07.09 - 2019.07.12    OralExamPlan
Week16    2019.07.17    Project02    DataSet