I will give an introduction to modern machine learning methods, such
as Neural Networks / Deep Learning, Decision Trees / Random Forests,
Support Vector Machines and Gaussian Mixture Models. The aim of this
course is not to cover these topics comprehensively or in great
theoretical detail, but to show how machine learning methods can be used
in astrophysical research. Further we will focus on Feature Importance
and Probability Calibration.

Topics that will be covered:

- Introduction to Machine Learning and General Concepts
- Regression and Classification Problems
- Support Vector Machines
- Decision Trees and Random Forests
- Feature Importance
- Parameter Space Exploration and Optimization
- Probability Calibration
- Deep Learning and Convolutional Neural Networks
- Recurrent Neural Networks
- Encoder-Decoder Networks and Representation Learning
- Generative Adversarial Networks

Lecture: Mondays 2 c.t. (14:15) @ Zoom virtual lecture room

Tutorial: Mondays 10:00 Zoom

**Introduction** (Nov 2)

As the lecture was announced only late (sorry), we will have a short
meeting, where I will show how AI is currently used in Astronomy.
The first lecture will then be the week after (Nov 9).

Slides

**Lecture 1** (Nov 9)

topics:

- Introduction and Concepts
- Supervised vs. unsupervised learning
- Bad data and bad algorithms
- Testing and validating

Slides

Exercise Sheet 1

Jupyter Notebook

Jupyter Notebook - Solution

**Lecture 2** (Nov 16)

topics:

- Linear Algebra Recap
- Probability Theory
- Training Models
- Gradient Methods

Slides

Exercise Sheet 2

Jupyter Notebook

**Lecture 3** (Nov 23)

topics:

- Stocastic Methods
- Classification
- Support Vector Machines

Slides

Exercise Sheet 3

Jupyter Notebook

**Lecture 4** (Nov 30)

topics:

- Decision Trees
- Ensemble Methods
- Random Forests

Slides

Exercise Sheet 4

**Lecture 5** (Dec 7)

topics:

- Dimensionality Reduction
- Manifold Learning
- Principal Component Analysis
- t-SNE

Slides

Exercise Sheet 5 and Solution

Jupyter Notebook

**Lecture 6** (Dec 14)

topics:

- Clustering
- K-Means
- Gaussion Mixture Models
- Example: GMMs for galaxy morphology

Slides

Exercise Sheet 6 and Solution

Jupyter Notebook

**Lecture 7** (Jan 11)

topics:

- Artificial Neural Networks
- Signal propagation and weights
- Backpropagation
- Coding a neural network

Slides

Exercise Sheet 7

Jupyter Notebook

Jupyter Notebook Solution

**Lecture 8** (Jan 18)

topics:

- Initialising weights
- Activation functions
- Regularisation and dropout
- Convolutional Neural Networks
- Inception, Xception, Residual Networks, SE blocks

Slides

Exercise Sheet 8

Jupyter Notebook

Jupyter Notebook Solution

**Lecture 9** (Jan 25)

topics:

- Batch Normalisation
- Recurrent Neural Networks
- Long Short-Term Memory
- Gated Recurrent Unit

Slides

Exercise Sheet 9

Jupyter Notebook

Jupyter Notebook Solution

**Lecture 10** (Feb 1)

topics:

- Representations
- Autoencoders
- Unsupervised Pretraining
- Encoder-Decoder Networks / U-Nets + Astro Examples

Slides

**Anaconda**

We will use Jupyter Notebooks for most if not all of the
exercises. For this I recommend the installation of an Anaconda
distribution. This not only includes Python and Jupyter, but also
almost all libraries that we will need. We will use Python 3, so please
use the according Anaconda distribution. You can download it from here:

**scikit-learn**

We will use scikit-learn for most ML problems (except for Neural
Networks). You can install it through pip:

pip install scikit-learn

**Tensorflow**

Later in the course we will use Tensorflow for Deep Learning. You can
learn more about it here: