Mathematics of Information

Prof. Dr. Helmut Bölcskei

Offered in:
Lecture:Thursday, 9:15-12:00, ETZ E6
The first lecture takes place on Thursday 28 Feb 2019, 9:15-12:00.
Discussion session: Monday, 13:15-15:00, ML F 38
The first discussion session takes place on Monday 4 Mar 2019, 13:15-15:00.
Instructor: Prof. Dr. Helmut Bölcskei
Teaching assistant: Verner Vlačić
Office hours: Monday, 15:15-16:15, ETF E 117
Lecture notes:Detailed lecture and exercise notes and problem sets with documented solutions will be made available as we go along.
Credits: 8 ECTS credits
Course structure:


News

We will post important announcements, links, and other information here in the course of the semester, so please check back often! There will not be a lecture in the first week of the semester. The first lecture will take place on Thursday 28 Feb, 9:15-12:00, and the first discussion session will take place on Monday 4 Mar, 13:15-15:00.



Course Info

The class focuses on fundamental aspects of mathematical information science: Frame theory, sampling theory, sparsity, compressed sensing, uncertainty relations, spectrum-blind sampling, dimensionality reduction and sketching, randomized algorithms for large-scale sparse FFTs, inverse problems, (Kolmogorov) approximation theory, and information theory (lossless and lossy compression).

Signal representations: Frames in finite-dimensional spaces, frames in Hilbert spaces, wavelets, Gabor expansions

Sampling theorems: The sampling theorem as a frame expansion, irregular sampling, multi-band sampling, density theorems, spectrum-blind sampling

Sparsity and compressed sensing: Uncertainty relations in sparse signal recovery, recovery algorithms, Lasso, matching pursuit algorithms, compressed sensing, super-resolution

High-dimensional data and dimensionality reduction: Random projections, the Johnson-Lindenstrauss Lemma, sketching

Randomized algorithms for large-scale sparse FFTs

Approximation theory: Fundamental limits on compressibility of signal classes, Kolmogorov epsilon-entropy of signal classes, optimal encoding and decoding of signal classes

Information theory: Entropy, mutual information, lossy compression, rate-distortion theory, lossless compression, arithmetic coding, Lempel-Ziv compression

Prerequisites

This course is aimed at students with a background in basic linear algebra, analysis, and probability. We will, however, review required mathematical basics throughout the semester in the discussion sessions.


Lecture and exercise notes

Here we will post lecture and discussion session notes in due course.

Homework Assignments

There will be 6 homework assignments. You can hand in your solutions and get feedback from us, but it is not mandatory to turn in solutions. Complete solutions to the homework assignments will be posted on the course web page.

Homework Problem Sets

Problems Solutions
Homework 1
Handouts



Recommended reading
If you want to go into more depth or if you need additional background material, please check out these books: