# Master Projects (Masterarbeiten)

If you are interested in one of the following topics, please contact the person listed with the respective topic.

If you don't find a compelling topic in our list, we are always keen on hearing about your ideas in the area of our research interests (see the Research section of our website). You are most welcome to discuss your interests with us.

Also, we have a list of finished diploma theses on our website.

## List of Projects

- Millimeter wave RF frontend for 5G applications
- Estimation of information measures for general analog sources
- Phase transitions for matrix separation
- Estimation of fractal dimension
- Analyzing cost functions for generative adversarial networks
- Connections between Shannon and Kolmogorov compression theories

### Millimeter wave RF frontend for 5G applications (DA)

With the increasing demand for bandwidth in mobile communications applications, broad activities are undertaken in industry and academia towards a future 5G mobile communication standard [1]. 5G promises vast improvements in terms of available data rate, coverage, and latency, all based on new technologies such as advanced modulation and coding schemes, (massive) MIMO, and in particular the usage of new spectrum bands [2, 3]. The Federal Communications Commission (FCC) recently designated several new blocks of spectrum in the millimeter wave frequency bands for these new-generation wireless broadband services.

The goal of this project is to implement a portable RF frontend for up- and downconversion to allow wideband radio channel measurements in the new spectrum bands. In a second step, it is planned to extend the system to a MIMO setup. The project is carried out in collaboration with Swisscom.

Type of project: 20% theory, 80% RF hardware design and measurements

Prerequisites: Interest in RF hardware and wireless systems

Supervisor: Michael Lerjen, Ruben Merz

Professor:
Helmut Bölcskei

References:

[1]
A. Osseiran, J. F. Monserrat, and P. Marsch, “5G mobile and wireless communications technology,” *Cambridge University Press*, 2016.

[2]
T. S. Rappaport, W. Roh, and K. Cheun, “Mobile's millimeter-wave makeover,” *IEEE Spectrum*, vol. 51, no. 9, pp. 34–58, 2014.
[Link to Document]

[3]
T. S. Rappaport, S. Sun, R. Mayzus, H. Zhao, Y. Azar, K. Wang, G. N. Wong, J. K. Schulz, M. Samimi, and F. Gutierrez, “Millimeter wave mobile communications for 5G cellular: It will work!,” *IEEE Access*, vol. 1, pp. 335–349, 2013.
[Link to Document]

### Estimation of information measures for general analog sources (DA)

Information measures such as entropy, Rényi entropy, mutual information, and divergence play a fundamental role in information theory. It is hence important to develop computationally efficient and consistent estimators for these information measures. To this end, numerous methods such as plug-in estimates [1], partitioning-based algorithms [2], and

*k*-nearest neighbor methods for density estimates [3] have been proposed for analog sources. Recently, a mutual information estimator for discrete-continuous mixtures was reported in [4]. Extending these methods to general sources, i.e., sources modeled by distributions that may include singular components remains, however, an important open problem.

The first goal of this project is to understand the information measure estimation algorithms in [5], and then to develop an efficient estimator for Rényi information dimension [6], an asymptotic information measure that has found relatively widespread use in information theory recently [7,8]. The second goal is to apply statistical methods to obtain a mutual information estimator for general sources with possibly singular components.

Type of project: 90% theory, 10% simulations

Prerequisites: Strong mathematical background, measure theory, probability theory

Supervisor: Recep Gül

Professor:
Helmut Bölcskei

References:

[1]
L. Györfi and E. C. van der Meulen, “Density-free convergence properties of various estimators of entropy,” * Computational Statistics and Data Analysis*, vol. 5, no. 4, pp. 425–436, Sept. 1987.
[Link to Document]

[2]
E. G. Miller, “A new class of entropy estimators for multi-dimensional densities,” * Proceedings of 2003 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)*, vol. 3, pp. 297–300, Apr. 2003.
[Link to Document]

[3]
D. O. Loftsgaarden and C. P. Quesenberry, “A nonparametric estimate of a multivariate density function,” *The Annals of Mathematical Statistics*, vol. 36, no. 3, pp. 1049–1051, June 1965.
[Link to Document]

[4] W. Gao, S. Kannan, S. Oh, and P. Viswanath, “Estimating mutual information for discrete-continuous mixtures,” Sept. 20, 2017. [Link to Document]

[5]
Q. Wang, S. R. Kulkarni, and S. Verdú, “Universal estimation of information measures for analog sources,” *Foundations and Trends in Communications and Information Theory*, vol. 5, no. 3, pp. 265-353, 2008.
[Link to Document]

[6]
A. Rényi, “On the dimension and entropy of probability distributions,” *Acta Math. Hungarica*, vol. 10, nos. 1-2, pp. 193-215, Mar. 1959.
[Link to Document]

[7]
Y. Wu and S. Verdú, “Rényi information dimension: Fundamental limits of almost lossless analog compression,” *IEEE Transactions on Information Theory*, vol. 56, no. 8, pp. 3721-3748, Aug. 2010.
[Link to Document]

[8]
D. Stotz and H. Bölcskei, “Characterizing degrees of freedom through additive combinatorics,” *IEEE Transactions on Information Theory*, vol. 62, no. 11, pp. 6423-6435, Nov. 2016.
[Link to Document]

### Phase transitions for matrix separation (DA)

A phase transition is a sharp division of the parameter space of a data processing problem into a region of success and a region of failure. This phenomenon was studied for the matrix separation problem [1, 2] and for the matrix completion problem [3] under specific recovery algorithms. Information-theoretic phase transitions do not depend on specific recovery algorithms and reveal what is best possible without stating how to achieve optimum performance. Recently, information-theoretic phase transitions were obtained for the matrix completion problem [4] and the signal separation problem [5].

The first goal of this project is to build on the techniques developed in [4, 5] to characterize information-theoretic phase transitions for the matrix separation problem, which in its traditional incarnation separates a low-rank matrix from a sparse matrix. The second goal is to investigate pairs of matrix structures (beyond low-rank and sparse) that allow for separation and to characterize their information-theoretic phase transitions.

Type of project: 80% theory, 20% simulation

Prerequisites: Strong mathematical background, measure theory, probability theory

Supervisor: Erwin Riegler

Professor:
Helmut Bölcskei

References:

[1] D. Amelunxen, M. Lotz, M. B. McCoy, and J. A. Tropp, “Living on the edge: A geometric theory of phase transitions in convex optimization,” *Information and Inference*, vol. 3, no. 3, pp. 224–294, Jun. 2014.
[Link to Document]

[2] V. Chandrasekaran, S. Sanghavi, P. A. Parrilo, and A. S. Willsky, “Rank-sparsity incoherence for matrix decomposition,” *SIAM Journal on Optimization*, vol. 21, no. 2, pp. 572–596, Jun. 2011.
[Link to Document]

[3] E. J. Candés and B. Recht, “Exact matrix completion via convex optimization,” *Foundations of Computational Mathematics*, vol. 9, no. 6, pp. 717–772, Dec. 2009.
[Link to Document]

[4]
E. Riegler, D. Stotz, and H. Bölcskei, “Information-theoretic limits of matrix completion,” *Proc. IEEE Int. Symp. on
Inf. Theory (ISIT)*, pp. 106–110, Jun. 2015. [Link to Document]

[5]
D. Stotz, E. Riegler, E. Agustsson, and H. Bölcskei, “Almost lossless analog signal separation and probabilistic uncertainty relations,” *IEEE Trans. Inf. Theory*, vol. 63, no. 9, pp. 5445-5460, Sep. 2017.
[Link to Document]

### Estimation of fractal dimension (DA)

Fractal dimensions such as the box-counting dimension or the Rényi information dimension of a random vector often appear as a measure of the vector's description complexity. Recently, fractal dimensions were shown to characterize fundamental limits in information-theoretic problems such as analog data compression [2,3] and communication over interference channels [4-6]. The exact computation of fractal dimensions is, in general, hard, as explicit formulae are available only for certain classes of well-behaved probability distributions.

The goal of this project is to develop, based on the results in [1], a statistical theory and corresponding algorithms for the estimation of fractal dimensions. The resulting estimation procedures shall then be used to gain insight into the fractal dimension of sources with structure relevant to information-theoretic problems.

Type of project: 60% theory, 40% simulation

Prerequisites: Strong mathematical background, measure theory, probability theory

Supervisor: Erwin Riegler

Professor:
Helmut Bölcskei

References:

[1] C. D. Cutler, “A review of the theory and estimation of fractal dimension,” in *Dimension estimation and models*, H. Tong, Ed., Nonlinear Time Series and Chaos, vol. 1, pp. 1–107, World Scientific, 1993.

[2] Y. Wu and S. Verdú, “Rényi information dimension: Fundamental limits of almost lossless analog compression,” *IEEE Trans. Inf. Theory*, vol. 56, no. 8, pp. 3721–3748, Aug. 2010. [Link to Document]

[3]
D. Stotz, E. Riegler, E. Agustsson, and H. Bölcskei, “Almost lossless analog signal separation and probabilistic uncertainty relations,” *IEEE Trans. Inf. Theory*, vol. 63, no. 9,\
pp. 5445-5460, Sep. 2017.
[Link to Document]

[4]
Y. Wu, S. Shamai (Shitz), and S. Verdú, “Information dimension and the degrees of freedom of the interference channel,” *IEEE Trans. Inf. Theory*, vol. 61, no. 1, pp. 256–279, Jan. 2015. [Link to Document]

[5]
D. Stotz and H. Bölcskei, “Degrees of freedom in vector interference channels,” *IEEE Transactions on Information Theory*, vol. 62, no. 7, pp. 4172–4197, Jul. 2016.
[Link to Document]

[6]
D. Stotz and H. Bölcskei, “Characterizing degrees of freedom through additive combinatorics,” *IEEE Transactions on Information Theory*, vol. 62, no. 11, pp. 6423–6435, Nov. 2016.
[Link to Document]

### Analyzing cost functions for generative adversarial networks (SA/DA)

Generative adversarial networs (GANs) [1] are a family of machine learning models that have led to remarkable results in image generation tasks [2]. The GAN-learning-problem is a two-player game between the so-called generator, which learns how to generate samples resembling the training data, and a so-called discriminator, which learns how to discriminate between real and fake data points. Both players aim to minimizie their own cost function until the Nash-equilibrium is established.

The goal of this project is to analyze---mathematically and possibly experimentally---different cost functions for image generation tasks.

Type of project: 80%-90% theory, 10-20% simulation

Prerequisites: Analysis, linear algebra, probability theory, Python

Supervisor: Thomas Wiatowski, Michael Tschannen

Professor:
Helmut Bölcskei

References:

[1] I. J. Goodfellow, J. Pouget-Abadie, M. Mirza, B. Xu, D. Warde-Farley, S. Ozair, A. Courville, and Y. Bengio, “Generative adversarial nets,” *Proc. of Neural Information Processing Systems (NIPS), pp. 2672–2680*, 2014. [Link to Document]

[2] A. Radford, L. Metz, and S. Chintala “Unsupervised representation learning with deep convolutional generative adversarial networks,” *arXiv:1511.06434*, 2015. [Link to Document]

### Connections between Shannon and Kolmogorov compression theories (DA)

Signal compression is at the core of modern Internet technology. Compression reduces storage requirements and transmission time. There are two theories for establishing the fundamental limits of compression, namely Shannon's rate-distortion theory [1], prominent in information theory, and Kolmogorov's ε-entropy theory [2, 3], an indispensable tool in approximation theory. While these theories were developed essentially independently of each other, they exhibit intriguing parallels. The former describes the number of bits required to approximate realizations of a stochastic process, while the latter quantifies the number of bits needed to approximate arbitrary elements of a given function class. Donoho [4] pointed out deep connections between the two theories and gave examples of how certain results in Shannon's rate-distortion theory can be transferred to Kolmogorov's ε-entropy theory. Specifically, it was shown in [4] that Shannon's theory allows to elegantly determine the precise asymptotics of ε-entropy for certain ellipsoid function classes.

The goal of this project is to first understand the results in [4] and to then further explore and possibly systematize the connections between Shannon's theory and Kolmogorov's theory.

Type of project: 100% theory

Prerequisites: Strong mathematical background, functional analysis, information theory

Supervisor: Dmytro Perekrestenko, Erwin Riegler

Professor:
Helmut Bölcskei

References:

[1]
T. Berger, “Rate distortion theory and data compression,” *Springer Vienna,* Vienna, 1975.
[Link to Document]

[2]
A. N. Kolmogorov and V. M. Tikhomirov, “ε-entropy and ε-capacity,” *Springer Netherlands*, pp. 231-239, 1993.
[Link to Document]

[3]
E. C. Posner, E. R. Rodemich, and H. Rumsey Jr., “Epsilon entropy of stochastic processes,” *The Annals of Mathematical Statistics*, Vol. 38, No. 4, pp. 1000-1020, 1967.
[Link to Document]

[4] D. L. Donoho, “Wald lecture I: Counting bits with Kolmogorov and Shannon”, 2000. [Link to Document]