Information theory is the basis of modern telecommunication systems. Main topics of information theory are source coding, channel coding, multi-user communication systems, and cryptology. These topics are based on Shannons work on information theory, which allows to describe information with measures like entropy and redundancy.

After a short overview of the whole area of information theory, we will consider concepts for statistic modeling of information sources and derive the source coding theorem. Afterwards, important source coding algorithms like Huffman, Tunstall, Lempel-Ziv and Elias-Willems will be described.

The second part of the lecture investigates channel coding. Important properties of codes and fundamental decoding strategies will be explained. Moreover, we will introduce possibilities for estimating the error probability and analyze the most important channel models according to the channel capacity introduced by Shannon.The Gaussian Channel is very important and therefore described extensively.

The third part deals with aspects of multi-user communication systems. We will introduce several models and investigate methods that can achieve the capacity regions.

Finally, we will give an introduction on data encryption and secure communication.

In the projects several information theoretic topics (e.g., Lempel-Ziv-coding) will be investigated by means of implementation tasks.

Topics

  • Shannon's Information Measure
  • Coding for Discrete Information Sources
  • Discrete Memoryless Channels
  • Reliable Transmission over Noisy Channels
  • Continuous Random Variables and Shannon's Information Measure
  • Channels with Continuous In- and Output
  • Rate-Distortion Theory
  • Multiuser Communications
  • Cryptography