Applied Information Theory

Contents

Information theory is the basis of modern telecommunication systems. Main topics of information theory are source coding, channel coding, multi-user communication systems, and cryptology. These topics are based on Shannons work on information theory, which allows to describe information with measures like entropy and redundancy.

After a short overview of the whole area of information theory, we will consider concepts for statistic modeling of information sources and derive the source coding theorem. Afterwards, important source coding algorithms like Huffman, Tunstall, Lempel-Ziv and Elias-Willems will be described.

The second part of the lecture investigates channel coding. Important properties of codes and fundamental decoding strategies will be explained. Moreover, we will introduce possibilities for estimating the error probability and analyze the most important channel models according to the channel capacity introduced by Shannon.The Gaussian Channel is very important and therefore described extensively.

The third part deals with aspects of multi-user communication systems. We will introduce several models and investigate methods that can achieve the capacity regions.

Finally, we will give an introduction on data encryption and secure communication.

In the projects several information theoretic topics (e.g., Lempel-Ziv-coding) will be investigated by means of implementation tasks.

Overview

Basics:

  • Uncertainty (entropy), mutual information
  • Fano's lemma, data processing inequality

Source Coding:

  • Shannon's source coding theorem
  • Coding methods for memoryless sources: Shannon-Fano-, Huffman-, Tunstall, and arithmetic coding
  • Coding for sources with memory

Channel Coding:

  • Concepts of linear binary block codes
  • Shannon's channel coding theorem
  • Random coding and error exponent
  • MAP and ML decoding
  • Bounds
  • Channels and capacities: Gaussian channel, fading channel

Multi-User Systems:

  • Duplex transmission
  • MAC channel
  • BC channel
  • MIMO channel

Cryptography:

  • Basics
Important News
  • The lecture on July 12 will start at 11:00am!

  • Frederic Knabe will be on vacation from July 16 until August 3. Questions concerning the lecture or exercise can not be answered before August 6. For urgent questions, please contact Carolin Huppert.
Dates
  • April 19, 10:00am: Lecture 1
  • April 23, 1:30pm: Lecture 2
  • April 26, 10:15am: Exercise 1
  • April 30, 1:30pm: Lecture 3
  • May 3, 10:15am: Exercise 2
  • May 7, 1:30pm: Lecture 4, part of Exercise 3 (3.3 and 3.4)
  • May 10, 10:15am: Lecture 5, given by Prof. Bossert
  • May 14, 1:30pm: Rest of Exercise 3 (3.1 and 3.2), Exercise 4 (4.1)
  • May 17: Holiday
  • May 21, 1:30pm: Lecture 6
  • May 24, 10:15am: Lecture 7
  • May 28: Holiday
  • May 31, 10:15am: Lecture 8
  • June 4, 1:30pm: Rest of Exercise 4 (4.2), Exercise 5
  • June 11, 1:30pm: Exercise 6
  • June 14, 10:15am: Lecture 9
  • June 18, 1:30pm: Exercise 7
  • June 21, 10:15am: Lecture 10
  • June 25, 1:30pm: Exercise 8
  • June 28, 10:15am: Lecture 11
  • July 2, 1:30pm: Exercise 9
  • July 5, 10:15am: Lecture 12
  • July 9, 1:30pm: Exercise 10
  • July 12, 11:00am: Lecture 13
  • July 16: no exercise
  • July 19, 10:15am: Time for questions
Exercise Sheets
  • Sheet 1 (Introduction to probability theory, Expected value (2), Law of total expectation, Probability calculus, Moment generating function):
  • Exercises Solutions
  • Sheet 2 (Entropy and mutual information during data transmission, Entropy of the geometric distribution, Conditional entropies, Conditional mutual information, Kullback-Leibler distance and average mutual information):
  • Exercises Solutions
  • Sheet 3 (Typical sequences, Uniquely decodable and prefix-free codes, Shannon-Fano code, Huffman code):
  • Exercises Solutions
  • Sheet 4 (Efficiency of different source coding schemes, Markov-sources):
  • Exercises Solutions
  • Sheet 5 (Linear block codes, Shortened Hamming code, Jointly typical sequences):
  • Exercises Solutions
  • Sheet 6 (Binary symmetric erasure channel, Channel capacity for quantized AWGN-channel):
  • Exercises Solutions
  • Sheet 7 (Gaussian distribution maximizes entropy, The mutual information game):
  • Exercises Solutions
  • Sheet 8 (Waterfilling, Tomlinson-Harashima precoding):
  • Exercises Solutions
  • Sheet 9 (Degraded broadcast channels, Broadcast vs. TDMA):
  • Exercises Solutions
  • Sheet 10 (TDMA in the multiple-access channel, Square multiple-access channel, M-user multiple-access channel):
  • Exercises Solutions
Lab
References
  • Thomas M. Cover and Joy A. Thomas, "Elements of Information Theory", Library ID: QAA 170/2006 C
  • Rolf Johannesson, "Informationstheorie", Library ID: QAA 170/1992 J (in German, can also be bought in our secretariat for 20€)
  • James L. Massey, Lecture Notes on "Applied Digital Information Theory I", ETH Zürich, external link to ETH Zürich (pdf)
  • Former german lecture notes by Prof. Bossert (pdf)

Summer Term 2012

Lecture:Thursday, 10:15 - 12:45,
Room 45.2.102
Exercise:Monday, 13:30 - 15:00,
Room 43.2.103

Contact

Lecturers:
Prof. Dr.-Ing. Martin Bossert
Dr.-Ing. Carolin Huppert
Supervisors:
Dipl.-Ing. Frederic Knabe

Language

English

Requirements

Bachelor
Probability Theory

Exams

Usualy oral exam, otherwise written exam of 120min duration.

More Informations

Hours per Week:  3V + 2Ü + 1P
8 ECTS Credits
LSF - ENGJ 8023