Advanced Topics in Information Theory

Contents

Based on the theory learned in the course Applied Information Theory we will continue to explore some fundamental topics concerning reliable transmission of information and applications of the information theory in another scientific fields.

In the first part of this course we will follow Shannon's ingenious approach for representing any communication system geometrically. Signals and noise are represented by points in high-dimensional spaces. Using advanced geometric reasoning a number of basic results in information and communication theory are deduced with deep understanding of some very complex subjects.

In the second part of this course we will discus some successful and controversial applications of information theory in cryptography, physics, biology, as well as at estimations of ultimate limits in computation and communication.

Topics

The course will cover approximately the following topics:

  • Information as a universal interpretation of Kolmogorov's mathematical expectation
  • Claude Elwood Shannon, the founding father of Information theory
  • Sampling and quantization - basic steps from the real to the digital world
  • Geometry of signals and codes
  • Optimal error control codes
  • Simplex conjecture - strong and week interpretation
  • Asymptotic bounds on communications
  • The Channel coding theorem - from geometrical point of view
  • Error exponent and channel capacity
  • Generalized error exponent and channel capacity
  • Channel coding theorem and ultimate limits of computation and communication
  • Information theory and security
  • Information theory and physics
  • Information theory and genetics
  • Future topics of information theory: Compressed Sensing and the Hilbert-Kolmogorov super-compression of information in neural networks.
References
  • Claude E. Shannon: "Communication in presence of Noise", Proceedings of the IRE, Vol. 37, No. 1, pp. 10 - 21, January 1945
  • John M. Wozencraft and Irwin M, Jacobs: "Principles of Communication Engineering", Wiley, 1965, and Waveland Press, Inc.,1990
  • Claude E. Shannon: "Probability of Error for Optimal Codes in a Gaussian Channel", BSTJ, Vol. 38, No. 3, pp. 611 - 656, May 1959, http://www.alcatel-lucent.com/bstj/vol38-1959/articles/bstj38-3-611.pdf
Further References and Recommended Readings
  • Robert G. Gallager: "Information Theory and Reliable Communication", Wiley, 1968.
  • Charles L. Weber: "Elements of Detection and Signal Design", McGraw-Hill, 1968 and Reprinted Springer Verlag, 1988.
  • R. Blahut, "Principles and Practice of Information Theory", Addison Wesley, 1987.
  • D. J. MacKay, "Information Theory, Inference and Learning Algorithms", Cambridge, 2004.
  • Claude E. Shannon: "A mathematical theory of communication," Bell System Technical Journal, vol. 27, pp. 379-423 and 623-656, July and October 1948.
  • Imre Csiszar, Janos Körner: Information Theory: Coding Theorems for Discrete Memoryless  Systems, 3rd edition, Akademiai Kiado, Budapest.
  • Thomas M. Cover and Joy A. Thomas: Elements of Information Theory, second edition, Wiley, 2006.
Materials

Lectures

Exercises

Further Material

Important News

Please check this site regularly for any last-minute changes and announcements!

Summer Term 2013

Lecture:Wednesday, 12:30 - 14:00,
Room 45.2.103
Exercise:Thursday, 12:30 - 14:00 (biweekly),
Room 45.2.103

Contact

Lecturers:
Dr.-Ing. Dejan Lazich
Supervisors:
Dipl.-Ing. Henning Zörlein

Language

English

Requirements

Bachelor/Vordiplom
Knowledge of basic terms in Information Theory preferable but not necessary
Mathematical prerequisites: linear algebra, analysis, probability, and combinatorics

Exams

Usualy oral exam of 30min duration.

More Informations

Hours per Week:  2V + 1Ü
5 ECTS Credits
LSF - ENGJ 71520