Iterative Methods for Wireless Communications

First Lecture


First lecture:
Tuesday, 17 October 2023, 14:00 h

Registration

We use the learning platform Moodle to provide downloads. To have access to this material you have to register for the course in Moodle:

Moodle course

 

Contents

Iterative methods are motivated by considering two classical examples: Newtons method to find the roots of nonlinear functions and the Jacobi- and Gauss-Seidel method to solve large systems of linear equations. Based on these examples convergence and convergence rates of iterative methods are discussed. The concept of the fix point iteration is used to provide a graphical interpretation of iterative processes.

In Chapter 2 the concept of vector-valued transmission is introduced. Based on this, we derive the optimum receiver structure for general linear modulation methods. Besides the optimum vector equalizer also various suboptimum methods (block linear equalizer, block decision feedback equalizer, multistage detector) are discussed in Chapter 3. Furthermore iterative equalizer are introduced and the relation to recurrent neural networks is described in Chapter 4. As an application we consider code division multiplexing in Chapter 5.

Chapter 6 first introduces the basic concepts for iterative decoding: maximum a posteriori decoding, probability theory for iterative decoding and tanner graphs as a means to graphically represent iterative decoding. As applications we consider low density parity check codes and convolutional self-orthogonal codes in Chapter 7.

In Chapter 8 iterative methods for concatenated systems are considered. This includes a discussion of classical turbo codes as well as receiver concepts based on a joint demapping, equalization and decoding (turbo equalization). As a further example we consider the basic principle of interleave division multiplexing. The iterative methods are analysed using EXIT charts.

References

  • J. Lindner, "Informationsübertragung - Grundlagen der Kommunikationstechnik", Springer-Verlag, Berlin 2005
  • S. Haykin, “Neural Networks – A Comprehensive Foundation”, Prentice Hall 1999
  • S.J. Johnson, “Iterative Error Correction – Turbo, Low-Density Parity-Check and Repeat-Accumulate Codes”, Cambridge University Press 2007

Semesterapparat

Additionally, the "Semesterapparat" to this Lecture may be of interest.

Winter Semester 2023/24
Lecture: Tuesday, 14:00 -15:30 h
Exercise: Tuesday, 15:45 -16:30 h
Language

English

Requirements

Bachelor

Exams

Usually oral exam, otherwise written exam of 90 minutes duration.

Further Informations

Hours per Week:  2V + 1Ü
4 ECTS Credits
LSF - ENGC 71150