Master's Thesis - Error performance of Turbo Codes


In recent years iterative decoding has regained popularity, with the remarkable results presented in a paper by a group of French researchers. They introduced a new family of convolutional codes, nicknamed "Turbo codes" after the resemblance with the turbo engine. A turbo code is built from a parallel concatenation of two recursive systematic codes linked together by nonuniform interleaving. Decoding is done iteratively by two separate a posteriori probability decoders, each using the decoding results from the other one. For sufficiently large interleaver sizes, the error correction performance seems to be close to Shannon's theoretical limit.

In this Master's Thesis we examine the performance of turbo-codes on the additive white Gaussian noise channel. The influence of the size of the encoder memory, different types and sizes of interleavers are examined together with two different decoding algorithms, the oneway algorithm and the two-way algorithm. We show that the two algorithms have the same performance and that the choice of interleaver and encoder is important.