20810545 - INFORMATION THEORY

Acquisition of theoretical background in information theory, methodologies, and technologies for source coding of mono and multimedia signals to reduce redundancy for lossless and lossy information. Acquisition of theoretical background, methodologies, and technologies for channel coding, i.e., to protect digital communications against errors caused by distortions and noise. Case study to multimedia and multisensorial systems.

Curriculum

teacher profile | teaching materials

Programme

Elements of information theory: entropy of a source, relative entropy. Joint entropy and conditional entropy. Sufficient statistics. Lossless source coding: Optimal codes. Codeword length limits for optimal codes. Kraft inequality for uniquely decodable codes. Huffman and Shannon-Fano-Elias encoders. Universal source coding. Arithmetic encoders. Lempel-Ziv encoder.
Equivocation, mutual information rate, channel capacity. Capacitance of symmetric binary channels and band-limited channels affected by additive gauan noise.
Shannon's theorem on channel coding. Fano inequality. Separation theorem between source coding and channel coding.
Linear block codes: definition, generating matrix, parity checks, systematic codes Error detection and correction for linear block codes. Syndrome. Dual code of a linear block code. Excellent decoder. Error detection and correction for symmetric binary channels. Standard deployment. Performance. Galois fields: definitions and properties. Cyclic codes. Hamming codes. Reed-Solomon codes.
Convolutional codes: definitions and properties. Maximum likelihood decoding: symmetric binary channels and Gaussian channels Markov series: definitions and properties.
Viterbi algorithm: principle, implementation and performance Viterbi algorithm: performance.
Turbocodes: definitions and operating principle.
Concatenated codes. Systematic recursive convolutional encoders. Interleavers for convolutional codes. Calculation of the posterior probability (AAP) for turbocodes. Operating principle of hybrid ARQ protocols.
Turbo code decoders: decoding algorithm.

Core Documentation

Elements of information theory
Thomas M. Cover, Joy A. Thomas,
2. ed., 2006
John Wiley & Sons, Ltd.

Reference Bibliography

A Mathematical Theory of Communication By C. E. SHANNON The Bell System Technical Journal, Vol. 27, pp. 379–423, 623–656, July, October, 1948.

Attendance

Attendance is not mandatory but recommended

Type of evaluation

During the course, ongoing tests will assess the student's preparation. There will be two tests: a) the first, in written form, aimed at assessing the student's ability to analyze b) the second, in oral form, aimed at evaluating the student's theoretical knowledge.

teacher profile | teaching materials

Programme

Elements of information theory: entropy of a source, relative entropy. Joint entropy and conditional entropy. Sufficient statistics. Lossless source coding: Optimal codes. Codeword length limits for optimal codes. Kraft inequality for uniquely decodable codes. Huffman and Shannon-Fano-Elias encoders. Universal source coding. Arithmetic encoders. Lempel-Ziv encoder.
Equivocation, mutual information rate, channel capacity. Capacitance of symmetric binary channels and band-limited channels affected by additive gauan noise.
Shannon's theorem on channel coding. Fano inequality. Separation theorem between source coding and channel coding.
Linear block codes: definition, generating matrix, parity checks, systematic codes Error detection and correction for linear block codes. Syndrome. Dual code of a linear block code. Excellent decoder. Error detection and correction for symmetric binary channels. Standard deployment. Performance. Galois fields: definitions and properties. Cyclic codes. Hamming codes. Reed-Solomon codes.
Convolutional codes: definitions and properties. Maximum likelihood decoding: symmetric binary channels and Gaussian channels Markov series: definitions and properties.
Viterbi algorithm: principle, implementation and performance Viterbi algorithm: performance.
Turbocodes: definitions and operating principle.
Concatenated codes. Systematic recursive convolutional encoders. Interleavers for convolutional codes. Calculation of the posterior probability (AAP) for turbocodes. Operating principle of hybrid ARQ protocols.
Turbo code decoders: decoding algorithm.

Core Documentation

Elements of information theory
Thomas M. Cover, Joy A. Thomas,
2. ed., 2006
John Wiley & Sons, Ltd.

Reference Bibliography

A Mathematical Theory of Communication By C. E. SHANNON The Bell System Technical Journal, Vol. 27, pp. 379–423, 623–656, July, October, 1948.

Attendance

Attendance is not mandatory but recommended

Type of evaluation

During the course, ongoing tests will assess the student's preparation. There will be two tests: a) the first, in written form, aimed at assessing the student's ability to analyze b) the second, in oral form, aimed at evaluating the student's theoretical knowledge.