20410442 - IN420 - Information Theory

Introduce key questions in the theory of signal transmission and quantitative analysis of signals, such as the notions of entropy and mutual information. Show the underlying algebraic structure. Apply the fundamental concepts to code theory, data compression and cryptography.

BONIFACI VINCENZO

teacher profile | teaching materials

Programme

1. Introduction to information theory.
Reliable transmission of information. Shannon's information content. Measures of information. Entropy, mutual information, informational divergence. Data compression. Error correction. Data processing theorems. Fundamental inequalities. Information diagrams. Informational divergence and maximum likelihood.

2. Source coding and data compression
Typical sequences. Typicality in probability. Asymptotic equipartitioning property. Block codes and variable length codes. Coding rate. Source coding theorem. Lossless data compression. Huffman code. Universal codes. Ziv-Lempel compression.

3. Channel coding
Channel capacity. Discrete memoryless channels. Information transmitted over a channel. Decoding criteria. Noisy channel coding theorem.

4. Further codes and applications
Hamming space. Linear codes. Generating matrix and check matrix. Cyclic codes. Hash codes.


Core Documentation

David J. C. MacKay. Information Theory, Inference and Learning Algorithms. Cambridge University Press, 2004.


Reference Bibliography

Thomas M. Cover, Joy A. Thomas. Elements of Information Theory. Wiley, 1991. Venkatesan Guruswamy, Atri Rudra, Madhu Sudan. Essential Coding Theory. Bozza disponibile online, 2019. Richard E. Blahut. Algebraic Codes for Data Transmission. Cambridge University Press, 2003. Timothy C. Bell, John G. Cleary, Ian H. Witten. Text Compression. Prentice-Hall, 1990.

Type of delivery of the course

Frontal lectures with recitations and/or lab.

Type of evaluation

Oral exam.

BONIFACI VINCENZO

teacher profile | teaching materials

Programme

1. Introduction to information theory.
Reliable transmission of information. Shannon's information content. Measures of information. Entropy, mutual information, informational divergence. Data compression. Error correction. Data processing theorems. Fundamental inequalities. Information diagrams. Informational divergence and maximum likelihood.

2. Source coding and data compression
Typical sequences. Typicality in probability. Asymptotic equipartitioning property. Block codes and variable length codes. Coding rate. Source coding theorem. Lossless data compression. Huffman code. Universal codes. Ziv-Lempel compression.

3. Channel coding
Channel capacity. Discrete memoryless channels. Information transmitted over a channel. Decoding criteria. Noisy channel coding theorem.

4. Further codes and applications
Hamming space. Linear codes. Generating matrix and check matrix. Cyclic codes. Hash codes.


Core Documentation

David J. C. MacKay. Information Theory, Inference and Learning Algorithms. Cambridge University Press, 2004.


Reference Bibliography

Thomas M. Cover, Joy A. Thomas. Elements of Information Theory. Wiley, 1991. Venkatesan Guruswamy, Atri Rudra, Madhu Sudan. Essential Coding Theory. Bozza disponibile online, 2019. Richard E. Blahut. Algebraic Codes for Data Transmission. Cambridge University Press, 2003. Timothy C. Bell, John G. Cleary, Ian H. Witten. Text Compression. Prentice-Hall, 1990.

Type of delivery of the course

Frontal lectures with recitations and/or lab.

Type of evaluation

Oral exam.

BONIFACI VINCENZO

teacher profile | teaching materials

Programme

1. Introduction to information theory.
Reliable transmission of information. Shannon's information content. Measures of information. Entropy, mutual information, informational divergence. Data compression. Error correction. Data processing theorems. Fundamental inequalities. Information diagrams. Informational divergence and maximum likelihood.

2. Source coding and data compression
Typical sequences. Typicality in probability. Asymptotic equipartitioning property. Block codes and variable length codes. Coding rate. Source coding theorem. Lossless data compression. Huffman code. Universal codes. Ziv-Lempel compression.

3. Channel coding
Channel capacity. Discrete memoryless channels. Information transmitted over a channel. Decoding criteria. Noisy channel coding theorem.

4. Further codes and applications
Hamming space. Linear codes. Generating matrix and check matrix. Cyclic codes. Hash codes.


Core Documentation

David J. C. MacKay. Information Theory, Inference and Learning Algorithms. Cambridge University Press, 2004.


Reference Bibliography

Thomas M. Cover, Joy A. Thomas. Elements of Information Theory. Wiley, 1991. Venkatesan Guruswamy, Atri Rudra, Madhu Sudan. Essential Coding Theory. Bozza disponibile online, 2019. Richard E. Blahut. Algebraic Codes for Data Transmission. Cambridge University Press, 2003. Timothy C. Bell, John G. Cleary, Ian H. Witten. Text Compression. Prentice-Hall, 1990.

Type of delivery of the course

Frontal lectures with recitations and/or lab.

Type of evaluation

Oral exam.