20410442 - IN420 - Information Theory

Introduce key questions in the theory of signal transmission and quantitative analysis of signals, such as the notions of entropy and mutual information. Show the underlying algebraic structure. Apply the fundamental concepts to code theory, data compression and cryptography

Curriculum

teacher profile | teaching materials

Mutuazione: 20410442 IN420 - TEORIA DELL'INFORMAZIONE in Scienze Computazionali LM-40 BONIFACI VINCENZO

Programme

1. Introduction to information theory.
Reliable transmission of information. Shannon's information content. Measures of information. Entropy, mutual information, informational divergence. Data compression. Error correction. Data processing theorems. Fundamental inequalities. Information diagrams. Informational divergence and maximum likelihood.

2. Source coding and data compression
Typical sequences. Typicality in probability. Asymptotic equipartitioning property. Block codes and variable length codes. Coding rate. Source coding theorem. Lossless data compression. Huffman code. Universal codes. Ziv-Lempel compression.

3. Channel coding
Channel capacity. Discrete memoryless channels. Information transmitted over a channel. Decoding criteria. Noisy channel coding theorem.

4. Further codes and applications
Hamming space. Linear codes. Generating matrix and check matrix. Cyclic codes. Hash codes.


Core Documentation

David J. C. MacKay. Information Theory, Inference and Learning Algorithms. Cambridge University Press, 2004.


Type of delivery of the course

Frontal lectures with recitations.

Type of evaluation

Oral exam.

teacher profile | teaching materials

Mutuazione: 20410442 IN420 - TEORIA DELL'INFORMAZIONE in Scienze Computazionali LM-40 BONIFACI VINCENZO

Programme

1. Introduction to information theory.
Reliable transmission of information. Shannon's information content. Measures of information. Entropy, mutual information, informational divergence. Data compression. Error correction. Data processing theorems. Fundamental inequalities. Information diagrams. Informational divergence and maximum likelihood.

2. Source coding and data compression
Typical sequences. Typicality in probability. Asymptotic equipartitioning property. Block codes and variable length codes. Coding rate. Source coding theorem. Lossless data compression. Huffman code. Universal codes. Ziv-Lempel compression.

3. Channel coding
Channel capacity. Discrete memoryless channels. Information transmitted over a channel. Decoding criteria. Noisy channel coding theorem.

4. Further codes and applications
Hamming space. Linear codes. Generating matrix and check matrix. Cyclic codes. Hash codes.


Core Documentation

David J. C. MacKay. Information Theory, Inference and Learning Algorithms. Cambridge University Press, 2004.


Type of delivery of the course

Frontal lectures with recitations.

Type of evaluation

Oral exam.