Algorithmic Information Theory - FS23, USI

Course information

Time Room Lecturer

Tuesdays and Thursdays 14:30-16:15 D1.14 Charles Bédard

For questions please contact Charles Bédard.

Course Information

Algorithmic information theory uses optimal data compression to root the concept of information in computation rather than probability. Arising as the mergence of Turing's theory of computation and Shannon's theory of information, algorithmic information involves many theoretical concepts like the Church-Turing thesis, prefix codes, Kolmogorov complexity, algorithmic randomness, Solomonoff's induction and logical depth. It is also rich in applications, notoriously by clearly expressing the incompleteness of mathematics and detaching statistics and model selection from probabilities. Even more concrete are applications to machine learning, like classification, risk analysis in generalized data and anomaly detection. All the above-mentioned concepts shall be investigated to varying degrees of precision throughout the course.

AIT Seminar

On June 1st (14:30, D1.14), Charles Bédard will animate a conversation between Charles Bennett and David Deutsch on the nature of computation, incompleteness and mathematics.
Papers Date and speakers
Information distance, Charles H. Bennett, Peter Gács, Ming Li, Paul MB Vitányi, and Wojciech H. Zurek (IEEE Transactions on information theory, 44(4):1407--1423, 1998)
May 25th, 14:30, D1.14: Marilena Palomba
Compression-based data mining of sequential data, Eamonn Keogh, Stefano Lonardi, Chotirat Ann Ratanamahatana, Li Wei, Sang-Hee Lee, and John Handley (Data Mining and Knowledge Discovery, 14:99--129, 2007)
May 25th, 14:30, D1.14: Pasquale Polverino
Algorithmic randomness and physical entropy, Wojciech H. Zurek (Physical Review A, 40(8):4731, 1989)
May 30th, 14:30, D1.14: William Schober
Logical depth and physical complexity, Charles H. Bennett (The Universal Turing Machine A Half-Century Survey, pages 227--257, 1988)
May 30th, 14:30, D1.14: Ali Egemen Ener
Stationary algorithmic probability, Markus Müller (Theoretical Computer Science, 411(1):113--130, 2010)
June 6th, 13:30, D1.15: Vincent Herrmann
Less is More: Parameter-Free Text Classification with Gzip, Zhiying Jiang, Matthew Y.R. Yang, Mikhail Tsirlin, Raphael Tang, Jimmy Lin (arXiv:2212.09410)
June 6th, 13:30, D1.15: Volodymyr Kyrylov
A theory of universal artificial intelligence based on algorithmic complexity, Marcus Hutter (arXiv preprint cs/0004001, 2000)
June 6th, 13:30, D1.15: Aditya Ramesh
Measuring complexity to infer changes in the dynamics of ecological systems under stress, Vasilis Dakos and Fernando Soler-Toscano (Ecological Complexity, 32:144--155, 2017)
June 6th, 13:30, D1.15: Tommaso Marzi
Fast whole-genome phylogeny of the covid-19 virus sars-cov-2 by compression, Rudi L. Cilibrasi and Paul MB Vitányi (bioRxiv, pages 2020--07, 2020)
June 6th, 13:30, D1.15: Qianbo Zang
Algorithmic statistics: Forty years later, Nikolay Vereshchagin and Alexander Shen (Computability and Complexity, pages 669--737. Springer, 2017)
June 7th, 13:30, D1.15: Martina Boschi
The use of ideas of information theory for studying ''language'' and intelligence in ants, Boris Ryabko and Zhanna Reznikova (Entropy, 11(4):836--853, 2009)
June 7th, 13:30, D1.15: David Esteban Alarcón Flores, Davide Casnici
Quantum Kolmogorov complexity based on classical descriptions, Paul MB Vitányi (IEEE Transactions on Information Theory, 47(6):2464--2479, 2001)
June 7th, 13:30, D1.15: Lorenzo Laneve
Strongly Universal Quantum Turing Machines and Invariance of Kolmogorov Complexity, Markus Muller (IEEE Transactions on Information Theory, vol. 54, no. 2, pp. 763-780, Feb. 2008)
June 7th, 13:30, D1.15: Carla Sophie Rieger