TelecomParis_IPParis.png Telecom Paris
Dep. Informatique & Réseaux

Dessalles_2018.png J-L. DessallesHome page

December 2023

5

AIAI0.png
IA703: Algorithmic Information and A.I.

Lecturers:
            Jean-Louis Dessalles
            Nils Holzenberger


5








Objectives

Algorithmic Information Theory (AIT) is based on the mathematical notion of complexity, which has been invented 50 years ago to solve issues related to machine learning, randomness and proof theory. It derives from a fundamental intuition: Complex objects cannot be described by short algorithms. Complexity corresponds to the size of algorithms (and not to their speed; see caveat below).

Creating Artificial intelligence is one of the greatest challenges in the history of humankind. Programs are said to be "intelligent" because they solve difficult problems, such as playing the game of Go. Unfortunately, Artificial intelligence is often perceived as no more than that, just a collection of brilliant, innovative methods to solve problems. Most people don’t imagine that intelligent behaviour can be universally described in terms of algorithmic information.

There is currently a growing interest in Complexity and AIT for their role in the theoretical foundations of Artificial Intelligence. Moreover, practical approaches to complexity based on compression techniques or minimum length descriptions offer efficient techniques in machine learning. AIT plays an important role in mathematics, for instance to set limits to what a formal theory or an intelligent system can do. More recently, AIT has been shown essential to address aspects of human intelligence, such as perception, relevance, decision making and emotional intensity.

Caveat:

  1. This course does not address the notion of "computational complexity" which measures the speed of algorithms.







Content

Topics

Chapter 1.     Introduction to
Algorithmic Information Theory (AIT)

    
Complexity measured by code length.
Complexity of integers.
Conditional Complexity.
Chapter 2.     AIT and data:
Measuring Information through compression

Compressibility.
Language recognition through compression.
Huffman codes - Complexity and frequency.
Zipf’s law.
"Google" distance - Meaning distance.
Chapter 3.     Algorithmic information applied to mathematics 

Incomputability of C.
Algorithmic probability - Algorithmic Information.
Randomness.
Gödel’s theorem revisited.
Chapter 4.     Machine Learning and Algorithmic Information
Induction - Minimum Description Length (MDL).
Analogy as complexity minimization.
Machine Learning and compression.
Chapter 5.     Subjective information and simplicity

Cognitive complexity.
Simplicity and coincidences.
Subjective probability & subjective information.
Relevance.

    
   read →
PdfIcon.png     2023 Students’ micro-studies     
         

Slides

    
   read →
PdfIcon.png     Chapter 1 (coding & description complexity)    
    
   read →
PdfIcon.png     Chapter 2 (complexity, frequency & compression)    
    
   read →
PdfIcon.png     Chapter 3 (complexity & maths)    
    
   read →
PdfIcon.png     Chapter 4 (complexity & ML)    
    
   read →
PdfIcon.png     Chapter 5 (complexity, AI & cognition)    

Readings

   read →
PdfIcon.png     Interesting paper in The New Yorker on ChatGPT and compression    

À lire    ➜     
   read →
PdfIcon.png     Simplicité, abondance et évolution    
    
   read →
PdfIcon.png     Cryptographie en lien avec complexité de Kolmogorov    

and also...







Validation

  1. Answers to questions during the lab sessions are recorded and evaluated.
  2. You will have to answer a short quiz on the last day.
  3. You will make a small original contribution (typically, as a continuation of a lab work question). The topic of the project must be related to K-complexity. You are expected to choose a topic of study, and to do something for this project (typically write a small program).
    The project can be done in pairs
    .
  4. You will write a small report (in English) to present your results. Your report should emphasize the link with Kolmogorov complexity and Algorithmic Information. All reports will be bundled up into proceedings made available to all registered students.
  5. You will have the opportunity to make an oral presentation of your work .

Your Project


Your project will typically build on some topic studied during the Lab Work sessions. You should pick a problem that you want to investigate further. A few suggestions are made. Initiative is welcome in any case. Note: you have to "do" something (typically write or extend a program).
  1. You should seek for simple and clear-cut results from which we can learn something (even if you study is inconclusive, we want to know clearly why). Initiative and logical clarity will be appreciated.
  2. Try to be realistic about what you can do. Your study should not be trivial, and it should lead somewhere.
  3. Please favour Python when writing code.

Before December 12: ➜    Please indicate here what you intend to do as a project.

.    If you change your mind, redo the registration.
.    If you work in pair, the two members should enter the same title for the project.
.    ➜    You may consult the others’ projects. Try to play a minority game in your choice!

Before Decembre 17 at the latest:


Report

  1. Write you report    ➜    Please use this template: MSWord or LibreOffice or LaTeX or Pdf
  2. Upload your program and any relevant material    ➜    Uploading page







Short bibliography

En français:

    

Line.jpg