Computational Learning Theory and Beyond

Brought by: openHPI

Overview

In this T-shaped course you will be introduced to computational learning theory and get a glimpse of other research towards a theory of artificial intelligence. "T-shaped" means that on the one hand we will concentrate on different learning models in depth, on the other hand we want to give a broad overview and invite experts from other AI projects to show what else can be done in AI.

The focus is on learning from informant, a formal model for binary classification, for example by a support vector machine. Illustrating examples are linear separators and other uniformly decidable sets of formal languages. Due to results by Gold the learning process can be assumed consistent. Another legitimate assumption is performing mind-changes only when observing an inconsistency.

After the proofs of the latter observations, the model is adjusted towards the setting of deep learning. This incremental model has less learning power than the full-information variant by a fundamental proof technique due to Blum and Blum. You will apply this technique to separate consistency. Finally, we outline why this model suggests to design incremental learning algorithms that update their currently hypothesized classifier, even though it is consistent with the observed datum.

Beyond these models, you will get digestible insights into other approaches towards a theory of AI. These include stable matchings, evolutionary algorithms, fair clustering, game theory, low-dimensional embeddings, submodular optimization and 3-satisfiability.
Further, more models in computational learning theory are being discussed.

 

Computational Learning Theory and Beyond
Go to course

Computational Learning Theory and Beyond

Brought by: openHPI

  • openHPI
  • Free
  • English
  • Certificate Available
  • Available at any time
  • All
  • N/A