Project 6: Explaining Classification Decisions of Hybrid Systems

Description of the project

In artificial intelligence, both machine learning approaches and approaches based on modeled knowledge, e.g. in the form of ontologies, are used. The modeled knowledge can be used to explain the system decisions. Learning-based approaches, on the other hand, are useful when regular knowledge for decision-making is not (yet) available or only partially available; however, a direct derivation of explanations is usually not possible. The aim of the proposed doctoral project is the development of methods for generating explanations for hybrid systems that have both a learning-based component (e.g. for the classification of sensor data) and a knowledge-based component (e.g. for deriving recommendations for action from the interpreted sensor data). In particular, the question arises as to how uncertain classifications of a machine learning system can suitably be integrated into coherent explanations of a hybrid system and communicated to human users.

Supervisors

First supervisor:

Prof. Dr. Birte Glimm, Institut für Künstliche Intelligenz, Universität Ulm

 

Tandem partner:

Prof. Dr. Michael Munz, Technische Hochschule

 

Consulting Eexperts:

Prof. Dr. Hans Kestler, Institut für Medizinische Systembiologie, Universität Ulm

Prof. Dr. Matthias Klier, Institut für Business Analytics, Universität Ulm

Prof. Dr. Manfred Reichert, Institut für Datenbanken und Informationssysteme, Universität Ulm

Prof. Dr. Christian Schlegel, Technische Hochschule Ulm