P1: Event-Based Adaptive Vision for Sensing Computers
Cognitive technical systems require sensory devices to measure specific properties of the physical world and to interact successfully with their environment. For autonomous systems such information processing capabilities are constrained by, e.g., efficiency and storage requirements. Sparse and energy efficient representations and processing architectures can solve such problems. For example, event-based cameras build part of a sensory front-end while sparse representations in neuromorphic architectures define the basis for later interpretation of the data. New approaches to the learning of sparse event-based sensory representations will be investigated to allow learning and adaptation of sensory representations for task specific sensory control.
In this dissertation project algorithms for sensory cognition will be investigated to support tasks like object recognition or spatial navigation. Such mechanisms will be specified, implemented and evaluated in scenarios of efficient computation and under limited energy resources. A special focus will be placed on learning and adaptation to complex real-world conditions as in, e.g., scenarios of robotic sensing and interaction. Energy restrictions will be taken into account by utilizing suitable platforms for neuromorphic computing.