He has ink-blue ears and looks rather adorable with his dark camera eyes. The little robot carefully moves towards the scientists at Ulm University. The researchers affectionately nickname him Ninja Turtle. He helps the scientists from neuroinformatics and cognitive psychology to develop and test special algorithms that mimic human perception and cognition. The goal is to boost the robustness, efficiency and speed of the processing of visual and auditive sensor data. The project is part of the 'Neurorobotics' programme of the Baden-Württemberg Foundation and receives 500 000 euros funding.
'The human brain remains one of the most effective data processing systems ever. Nervous systems in different species work extremely efficient and are superior to many technical systems, particularly with regards to analysing sensory stimuli,' explains Professor Heiko Neumann. The Deputy Director of the Institute of Neural Information Processing is one of the successful applicants, as is Professor Marc Ernst, Director of the Department of Applied Cognitive Psychology. The VA-MORPH project’s objective is to transfer neurobiological functions of the brain onto robotic and IT systems. The scientists’ focus is on the development of so-called neuromorphic algorithms that mimic the structure and work style of the human brain and its elementary components, neurons. Starting point is this question: How can visual and auditive sensor streams be processed, consolidated and technically utilised, for example, for spatial orientation and navigation?
'Human perception is not clocked like in technical systems, but based on events. This means that particular relevance is attributed to things that change over a certain period of time. Out of the entire information that floods the brain it selects only those details that are "relevant to survival" and make sense in that particular situation,' elaborates cognitive psychologist Ernst. The process is different to a conventional camera, where the spatial environment is captured in individual images. The biological hearing process is just as complex and 'data-economical'. Here, the brain combines sensory signals with expectations based on various experiential contexts and uses the information to calculate a multisensory overall impression. 'The integration of these sensory data streams is a masterly performance of the brain. Once we understand how exactly this works, we can try to transfer this operating mode onto technical sensor data processing systems,' the researchers from Ulm summarise their scientific mission. Brain-inspired hard- and software have become integral parts of today's cognitive computing, neurorobotics and artificial intelligence.
The scientists at Ulm University develop biologically plausible learning methods that filter out 'relevant' information from the entirety of the sensory data in order to define neuromorphic algorithms. Neumann and Ernst and their doctoral candidates Christian Jarvers, Maximilian Löhr and Timo Oess now want to test the practicability and effectiveness of these human-style algorithms. They therefore implement them in the robot platform and run simple orientation tasks to begin with. 'This is now the job of our little ninjabot: His task is to find and collect certain visual and acoustic "landmarks" without getting distracted by ambient noise and visibility obstructions,' elaborate Löhr and Oess.
Highly specialised computer architectures are employed to realise such algorithms. In this so-called brain-inspired hardware processor and storage space are not separated as is the case with conventional computers. Instead, these components work together like neurons and their synaptic connections in the brain. This affords much faster and more efficient data processing. 'Thanks to our cooperation partners we have access to scientific equipment that is probably unique for a university,' the project team agrees excitedly. Technology giant IBM Research Almaden (USA) provides neuromorphic brain-inspired chip architectures from the field of brain-inspired computing. The scientists can furthermore access a hardware platform from the EU-funded Human Brain Project as well as special neuromorphic sensors from the company IniLabs.
If everything goes smoothly, the little robot Ninja Turtle will make his way through the laboratory unimpressed by background noises and poor visibility conditions while using only a fraction of computer capacity and storage space compared to traditional computer architectures. The researchers from Ulm University are convinced: there is much to gain in learning from the brain!
Text and media contact: Andrea Weber-Tuckermann