Learning Weights and Decay in SNN
Learning in ANN is widely used. Here, we explore how to simultaneously learn the time constants for membrane decay in spiking neural networks, which addresses the fundamental research question: how to learn multiple independent quantities concurrently.
Covariant Spatio-Temporal Receptive Fields in Spiking Neural Networks
We explore initialization of visual receptive fields in spiking neural networks according to properties developed in scale-space theory for traditional (non-spiking) ANN. Our work extends the theory to include temporal computation.
Robotic Arm High Speed Reaching and Grasping
A collaborative robot arm needs to be save and convenient to work with humans. We use extrinsic and intrinsic event-based vision cameras and event-based processing, e.g. spiking neural networks, to enable save, fast, and intuitive robotic motion.
Event-based Vision Air Hockey Game
Playing against air hockey against a robot. Event cameras and low-latency signal processing on neuromorphic hardware give the robot player a competitive edge! Best of luck!
Event-based Vision in Challenging Automotive Scenarious
We investigate event-based cameras in cars. Besides obvious advantages (e.g. lower latency and lower processing power), we are interested in possible advantages of event-driven systems in challenging scenarios, such as heavy rain, fog, snow, or glare.
Low-Power Event-Based Observation and Surveillance
We explore energy-efficient, real-time monitoring of complex traffic scenarios in urban areas. Using decentralized neuromorphic sensing and processing provides: (1) low latency, (2) minimal transmission bandwidth, (3) low power, and (4) enhanced privacy.
Embedded Event Cameras and Processing
Event cameras connect via USB to a computer, which is convenient for recording, but denies many benefits: no longer low-power, nor low-latency. We design embedded processing systems that directly connect event camera sensor chips with microcontrollers.