Jörg Conradt

Principal Investigator


EECS, CST

KTH Royal Institute of Technology, Sweden

Lindstedtsvägen 5
114 28 Stockholm, Sweden



NCS Lab Research Resources


You get an impression of the NCS lab facilities in the panoramic picture above, with the robot arena (and the student relaxing arena) centered/right. In the left hallway, we offer temporary working desks for thesis / project students, a small mechanics toolshop including lasercutter and 3d printers (hidden), and a larger electronic (SMT) assembly and measurement workplace (middle, hidden).

More details about the lab infrastructure and available basic lab equipment here.

Below a Selection of Research Equipment for Projects

Event-based Cameras
We have a large collection of event cameras, from early prototypes to recent HD resolution (1280x780 pixel). Most cameras are connected to a PC for data collection, streaming, or live processing in computing hardware. Specific expertise in the lab includes developing embedded prototypes of event cameras with microcontroller or neuromorphpic on-board processing.
Omnidirectional Research Robot Durin
We have designed and build the omnidirectional mobile robot platform for student experiments. The robot is about 16x16 cm at the base, and can be remotely controlled (wireless network) from a PC; or can be equipped with on-board sensors and computers (as shown on the right) for low-latency closed-loop sensor-perception-action experiments. We have about 10 such robots in the lab.
Franka Emika Panda Robot Co-Working Arm
We use the robot arm for reaching-and-grasping co-working experiments. Event cameras observe the workspace, and can get added onto the robot arm for particular experiments. We explore low-latency sencory perception in close collaboration between human and robot.
Robot Dog (Quadruped) Fenrir
Our walking robot (Unitree A1) is an on-board controlable mobile platform that sees the world through event-camera eyes. We expect to run high-speed low-latency experiments reacting to the robot's environment in real-time, e.g. detecting and approaching stairs.

Neuromorphic Programmable SpiNNaker System
The SpiNNaker 24-boards computing machine allows executing large vastly connected spinking neural networks in real time (beyond what modern GPU cards can do).
In addition to large-scale simulation hardware we also have neuromorphic processors in the lab, such as Synsense's Speck or Brainchip's Akida.
GPU equipped machines, RTX4090 and RTX3090
Of course we have a number of "high computing resources" work stations in the lab, such as RTX4090 and RTX3090, with ample PC performance, for training and real-time inference of spiking neural networks.
Augmented Reality Glasses
we have a collection of xreal augmented reality glasses (e.g. xreal air 2 ultra). We combine low-latency event-based vision and AR glasses to create an immersive experience within and outside the lab.
Anything else?
Are we missing anything as basic research or development tool to implement exciting projects? Let us know!
Share

Tools
Translate to