Navigation auf uzh.ch

Suche

Department of Informatics Robotics and Perception Group

Robotics and Perception Group

Throw and Go
Tango
Matia
SVO
Window Flight
RAL16_Giusti
DAVIS
CEBIT
Air Ground
car
Lab Retreat
Scientifica 2015
Kuka Innovation Award

Welcome to the website of the Robotics and Perception Group led by Prof. Dr. Davide Scaramuzza. Our lab was founded in February 2012 and is part of the Department of Informatics at the University of Zurich. Our mission is to develop autonomous machines that can navigate all by themselves using only onboard cameras, without relying on external infrastructure, such as GPS or motion capture systems. Our interests encompass both ground and micro flying robots, as well as multi-robot heterogeneous systems consisting of the combination of these two. We do not want our machines to be passive, but active, in that they should react to and navigate within their environment so as to gain the best knowledge from it.

Follow us on Google+, Scholar, Github, and youTube:

News

Weiterführende Informationen

event_flight

RPG open sources package for event-based feature tracking analysis

More about RPG open sources package for event-based feature tracking analysis

Try out our open-source package for event-based feature tracking analysis! ECCV'18 paper

event_flight

RPG ranks 2nd at the IROS 2017 Autonomous Drone Race

More about RPG ranks 2nd at the IROS 2017 Autonomous Drone Race

Watch our performance at the IROS 2017 Autonomous Drone Race, where we ranked 2nd!

event_flight

Quadrotor Flight with an Event Camera

More about Quadrotor Flight with an Event Camera

Watch the first ever autonomous quadrotor flight with an event camera using our UltimateSLAM. RAL'18 paper.

event-based vio

Real-time Visual-Inertial Odometry for Event Cameras using Keyframe-based Nonlinear Optimization

More about Real-time Visual-Inertial Odometry for Event Cameras using Keyframe-based Nonlinear Optimization

Check out our latest work on event-based visual-inertial odometry in real-time: BMVC'17 paper.

active camera

Active Exposure Control for Robust Visual Odometry in High Dynamic Range Environments

More about Active Exposure Control for Robust Visual Odometry in High Dynamic Range Environments

Check our our latest work on active exposure control for robust visual odometry in high dynamic range environments: ICRA'17 paper

EVO

EVO: A Geometric Approach to Event-based 6-DOF Parallel Tracking and Mapping in Real-time

More about EVO: A Geometric Approach to Event-based 6-DOF Parallel Tracking and Mapping in Real-time

Check out EVO, our latest work on parallel tracking and mapping with an event camera: RA-L'16 paper.

RAL16 Gallego

Accurate Angular Velocity Estimation with an Event Camera

More about Accurate Angular Velocity Estimation with an Event Camera

Check our our latest work on rotational motion estimation with an Event Camera: RAL'16 paper.

Window Flight

Agile Drone Flight through Narrow Gaps with Onboard Sensing and Computing

More about Agile Drone Flight through Narrow Gaps with Onboard Sensing and Computing

Check our our latest work on agile quadrotor flight through narrow gaps with onboard sensing and computing: More info here.

bmvc16_rebecq

EMVS: Event-based Multi-View Stereo

More about EMVS: Event-based Multi-View Stereo

Check out our latest work on Event-based Multi-View Stereo, which uses a single, continuously moving event camera for accurate 3D reconstruction! BMVC'16 paper.

iser16_delmerico

"On-the-spot" Terrain Classifier Training

More about "On-the-spot" Terrain Classifier Training

Our latest work on search and rescue robotics is a system for training a terrain classifier "on-the-spot" in only 60 seconds. Our flying robot can then use this classifier to guide a ground robot through a disaster area. Details are in our ISER'16 paper.

arxiv16_gallego

Event-based, 6-DOF Camera Tracking for High-Speed Applications

More about Event-based, 6-DOF Camera Tracking for High-Speed Applications

We designed an event-based 6-DOF pose tracking pipeline with the latency of 1 microsecond using the DVS sensor for very high speed (>500 deg/sec) and high-dynamic-range (> 130 dB) applications, where all standard cameras fail. All the details in our Arxiv paper.

EBCCSP16_Tedaldi

Low-Latency Visual Odometry using Event-based Feature Tracks

More about Low-Latency Visual Odometry using Event-based Feature Tracks

We designed an event-based 6-DOF visual odometry pipeline with the latency of 1 microsecond using the DAVIS sensor. All the details in our IROS'16 paper and EBCCSP'16 paper.

ral16_giusti

Quadcopter Navigation in the Forest using Deep Neural Networks

More about Quadcopter Navigation in the Forest using Deep Neural Networks

We used Deep Neural Networks to teach our drones to recognize and follow forest trails to search for missing people. Journal Paper. More info.

icra16_isler

Information Gain Metrics for Active 3D Object Reconstruction

More about Information Gain Metrics for Active 3D Object Reconstruction

Our active volumetric reconstruction software framework is now released open source. More details in our ICRA'16 paper.

icra15_faessler

How to Launch a Quadrotor

More about How to Launch a Quadrotor

Our latest work on failure recovery from aggressive flight and how to launch a quadrotor by throwing it in the air! ICRA'15 paper.

icra15_forster

Autonomous Quadrotor Landing using Continuous On-Board Monocular-Vision-based Elevation Mapping

More about Autonomous Quadrotor Landing using Continuous On-Board Monocular-Vision-based Elevation Mapping

Our latest work on autonomous landing-site detection and landing with onboard monocular vision! ICRA'15 paper.

3rd_lab_anniversary

Three-Year Anniversary of the Robotics and Perception Group!

More about Three-Year Anniversary of the Robotics and Perception Group!

To celebrate our lab's 3-year anniversary, we summarize in this clip our main achievements, projects, awards, exhibitions, and upcoming videos!

ssrr14_mueggler

Aerial-guided Navigation of a Ground Robot among Movable Obstacles

More about Aerial-guided Navigation of a Ground Robot among Movable Obstacles

Our latest work on Aerial-guided Navigation of a Ground Robot among Movable Obstacles. More details in our SSRR'14 paper.

rss14_forster

Appearance-based Active, Monocular, Dense Reconstruction for Micro Aerial Vehicles

More about Appearance-based Active, Monocular, Dense Reconstruction for Micro Aerial Vehicles

Our latest work on Appearance-based Active, Monocular, Dense Reconstruction for Micro Aerial Vehicles. More details in our RSS'14 paper.

iros14_mueggler

Event-based, 6-DOF Pose Tracking for High-Speed Maneuvers using a Dynamic Vision Sensor

More about Event-based, 6-DOF Pose Tracking for High-Speed Maneuvers using a Dynamic Vision Sensor

Our latest work on event-based vision: 6-DOF Pose Tracking for High-Speed Maneuvers. More details in our IROS'14 paper.

quadrotor_trailer

Quadrotor Demos - Live dense 3D recontruction and collaborative grasping

More about Quadrotor Demos - Live dense 3D recontruction and collaborative grasping

Our quadrotor demo trailer: autonomous navigation, live dense 3D reconstruction, and collaborative grasping.

kuka_automatica

Collaboration between Ground and Flying Robots for Search-And-Rescue Missions

More about Collaboration between Ground and Flying Robots for Search-And-Rescue Missions

Our demo at the KUKA Innovation Award that shows the collaboration of flying and ground robots for search-and-rescue missions.

research_vo

Autonomous Vision-based Flight over a Disaster Zone

More about Autonomous Vision-based Flight over a Disaster Zone

Autonomous Vision-based Flight over a Disaster Zone using SVO (more details).

icra14_forster

SVO: Fast Semi-Direct Monocular Visual Odometry

More about SVO: Fast Semi-Direct Monocular Visual Odometry

SVO - our new visual odometry pipeline for MAV state estimation. More details in our ICRA'14 paper.

icra14_pizzoli

REMODE: Probabilistic, Monocular Dense Reconstruction in Real Time

More about REMODE: Probabilistic, Monocular Dense Reconstruction in Real Time

Our latest work on probabilistic, monocular dense reconstruction in real time. More details in our ICRA'14 paper.

icra14_faessler

A Monocular Pose Estimation System based on Infrared LEDs

More about A Monocular Pose Estimation System based on Infrared LEDs

Our monocular pose estimation system that is released as open-source. More details in our ICRA'14 paper.

youbot_torque_control

Torque Control of a KUKA youBot Arm

More about Torque Control of a KUKA youBot Arm

Torque Control of a KUKA youBot Arm (Master thesis of Benjamin Keiser)

arte

Robotics and Perception Group on arte X:enius

More about Robotics and Perception Group on arte X:enius

RPG was featured on the German-French TV channel ARTE in their science programme X:enius. The French version is available here.

iros13_forster_air_ground

Air-Ground Localization and Map Augmentation Using Monocular Dense Reconstruction

More about Air-Ground Localization and Map Augmentation Using Monocular Dense Reconstruction
iros13_majdik

MAV Urban Localization from Google Street View Data

More about MAV Urban Localization from Google Street View Data
iros13_forster_csfm

Collaborative Monocular SLAM with Multiple Micro Aerial Vehicles

More about Collaborative Monocular SLAM with Multiple Micro Aerial Vehicles

Watch the video for our new IROS'13 paper "Collaborative Monocular SLAM with Multiple Micro Aerial Vehicles".

tedx_scaramuzza

Autonomous Flying Robots: Davide Scaramuzza at TEDxZurich

More about Autonomous Flying Robots: Davide Scaramuzza at TEDxZurich

Autonomous Vision-Controlled Micro Flying Robots: Davide Scaramuzza at TEDxZurich.