Navigation auf uzh.ch

Suche

Department of Informatics Robotics and Perception Group

Robotics and Perception Group

Throw and Go
Tango
Matia
SVO
Window Flight
RAL16_Giusti
DAVIS
CEBIT
Air Ground
car
Lab Retreat
Scientifica 2015
Kuka Innovation Award

Welcome to the website of the Robotics and Perception Group led by Prof. Dr. Davide Scaramuzza. Our lab was founded in February 2012 and is part of the Department of Informatics at the University of Zurich. Our mission is to develop autonomous machines that can navigate all by themselves using only onboard cameras, without relying on external infrastructure, such as GPS or motion capture systems. Our interests encompass both ground and micro flying robots, as well as multi-robot heterogeneous systems consisting of the combination of these two. We do not want our machines to be passive, but active, in that they should react to and navigate within their environment so as to gain the best knowledge from it.

Follow us on Google+, Scholar, Github, and youTube:

News

  • Code Release - EMVS: Event-based Multi-View Stereo

    Our paper Deep Drone Racing: Learning Agile Flight in Dynamic Environments won the Best Systems Paper Award at the Conference on Robotic Learning (CoRL) 2018.

  • CoRL 2018 Best System Paper Award

    Our paper Deep Drone Racing: Learning Agile Flight in Dynamic Environments won the Best Systems Paper Award at the Conference on Robotic Learning (CoRL) 2018.

  • RPG won the IROS 2018 Autonomous Drone Race

    We are proud to announce that our team won the IROS Autonomous Drone Race Competition, passing all 8 gates in just 30 seconds! In order to succeed, we combined deep networks, local VIO, Kalman filtering, and optimal control. Watch our performance here

  • Oculus Quest is out

    Mark Zuckerberg just announced the new Oculus VR headset, called Oculus Quest. This is that our former lab startup, Zurich Eye, now Oculus Zurich has been working on for the past two years. Watch the video.

  • RPG live demo at Langen Nacht der Zurcher Museen

    We performed a live quadrotor demo at the Zurich Kunsthalle during the Langen Nacht der Zurcher Museen, as part of the 100 Ways of Thinking show, in front of more than 200 people. Check out the media coverage here.

  • RPG featured on NZZ

    Our research was feature on Neue Zucher Zeitung. Check out the article here.

  • Huge media coverage for search and rescue demonstration

    Our lab received great Swiss media attention (NZZ, SwissInfo, SRF) for our live flight demonstration of a quadrotor entering a collapsed building to simulate a search and rescue operation. Check out the video here.

  • RPG research featured on NewScientist

    Our research on autonomous drone racing was featured on NewScientist. Check out the article here.

  • Paper accepted in RA-L 2018

    Our paper about safe quadrotor navigation computing forward reachable sets was accepted for publication in the Robotics and Automation Letters (RA-L) 2018. Check out the PDF.

  • Paper accepted at RSS 2018

    Our paper about drone racing was accepted to RSS 2018 in Pittsburgh! Check out the long version, short version and the video!

  • Paper accepted in IEEE TRO!

    Our paper on Continuous-Time Visual-Inertial Odometry for Event Cameras has been accepted for publication at IEEE Transactions of Robotics. Check out the paper.

  • New Postdoc

    We welcome Dr. Dario Brescianini as new Postdoc in our lab!

     

  • RPG receives 2017 IEEE Transactions on Robotics (TRO) best paper award

    Our paper on IMU pre-integration received the 2017 IEEE Transactions on Robotics (TRO) best paper award at ICRA 2018 in Brisbane, Australia. Check out the paper here!

  • IEEE TRO Best Paper Award

    We are proud to announce that our paper on IMU pre-integration will receive the 2017 IEEE Transactions on Robotics (TRO) best paper award. On this occasion, IEEE made the article open access for the next ten years!

    C. Forster, L. Carlone, F. Dellaert, D. Scaramuzza
    On-Manifold Preintegration for Real-Time Visual-Inertial Odometry
    IEEE Transactions on Robotics, vol 33, no. 1, pp. 1-21, Feb. 2017.
    PDF DOI YouTube

  • Qualcomm Innovation Fellowship

    Henri Rebecq, a PhD student in our lab, won a Qualcomm Innovation Fellowship with his proposal "Learning Representations for Low-Latency Perception with Frame and Event-based Cameras"!

  • Release of NetVLAD in Python/Tensorflow

    We are happy to announce a Python/Tensorflow port of the FULL NetVLAD network, approved by the original authors and available here (see also our software/datasets page). The repository contains code which allows plug-and-play python deployment of the best off-the-shelf model made available by the authors. We have thoroughly tested that the ported model produces a similar output to the original Matlab implementation, as well as excellent place recognition performance on KITTI 00.

  • Release of Data-Efficient Decentralized Visual SLAM

    We provide the code accompanying our recent Decentralized Visual SLAM paper. The code contains a C++/Matlab simulation containing all building blocks for a state-of-the-art decentralized visual SLAM system. Check out the paperthe Video Pitchthe presentation and the code.

  • Release of the Fast Event-based Corner Detector

    We provide the code of our FAST event-based corner detector. Our implementation is capable of processing millions of events per second on a single core (less than a micro-second per event) and reduces the event rate by a factor of 10 to 20. Check out our Paper, video, and code.

  • Release of the RPG Quadrotor Control Framework

    We provide a complete framework for flying quadrotors based on control algorithms developed by the Robotics and Perception Group. We also provide an interface to the RotorS Gazebo plugins to use our algorithms in simulation. Check out our software​ page for more details.

  • Davide Scaramuzza gives an invited seminar at Princeton University

Weiterführende Informationen

event_flight

RPG open sources package for event-based feature tracking analysis

More about RPG open sources package for event-based feature tracking analysis

Try out our open-source package for event-based feature tracking analysis! ECCV'18 paper

event_flight

RPG ranks 2nd at the IROS 2017 Autonomous Drone Race

More about RPG ranks 2nd at the IROS 2017 Autonomous Drone Race

Watch our performance at the IROS 2017 Autonomous Drone Race, where we ranked 2nd!

event_flight

Quadrotor Flight with an Event Camera

More about Quadrotor Flight with an Event Camera

Watch the first ever autonomous quadrotor flight with an event camera using our UltimateSLAM. RAL'18 paper.

event-based vio

Real-time Visual-Inertial Odometry for Event Cameras using Keyframe-based Nonlinear Optimization

More about Real-time Visual-Inertial Odometry for Event Cameras using Keyframe-based Nonlinear Optimization

Check out our latest work on event-based visual-inertial odometry in real-time: BMVC'17 paper.

active camera

Active Exposure Control for Robust Visual Odometry in High Dynamic Range Environments

More about Active Exposure Control for Robust Visual Odometry in High Dynamic Range Environments

Check our our latest work on active exposure control for robust visual odometry in high dynamic range environments: ICRA'17 paper

EVO

EVO: A Geometric Approach to Event-based 6-DOF Parallel Tracking and Mapping in Real-time

More about EVO: A Geometric Approach to Event-based 6-DOF Parallel Tracking and Mapping in Real-time

Check out EVO, our latest work on parallel tracking and mapping with an event camera: RA-L'16 paper.

RAL16 Gallego

Accurate Angular Velocity Estimation with an Event Camera

More about Accurate Angular Velocity Estimation with an Event Camera

Check our our latest work on rotational motion estimation with an Event Camera: RAL'16 paper.

Window Flight

Agile Drone Flight through Narrow Gaps with Onboard Sensing and Computing

More about Agile Drone Flight through Narrow Gaps with Onboard Sensing and Computing

Check our our latest work on agile quadrotor flight through narrow gaps with onboard sensing and computing: More info here.

bmvc16_rebecq

EMVS: Event-based Multi-View Stereo

More about EMVS: Event-based Multi-View Stereo

Check out our latest work on Event-based Multi-View Stereo, which uses a single, continuously moving event camera for accurate 3D reconstruction! BMVC'16 paper.

iser16_delmerico

"On-the-spot" Terrain Classifier Training

More about "On-the-spot" Terrain Classifier Training

Our latest work on search and rescue robotics is a system for training a terrain classifier "on-the-spot" in only 60 seconds. Our flying robot can then use this classifier to guide a ground robot through a disaster area. Details are in our ISER'16 paper.

arxiv16_gallego

Event-based, 6-DOF Camera Tracking for High-Speed Applications

More about Event-based, 6-DOF Camera Tracking for High-Speed Applications

We designed an event-based 6-DOF pose tracking pipeline with the latency of 1 microsecond using the DVS sensor for very high speed (>500 deg/sec) and high-dynamic-range (> 130 dB) applications, where all standard cameras fail. All the details in our Arxiv paper.

EBCCSP16_Tedaldi

Low-Latency Visual Odometry using Event-based Feature Tracks

More about Low-Latency Visual Odometry using Event-based Feature Tracks

We designed an event-based 6-DOF visual odometry pipeline with the latency of 1 microsecond using the DAVIS sensor. All the details in our IROS'16 paper and EBCCSP'16 paper.

ral16_giusti

Quadcopter Navigation in the Forest using Deep Neural Networks

More about Quadcopter Navigation in the Forest using Deep Neural Networks

We used Deep Neural Networks to teach our drones to recognize and follow forest trails to search for missing people. Journal Paper. More info.

icra16_isler

Information Gain Metrics for Active 3D Object Reconstruction

More about Information Gain Metrics for Active 3D Object Reconstruction

Our active volumetric reconstruction software framework is now released open source. More details in our ICRA'16 paper.

icra15_faessler

How to Launch a Quadrotor

More about How to Launch a Quadrotor

Our latest work on failure recovery from aggressive flight and how to launch a quadrotor by throwing it in the air! ICRA'15 paper.

icra15_forster

Autonomous Quadrotor Landing using Continuous On-Board Monocular-Vision-based Elevation Mapping

More about Autonomous Quadrotor Landing using Continuous On-Board Monocular-Vision-based Elevation Mapping

Our latest work on autonomous landing-site detection and landing with onboard monocular vision! ICRA'15 paper.

3rd_lab_anniversary

Three-Year Anniversary of the Robotics and Perception Group!

More about Three-Year Anniversary of the Robotics and Perception Group!

To celebrate our lab's 3-year anniversary, we summarize in this clip our main achievements, projects, awards, exhibitions, and upcoming videos!

ssrr14_mueggler

Aerial-guided Navigation of a Ground Robot among Movable Obstacles

More about Aerial-guided Navigation of a Ground Robot among Movable Obstacles

Our latest work on Aerial-guided Navigation of a Ground Robot among Movable Obstacles. More details in our SSRR'14 paper.

rss14_forster

Appearance-based Active, Monocular, Dense Reconstruction for Micro Aerial Vehicles

More about Appearance-based Active, Monocular, Dense Reconstruction for Micro Aerial Vehicles

Our latest work on Appearance-based Active, Monocular, Dense Reconstruction for Micro Aerial Vehicles. More details in our RSS'14 paper.

iros14_mueggler

Event-based, 6-DOF Pose Tracking for High-Speed Maneuvers using a Dynamic Vision Sensor

More about Event-based, 6-DOF Pose Tracking for High-Speed Maneuvers using a Dynamic Vision Sensor

Our latest work on event-based vision: 6-DOF Pose Tracking for High-Speed Maneuvers. More details in our IROS'14 paper.

quadrotor_trailer

Quadrotor Demos - Live dense 3D recontruction and collaborative grasping

More about Quadrotor Demos - Live dense 3D recontruction and collaborative grasping

Our quadrotor demo trailer: autonomous navigation, live dense 3D reconstruction, and collaborative grasping.

kuka_automatica

Collaboration between Ground and Flying Robots for Search-And-Rescue Missions

More about Collaboration between Ground and Flying Robots for Search-And-Rescue Missions

Our demo at the KUKA Innovation Award that shows the collaboration of flying and ground robots for search-and-rescue missions.

research_vo

Autonomous Vision-based Flight over a Disaster Zone

More about Autonomous Vision-based Flight over a Disaster Zone

Autonomous Vision-based Flight over a Disaster Zone using SVO (more details).

icra14_forster

SVO: Fast Semi-Direct Monocular Visual Odometry

More about SVO: Fast Semi-Direct Monocular Visual Odometry

SVO - our new visual odometry pipeline for MAV state estimation. More details in our ICRA'14 paper.

icra14_pizzoli

REMODE: Probabilistic, Monocular Dense Reconstruction in Real Time

More about REMODE: Probabilistic, Monocular Dense Reconstruction in Real Time

Our latest work on probabilistic, monocular dense reconstruction in real time. More details in our ICRA'14 paper.

icra14_faessler

A Monocular Pose Estimation System based on Infrared LEDs

More about A Monocular Pose Estimation System based on Infrared LEDs

Our monocular pose estimation system that is released as open-source. More details in our ICRA'14 paper.

youbot_torque_control

Torque Control of a KUKA youBot Arm

More about Torque Control of a KUKA youBot Arm

Torque Control of a KUKA youBot Arm (Master thesis of Benjamin Keiser)

arte

Robotics and Perception Group on arte X:enius

More about Robotics and Perception Group on arte X:enius

RPG was featured on the German-French TV channel ARTE in their science programme X:enius. The French version is available here.

iros13_forster_air_ground

Air-Ground Localization and Map Augmentation Using Monocular Dense Reconstruction

More about Air-Ground Localization and Map Augmentation Using Monocular Dense Reconstruction
iros13_majdik

MAV Urban Localization from Google Street View Data

More about MAV Urban Localization from Google Street View Data
iros13_forster_csfm

Collaborative Monocular SLAM with Multiple Micro Aerial Vehicles

More about Collaborative Monocular SLAM with Multiple Micro Aerial Vehicles

Watch the video for our new IROS'13 paper "Collaborative Monocular SLAM with Multiple Micro Aerial Vehicles".

tedx_scaramuzza

Autonomous Flying Robots: Davide Scaramuzza at TEDxZurich

More about Autonomous Flying Robots: Davide Scaramuzza at TEDxZurich

Autonomous Vision-Controlled Micro Flying Robots: Davide Scaramuzza at TEDxZurich.