Elias Mueggler

Elias Mueggler


MSc ETH Zürich

Robotics and Perception Group

Department of Informatics

University of Zurich

Email: mueggler (at) ifi (dot) uzh (dot) ch

Office: Andreasstrasse 15, AND 2.16

Linkedin Linkedin

Google Scholar Google Scholar


I am a PhD student at the Robotics and Perception Group led by Prof. Davide Scaramuzza. Currently, I am working on event-based vision for high-speed robotics and air-ground robot collaboration. In 2010 and 2012, I received my Bachelor's and Master's degree in Mechanical Engineering from ETH Zurich, respectively. During my studies at ETH, I was focusing on robotics, dynamics, and computer vision. I wrote my Master thesis at MIT under the supervision of Prof. John J. Leonard on visual SLAM for space applications.


  • NCCR Robotics International PhD exchange programme 2016
  • Qualcomm Innovation Fellowship 2016 (40'000 USD) Website
  • KUKA Innovation Award 2014 (20'000 EUR) Video of Presentation at Automatica Video from Submitted Paper
  • Convergent Science Network of Biomimetics and Neurotechnology CapoCaccia Fellowship 2014
  • Hans und Wilma Stutz Foundation Scholarship 2012

Research Interests

Event-based Robot Vision

Unlike a standard CMOS camera, a DVS does not wastefully send full image frames at a fixed frame rate. Conversely, similar to the human eye, it only transmits pixel-level brightness changes at the time they occur with microsecond resolution, thus, offering the possibility to create a perception pipeline whose latency is negligible compared to the dynamics of the robot. We exploit these characteristics to estimate the pose of a quadrotor with respect to a known pattern during high-speed maneuvers, such as flips, with rotational speeds up to 1,200 degrees a second. We presented our work at the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) in 2014. It was also featured on IEEE Spectrum.





E. Mueggler, G. Gallego, H. Rebecq, D. Scaramuzza

Continuous-Time Visual-Inertial Trajectory Estimation with Event Cameras

(Under review)


Davis Dataset

E. Mueggler, H. Rebecq, G. Gallego, T. Delbruck, D. Scaramuzza

The Event-Camera Dataset and Simulator: Event-based Data for Pose Estimation, Visual Odometry, and SLAM

International Journal of Robotics Research, Vol. 36, Issue 2, pages 142-149, Feb. 2017.

PDF (arXiv) YouTube Dataset



G. Gallego, Jon E. A. Lund, E. Mueggler, H. Rebecq., T. Delbruck, D. Scaramuzza

Event-based, 6-DOF Camera Tracking for High-Speed Applications

(Under review)

PDF (arXiv) (PDF, 3979 KB) YouTube

EB Visual Odometry

B. Kueng, E. Mueggler, G. Gallego, D. Scaramuzza

Low-Latency Visual Odometry using Event-based Feature Tracks

IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Daejeon, 2016.

Best Application Paper Award Finalist! Highlight Talk: Acceptance Rate 2.5%

PDF (PDF, 1623 KB) YouTube


D. Tedaldi, G. Gallego, E. Mueggler, D. Scaramuzza

Feature Detection and Tracking with the Dynamic and Active-pixel Vision Sensor (DAVIS)

International Conference on Event-Based Control, Communication and Signal Processing (EBCCSP), Krakow, 2016.

PDF (PDF, 1241 KB)


E. Mueggler, N. Baumli, F. Fontana, D. Scaramuzza

Towards Evasive Maneuvers with Quadrotors using Dynamic Vision Sensors

European Conference on Mobile Robots (ECMR), Lincoln, 2015.

PDF (PDF, 665 KB)

ICSAS Delbruck

T. Delbruck, M. Pfeiffer, R. Juston, G. Orchard, E. Müggler, A. Linares-Barranco, M. W. Tilden

Human vs. computer slot car racing using an event and frame-based DAVIS vision sensor

IEEE International Symposium on Circuits and Systems (ISCAS), Lisbon, 2015.



E. Mueggler, G. Gallego, D. Scaramuzza

Continuous-Time Trajectory Estimation for Event-based Vision Sensors

Robotics: Science and Systems (RSS), Rome, 2015.

PDF (PDF, 1806 KB)


E. Mueggler, C. Forster, N. Baumli, G. Gallego, D. Scaramuzza

Lifetime Estimation of Events from Dynamic Vision Sensors

IEEE International Conference on Robotics and Automation (ICRA), Seattle, 2015.

PDF (PDF, 726 KB)


Air-Ground Collaboration

 We develop strategies for aerial and ground robots to work together as a team. By doing so, the robots can profit from each others capabilites. Our demonstration won the KUKA Innovation Award 2014 and was presented in a paper at the IEEE International Symposium on Safety, Security, and Rescue Robotics (SSRR) in 2014.





KUKA Award


Happy Easter


We released this video on Easter 2013. A quadrotor is flying above a ground robot while looking for Easter eggs that lie on the ground. It then tells the ground robot the exact position of these eggs, so that all of them can be collected. This video was accepted for the video session at the International Joint Conference on Artificial Intelligence (IJCAI) 2013 in Beijing, China.



R. Kaeslin, P. Fankhauser, E. Stumm, Z. Taylor, E. Mueggler, J. Delmerico, D. Scaramuzza, R. Siegwart, M. Hutter

Collaborative Localization of Aerial and Ground Robots through Elevation Maps

International Symposium on Safety, Security, and Rescue Robotics (SSRR), Lausanne, 2016.

PDF (PDF, 3950 KB)


J. Delmerico, A. Giusti, E. Mueggler, L.M. Gambardella, D. Scaramuzza

"On-the-spot Training" for Terrain Classification in Autonomous Air-Ground Collaborative Teams

International Symposium on Experimental Robotics (ISER), Tokyo, 2016.

PDF (PDF, 4043 KB) YouTube


M. Faessler, F. Fontana, C. Forster, E. Mueggler, M. Pizzoli, D. Scaramuzza

Autonomous, Vision-based Flight and Live Dense 3D Mapping with a Quadrotor Micro Aerial Vehicle

Journal of Field Robotics, 2015.

PDF (PDF, 2438 KB) YouTube 1 YouTube 2 YouTube 3 YouTube 4 Software


E. Mueggler, M. Faessler, F. Fontana, D. Scaramuzza

Aerial-guided Navigation of a Ground Robot among Movable Obstacles

IEEE International Symposium on Safety, Security, and Rescue Robotics (SSRR), Toyako-cho, 2014.

PDF (PDF, 791 KB) YouTube Presentation at AUTOMATICA


M. Faessler, E. Mueggler, K. Schwabe, D. Scaramuzza

A Monocular Pose Estimation System based on Infrared LEDs

IEEE International Conference on Robotics and Automation (ICRA), Hong Kong, 2014.

PDF (PDF, 1741 KB) YouTube


Supervised Student Projects

  • Timo Horstschaefer (Master Thesis - 2016). Winner of the Fritz Kutter Award 2016!
    Parallel Tracking, Depth Estimation, and Image Reconstruction with an Event Camera.
  • Jonathan Huber (Semester Thesis - 2016).
    Ground Robot Localization in Aerial 3D Maps.
  • Julia Nitsch (NCCR Internship - 2016).
    Terrain Classification in Search-and-Rescue Scenarios.
  • Beat Kueng (Master Thesis - 2016).
    Visual Odometry pipeline for the DAVIS camera. Paper (PDF, 1623 KB) Video
  • Mathis Kappeler (Master Project - 2016).
    Exposure Control for Robust Visual Odometry.
  • Imanol Studer (Master Project - 2015).
    Head Pose Tracking with Quadrotors.
  • Jon Lund (Master Thesis - 2015).
    Towards SLAM for Dynamic Vision Sensors.
  • Micha Brunner (Semester Thesis - 2015).
    Flying Motion Capture System.
  • Igor Bozic (Master Project - 2015).
    High-Frequency Position Control of the KUKA youBot Arm.
  • Joachim Ott (Semester Thesis - 2015).
    Vision-Based Surface Classification for Micro Aerial Vehicles.
  • David Tedaldi (Semester Thesis - 2015).
    Feature Tracking based on Frames and Events. Paper (PDF, 1241 KB)
  • Nathan Baumli (Master Thesis 2015).
    Towards Evasive Maneuvers for Quadrotors using Stereo Dynamic Vision. Paper (PDF, 665 KB)
  • Amos Zweig (Semester Thesis - 2014).
    Event-based Depth Estimation.
  • Nathan Baumli (Semester Thesis - 2014).
    Event-Based Full-Frame Visualization. Paper (PDF, 726 KB)
  • Basil Huber (Master Thesis - 2014). Winner of the Fritz Kutter Award 2014!
    High-Speed Pose Estimation using a Dynamic Vision Sensor. Paper (PDF, 926 KB) Video
  • Karl Schwabe (Master Thesis - 2013).
    A Monocular Pose Estimation System based on Infrared LEDs. Paper (PDF, 1741 KB)VideoCode
  • Benjamin Keiser (Master Thesis - 2013). Winner of the 2013 KUKA Best Student Project!
    Torque Control of a KUKA youBot Arm. Thesis (PDF, 4092 KB) VideoCode

Master Thesis: Visual Mapping of Unknown Space Targets for Relative Navigation and Inspection


During my Master thesis at the Computer Science and Artificial Intelligence Lab (CSAIL) at MIT under the supervision of Professor John Leonard, I implemented a visual mapping algorithm that is capable of creating a 3D model of an unknown and uncooperative space target, e.g. a satellite, using a stereo camera. The code was tested aboard the International Space Station (ISS). This algorithm will later be used for relative navigation, inspection, and docking maneuvers in space.

Publication: B. Tweddle, E. Müggler, A. Saenz-Otero, D. Miller. The SPHERES VERTIGO goggles: vision based mapping and localization onboard the International Space Station. International Symposium on Artificial Intelligence, Robotics and Automation in Space (i-SAIRAS), Turin, 2012. PDF YouTube

Semester Thesis: Robotic calligraphy - A robot that learns how to write Chinese calligraphy

Robotic calligraphy

I wrote my Semester thesis at the Institute of Dynamic Systems and Control at the ETH Zurich about robotic calligraphy. During this project, I implemented a trajectory generator that enabled the robot to draw Chinese characters, used computer vision algorithms to compare the drawn characters with a reference from a textbook, and applied an iterative learning controller to improve the robot's next drawing.

Publication: N. Huebel, E. Mueggler, M. Waibel, R. D'Andrea. Towards Robotic Calligraphy. IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Vilamoura, 2012. PDF YouTube