Navigation auf uzh.ch

Suche

Department of Informatics Artificial Intelligence Lab (ARCHIVE)

Development of a new, humanoid robot in only 9 months.

ROBOY will be developed in record time – only 9 months. The renowned Artificial Intelligence Lab of the University of Zurich will host the robotics fair ROBOTS ON TOUR in Zurich on March 8 2013 for its’ 25 year anniversary. (www.robotsontour.com) Joining research and the private sector, a team of specialists has set itself an ambitious goal in June 2012: To advance the ECCE technologies and present a novel, unique humanoid robot in March 2013- ROBOY.

Financing by crowd funding

To markedly speed up decisions and cut down on development time, ROBOY will be financed with private funds and constructed by private companies. This means that ROBOY will be financed by exclusive sponsors and through crowd funding. How far we will be able to develop ROBOY will be dependent on your support. In return for a contribution, every supporter will have his name or logo engraved on ROBOY and will receive another token of appreciation.

Open Source Development

Development of ROBOY will be conducted Open Source. This means that all expertise, ideas and inventions will not belong to a specific partner exclusively and every one will have the right to advance ROBOY’s technologies. ROBOY will be the ideal research platform for pioneering research in robotics and will be advanced even further at the AI Lab and other leading research institutions after ROBOTS ON TOUR.

Learn more about roboy at www.roboy.org


ECCCEROBOT-2, also referred to als EDS (Embodied Design Study) or simply "Max" due to the numerous maxon motors employed has been publicly presented at the Hannover Messe 2010.

Photgrapher credits: Patrick Knab

Read more about "Max" here: www.eccerobot.eu

Credits go to:

University of Sussex, Artificial Intelligence Lab - University of Zurich, Elektrotehnicki Fakultet - Univerzitet u Beogradu, The Robot Studio

Funding:

The project is funded by the EU's 7th Framework Programme, under ICT Challenge, 'Cognitive Systems and Robotics'.


ECCCEROBOT-1 (Embodied Cognition in a Compliantly Engineered Robot) is a project funded by the 7th framework programme of the EU (ICT-Challenge 2, "Cognitive Systems, Interaction, Robotics") that has the goal to build and control the first anthropomimetic robot and finally, to investigate its human-like cognitive features.

Learn more about ECCEROBOT 1 at www.eccerobot.org/


We have built a miniature and cheap version of the rHex robot, originally created by a group of universities under a large DARPA program in the US.

Our mini-rHex is 26 x 14 cm in size and weighs 530 grams. The whole robot costs USD 190.- to produce. In order to increase speed, each leg has 3 evenly spaced pedals. We have also added a passive spine joint on the body of the robot, between the middle and hindlegs. We will shortly conduct an experiment to evaluate the advantages and disadvantages of this spine joint for handling rough terrains.

Preliminary runs demonstrate that the robot is capable of handling various terrain materials and going over obstacles as high as 110% of its own height (see video). On flat surfaces, the robot can go at about two body lengthes per second with the current configuration, in theory, faster speeds should be possible.

This project is funded by the Locomorph project and supervised by Lijin Aryananda.


Novel Design Principles and Technologies for a new Generation of High Dexterity Soft-Bodied Robots inspired by the Morphology and Behaviour of the Octopus

The octopus is a marine invertebrate with amazing motion control capabilities and intelligent behaviors. Its completely soft body has no internal or external skeleton and shows interesting characteristics from an engineering viewpoint. The eight soft tentacles of the octopus are highly dexterous and can bend at any point and to any directions along the arms, thus contain infinite number of degrees of freedom (DOFs). The animal can control its totally soft arms for reaching, catching, precise point-to-point fetching, crawling, swimming, and even walking. The octopus represents a conclusive biological demonstration of how effective behavior in the real world is tightly related to the morphology of the body.

The grand challenge of the OCTOPUS IP (Integrating Project) is investigating and understanding the principles that give rise to the octopus sensory-motor control capabilities and incorporating them in new design approaches and robotics technologies to build an embodied artifact, based broadly on the anatomy of the 8-arm body of an octopus, and with similar performance in water, in terms of dexterity, speed, control, flexibility, and applicability.

Funding

This work is funded by the European Commission under the 7th Framework Programme in the theme of the Future and Emerging Technologies (FET)(OCTOPUS IP, FP7-ICT 2007.8.5, FET Proactive, Embodied Intelligence, Grant agreement no. 231608).

Using simple periodic motions, we showed that reservoir computing can be used to achieving behavior switching in a soft robotic arm.

The first complete roboitc octopus (Tako V). It is the combination of Tako IV and two Tako II. Those two big tentacles at the front are used as manipulators and are actuated by shape-memory alloys (SMAs) that change their lengths when applying voltages.


by Daniel Germann, Alexander Gilgen and Katja Dietrich


by Marc Ziegler

Diversity of animal's morphology is particularly impressive in the underwater world. It has been uncovered that various properties of morphology have been optimized for the efficient locomotion in the evolutionary process. In this project we explore such morphological properties for the purpose of underwater robot locomotion. Toward adaptive underwater locomotion, this project investigates a fish-like swimming robot. By using motor control with only one degree of freedom, this robot exhibits surprisingly rich behavioral diversity in three dimensional underwater environment.

Proceeding (pdf)

Movie (avi)


by Gabriel Gomez, Alejandro Hernandez Arieta, Hiroshi Yokoi, and Peter Eggenberger Hotz

Producted by Tsukasa Kiko Engineering

The tendon driven robot hand is partly built from elastic, flexible and deformable materials. For example, the tendons are elastic, the fingertips are deformable and between the fingers there is also deformable material. It has 15 degrees of freedom that are driven by 13 servomotors, a bending sensor is placed on each finger as a measure of the position, and a set of standard FSR pressure sensors cover the hand (e.g., on the fingertips, on the back and on the palm). The robotic hand has 13 degrees of freedom, and each finger has been equipped with different types of sensors (i.e., flex/bend, angle, and pressure). At the Artificial Intelligence Lab, we use the robotic hand to investigate the relationship between morphology, intrinsic body dynamics, generation of information structure through sensorimotor coordinated activity, and learning. We have implemented biologically inspired learning mechanism to allow the robotic hand to explore its own movement capabilities. Moreover, by correlating the sensory input as a result of its motor outputs, the robotic hand can learn to manipulate and grasp objects by itself (Gomez et al., 2005; Gomez et al., 2006).


Developed withing the European Research project IST-004370 RobotCub Maintained and run at the AILab by Jonas Ruesch.


by Raja Dravid, Martin F. Krafft, Gabriel Gomez and Jonas Ruesch.

The main objective to build this robot is to study the process of building a coherent representation of visual, auditory, haptic sensations and how this representation can be used to describe/elicit the sense of presence. The goal is the understanding of representation in humans and machines. We intend to pursue this in the framework of development e.g. by studying the problem from the point of view of a developing system. Within this framework we will use two methodologies: on one side we will investigate the mechanisms used by the brain to learn and build this unified representation by studying and performing experiments with human infants; on the other side we intend to use artificial systems (e.g. robots) as models and demonstrators of perception-action representation theories.


by Fumiya Iida

To achieve rapid locomotion, exploiting morphological properties is essential. The running quadruped robot "MiniDog" is capable of relatively robust rapid legged locomotion by using intrinsic body dynamics induced by spring-like like property, weight distribution, and body dimentions. Owing to the use of body dynamics, the control of the robot is extremely simple and, moreover, it has rich behavioral diversity.


When we take a look at micro-world of a cell, we see vast number of molecules interacting together and somehow manage high autonomous life activity.

There is no central "control", but they self-assemble and - that's how (and why) you are reading this sentence.

In order to solve the mystery of life (and create "living" robot), we developed a research plat from - Tribolon (derived from Tribology) - which is the only one self-propelled assembly robot in the world.

Check out the detail at www.tribolon.com. You can also download some publications.


by Raja Dravid

This arm robot consists of actuators using highly non-linear pneumatic artificial muscles. For more detail, please ask the developer Raja Dravid.


by Andreas Fischer

The robot started as a remote controlled car (RC-car) with an in some ways special body. The body is a hard-shell suitcase with an only slightly modified RC-car base built into it. The RC-car is mounted in a way that the motor can power the rear wheels, while the steering-servo is connected to one of the front wheels. The suitcase can be switched from microcontroller control to usual remote control. Therefore a receiver is built into the suitcase which can be activated through a switch. This has been built in for demonstration purposes and is of no further use in this assignment.


by Simon Bovet and Miriam Fend

We have developed an artificial whisker sensor based on microphones. Natural rat whiskers are glued onto capacitor microphones such that deformations of the whisker move the membrane of the microphone. This signal can be amplified and digitiliazed. AMOUSE aimed at the construction of a mobile robot equipped with an artificial whisker system that serves as a mean for validating models based on the results from neurophysiological experiments and neural modelling. The AMouse is standard Khepera II robot equipped with two artificial whisker arrays. The whiskers consist of natural rat whiskers glued on capacitor microphones. Each whisker is thus a single sensor. The whiskers can be moved actively. Data acquisition is done on a laptop with a PCMCIA data acquisition card. Furthermore, the robot has an omnidirectional camera allowing experiments on tactile perception, multimodal issues and visual navigation.


by Marc Ziegler and Lijin Aryananda

Through many experiments of this swimming humanoid robot, we have noticed that humans are restricted in many ways to swim. For example, we have to take breath when we are swimming which is not the case in the robot. Also a lot of aspect about human system have been revealed. More detail description will come soon.


Used by Pabro Ventura
Producted by Kondo Kagaku CO.LTD

Currently we have 3 Kondo humanoids purchased from Japanese company. An choreography artist Pabro Ventura have been using this robot for next surprise at an exhibition!


by Mike Rinderknecht and Maik Hadorn

"Cheap" Quadrupedal Locomotion (AI Lab, University of Zurich, Switzerland) Body dynamics can reduce significantly both the computational effort and the complexity of an agentfs controller. In this work, we show that the phase delay between the legs of a quadrupedal agent as a unique controlling parameter is adequate to navigate on a 2D-surface.


by Lukas Lichtensteiger

The "Whirling Arm" will be used at the Artificial Intelligence Lab as an experimental tool for research on insect vision. It can be seen as a kind of "flight-simulator for insect eyes": An artificial insect eye (camera or specially constructed compound eye) is mounted on the Whirling Arm and is then subjected to fast and complex movements through space that can (to some degree) mimic the actual situation encountered by the head of a flying insect. One goal of these studies is to better understand how the specific features of insect eyes (e.g., its sensor morphology) relate to the visual input the animal encounters during its flight and how this can facilitate flight control. Since insects like the house-fly can navigate very fast the Whirling Arm has to be able to produce very fast reactions. Consequently it was designed for a minimum of inertia for each of its three rotational degrees of freedom while at the same time providing enough motor power for fast accelerations.


by Raja Dravid

The Stumpy Project explores the fundamental design principles of locomotion on the basis of our biological knowledge. However, we do not simply copy the design of the biological systems, but we try to extract the underlying principles. One of the most fundamental challenges in this project is how to enhance the behavioral diversity of a robot by concerving the simplicity of the morphological and physiological design. Given this perspective, in this project, we are investigating the interplay between the oscillation based actuation, the material properties, and the interaction with the environment. Stumpy uses inversed pendulum dynamics to induce bipedal hopping gaits. Its mechanical structure consists of a rigid inverted T-shape mounted on four compliant feet. An upright "T" structure is connected to this by a rotary joint. The horizontal beam of the upright "T" is connected to the vertical beam by a second rotary joint. Using this two degrees of freedom mechanical structure, with a simple oscillatory control, the robot is able to perform many different behavior controls for the purpose of locomotion including the gait controls of hopping, walking and running.


by Fumiya Iida

Many types of small version of stumpy was built by Fumiya Iida. Although the size of the robot significantly affects the whole dynamics of the robot, we have been showing the stability as a morphology and the mechanism relating to the dynamics.


by Arthur Korn and Fumiya Iida

The robot rabbit was built under the same concept of stumpy. It can move "forward" by jumping with two rotating mass. Also robustness against different types of ground with different frictions was observed.


by Fumiya Iida and Hiroshi Yokoi

Dumbo is one of the outstanding robot that defis the common wisdom. For more detail, please visit our lab!


by Fumiya Iida

The main objective of this project is to explore the design principles of biologically inspired legged running robots. In particular this project focuses on a minimalistic model of rapid locomotion of quadruped robots inspired by biomechanics studies. The goal of this project is, therefore, to achieve technology for a form of rapid legged locomotion as well as to obtain our further understanding of locomotion mechanisms in biological systems.


by Alex Schmitz

Schmaroo is a Kangaroo robot which can jump a few cm. The robot has a camera and a long leg to generate vertical force to jump. The name is derived from the developer Schmitz + Kangaroo.


by Daisuke Katagami

Coffee was developed to investigate human-robot interaction. This robot has two actuators which enable the head of the robot to move around for several ways. Through the experiment with this robot, we learned even simple nod movement of a head can classify into many types.


by Koji Shibuya

In this project, we are trying to develop a robot capable of hovering by beating its wings. Making a robot, we focused on concepts of "cheap design" and "morphological computation," and we took advantage of "material property," which are proposed in the field of artificial intelligence recently. Based on the concepts, we designed a robot that had one D.C. motor and a crank mechanism for beating wings. The robot's wings beat in the horizontal plane, and were made by soft materials, such as polyurethane, cardboard, and plastic to increase air flow to downward. We observed videos of flapping wings and measured lifts in every materials and sizes of wings. From the results, we concluded that materials and sizes of wings should be chosen carefully according to flapping frequencies, weight of a robot, and so on.


by Fumiya Iida

Most of the projects relating locomotion were launched by Fumiya Iida. This robot Puppy is one of the successful exemplars that reveals the stability of the intrinsic dynamics with the morphology. This project shows that having an adequate morphology enables the dynamic system to achieve stable locomotion with simple controller (brain).


by Kojiro Matsushita and Hiroshi Yokoi

In this project, aiming at aquisition of design scheme of Pseudo-Passive Dynamic Walker, we have been developping the robot to model the inferior limb both from systematic perspective and controlling perspective.


by Kojiro Matsushita and Hiroshi Yokoi

The relation between morphology and material property of a biped robot is worth to attack in the current state of the art of the field. Considering the affinity of these two aspects, we designed the robot Fork Leg Robot.


by Dominic Frutiger, Fumiya Iida, and Josh Bongard

How does monkey achieve jumping and climbing trees with such a heavy body? In this project, we developed monkey robot to reveal the secret mechanism of monkey by investigating especially the intrinsic oscillation of the body.


by Fumiya Iida

Melissa is developed as a robotic platform for the Flying Robot Project which is a part of the Biorobotics research at AILab, Dept. of Information Technology, Univ. of Zurich. The robot Melissa is a blimp-like flying robot, consisting of a helium balloon, a gondola hosting the onboard electronics, and an offboard host computer. The balloon is 2.3m long and has a lift capacity of approximately 400g. Inside the gondola, there are 3 motors for rotation, elevation and thrust control, a four-channel radio transmitter, a miniature panoramic vision system, and the batteries.


by Raja Dravid

In its most complex configuration the Dextrolator is composed of seven segments, actuated by seven motors. It receives sensor feedback from 126 sensors. The primary tasks the manipulator must perform is to move through a tube without touching the walls, to find its way to a specific point in space and finally to navigate through an environment to a certain point while performing obstacle avoidance.


by Lukas Lichtensteiger

A robot that is able to position its sensors autonomously using electrical motors. The task of the robot is to employ motion parallax to estimate a critical distance to obstacles. This task is achieved by adapting the morphology of the compound eye by an evolutionary algorithm while using a fixed neural network to control the robot. Each of the 16 long tubes contains a light sensor which can detect light within an angle of about 2 degrees. The tubes can be rotated about a common vertical axis.


by Lukas Lichtensteiger

This robot is one example of a series of robots rapidly built from a children's construction kit using our Flexible Robot Building Kit. We used an artificial evolutionary system to evolve simulated agents that can complete some specific task. Particular attention was devoted to the role of the morphology of these robots with regard to their fitness in a specific environment. These simulated agents (left) were then used as blueprints to build real world robots (right). Finally, the robots were tested in a real world environment to evaluate their fitness.


by Max Lungarella

Our experimental setup consists of: (a) an industrial robot manipulator with six degrees of freedom (DOF), (b) a color stereo active vision system, and (c) a set of tactile sensors placed on the robotfs gripper. This robot has been used for experiments related with the field of developmental robotics.


by Hiroshi Kobayashi
Produced by Neuronics, Inc.

The Samurai robot was designed by Hiroshi Kobayashi and is being built by Neuronics, Inc., a spin-off company of the AILab. It will be used by undergraduate students in classes and tutorials in New Artificial Intelligence, but also for research purposes. The Samurai is equipped with: An array of 12 infrared proximity sensors, 8 Bumper sensors, An omnidirectional color-camera, Differential steering with two 15 Watt DC motors, Motorola 68336 main processor.


by Dimitrios Lambrinos and Ralf Moller, in cooperation with Rosys AG

Sahabot 2 was built by Dimitrios Lambrinos and Ralf Moller, in cooperation with Rosys AG, Hiroshi Kobayashi and Marinus Maris. As its predecessor, Sahabot, it was built for a specific experiment involving the navigation behavior of the desert ant cataglyphis, and is being run in the Tunesian part of the Sahara desert in August 1997 in the same area where ethologists collected data on the real cataglyphis.


by Dimitrios Lambrinos, Hiroshi Kobayashi, and Marinus Maris

Sahabot was built by Dimitrios Lambrinos, Hiroshi Kobayashi, and Marinus Maris. It was built for a specific experiment involving the navigation behavior of the desert ant cataglyphis. It was run in the Tunesian part of the Sahara desert in july 1996 in the same area where ethologists collected data on the real cataglyphis.


by Hiroshi Kobayashi with some assistance from Rene Schaadv

Honey is a flying autonomous robot. It is an indoor blimp controlled by an off-board PC. It sports various sensors including a camera and four propellers for motion control. It was mainly developed by Hiroshi Kobayashi with some assistance from Rene Schaad. Honey was mainly built for use in navigation experiments and for experiments involving human-robot interaction.


by Hiroshi Yokoi?

Information wanted


by Rene Schaad

Gloria is a modified Didabot. It improves on a Didabot by providing improved battery life (for now up to 1.5 hours), a protective cover, bump sensors, and a real-time clock. The modifications were made necessary because Gloria is serving as a buddy to Rufus, which operates in an unmodified office environment for extended periods of time.


by Rene Schaad

The Analog robot performs visual homing in purely analog hardware. The hardware is based on the "Average Landmark Vector" model. For a description, see our paper "Landmark Navigation without Snapshots: the Average Landmark Vector Model" which is available on Ralf Moeller's home page.


by Marinus Maris

The control architecture of the autonomous robot Morpho I, which was built by Marinus Maris, is based on a neuromorphic design. Basically, there is a complete sensory-motor chip for robot control that takes care of all sensing (23 pixels contrast retina array), edge position detection (winner-take-all with position encoding), decision making (attention bias) and motor steering (a spike generator that delivers pulses for a servo). Its task is to follow one out of two possible lines. Which line is followed is controlled from outside of the chip adjusting the attention of the robot.


by Marinus Maris

Sita was built by Marinus Maris. Sita is built on a model car base, like its brother "Famez" (below). It is equipped with a 1D camera (64 pixels), 16 IR and ambient light sensors, bumpers, and a speech generator. The task of the robot is to run for errands whenever asked. The speech generator (hopefully soon augmented with speech understanding possibilities) will enable the robot to verbally interact with humans.


by Marinus Maris with system software by Rene Schaad and Daniel Regenass

Ten educational robots were built by Marinus Maris with system software by Rene Schaad and Daniel Regenass for use in student education in the context of Prof. Pfeifers class "New AI". It features: Based on R/C car (Tyco Scorcher), Very fast Differential 4WD (4 propulsed out of 6), Intel 16-bit 196KD microcontrollers (20 MHz), IR, and ambient light sensors, Programmable in C and assembler.


by Rene Schaad

Rufus T. Firefly was built by Rene Schaad. It is a multipurpose extensible platform for autonomous agents research.


by Marinus Maris

Famez is a fast robot relying entirely on only one sensor (one ultrasonic range finder). Three of them were built at our laboratory by Marinus Maris based on model car kits. Its top speed is ~10 mph. It features Motorola MC68331 and HC11 microcontrollers.


by Rene Schaad

This robot was built by Rene Schaad from "Stokys" metal construction parts. It features: Car-like steering, 20Mhz Intel 196KD microcontroller, Sonar, 2 antennae, buzzer, Gripper.


by Rene Schaad

developed by the Laboratoire de microinformatique at the Swiss Federal Institute of Technology in Lausanne

Cyclope was developed at the Laboratoire de microinformatique at the Swiss Federal Institute of Technology in Lausanne, Switzerland. We own one exemplar for evaluation purposes. Features include: Circular shape, 12.5 cm diameter (5"), HC11 microcontroller, 64 element linear CCD array, bumpers, debugging board, IR remote control, graphic LCD etc.


developed by the Laboratoire de microinformatique at the Swiss Federal Institute of Technology in Lausanne

Khepera was engineered at the Laboratoire de microinformatique at the Swiss Federal Institute of Technology in Lausanne, Switzerland. The AI Lab currently owns 15 Kheperas. Features includes: Circular shape, 5.5 cm diameter (2.2"), The small size enables desktop experimenting, 2 DC motors for differential steering, 20 min. autonomy, or power-by-wire, Motorola MC68332 microcontroller, Miniature gripper forthcoming.


by Rene Schaad

developed by the Laboratoire de microinformatique at the Swiss Federal Institute of Technology in Lausanne

The robot Koala has similar architecture to Khepera, but with larger body size.