Title: "Morphological Computation - The Computational Power of Soft Bodies"
Morphological computation captures conceptually the observation that biological systems take advantage of their morphology to conduct computations needed for a successful interaction with the environment. This phenomenon can be encountered at different scales, i.e., from the molecular to the macroscopic level. One remarkable conclusion from this concept is that parts of the computation needed for complex interactions (like stable locomotion) can be outsourced to the physical body of the agent. As a result the remaining computations and the corresponding learning and controlling tasks are much simpler. Consequently, this concept has been (and is still) of great interest in the context of robot design. Moreover, recently developed theoretical models point to a completely new type of robots. The results support the emergence of a new field of research, called soft robotics.
With the help of examples from nature and robotics we will provide an overview of the concept of morphological computation. We will discuss what kind of morphologies and what type of computation should be considered. We will present state-of-the-art research in this field, especially, in the context of soft robotics, and we will discuss research opportunities ahead of us.
Morphology, Evolution and Cognition Laboratory, Vermont Complex Systems Center, University of Vermont, USA
Title: "Evolution Robotics for Embodied Cognition, Education and Neuroscience"
This talk will be composed of three parts. In the first part I will introduce the field of evolutionary robotics and demonstrate how it can be used to investigate various issues in embodied cognition. In the second part I will explain an evolutionary tool we have recently developed--Ludobots--that allows students outside my lab to conduct their own evolutionary robotics experiments. In the third and final part I will describe an interdisciplinary research effort in which we adapt some of the methods from evolutionary robotics for fMRI brain imaging studies.
Title: "Values, Timescales, Wholes and Parts in Cognitive Activity."
The enactive approach to cognitive science aspires, amongst other things, to directly address what Pfeifer and Bongard (2007) called "the value question" – how it is that activity of a system can come to have meaning, value, or what Varela (1991) calls a "surplus of significance". The enactivist account is grounded in notions of autonomy and identity (each related to organisational closure). These concepts provide structure to our understanding of how a system is to be distinguished from its environment and how it comes to govern its own behaviour. In this talk I will use the varying timescales of cognitive activity (from milliseconds to years) in order to examine, and to raise some challenges for, these ideas. The dynamic and sometimes evanescent conception of value to which enactivism is committed implies some strange and somewhat counter-intuitive notions of identity and of agency. While these counter-intuitive notions maybe somewhat uncomfortable at times (and might be seen by some as a challenge to the idea that enactivism really offers a coherent response to the value question), they are also potentially useful in considering other tensions in our understanding of cognitive systems, such as the opposing tendencies toward holism and componentialism, generality and specialism.
Title: "Motor primitives and central pattern generators: from biology to robotics. "
The ability to efficiently move in complex environments is a fundamental property both for animals and for robots, and the problem of locomotion and movement control is an area in which neuroscience and robotics can fruitfully interact. Animal locomotion control is in a large part based on central pattern generators (CPGs), which are neural networks capable of producing complex rhythmic or discrete patterns while being activated and modulated by relatively simple control signals. These networks are located in the spinal cord for vertebrate animals. In this talk, I will present how we model pattern generators of lower vertebrates (lamprey and salamander) using systems of coupled oscillators, and how we test the CPG models on board of amphibious robots, in particular a salamander-like robot capable of swimming and walking. The models and robots were instrumental in testing some novel hypotheses concerning the mechanisms of gait transition, sensory feedback integration, and generation of rich motor skills in vertebrate animals. I will also discuss how the concept of motor primitive and CPGs can be useful more generally for controlling robots with many degrees of freedom from quadruped to humanoid robots.
Title: "Neurocognition of Prediction"
We are living in an ever-changing environment and mostly have to adapt to more or less predictable variation. How can we do so?
Our concept of what perception is and in which way it differs from action has changed fundamentally with the advent and proliferation of imaging methods applied on brain function. This innovation has revitalized the control-theoretically inspired notion of the brain as a predictive machine implementing internal models of sensory change. According to the most recent and radical reading, perceptual representation amounts to what is generated in a top-down feedback fashion, whereas sensory feedforward streams provide nothing but prediction errors. The mismatch between both is information in the proper sense.
Currently, however, sophisticated concepts on predictive perception run ahead of our understanding of how the brain exploits environmental cues to feed predictions and how dynamic prediction is organized in time. A series of experiments is presented that seek to contribute to three issues in this area: (a) the core network of sensorimotor prediction, (b) the exploitation of environmental cues driving predictions, and (c) the adjustments of predictions based on probability structures. Findings point to a complex but generic set of networks underlying predictions in the seconds range and beyond.
Title: "The Bayesian Brain: From Perception to Action"
A major impediment to understanding brain structure and function at the systems level is the lack of overarching computational theories describing how the brain combines sensory information with prior knowledge and rewards to generate behaviors. In this talk, I will discuss a Bayesian theory of perception and action that could serve as a candidate for such a theory. After briefly reviewing predictive coding models, I will describe a Bayesian decision making model that can provide normative explanations for neural and behavioral data from a range of sensory decision making tasks. The model suggests specific computational roles for the neocortex and the basal ganglia in transforming noisy sensory information into actions that maximize expected reward.
The Bayesian Brain: Probabilistic Approaches to Neural Coding, K. Doya, S. Ishii, A. Pouget, and R. P. N. Rao (Eds.), Cambridge, MA: MIT Press, 2007.
Probabilistic Models of the Brain: Perception and Neural Function, R. P. N. Rao, B. A. Olshausen and M. S. Lewicki (Eds.), Cambridge, MA: MIT Press, 2002.
R. P. N. Rao. "Decision making under uncertainty: a neural model based on partially observable Markov decision processes," Front. Comput. Neurosci. 4(146), 2010.
R. P. N. Rao and D. H. Ballard. "Predictive coding in the visual cortex: A functional interpretation of some extraclassical receptive field effects" Nature Neuroscience, 2(1):79-87, 1999.
Artificial Intelligence Laboratory, University of Zurich, and
National Competence Center for Research in Robotics, Switzerland
Title: "On the role of embodiment in the emergence of cognition: The four messages"
Traditionally, in robotics, artificial intelligence, and neuroscience, there has been a focus on the study of the control or the neural system itself. Recently there has been an increasing interest into the notion of embodiment in all disciplines dealing with intelligent behavior, including psychology, philosophy, and linguistics. In an embodied perspective, cognition is conceived as emergent from the interaction of brain, body, and environment, or more generally from the relation between physical and information (neural, control) processes. It can be shown, and this is one of the underlying assumptions of the eSMC project, that through the embodied interaction with the environment, in particular through sensory-motor coordination, information structure is induced in the sensory data, thus facilitating categorization, perception and learning. The patterns thus induced depend jointly on the morphology, the material characteristics, the action and the environment. Because biological systems are mostly "soft", a new engineering discipline, "soft robotics", has taken shape over the last few years. I will discuss the far-reaching implications of embodiment, in particular of having a soft body, on our view of the mind and human behavior in general: Cognition is no longer centralized in the brain, but distributed throughout the organism, functionality is "oursourced" to morphological and material properties of the organism, which requires an understanding of processes of self-organization. Because in "soft" systems part of the functionality is in the morphology and materials, there is no longer a clear separation between control and the to-be-controlled, which implies that we need to fundamentally re-think the notion of control. The ideas will all be illustrated with case studies from biology -- humans and animals -- and robotics and will be summarized as a set of four "message" for embodied systems.
Title: "Informational Organization for Embodied Agents"
Recent years have seen a significantly improved understanding in how
informational principles govern cognitive performance, especially
embodied one. Embodiment imprints structure on the way information
flows through the organism, both body and "brain".
Now, there are strong indications that biologically plausible cognition may adapt and evolve towards the most cognitively inexpensive solution. Under this hypothesis of "information parsimony", strong constraints can be imposed on the structure of information flows through the organism. In addition, with this view, concepts such as "morphological computation" can be given a well-defined quantitative meaning in informational terms and an immediate biological justification.
My talk will introduce informational techniques to model embodied cognition, discuss the reasons that make information at all relevant in cognitive processes and discuss ramifications for our understanding of embodiment and how it interacts with agents and organismic cognitive faculties. It turns out that the world is - well, if you want to know more, please attend the talk...
Active Perception Lab, Departement Engineering Management, University of Antwerp, Belgium
Title: "Bits of echolocation, applying information theory to bats"
It has been argued that an important part of understanding bat echolocation comes down to understanding the morphology of the bat sound processing apparatus. In this presentation I'll describe a method based on information theory that allows to assess target localization performance of bat sonar, without a priori knowledge on the position, size, or shape of the reflecting target. Using simulated directivity patterns, this method is applied to the sonar system of the FM-bat Micronycteris microtis. The results of this analysis indicate that the morphology of this bat’s sound processing apparatus has evolved to be a compromise between sensitivity and accuracy with the outer ears and the noseleaf playing different roles. The same information theoretic analysis is also applied to Rhinolophidae, CF-FM bats that hunt among vegetation. The foliage returns clutter echoes that potentially mask the echoes of insect prey. However, prey introduces frequency and amplitude shifts, called glints, into the echo to which these bats are highly sensitive. In contrast to the spectral cues used by FM-bats, the localization cues in Rhinolophidae are most likely provided by self-induced amplitude modulations generated by pinnae movement. Our model includes the spatial filtering of the echoes by the morphology of the sonar apparatus of Rhinolophus rouxii as well as the amplitude modulations introduced by pinnae movements. Using this model, we evaluate whether the dominant glints provide Rhinolophidae with enough information to perform localization. Finally, robotic implementations of these principles will be demonstrated.
Title: "Multisensory construction of body-space interactions"
Accurate bodily-space interactions are made possible through complex integration of visual stimuli coming from the body and the surrounding environment, and somatosensory input from our own body. I will present experimental data showing how visuo-tactile signals from the body are critical to construct an accurate body metric, and how visual input about the size of bodily segments can surprisingly modulate somatosensory perception and action. Furthermore I will review some work suggesting that the multisensory coding of body and peripersonal space can be plastically changed through action and tool use. This work gives clues to understand the determinants of body-space interactions in neurologically intact individuals, brain-damaged patients and people using functional prostheses.
Visual Cognition Group, Laboratoire de Neurosciences Cognitives, École Normale Supérieure, Paris, France
Title: "Brain-body interactions in conscious vision"
Conscious vision is accompanied by both enhanced cognitive abilities and subjective experience. I will present here experimental evidence to show that those two aspects of consciousness can be dissociated both at the neural and behavioral level. I will propose that the first person perspective necessary for subjective experience requires brain-body interactions, by showing that fluctuations in neural responses to heartbeats before stimulus onset can predict fluctuations in visual awareness.
Max Planck Institute for Human Cognitive and Brain Sciences, Department of Psychology, Leipzig, Germany
Title: "Perception and action: Strong interactions"
This talk gives a selective overview of a research program that our lab has pursued over the past two decades in the domain of action representation. I will start with the ideomotor principle which has served as a theoretical guideline for most of our studies. In what follows I will address experimental paradigms for the study of action interference, action induction, action coordination and action simulation. While I will be brief on the first three topics (focusing on major experimental findings) I will elaborate in greater detail on action simulation and pertinent experimental paradigms. While the focus will be on behavioral findings, some studies also address brain mechanisms involved in action simulation.
Bio-Inspired Robotics Laboratory, Institute of Robotics and Intelligent Systems, ETH Zurich, Switzerland
Title: "Guided self-organization in the real-world machines"
Many technologies have evolved through biological inspirations, and we learned that adequate abstraction of mechanisms in nature could give us substantial impacts in the technological development of, for example, control theories, computational algorithms, and advanced mechatronic systems. One of the most significant challenges in bio-inspired robotics is extraction of feasible principles about the real-world self-organization. In the biological world, self-organization processes take place in many different timescales and physical principles. For example, biological systems are made of components that are continuously changing and adapting over different timescales (e.g. evolutionary, developmental and "here and now" timescales), and they rely on a variety of underlying physical principles such as spring-mass-damper and fluidic interactions, cell adhesion, and informational dynamics based on electric signals, most of which are not completely integrated into our robotic systems. In this talk, I introduce our efforts on developing and understanding self-organizing machines in the real world, which are structured into three basic research components. The first research component aims to construct the principles of self-organization through mechanical system-environment interactions such as springy legs interacting with the ground for locomotion purposes. These case studies provide us fundamentals of emergent behaviors because system-environment interactions are the sole basis of self-organization in the real-world. Second, we have been also exploring self-organization of motion control processes, in which we study the underlying mechanisms of sensory-motor calibration of complex musculoskeletal dynamical systems. Together with our collaborating neurophysiologists, we investigate how muscle twitches during sleep could provide basic circuitry for sensory-motor coordination of spinal reflexes. And third, we also investigate the use of soft continuum bodies for next generations of mechatronic systems. Here we make use of thermoplastic adhesive polymers that can be used for soft elastic mechanical bodies as well as dynamic reconfiguration of body shapes and other mechanical properties. Through these case studies of dynamic system-environment interactions, we discuss the implications of our embodiment research with the physical dynamical systems for our comprehensive understanding of guided self-organization and its research directions in the future.