Details Colloquium Spring 2009
Speaker: Dr. Enrique Alfonseca, Google Zurich, Switzerland
Host: Michael Hess, IFI, UZH
Word and phrase similarity is a long-standing problem, consisting on finding how related or similar are two words or phrases. It has applications to many other areas included but not limited to word sense disambiguation, information retrieval, information extraction or machine translation. Traditional systems for word sense disambiguation or information extraction have relied on small annotated corpora on which to train classifiers, and external knowledge sources like lexico-semantic networks, but the availability of large amounts of unannotated data and computational resources able to process them has motivated the development of weakly-supervised and unsupervised systems. In this talk we explore different directions that these unsupervised systems are taking, focusing on two different approaches: large-scale analyses of noisy, unannotated data like web corpora; and the leverage of search engine results as a resource for linguistic processing.
Enrique Alfonseca is a software engineer at Google Zurich, working in ads quality. His research interests include natural language processing, information retrieval and machine learning, and he has worked previously on information extraction, text summarization and automated essay grading. He holds a PhD from Universidad Autonoma de Madrid on automatic ontology population, and has also done research at University of York (1998-2001, part time) and the Tokyo Institute of Technology (2005-2006). He has served as reviewer for conferences like ACL, EACL, NAACL, EMNLP or WWW.
Speaker: Prof. Oliver Staadt, University of Rostock, Germany
Host: Renato Pajarola, IFI, UZH
Tele-Immersion technology enables users at geographically distributed sites to collaborate in real time in a shared, simulated environment as if they were in the same physical space. This new paradigm for human-computer interaction is the ultimate synthesis of networking and media technologies. Prior projects, such as the blue-c and the National Tele-Immersion Initiative have developed prototype systems that addressed some of the challenges, but also highlighted unresolved issues and the need for further research. In this talk, I will present our vision of a novel compact tele-immersion system that will eventually support bidirectional communication and interaction between users located at more that two geographic sites.
Prof. Dr. Oliver Staadt is a full professor of computer science at the University of Rostock and an adjunct professor of computer science at UC Davis. He received a Master of Science in computer science and a PhD in computer science from TU Darmstadt and ETH Zurich, respectively. Prior to joining the University of Rostock, he was an Assistant Professor of computer science at the University of California, Davis, where he was also the director of the Virtual Reality Laboratory. His research interests include computer graphics, virtual reality, teleimmersion, visualization, and multiresolution analysis. He serves as a member of international program committees of many graphics, VR, and visualization conferences. Dr. Staadt is associate editor of Computers & Graphics and co-chair of the program committees of the EG/IEEE Symposium on Point-Based Graphics (PBG) 2008 and the Forth International Symposium on 3D Data Processing, Visualization and Transmission (3DPVT) 2008. He is a member of ACM, ACM SIGGRAPH, the IEEE Computer Society, and the Eurographics Association.
Speaker: Prof. Colin Atkinson, University of Mannheim, Germany
Host: Martin Glinz
Although they different significantly in how they decompose and conceptualize software systems, all advanced software engineering paradigms significantly increase the number of views involved in visualizing a system. Managing these different views can be challenging even when a paradigm is used independently, but when they are used together the number of views and inter-dependencies quickly becomes overwhelming. In this talk Colin Atkinson will present a new approach for organizing and generating the different views used in advanced software engineering methods referred to as Orthographic Software Modelling (OSM). This provides a simple metaphor for integrating different development paradigms and for leveraging domain specific languages in software engineering projects. Development environments that support OSM essentially raise the level of abstraction at which developers interact with their tools by hiding the idiosyncrasies of specific editors' storage choices and view organization policies. The overall benefit is to significantly simplify the use of advanced software engineering methods.
Colin Atkinson holds the chair of Software Engineering at the University of Mannheim and is joint leader of its interdisciplinary Mobile Business research group. Before that he was a professor at the University of Kaiserslautern and a project leader at the affiliated Fraunhofer Institute for Experimental Software Engineering . From 1991 until 1997 he was an Assistant Professor of Software Engineering at the University of Houston - Clear Lake. His research interests are focused on the use of advanced software engineering approaches, such as model driven development, service-oriented architectures and product line engineering, in the development of dependable computing systems. He received his Ph.D. and M.Sc. in computer science from Imperial College, London, in 1990 and 1985 respectively, and received his B.Sc. in Mathematical Physics from the University of Nottingham 1983.
Speaker: Dr. Lorenz Hilty, Technology and Society Lab, Empa, St.Gallen, Switzerland
Host: Harald Gall, IFI, UZH
Information and Communication Technology (ICT) has the potential to change production and consumption processes, as well as our perception of reality. The design and use of ICT can therefore raise ethical issues. One central ethical challenge of our time is to reconcile intra- with intergenerational justice in a world of finite natural resources. The vision of how this dilemma could be solved has been called "sustainable development"‚ defined as a development that "meets the needs of the present without compromising the ability of future generations to meet their own needs" by the Brundtland Commission in 1987. In my talk, I will discuss the relationship between ICT impacts and sustainable development. Is ICT part of the problem (e.g. due to its energy consumption) or part of the solution (e.g. by dematerializing the economy) or both? Two analytical frameworks will be presented which will help us to assess the contribution of ICT applications to sustainability: (1) a conceptual framework which helps us to analyse ICT effects on societal metabolism (the mass and energy throughput of society), and (2) the Socio-Technical Interaction Network (STIN) approach, which was originally created by the late Rob Kling to avoid technological determinism and conflicts when introducing ICT in socio-technical systems. Both approaches can serve computer scientists and application developers as tools for reflection on the sustainability of ICT applications.
Lorenz Hilty is Head of the Technology and Society Lab at Empa, the Swiss Federal Laboratories for Materials Testing and Research. From 1998 to 2005, he was Professor of Computer Science at the University of Applied Sciences Northwestern Switzerland. In parallel with that, he headed up Empa's research program "Sustainability in the Information Society" funded by the ETH Board, from which the Technology and Society Lab emerged in 2004. Lorenz Hilty studied Informatics and Psychology at the University of Hamburg, where he also got his PhD and habilitation in informatics. Research focus: Environmental and social aspects of Information and Communication Technologies (ICT), ethical implications of the convergence of Information-, Nano-, Biotechnology and Cognitive Science (NBIC).
Speaker: Prof. Alois Ferscha, University of Linz, Austria
Host: Rolf Pfeifer, IFI, UZH
Challenged by (i) technological progress (like advances in sub-micron and system-on-a-chip designs, or the miniaturization of sensor/actuator systems, wireless communication technologies and micro-electromechanical systems, etc.), and (ii) the vast per-vasion of global networks, Cooperative Systems research today addresses a whole new cosmos of research issues, that goes way beyond the traditional computer-mediated "Person-to-Person" technologies, now towards self-coordinated "Networks-of-Things" technologies. In this presentation –from a Pervasive Computing viewpoint- I will outline aspects of a new Cooperative Systems research agenda, as raised by the miniaturization and in-visible integration of (computing, communication and software-) technology into eve-ryday objects like appliances, commodities, machinery, tools and environments. Furthermore, in-depth I will address the issue of spatially constrained spontaneous interaction among Cooperative Digital Artefacts. An explicit model for spatial proximity is investi-gated based on geometric descriptions of "Zones of Influence" (ZoI) of artefacts, ena-bling the triggering of their interactions based on real time verification of relations among ZoIs (intersection, inclusion).
Univ. Prof. Alois Ferscha was with the Department of Applied Computer Science at the University of Vienna at the levels of assistant and associate professor (1986-1999). In 2000 he joined the University of Linz as full professor where he heads the Excellence Initiative “Pervasive Computing”, the department of Pervasive Computing, the Research Studio Pervasive Computing Applications (as Part of ARC Austrian Research Centers, Seibersdorf) and RIPE (Research Institute of Pervasive Computing). Ferscha has published more than a hundred technical papers on topics related to parallel and distributed computing. Currently he is focused on Pervasive and Ubiquitous Computing, Embedded Software Systems, Wireless Communication, Multiuser Cooperation, Distributed Interaction and Distributed Interactive Simulation. He has been a visiting researcher at the Dipartimento di Informatica, Universita di Torino, Italy, at the Dipartimento di Informatica, Universita di Genoa, Italy, at the Computer Science Department, University of Maryland at College Park, College Park, Maryland, and at the Department of Computer and Information Sciences, University of Oregon, Eugene, Oregon, U.S.A.
Speaker: Dr. Bernardo Huberman, HP Labs, USA
Host: Burkhard Stiller, David Hausheer, IFI, UZH
The past decade has witnessed a momentous transformation in the way people interact and exchange information with each other. Content is now co-produced, shared, classified, and rated on the Web by millions of people, while attention has become the ephemeral and valuable resource that everyone seeks to acquire. I will describe our research on the interplay between popularity, novelty and collective attention in the Web, as well as the role that attention plays in solving the tragedy of the digital commons.
Bernardo Huberman is a Senior HP Fellow and Director of the Social Computing Lab at Hewlett Packard Laboratories. He is also a Consulting Professor in the Department of Applied Physics at Stanford University. For a number of years he worked in statistical physics and dynamical systems and then moved on to study large distributed systems, both artificial and human. In that context he designed and implemented market mechanisms for resource allocation, and investigated the phenomenon of cooperation in large groups and organizations . His research into the phenomenon of the web led to the discovery of a number of strong regularities, which are described in this book "The Laws of the Web: Patterns in the Ecology of Information" (MIT Press). Presently, his work centers on the phenomenon of social attention in the design of novel mechanisms for discovering and aggregating information in distributed systems.
Speaker: Prof. Geoff Sutcliffe, University of Miami, USA
Host: Michael Hess, Norbert Fuchs, IFI, UZH
This talk gives an overview of activities and products that stem from the Thousands of Problems for Theorem Provers (TPTP) problem library for Automated Theorem Proving (ATP) systems. These include the TPTP itself, the Thousands of Solutions from Theorem Provers (TSTP) solution library, the TPTP language, the CADE ATP System Competition (CASC), tools such as my semantic Derivation Verifier (GDV) and the Interactive Derivation Viewer (IDV), meta-ATP systems such as the Smart Selective Competition Parallelism (SSCPA) system and the Semantic Relevance Axiom Selection System (SRASS), online access to automated reasoning systems and tools through the SystemOnTPTP web service, and applications in various domains. Current work extending the TPTP to higher-order logic will be introduced.
Geoff Sutcliffe is an Associate Professor, and Director of Undergraduate Studies, in the Department of Computer Science at the University of Miami. He received a BSc(Hons) and MSc from the University of Natal, and a PhD in Computer Science from the University of Western Australia. His research is in the area of Automated Reasoning, particularly in the evaluation and effective use of automated reasoning systems. His most prominent achievements are: the first ever development of a heterogeneous parallel deduction system, eventually leading to the development of the SSCPA automated reasoning system; the development and ongoing maintenance of the TPTP problem library, which is now the de facto standard for testing 1st order automated reasoning systems; the development and ongoing organization of the CADE ATP System Competition - the world championship for first order automated reasoning systems; and the specification of the TPTP language standards for automated reasoning tools. The research has been supported by grants from the German Ministry for Research, the Australian Research Council, the European Union, and also by internal university grants from Edith Cowan University, James Cook University, and the University or Miami. The research has produced over 60 journal and conference papers. Additionally, he has been guest editor of several special journal issues on topics in automated reasoning. He has contributed to the automated reasoning and artificial intelligence communities as the conference chair of the 14th and 19th International Conferences on Automated Deduction (CADE), program co-chair of the 12th International Conference on Logic for Programming Artificial Intelligence and Reasoning (LPAR), program co-chair of the 19th and 20th International FLAIRS Conferences, chair of the 21st FLAIRS Conference, co-founder and organizer of the "ES*" series of workshops on Empirically Successful Automated Reasoning, and as a regular program committee member and reviewer for automated reasoning and artificial intelligence journals and conferences. He is currently a CADE trustee and the vice-president of FLAIRS. As a faculty member he has supervised and examined several graduate theses, serves on the University and College curriculum committees at the University of Miami, and is the PI of a $467575 NSF grant providing scholarships for students taking Computer Science or Mathematics as a second major.
Speaker: Dr. Jacky Mallett, Sony Techsoft, Belgium
Host: Burkhard Stiller, IFI, UZH
Real time distributed computing can be described as the problem of organizing large numbers of autonomous computers to exchange messages and participate in shared tasks. Coincidentally this description, with appropriate modifications for much lower speeds, and rather less reliability, also fits many of the activities we perform as participants in social and economic organizations. Can the study of distributed computer systems then, be useful in understanding human society? In this talk we will discuss how concepts from packet switched networking like communication latency, may provide explanations for social phenomena, such as the relative economic success of democratic countries, the general failure of revolutions to actually change the form of government, and the great mystery of why Religions and Martial Arts Clubs are so prone to schisms.
Dr. Jacky Mallett is a Team Leader at Sony Techsoft, Belgium, responsible for research and development of Peer-to-peer technologies for consumer electronics. Prior to 1998 she worked as a wide area network troubleshooter for Nortel Networks. From 1998 to 2005 she was a research assistant at the Massachusetts Institute of Technology, MIT, U.S.A., studying the problem of image analysis using co-operating robotic cameras, and obtained a Ph.D for work on the problem of group based co-operation in 2005. Since noticing similarities between the constraints on robotic organization in applications that had high CPU and relatively low bandwidth capabilities, and some of the more puzzling behaviors commonly seen in corporate organizations, she has been working on applying distributed computing theory to human social and economic organization.