The 2015 IFI Summer School is a week-long event for graduate students and reasearch assistants in informatics and related fields, where invited experts teach a number of different topics in day-long courses on a variety of topics in Computer Science.
The summer school will take place June 22-26, 2015 at the University of Zurich BIN (Department of Informatics, Binzmühlestrasse 14, 8050 Zürich). The courses will be held in two parallel sessions in room 2.A.01 and 2.A.10 from 9:00 - 17:00 (check-in starts at 8:45am) with coffee and lunch breaks. PLEASE NOTE: Dr. Angerer's course will take place in room 0.B.04!
|Mon, June 22||Principles of Transactional Memory||Prof. Rachid Guerraoui||0.5 Doctoral|
|Mon, June 22||
Understanding research paradigms and Intro|
to qualitative research methods
|Prof. Geraldine Fitzpatrick||0.5 Methodology|
|Tue, June 23||Simulation and Optimisation for Systems |
Planning and Management
|Prof. Andrea Rizzoli||0.5 Methodology|
|Tue, June 23||SNA and its usage in IS and Software Engineering||Prof. Kevin Crowston||0.5 Methodology|
|Wed, June 24||Programming GPUs with CUDA and OpenACC||Dr. Christoph Angerer||0.5 Methodology|
|Wed, June 24||Big Data Programming and Algorithms||Prof. Kunpeng Zhang||0.5 Methodology|
|Thurs, June 25||Computer Supported Collaborative Work||Prof. Margaret-Anne Storey||0.5 Doctoral|
|Thurs, June 25||
Designing Combinatorial Market-based Systems: |
Theory and Practice
|Prof. Ben Lubin||0.5 Doctoral|
|Fri, June 26||
User-Centric Assessment and Performance |
Evaluation of Internet Video Delivery
|Prof. Tobias Hossfeld||0.5 Methodology|
|Fri, June 26||Resilience Mechanisms for Improved Safety and Security||Prof. Christof Fetzer||0.5 Doctoral|
|08:45 - 09:00||Check-in|
|09:00 - 10:15||Instruction|
|10:15 - 10:45||Coffee break|
|10:45 - 12:00||Instruction|
|12:00 - 13:00||Lunch (@mensa, not included in cost)|
|13:00 - 15:00||Instruction|
|15:00 - 15:30||Coffee break|
|15:30 - 17:00||Instruction|
All registered students are also invited to attend the summer school social event, which will take place on Wednesday, June 24th directly following the course. Details to follow.
The Summer School is open to doctoral students in computer science and related fields from the University of Zurich as well as other universities. Registration is free for IfI research assistants and ifi doctoral students. For all other students, fees are 90 CHF for the entire five-day summer school, or 20 CHF for individual courses. Attendance will be capped at 40 people per course.
Preference will be given to IfI doctoral students and research assistants and other participants will be admitted on a first-come-first-served basis.
Registration is now open and will close on Monday, 15th June 2015.
The fee will be paid on site in cash only at the Check-in desk outside the classrooms every day between 8:45 and 9:00am.
For UZH students, you can find the ECTS credit awarded by each course in the Overview above. For non-IfI students who would like to acquire cedits, you need to talk with the person who is in charge of credit transfering in your school first and find out if your school accepts/recognizes the ECTS credits awarded by IfI at UZH.
Principes of Transactional Memory
Instructor: Prof. Rachid Guerraoui
The hardware evolution is clearly towards multicore architectures and it might indeed well become the case that the only way for programmers to speed up their applications is by parallelizing them. Yet this is not easy and simple programming abstractions for synchronizing concurrent tasks are badly needed. The Transaction Memory (TM) paradigm is an appealing candidate abstraction.It is argued to be as easy to use as coarse-grained locking and as efficient as, hand-crafted, fine-grained locking.
Not surprisingly, a large body of work has been devoted to implementing the TM paradigm and exploring its ramifications. Very little work has however been dedicated to exploring the fundamental principles of the TM concept. Without these, it is simply impossible to establish the correctness of a TM algorithm, compare two algorithms, or determine whether certain performance limitations are inherent to the model or simply the result of an implementation artifcact. This tutorial will recall the TM paradigm, summarize the results of some of the fundamental work achieved so far on TM, and open new research perspectives for establishing TM correctness properties and verifying them.
Understanding research paradigms and Intro to qualitative research methods
Instructor: Prof. Geraldine Fitzpatrick
This course will consist of two parts. In the introductory part we will discuss research paradigms, "the set of common beliefs and agreements between scientists about how problems should be understood and addressed" (Kuhn 1962). Research paradigms are important because they set up what we think can be known (and how we frame our research questions and claims) and how it can be known (what methods we choose). This is important for your methodology chapter to frame and justify the overall approach of a research thesis. It is also important for understanding and valuing that there is a diversity of paradigms and respecting that different researchers can make different albeit equally legitimate choices.
The second part of the course will focus on qualitative research methods, most often used in a constructionist/interpretivist paradigm but also possible to apply across different paradigms. As applications become more complex and open-ended, the limitations of traditional software requirements techniques become evident and there is an increasing shift in both industry and academia to qualitative methods. This course will provide a practical introduction to qualitative user research methods for the purposes of technology design and evaluation. Topics will include: core qualitative methods of observation and interview; planning and conducting fieldwork; analysing qualitative data; moving from data to design; writing up qualitative research. Case studies from both research and industry will be used to illustrate the types of insights that can be gained. Participants will also have the opportunity to apply the methods in practical exercises.
Simulation and Optimisation for Systems Planning and Management
Instructor: Prof. Andrea Rizzoli
Simulation models must be available at different levels of complexity, from the simplest, to efficiently syntethise control policies by means of optimisation algorithms, to the most complex, to deliver trustworthy “virtual worlds” for the exploration of alternative solutions and scenarios.
In this mini course we will examine how simulation and optimisation work hand in hand for the solution of complex planning and management problems.
SNA and its usage in IS and Software Engineering
Instructor: Prof. Kevin Crowston
Social network analysis as a field of study is growing rapidly and in popularity as an effective tool to study the ties or links between people, organizations and phenomena. It is both an approach and a tool to uncover and understand the connections that drive certain phenomenon involving interaction across a network. SNA techniques have been deployed to uncover and visualize hidden patterns in as diverse groups as academic communities to terrorists communities; as diverse phenomenon as correlating performance and creativity to who will be the next US president.
This class will provide a brief introduction to the concepts, methods and data analysis techniques of social network analysis. The course begins with a general introduction to the goals and perspectives of network analysis. This is followed by a practical discussion of network data, covering issues of collection, validity, visualization, and mathematical/computer representation. We then take up the methods of detection and description of structural network properties such as centrality, cohesion, subgroups, cores or roles. After the course participants will understand what it means to examine data in ’social networks way’.
Programming GPUs with CUDA and OpenACC
Instructor: Dr. Christoph Angerer
Massively parallel computing is expanding rapidly: The extreme parallelism provided by GPUs are essential for many modern applications, from game graphics on the smallest smart phones to long-running scientific simulations on the largest super computers. In this hands-on lab, we will explore state-of-the-art programming techniques using CUDA and OpenACC paradigms and will dive into optimization, profiling, and debugging methods for modern GPU architectures. A basic understanding of C or C++ will be helpful but it is not required to have any prior experience with GPUs.
Big Data Programming and Algorithms
Instructors: Prof. Kunpeng Zhang
The development of web 2.0 and related technologies have led to an exponential increase in various types of user-generated content including textual and networked information. Finding meaningful nuggets of knowledge from such a big and diverse data has attracted a lot of attention. Hadoop and MapReduce as a well known distributed environment and computing framework have been widely and successfully deployed in many domains, particularly in the field of business, healthcare, and social media. Many scalable machine-learning algorithms such as clustering, association rule mining, recommender systems, topic modeling, and network analysis have been proposed and implemented in many open-source packages (e.g. Apache Mahout).
In this course, we plan to cover the following materials.
1) Introduction to big data: basic concepts, characteristics, challenges, and applications.
2) Distributed Hadoop system: architecture, installation and configuration.
3) MapReduce programming: basic framework of MapReduce, advanced MapReduce framework (customized components). We have two examples to show to use MapReduce for solving real large problems.
4) Algorithms: implementing three algorithms using MapReduce, including classic clustering (K-means), item-based collaborative filtering, and PageRank. In addition, we have hands-on exercises on (1) pseudo-distributed Hadoop installation and configuration; (2) MapReduce implementations: word count example and single-source shortest path finding.
Computer Supported Collaborative Work
Instructor: Prof. Margaret-Anne Storey
Computer Supported Collaborative Work (CSCW) concerns the design, use and evaluation of technologies that support teams, groups and communities. It is an interdisciplinary topic that addresses both the technical and social aspects of collaboration technology. In this one-day course, students will gain an understanding of the theoretical underpinnings of CSCW while exploring a wide range of existing collaboration and social technologies that are used in the domains of software engineering and education. The course will be highly collaborative, requiring students to work with each other through the use of social technologies and through small group discussions.
Designing Combinatorial Market-based Systems: Theory and Practice
Instructor: Prof. Ben Lubin
This course will cover several key issues in the design and implementation of combinatorial markets, such as those used in the many recent billion dollar government spectrum auctions. We will first briefly review the motivation for such auctions and their design. The next part of the class will be focused on the particular problem of designing bidding languages for auctions that are both expressive and concise. We will then turn to the problem of pricing these auctions, with a specific focus on a surprising recent line of research that has established a connection between the pricing of combinatorial markets and kernel-based classifiers in machine learning (ML). Students will complete a hands-on exercise to help them rapidly understand the construction necessary for these ML techniques. We will then explain how to deploy such ML techniques in creating auctions of both theoretical and practical interest.
User-Centric Assessment and Performance Evaluation of Internet Video Delivery
Instructor: Prof. Tobias Hossfeld
The purpose of the course is to develop an understanding how to evaluate the performance of Internet applications from a user-centric point of view. As an example, Internet video delivery is considered where users are interacting, e.g. by posting video links to friends, and watch the video accordingly. The focus is on the methodology how to measure Quality of Experience by means of crowdsourcing and how to analyze subjective results statistically sound. As a result, key QoE influence factors may be derived to formulate a QoE model.
The performance evaluation of Internet video delivery needs then to consider the network conditions as well as the video characteristics. This allows to quantify the user perceived quality. To this end, an optimal playout strategy is derived. The video buffer on application layer is modeled by a simple queueing system which is analyzed with common queuing theory methods. Together with the QoE model, a user-centric evaluation is possible.
The course will be organized as a whole-day event. Lectures with integrated exercises will be given to interactively implement the theory into practice for selected examples.
Contents of the course
· Introduction to Performance Evaluation of Internet Applications
· Crowdsourced Quality of Experience
· Statistical Analysis of Crowdsourced Data
· Modeling a video buffer with Queueing Systems
· Optimization problem for QoE optimal video playout
· Network analysis of video graphs
Resilience Mechanisms for Improved Safety and Security
Instructor: Prof. Christof Fetzer
The dependability, in particular, the availability, integrity and confidentiality of systems can be impacted by both software as well as hardware issues. For example, hardware reliability issues already lead to diminishing performance returns when transitioning to smaller CMOS gate lengths. Bugs, both on hardware as well as software layers, can lead to dependability degradations.
In this course, I will introduce the approach and the resilience mechanisms that we are currently investigating in the context of cfaed (German excellence cluster with one focus on resilience) and SERECA (H2020 project on cloud security). The main objective of these projects are to reduce the cost of resilience. Therefore, the focus of this course will be on resilience mechanisms that are addressing both security as well as safety. Moreover, one of the goals of these projects is to be able to build safety critical systems on top of commodity hardware and software. In particular, we are investigating resilience mechanisms that permit us to offload some safety critical software components from cars to a cloud.
Prof. Rachid Guerraoui (EPFL)
Rachid Guerraoui (lpdwww.epfl.ch) is professor at EPFL where he leads the distributed programming laboratory. He is an ACM fellow and the winner of an advanced ERC grant and a Google focused award. He has been affiliated in the past with HP Labs in California and MITstry.
Prof. Geraldine FItzpatrick (TU Wien)
Geraldine Fitzpatrick is Professor of Technology Design and Assessment and leads the Human Computer Interaction group in the Institute for Technology Design and Assessment at Vienna University of Technology. She was previously Director of the Interact Lab at the Uni of Sussex, a User Experience consultant at Sapient London, and Snr Researcher at the Distributed Systems Technology Centre and Centre for Online Health in Australia. Her research is at the intersection of social and computer sciences, using mobile, tangible and sensor-based technologies in everyday contexts of work, play and daily life. Particular interest areas include supporting: collaboration and social interaction; smart spaces; social and emotional skills learning; health, selfcare and well-being; sustainability; and active engagement for older people. She has extensive experience in inter-disciplinary research projects in these areas, contributing expertise in the application of qualitative and mixed methods approaches to the design and evaluation of technologies in situ.
Prof. Andrea Rizzoli (SUPSI)
Dr Andrea E. Rizzoli is a professor at SUPSI and a senior researcher of IDSIA. His interests are in simulation and in decision support systems and he has been involved in the development of various simulation models in a number of different contexts, from natural resources modelling to logistics and industrial systems.
Prof. Kevin Crowston (Syracuse University)
||Kevin Crowston is a Distinguished Professor of Information Science in the School of Information Studies at Syracuse University. He received his Ph.D. (1991) in Information Technologies from the Sloan School of Management, Massachusetts Institute of Technology (MIT). His research examines new ways of organizing made possible by the extensive use of information and communications technology. Specific research topics include the development practices of Free/Libre Open Source Software teams and work practices and technology support for citizen science research projects, both with support from the US National Science Foundation.|
Dr. Christoph Angerer (NVIDIA)
Dr. Christoph Angerer is a developer in NVIDIA's European Developer Technology team. Based in Zurich, Switzerland, he works with developers accelerating applications on GPUs. He holds a Ph.D. in computer science from ETH Zurich in Switzerland.
Prof. Kunpeng Zhang (University of Illinois)
Kunpeng Zhang (KZ) is currently Assistant Professor at University of Illinois at Chicago and will join University of Maryland, College Park in August 2015. He received his Ph.D. in Computer Science from Northwestern University in 2013. He works on large-scale data analysis with particular focuses on mining social media, business, and healthcare data through machine learning, social network analysis, and natural language processing techniques. He published many conference and journal papers. He serves as program committees for many international conferences and currently is associate editor for electronic commerce research journal.
Prof. Margaret-Anne Storey (University of Victoria)
Margaret-Anne (Peggy) Storey is a professor of computer science at the University of Victoria and a Canada Research Chair in Human Computer Interaction for Software Engineering. Her research goal is to understand how technology can help people explore, understand and share complex information and knowledge. She applies and evaluates techniques from knowledge engineering, social software and visual interface design to applications such as collaborative software development, program comprehension, biomedical ontology development, and learning in web-based environments.
Prof. Ben Lubin (Boston University School of Management)
After receiving his Bachelor's degree in Computer Science from Harvard University in 1999, Dr. Lubin joined BBN Technologies, the research and development firm where the first internet routers in the world were developed, working on advanced multi-agent modeling, scheduling and logistics systems. After six years in industry, he returned to Harvard University to pursue a Ph.D. at the intersection of computer science, game theory and economics. He is now an Assistant Professor in the Information Systems department of the Boston University School of Management. His research is in three primary areas: (1) mechanism design, especially of combinatorial exchanges, mechanisms that support efficient reallocation of goods when participants have complex preferences regarding bundles of items, (2) the use of spectral graph theory to advance the analysis of social networks, and (3) applications of network science and machine learning to understanding and improving the healthcare delivery system. Dr. Lubin is a recipient of the Siebel fellowship and a Yahoo Key Technical Challenge award. Portions of his research are funded by NIHCM and the Veterans Administration.
Prof. Tobias Hossfeld (University of Duisburg-Essen)
Tobias Hoßfeld is professor and head of the Chair "Modeling of Adaptive Systems" at the University of Duisburg-Essen, Germany, since 2014. He finished his PhD in 2009 and his professorial thesis (habilitation) "Modeling and Analysis of Internet Applications and Services" in 2013 at the University of Würzburg, Chair of Communication Networks, where he was also heading the "Future Internet Applications & Overlays" research group. He has published more than 100 research papers in major conferences and journals, receiving 5 best conference paper awards, 3 awards for his PhD thesis, and the Fred W. Ellersick Prize 2013 (IEEE Communications Society) for one of his articles on QoE. He is member of the advisory board of the ITC conference and the editorial board of IEEE Communications Surveys & Tutorials.
Prof. Christof Fetzer (TU Dresden)
Prof. Dr. Christof Fetzer has received his diploma in Computer Science from the University of Kaiserlautern, Germany and his Ph.D. from UC San Diego (1997). He then joined AT&T Labs Research in 1999 and had been a principal member of technical staff until 2004. Since then, he heads the endowed chair (Heinz Nixdorf endowment) in Systems Engineering in the Computer Science Department at TU Dresden, Germany. He is the chair of the Distributed Systems Engineering International Masters Program at the Computer Science Department and he leads the resilience path of the center of advancing electronics dresden (cfaed https://cfaed.tu-dresden.de/index.php/resilience.html) and the H2020 Sereca project (http://www.serecaproject.eu). Prof. Dr. Fetzer has published more than 150 research papers in the field of dependable distributed systems and has recently won, together with his co-authors, multiple best (student) paper awards (DEBS2013, LISA2013, SRS2014, IEEE Cloud 2014, DSN). He has been a member of a large number of program committees.