Social Computing Group

Welcome to the Social Computing Group

The Social Computing group is lead by Prof Aniko Hannak. Our research focuses on how digitization affects people and society, especially problems that arise to automated decision-making. In the quickly changing online ecosystem of online platforms, companies track users' every move and feed the collected data into big data algorithms in order to match them with the most interesting, most relevant content. Since these algorithms learn on human data they are likely to pick up on social biases and unintentionally reinforce them. Our projects focus on both uncovering the problems imposed by these large algorithmic systems and also on coming up with mitigation strategies and policy recommendations. Examples of past algorithmic audits include examining the "Filter Bubble effect" on Google Searchonline price discrimination, detecting inequalities in online labor markets and inequality in open source communities

Current Projects

Measuring Bias in Online Labor Markets

Labor economy has been through a lot of structural changes in the past years. People use various online services to find employment, advertise freelance services, collaborate on projects, outsource work, etc. These online sites offer innovative mechanisms for organizing employment or hiring processes and may alter many of the social forces known to cause social inequality in traditional labor markets. While policies in the traditional labor economy protecting people in the labor market have been developed over hundreds of years, we are at the early stages of this process in the online context. Paradoxically, while meaningful policy making requires a good understanding of the mechanisms that create or reinforce inequalities, without regulations reinforcing audits or some form of transparency, it is very difficult to learn about these systems. This project combines a variety of methods, including online data collection and empirical analysis, online and field experiments, and survey base data collection.

Gender Bias on Stack Overflow

Programming is a valuable skill in the labor market, making the underrepresentation of women in computing an increasingly important issue. Online question and answer platforms serve a dual purpose in this field: they form a body of knowledge useful as a reference and learning tool, and they provide opportunities for individuals to demonstrate credible, verifiable expertise. Issues, such as male-oriented site design or overrepresentation of men among the site’s elite may therefore compound the issue of women’s underrepresentation in IT. In this project we audit the differences in behavior and outcomes between men and women on Stack Overflow, the most popular of these Q&A sites. We investigate how inequalities known from social sciences manifest in online communities and their root causes. 

Simulating income inequality in Ride-Sharing

As they grow in popularity, ride-hailing and food-delivery services such as Uber, Lyft, Ola or Foodora are quickly transforming urban transportation ecosystems. Despite their potential to democratize the labor market, these services are often accused of fostering unfair working conditions and low wages. In this project, we investigate the effect of algorithm design decisions on wage inequality in ride-hailing platforms. Using a simulation approach, we can overcome the difficulties stemming from both the complexity of transportation systems and the lack of data and algorithmic transparency. We calibrate our model based on empirical data, including conditions about locations of drivers and passengers, traffic, the layout of the city, and the algorithm that matches requests with drivers. Then we evaluate the output of the simulation under various external conditions. 

More details on past work at my Northeastern Algorithm Auditing group here