The Algorithmic Fairness and Opacity Group (AFOG) is an interdisciplinary research group housed at the UC Berkeley School of Information, bringing together faculty, postdocs, and graduate students from Information Studies, Sociology, Law, Communication, Media Studies, Computer Science, and the Humanities, among others. We conduct research, education, develop policy, build systems, bring theory into practice, and bridge disciplinary boundaries. Throughout our efforts, we center human values in the design and use of technical systems to support more equitable and just societies.
Algorithmic decision tools increasingly impact all facets of life -- from high stakes contexts like criminal justice to everyday interactions on social media. Once deployed, they become part of complex socio-technical systems. The reality of these systems often deviates from the planned or expected. We discuss and undertake new research to understand the effects of AI systems on the ground.
Designing technology to support just and equitable societies requires a broad consideration of the ‘solution space.’ We bring together scholars and practitioners with wide ranging expertise to explore solutions that include improved algorithms, interface designs, legal reforms, improved organizational policies and processes, collective action, and social movements. We also explore ways of configuring the handoffs between technical, organizational, and legal aspects of algorithmic systems to foster forms of oversight and accountability aligned with democratic norms of governance.
We teach courses and host public events that draw students from across campus. We strive to teach the next generation of professionals working in non-profit, corporate, and government settings to think deeply and critically about technology, human values, and the social and political implications of technical systems.
For the last three years AFOG at UC Berkeley has benefited from the participation of several members of the Ethical AI team at Google in our working group meetings and workshops. Recently Dr. Timnit Gebru who co-leads the team was fired from her position over her request for greater transparency and procedural fairness in the internal review process of the research produced by herself and her team members...
WHAT WE’RE READING
Jenna Burrell and Marion Fourcade
The pairing of massive data sets with processes—or algorithms—written in computer code to sort through, organize, extract, or mine them has made inroads in almost every major social institution. This article proposes a reading of the scholarly literature concerned with the social implications of this transformation.
A virtual conference intended to 'meet the moment' by exploring organized technology refusal from historical and contemporary vantage points.