The Algorithmic Fairness and Opacity Group (AFOG) is an interdisciplinary research group housed at the UC Berkeley School of Information, bringing together faculty, postdocs, and graduate students from Information Studies, Sociology, Law, Communication, Media Studies, Computer Science, and the Humanities, among others. We conduct research, education, develop policy, build systems, bring theory into practice, and bridge disciplinary boundaries. Throughout our efforts, we center human values in the design and use of technical systems to support more equitable and just societies.
Algorithmic decision tools increasingly impact all facets of life -- from high stakes contexts like criminal justice to everyday interactions on social media. Once deployed, they become part of complex socio-technical systems. The reality of these systems often deviates from the planned or expected. We discuss and undertake new research to understand the effects of AI systems on the ground.
Designing technology to support just and equitable societies requires a broad consideration of the ‘solution space.’ We bring together scholars and practitioners with wide ranging expertise to explore solutions that include improved algorithms, interface designs, legal reforms, improved organizational policies and processes, collective action, and social movements. We also explore ways of configuring the handoffs between technical, organizational, and legal aspects of algorithmic systems to foster forms of oversight and accountability aligned with democratic norms of governance.
We teach courses and host public events that draw students from across campus. We strive to teach the next generation of professionals working in non-profit, corporate, and government settings to think deeply and critically about technology, human values, and the social and political implications of technical systems.
Awarded by New America's Public Interest Technology University Network (PIT-UN), the grant will fund a series of hands-on workshops (and a lecture series, career pathways panels, and lunches) with campus academic partners: Cal NERDS (standing for “New Experiences for Research and Diversity in Science”, a program made up of several diversity-focused STEM initiatives for undergraduate and graduate students); and the D-Lab (a campus initiative that supports professionalization of students and diversity in data-intensive social science).
Recordings of the public sessions and our refusal reading list are now available on the conference page. We will be building out on the Refusal theme over the next few months, using it to inform our public programming, our research and our actions and partnerships.
WHAT WE’RE READING
Many critiques of the tech industry seem to take for granted that Silicon Valley engineers are behaviorists. But these engineers see themselves less as designers of stimulus-response mechanisms and more as "choice architects," and thus have a very different theory of freedom than behaviorists--which matters in the ongoing fight to regulate social media.
A virtual conference intended to 'meet the moment' by exploring organized technology refusal from historical and contemporary vantage points.