Algorithmic Fairness and Opacity Group

The Algorithmic Fairness and Opacity Group (AFOG) is an interdisciplinary research group housed at the UC Berkeley School of Information, bringing together faculty, postdocs, and graduate students from Information Studies, Sociology, Law, Communication, Media Studies, Computer Science, and the Humanities, among others. We conduct research, education, develop policy, build systems, bring theory into practice, and bridge disciplinary boundaries. Throughout our efforts, we center human values in the design and use of technical systems to support more equitable and just societies.

Learn More
Webpage Preview

AI On the Ground

Algorithmic decision tools increasingly impact all facets of life -- from high stakes contexts like criminal justice to everyday interactions on social media. Once deployed, they become part of complex socio-technical systems. The reality of these systems often deviates from the planned or expected. We discuss and undertake new research to understand the effects of AI systems on the ground.

Focus Area 1
Focus Area 2

Expanding the ‘Solution Space’ for Responsible AI

Designing technology to support just and equitable societies requires a broad consideration of the ‘solution space.’ We bring together scholars and practitioners with wide ranging expertise to explore solutions that include improved algorithms, interface designs, legal reforms, improved organizational policies and processes, collective action, and social movements. We also explore ways of configuring the handoffs between technical, organizational, and legal aspects of algorithmic systems to foster forms of oversight and accountability aligned with democratic norms of governance.

Education

We teach courses and host public events that draw students from across campus. We strive to teach the next generation of professionals working in non-profit, corporate, and government settings to think deeply and critically about technology, human values, and the social and political implications of technical systems.

Focus Area 3
PIT-UN Image with text: We're bridging technology and the public interest.

Latest Announcement

AFOG receives a grant for Public Interest Tech

Awarded by New America's Public Interest Technology University Network (PIT-UN), the grant will fund a series of hands-on workshops (and a lecture series, career pathways panels, and lunches) with campus academic partners: Cal NERDS (standing for “New Experiences for Research and Diversity in Science”, a program made up of several diversity-focused STEM initiatives for undergraduate and graduate students); and the D-Lab (a campus initiative that supports professionalization of students and diversity in data-intensive social science).

Learn More
"Refusal" by Schinria Islam-Zhu

Latest Announcement

The Refusal Conference - Outcomes

Recordings of the public sessions and our refusal reading list are now available on the conference page. We will be building out on the Refusal theme over the next few months, using it to inform our public programming, our research and our actions and partnerships.

Learn More
Image of a man with an EEG (brain sensor) cap

WHAT WE’RE READING

Are surveillance capitalists behaviorists? Does it matter? No, and maybe.

Shreeharsh Kelkar

Many critiques of the tech industry seem to take for granted that Silicon Valley engineers are behaviorists. But these engineers see themselves less as designers of stimulus-response mechanisms and more as "choice architects," and thus have a very different theory of freedom than behaviorists--which matters in the ongoing fight to regulate social media.


Image: Mind control! CC-BY-2.0

Image of UC Berkeley

Latest Recording

October 14, 2020

The Refusal Conference

The Refusal Conference

A virtual conference intended to 'meet the moment' by exploring organized technology refusal from historical and contemporary vantage points.

Our partners work with us to examine topics in fairness and opacity. If you are interested in becoming a partner, please contact us at afog@berkeley.edu.