The Refusal Conference

A virtual conference intended to 'meet the moment' by exploring organized technology refusal from historical and contemporary vantage points

Refusal Painting

"Refusal" by Schinria Islam-Zhu

The idea of rejecting or refusing technology runs against the grain of the celebrated role tech has generally occupied in the West, wedded closely to the notion of progress itself (Marx 1997). By this cultural logic, refusal is cast as unwise because it is anti-innovation or it is cast as impossible because technological developments are presumed to be inevitable. And yet, this view is contradicted in practice. Research directions narrow the pathways of tech development through disciplinary logics, market possibilities, and life experience. In industry, projects are frequently cancelled when they cannot generate a profit. This financial logic is a kind of value that motivates refusal. What other values currently guide refusal or could in the future? What forms of justification are useful? What practices make refusal possible? At this conference we lean into the idea that sometimes making a more just and equitable society means refusing certain technologies or certain applications of technology.

  • Dates: Oct. 14 - 16th, 2020
  • Location: Remote

In June of 2018, the Algorithmic Fairness and Opacity Working Group (AFOG) held a summer workshop with the theme “Algorithms are Opaque and Unfair: Now What?.” The event was organized by Berkeley I School Professors (and AFOG co-directors) Jenna Burrell and Deirdre Mulligan and postdoc Daniel Kluttz, and Allison Woodruff and Jen Gennai from Google. Our working group is generously sponsored by Google Trust and Safety and hosted at the UC Berkeley School of Information.

Inspired by questions that came up at our biweekly working group meetings during the 2017-2018 academic year, we organized four panels for the workshop. The panel topics raised issues that we felt required deeper consideration and debate. To make progress we brought together a diverse, interdisciplinary group of experts from academia, industry, and civil society in a workshop-style environment. In panel discussions, we considered potential ways of acting on algorithmic (un)fairness and opacity. We sought to consider the fullest possible range of ‘solutions,’ including technical implementations (algorithms, user-interface designs), law and policy, standard-setting, incentive programs, new organizational processes, labor organizing, and direct action.

Refusal Painting

"Refusal" by Schinria Islam

Opening Remarks

1pm – 1:30pm (PDT)

What has been refused? A history of refusal

YouTube Recording

  • Jenna Burrell, Associate Professor, School of Information, University of California, Berkeley
  • Deirdre Mulligan, Professor, School of Information, University of California, Berkeley

Panel 1

1:40pm – 2:40pm (PDT)

Feminist Data Manifest-No

Why refusal? What is the power and potential of refusal? What traditions (academic or otherwise) does refusal build upon? As we think about refusal, what should we keep in mind (e.g., what did you learn putting together the Feminist Data Manifest-No that we should be sure to keep at the front of our minds during this conference?). What makes refusal part of a feminist analytic?
https://www.manifestno.com/

YouTube Recording

  • Marika Cifor, University of Washington
  • Patricia Garcia, University of Michigan
  • Niloufar Salehi, University of California, Berkeley (moderator)
  • Anita Say Chan, University of Illinois
  • TL Cowan, University of Toronto
  • Anna Lauren Hoffmann, University of Washington
  • Jasmine Rault, University of Toronto
  • Tonia Sutherland, University of Hawai'i at Mānoa

Keynote PANEL

2:50pm – 3:50pm (PDT)

Indigenous Scholarship on Refusal and The Prospects for Remaking Tech

YouTube Recording

  • Marisa Duarte, Assistant Professor, Arizona State University
Headshot of Marisa Duarte

Marisa Elena Duarte (Pascua Yaqui/Chicanx) is an assistant professor in the School of Social Transformation. Her 2017 book Network Sovereignty: Building the Internet Across Indian Country is about how tribes whose command over Internet infrastructure and regulation strengthens the power of Native nations to enforce tribal sovereignty. Her recent work includes sociotechnical and network analytic investigations of Indigenous digital tactics toward decolonial resistance. She teaches courses in Justice Theory, Indigenous Methodologies, and Learning Technologies for Native Education through the School of Social Transformation.

  • Kimberly Christen, Professor, Washington State University
Headshot of Kimberly Christen

Dr. Kimberly Christen is the Director of the Center for Digital Scholarship and Curation at Washington State University where she is a Professor in, and the Director of, the Digital Technology and Culture Program. Her work explores the intersections of cultural heritage, traditional knowledge, information ethics, and the use of digital technologies in and by Indigenous communities globally. She is the founder of Mukurtu CMS an open source community access platform designed to meet the needs of Indigenous communities, she is also the Director of the Sustainable Heritage Network, and co-director of the Local Contexts initiative, both platforms provide practical tools and educational resources for stewarding digital cultural heritage and  the management of intellectual property by Indigenous communities.  You can follow her on Twitter @ProfChristen and her work can be found on her website: www.kimchristen.com

Happy Hour

4pm – 5pm (PDT)

Nostalgic Tech Futures

Rotating breakout groups, activity with videos from the Prelinger archive.

Opening Remarks BY INVITATION ONLY

9:30am – 9:50am (PDT)

Day 1 Recap & Day 2 Preview

  • Jenna Burrell, Associate Professor, School of Information, University of California, Berkeley
  • Deirdre Mulligan, Professor, School of Information, University of California, Berkeley

Panel 2BY INVITATION ONLY

10:00am – 11:15am (PDT)

Refusal through Social Movements (focusing on facial recognition)

In this panel we will discuss the power of social movements and other forms of organizing and activism. We focus especially on recent developments related to resisting the use of facial recognition technologies. In Hong Kong, protestors have broken 100s of CCTV cameras and used umbrellas to subvert State surveillance of public protests. Cities including San Francisco, Oakland, and Boston have formally banned the use of facial recognition by city departments including the police. The ACM’s US Technology Policy Committee published a letter this past June that urged, “an immediate suspension of the current and future private and governmental use of facial recognition (FR) technologies” where human and legal rights are likely to be violated. Recently, firms including IBM, Amazon, and others have made public their decision to suspend or discontinue the development or sale of such technologies. What were the precursors to such decisions? What groups were involved in effecting these outcomes? This panel will focus in particular on how collectives are effective in shifting the discourse on what is acceptable in the use of tech by public and private agencies and in public space.

  • Tawana Petty, Director, Data Justice Program, Detroit Community Technology Project
    Tracy Frey,
    Google Director of Strategy, Cloud AI, Google Corporation
    Lilly Irani, Associate Professor, Department of Communication, University of California – San Diego
    Khalid (Paul) Alexander, president and founder Pillars of the Community
    Nicole Ozer,
    Technology & Civil Liberties Director, ACLU of California

  • moderated by Morgan Ames, Assistant Adjunct Professor, UC Berkeley School of Information

Panel 3 BY INVITATION ONLY

11:25am – 12:40pm (PDT)

Stories of refusal from within the corporate firm

This panel considers instances of refusal and the logic underlying them within corporate firms -- tell us about a decision to refuse? What frameworks are useful? What evidence was available or necessary to influence key stakeholders (e.g., internal letters to the company, social movements, research papers in academia, scenarios illustrating realized or potential harm)? How do power structures within the org shape the possibility of refusal? What were the consequences? Individual vs. group (acting within sanctioned work role vs. unsanctioned employee activism)? Refusing an instance of tech implementation (i.e. a contract with a particular group), a domain (i.e. policing), vs. a type of tech altogether (facial recognition)? In what ways does the company’s business model (ad supported vs. contract based) shape possibilities of refusal? What role does corporate leadership play? This panel will discuss refusals large and small -- that feature is a bad idea, that contract is not worth taking, that category of tech development should be closed down or banned.

  • Alexandria Walden, Head of Human Rights, Google Corporation
    Eshwar Chandrasekharan,
    Assistant Professor, University of Illinois, Urbana-Champaign
    Dunstan Allison-Hope,
    Vice President, Business for Social Responsibility (BSR)
    Jeanna Matthews, Professor, Department of Computer Science, Clarkson University

  • moderated by Deirdre Mulligan, Professor, UC Berkeley School of Information

Break out groupsBY INVITATION ONLY

2pm – 4pm (PDT)

Discussion, workshop activity

Discussion, workshop activity

Concluding Keynote

9:30am – 10:30am (PDT)

YouTube Recording

  • Mutale Nkonde, AI For the People
Headshot of Mutale Nkonde

Mutale Nkonde is a researcher, policy advisor and key constituent to the 3C UN Roundtable on AI.

Prior to starting AI for the People, she was part of a team that introduced; the Algorithmic Accountability Act, the DEEP FAKES Accountability Act, and the No Biometric Barriers to Housing Act to the US House of Representatives. She is currently writing Automated Anti Blackness, why we Need to Name Race to Create Just Technical Futures which will be published by Polity Press in Fall 2021.

Nkonde holds fellowships at the Digital Civil Society Lab at Stanford University and the Institute of Advanced Study at Notre Dame and is a member of the TikTok Content Advisory Council.

DiscussionBY INVITATION ONLY

10:45am – 11:45am (PDT)

Why should we refuse? What should be refused and under what guidelines?

Rubrics for thinking about refusal. Guidelines and tools for refusing. What would we like to see stopped in its tracks? What are some tech trends that we would like to see refused that will likely be especially hard to reverse? What is the justificatory basis? What are the specific values to uphold that should guide decisions about tech dev -- privacy? Human rights? How do we apply these to new instances of tech -- through what processes? At what decision points: research paper review? contract negotiation? code review? Where does legislation fit into the landscape of refusal? Technology non-specific guidelines -- facial recognition vs. tech-enhanced surveillance practices. What is the purpose of refusal? Beyond the private sector, how to think about refusal in the public sector, in government (procurement decisions?)? How can refusal be implemented without professional bodies that have the power to censure members (i.e. disbarred, lose license, etc)?

  • moderated by Richmond Wong, Postdoctoral Fellow, Center for Long-Term Cybersecurity,
    University of California, Berkeley

closing remarksBY INVITATION ONLY

12pm – 12:30pm (PDT)

Closing Remarks

  • Jenna Burrell, UC Berkeley School of Information
  • Deirdre Mulligan, UC Berkeley School of Information

Schinria Islam-Zhu is an acrylic painter currently based in Reno, Nevada. This original painting was created for the AFOG 2020 Refusal Conference in honor of the act of refusal at the individual level.

Through modern abstract landscapes, Schinria visually reimagines concepts of travel through time, space, and music. As a believer in the power of representation and use of visual artifacts to amplify auditory experiences, Schinria leverages her synesthesia (rare ability to “see” sound as colors) to invoke bold color combinations and unexpected uses of geometry.

This piece -- entitled “Refusal” -- is inspired by Rockwell’s 1984 pop hit, “Somebody’s Watching Me.” It depicts a lone character, a single apostrophe in the upper right quadrant towards the center of the piece who is choosing to go rogue, narrowly and forcefully ejecting themselves from the confines of a system designed to retain them indefinitely.

Every attendee of the Refusal Conference received 3 postcards, with each containing a quarter of the “Refusal” landscape. The lone figure escaping the system exists in the final, missing quarter of the piece, representing new possibilities and justifications for refusal which may be realized as a result of engaging in this conference.

About the piece

Title: “Refusal”
Music inspiration: “Somebody’s Watching Me” by Rockwell (1984)
16”x20” acrylic on stretched canvas