Hey, I’m Bernard, Director of Engineering at Drift. I joined the company a year ago after meeting with Drift’s CTO, Elias Torres. When we met, Elias spoke about Drift’s mission to change the face of corporate America, and, more specifically, tech.
Throughout my years in software development, most companies I’ve worked for have aimed to create a workplace that promotes racial diversity and works to eliminate biased systems. But for one reason or another, they often fell short – not only in their hiring processes, but also in their approach towards software development. While Black software professionals intently work to eliminate bias, our impact is often invisible.
So in an effort to better understand the latest industry research on coded biases, I turned to the work of Ruha Benjamin, an Associate Professor of African American Studies at Princeton. Her book, Race After Technology, draws attention to several forms of coded inequity and racism, including in predictive algorithms and AI.
We are living through a moment marked by an openness to listen and take action in order to dismantle systemic racism in our communities. While many of us are involved on an individual level, I’d be remiss if I didn’t mention the important responsibility tech companies have to play in finding a solution.
Why This Book Is a Must-Read for Product Managers & Developers
In her book, Ruha Benjamin discusses ethnicity in software for policing, health, marketing goods and services, employment, politics, and others. Each “contains social biases embedded in technical artifacts, the allure of objectivity without public accountability” [p.53].
The book describes how logic with racial biases enters through the back door of tech development, while humans developing and optimizing algorithms are “hidden from view” [p.11]. Discriminatory software design through “engineered inequity, default discrimination, coded exposure, and tech benevolence – fall on the spectrum that ranges from most obvious to oblivious in the way it helps produce social inequity” [p.160].
The common view that AI and technology are “neutral tools” ignores how race also functions like a tool and becomes embedded in tech and systems [p.29]. This denial of racial fixes or categories can lead to “some of the most sinister and systemic forms of racism” “when people refuse to acknowledge and challenge how logics structure development” [p.157].
On Diversity & Inclusion, Ruha Benjamin goes on to explain how to hold tech platforms and product developers accountable “as we reckon with our desire for more diversity and inclusion” [p.33]. The book is about the “connections” that are creating biases in software rather than “comparing” different coded biases [p.160], and the author pushes back on the notion that tech bias is “unintentional” or “unconscious” [p.28].
How to Remove a Bias from Software (and Where to Start)
Which problems need solving requires judgement that is potentially biased as well. But we must try. My top list of Ruha Benjamin’s calls to action (and there’s a lot more to unpack throughout the book) include:
- Understanding the connections and “make[ing] racial fixes visible” [p.158].
- “Tackl[ing] the many discriminatory designs that codify the value gap between Black and White by automating racial habits” in systems’ code [p.159]. Identify them when you design and build.
- When biases are uncovered, they should not be treated as software glitches or defects, but as signals [p.87]. The book explains why in more detail.
- Reviewing labeling in automated systems, or biased datasets that “could not exist without data produced through histories of exclusion and discrimination” [p.10]. This is not only AI, but systems that use “intelligence”. Which algorithm’s embedded preferences exist [p.50].
- Tackling how data filtering is performed when a UI offers questions/answers and visualization [p.33]. An automated system can be less human in terms of interactions yet still discriminatory [p.142]. So ask yourself if there is a race tax imposed by design? On this topic the book is exhaustive.
- Reading Black and Latinx examples of structurally biased datasets.
- Mitigating risks when cleaning data as a solution to eliminate bias [p.141]. The book explains the risks in more detail.
One Last Word…
Many of the concepts Ruha Benjamin lays out have become familiar to me over my years of software development. While recognizing that coded inequity and lack of tech neutrality exists, I know that these will return even after changes are attempted if software design changes are not also made. This book is a great synthesis of the reasons behind this.
“Race After Technology” is a practical book that all should read. Afterall, everyone in tech is accountable to gain this awareness in order for real change to happen.
What actions will you take?