logo

29 pages 58 minutes read

Ruha Benjamin

Race After Technology: Abolitionist Tools for the New Jim Code

Nonfiction | Book | Adult | Published in 2019

A modern alternative to SparkNotes and CliffsNotes, SuperSummary offers high-quality Study Guides with detailed chapter summaries and analysis of major themes, characters, and more.

Chapter 2Chapter Summaries & Analyses

Chapter 2 Summary: “Default Discrimination: Is the Glitch Systemic?”

When programmers design databases, they project their own worldviews. What we often call glitches can function to exclude certain demographics, as with the Google Maps voice that reads the “X” in “Malcom X Boulevard” as a ten. While glitches seem like fleeting errors, they reveal the innerworkings of our social biases.

Predictive policing software aims to “predict” where criminal activity might be and who might reoffend, but it has been shown to make false predictions and overrepresent black criminality. Crime prediction algorithms lead law enforcement to over-criminalize certain neighborhoods. In The Matrix, the Oracle predicts that Neo is about to knock over a vase, taking him off guard and causing him to knock it over. Just as her prediction is self-fulfilling, predictive algorithms create situations for police to find crime. Glitches are not lapses in a benign system but signs of a flawed process.

“Defensive” architecture—such as armrests on public benches that discourage lying down—abounds in stratified societies. Structures can be engineered to reinforce hierarchies. We see this with Robert Moses’s overpasses, which are rumored to be purposely too low for buses and limit the mobility of the Black working class. When a school bus of affluent white children got into an accident at one such overpass, it demonstrated the way that discriminatory design can impact everyone. Google has been shown to turn up search results that reflect our societal prejudices about race; for example, “unprofessional hairstyles for work” shows more of Black women’s hair. Ultimately, algorithms are not pure and benign but a reflection of their maker’s implicit and explicit biases.

Chapter 2 Analysis

Chapter 2 focuses on the “glitch” not as a random aberration in an otherwise benign program but as symptomatic of a greater, systemic issue. Benjamin draws on an example from The Matrix (1999), where a glitch—appearing as déjà vu—warns the characters that there is danger. Benjamin encourages the reader not to overlook small, egregious moments of racism that appear to be glitches; instead, we ought to ask what those moments reveal about the systems that govern society.

Chapter 2 also deals with predictive policing technologies. These rely on racial profiles and stereotypes to target communities where police believe crime will be committed. Benjamin’s discussion reflects a broader Black Studies interest in futurity. A long history of oppression has led many Black intellectuals and creatives to adopt a forward-thinking attitude. The enslaved engaged in rebellion, resistance, and escape in hopes of futures that were free. Through the late nineteenth and twentieth centuries, African American individuals worked toward a future with civil rights. Benjamin joins a long tradition of Black resistance and imagining better futures; she asks her readers to interrogate technology today so it does not continue to perpetuate inequities tomorrow. By discussing the way that government authorities use technology, Benjamin draws our attention to the importance of pursuing positive futures for marginalized communities.

This chapter also ties class with the book’s main issue of race. Defensive architecture is designed to discourage loitering and public gathering. On its surface, it aims to maintain the appearance of a polished urban landscape. However, a bench with timed spikes (90)—an example Benjamin offers—discriminates against individuals who have no choice but to sleep outside. Further, the discouragement of public gathering includes public protest, a basic right often invoked by those most in need of change. Benjamin reminds the reader that, just as benches and public squares can exclude people struggling with poverty, technological design can perpetuate class discrimination.

blurred text
blurred text
blurred text
blurred text