The Critical Nature of Designing Inclusivity

Susan E. Williams
Tincture
Published in
5 min readJun 6, 2018

--

Safiya Noble, an assistant professor at University of Southern California Annenberg School of Communication, recently said during a presentation at Data Society:

If you’re designing technology for society, and you don’t know anything about society, you’re unqualified.

As I unpacked this statement, I started considering how one begins to really “know” society? And what does it mean to know? How does one achieve the right amount of knowledge to have an understanding about society? What is the language and what are the methods for achieving deeper insight and greater knowledge? And most importantly, how does one continue to learn and fold new learnings into products and systems designed to serve today’s society?

I recently read Technically Wrong: Sexist Apps, Biased Algorithms, and Other Threats of Toxic Tech by Sara Wachter-Boettcher. In it I found a refreshing perspective and a strong call to action for becoming more critical of those people and companies designing the algorithms that so many of us interact with every day.

“Technology is so often positioned as a disruptive good, that it inherently will make things better because it will drive progress,” said Sara in a recent conversation we had. “But the technology doesn’t by itself solve those fundamental social and cultural problems.”

Instead, technology mirrors — and in some ways conceals — many of those problems.

In Technically Wrong, Sara strips away the myth of complexity and super-hero fame of Silicon Valley and opens access to a way of thinking about the kind of cultural and social assumptions intentionally designed into default settings and a program’s “standard user base.”

She brings in simple examples of ethical oversights and biases (Facebook featuring a “best of your year” photo album of a user’s recently deceased child; or how digital forms alienate people who don’t fit into one category or another in terms of race, gender, or marital status; or calorie-counting apps will warn someone that they are eating too much when perhaps gaining, not losing, weight is the goal).

In so many ways, a programmer is a linguistic translator — someone who takes the language of society and culture and translates it into a program’s code — an algorithm. German philosopher Walter Benjamin in his essay “The Task of the Translator” posits how translations must consider the context of the culture into the translation, otherwise what a word connotes could be different than what was originally intended.

For this reason, a translator must have an intimate understanding not only of the word or phrase or idea, but of the context and the intention in which that word sits. In other words, the translator’s job is to identify an intended effect of the original. What results across the original and the translated text, then, is a new surface for critique.

Not unlike literary criticism, where at the core is a close reading and understanding premised on a shared language and knowledge of historical, societal and cultural contexts, being able to be critical of the language of coding — and the resulting experiences (regardless of intention) — requires words and concepts that people (users) can understand. And this is what Sara does so well in her book. Simplifies — and translates — the intentions behind programming code into a language others can understand.

Introducing a way of speaking and thinking about technology that considers who and why programs are designed the way they are must go hand in hand with questioning the company’s motives and incentives.

Sara explains:

Tech companies don’t tend to reward people who think about the implications of their work. The result is a toxic short-sightedness, and we’re seeing the impact everywhere.

An example Sara uses in her book is Facebook. In its sign up form, Facebook forces users to enter their gender in binary terms. However, many users might not associate with one gender or the other, and many more users might not want to indicate their gender in order ensure they are not fed a certain type of content. Facebook relies on their data to indicate gender so that advertisers can better target you and Facebook can continue to make revenue. This is the where Facebook’s motives reside — this is the intended effect of their code. Not capturing the true diversity of their users.

In another example, she shares ProPublica’s research on COMPAS, a criminal recidivism software, which has been proven to incorrectly predict whether a person will repeat a crime. The algorithm is based on numbers from the past, but, “if the past was biased (and it certainly was), then these systems will keep bias alive — even as the public is led to believe that high-tech models remove human error from the equation.” (p126)

Furthermore, when we see the trend in adopting machine learning into systems built with fundamental biases and flaws in their algorithms related to public or personal health, Sara’s call to action to level the playing field, to criticize more ardently those products defining some of our most vulnerable life moments, becomes that much more vital.

As Sara writes in Technically Wrong (p.75):

It’s never been more important that we demand [tech companies to be more accountable]. When systems don’t allow users to express their identities, companies end up with data that doesn’t reflect the realities of their users…when companies (and, increasingly, their artificial intelligence systems) rely on that information to make choices about how their products work, they can wreak havoc — affecting everything from personal safety to political contests and prison sentences.

These are complex and messy problems — layered between the realities of short-sighted business incentives, designers’ and programmers’ lack of knowledge for how to be more inclusive and self-reflective, and, of course, societal, political and cultural structures. The weight of these issues is huge and should not be swept aside.

Though intense, open-minded work is needed at all the different levels concurrently, I’m optimistic that with more people like Sara Wachter-Boettcher and Safiya Noble introducing awareness and language that establishes a space for active criticism, we’ll begin to see some baby-steps toward change.

--

--

Anthropologist working in technology and health. @columbia @nyuniversity