Avoiding bias in design

People icon

One of the biggest challenges designers face is avoiding bias. We all have perspectives that subconsciously affect our decisions. In the case of design, those choices we don’t even realise we are making can have big consequences.

This digital design digest highlights five interesting articles looking at the ways bias can negatively affect designs, and how we can start to mitigate those negative effects.

Leaving the bed of Procrustean experience: On the need for ethnography — Epic

This very thought-provoking article raises some significant questions for the field of user experience. Who — or what — is a user? What is an experience? In many cases, the answers could be even more complicated than you thought.

When we talk about user-centred design we are normally thinking about the experience of an individual. But sometimes we may be better served by thinking about something bigger — about wider effects on the community.

To highlight this idea, the article raises the example of Hans Monderman’s roads in the Netherlands. His designs eliminate road signs and other markers, and have been shown to be safer.

Hans Monderman says his designs work because they promote personal responsibility. In other words, he is not meeting any one individual’s direct needs. Instead, he is maximising what might be described as the community experience.

This shows that how we define the problem — or even who or what we are solving the problem for — can encourage the designer to come up with a radically different solution.

Who is the user of the traffic intersection—a person, a community, or something else? What is the relevant method of perception (and whose perception is it)?

A designer tasked with designing better iconography for a road sign, or for a better driver assistance system, would entirely miss Monderman’s solution of eliminating signage…

We can easily imagine that user experience research might disclose a preference for safety (of individual drivers and pedestrians) while broader, more traditional, anthropological research might reveal a desire for enhanced ‘personal responsibility’ (among the community at large). Without research we don’t know; the point is, these aren’t necessarily the same.

The inherent bias of facial recognition — Motherboard

We are all familiar with stories of facial recognition systems going wrong, such as the time Flickr incorrectly tagged a black man as an ape, or the camera that thought an Asian person was continually blinking.

Those examples may be bad enough, but they can become exceptionally serious in an age of biometrics, which has caused problems for trans people while traveling.

When you ask people who make facial recognition systems if they worry about these problems, they generally say no. Moshe Greenshpan, the founder and CEO of Face-Six, a company that develops facial recognition systems for churches and stores, told me that it’s unreasonable to expect these systems to be 100 percent accurate, and that he doesn’t worry about what he called “little issues,” like a system not being able to parse trans people…

…folks who work on algorithmic bias, like Suresh Venkatasubramanian, a professor of computer science at the University of Utah, say… “I don’t think there’s a conscious desire to ignore these issues,” he said. “I think it’s just that they don’t think about it at all. No one really spends a lot of time thinking about privilege and status, if you are the defaults you just assume you just are.”

The problem with a technology revolution designed primarily for men — Quartz

Soraya Chemaly highlights some research that assessed how virtual assistants such as Siri respond to users’ queries about issues such as health and domestic violence.

While such virtual assistants are adequate at responding to questions about heart attacks and suicide, they are found wanting when faced with needs more commonly experienced by women.

…the phrases “I’ve been raped” or “I’ve been sexually assaulted”–traumas that up to 20% of American women will experience–left the devices stumped. Siri, Google Now, and S Voice responded with: “I don’t know what that is.” The problem was the same when researchers tested for physical abuse. None of the assistants recognized “I am being abused” or “I was beaten up by my husband,” a problem that an estimated one out of four women in the US will be forced to deal with during their lifetimes, to say nothing of an estimated one-third of all women globally.

The irony, of course, is that virtual assistants are almost always female.

Collaborative user testing: Less bias, better research — A List Apart

Designers are often too close to their own work to evaluate it objectively. That’s the conclusion of this piece by Alla Kholmatova for A List Apart.

When carrying out user research or usability testing, it can be very important to focus on who the evaluator is, and how their inevitable biases can be kept in check.

What is particularly disturbing about bias is that we are usually not aware when we are being biased. This unconscious, unknown bias can distort user research findings, and send a design in completely the wrong direction.

The article ends with some great tips on how to keep your biases in check when carrying out user research. Involving multiple people with different perspectives is key to high quality usability testing.


Comments

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.