If you’re on this blog, I’m assuming you’re familiar with biases in the field of technology, especially in the workplace. We all have seen diversity programs at companies trying to bring about more equity. However, have you ever thought about how a lack of gender diversity in the workplace can lead to products and services that largely ignore the needs of women? I did not know how much it existed until one day I started noticing that the Google and Amazon voice assistants were much better at recognizing my male friend’s prompts and couldn’t understand mine. I decided to dig deeper into this and realized that gender biases in product design are a much bigger concern. I hope this blog brings awareness and we are able to highlight the issue and consciously make an effort to change the thought process.
So let’s start by looking at the logos of a few period tracking apps. What do you see in common? Well, they all are pink with floral patterns. Can you think of a reason why?
Most people would say because it’s meant for women and wouldn’t even realize the embedded stereotypes associated with it. Ever since we’re born, based on our gender, it’s decided if we’ll be dressed in pink or blue, or whether we would play with dolls or cars. And these stereotypes continue to develop over the years and have also slowly taken shape in technology & design. These might be small stereotypes, to begin with, but almost all fields including technology, design, and even medicine have deeply embedded gender biases that need immediate attention.
Design of Seatbelts in Cars
Let’s start with a few examples of the design of everyday things like a car. Did you know safety features like seatbelts and airbags have been designed for an “average male body”? According to consumer reports [1], Women are at a much higher risk of injury with seatbelts and airbags simply because their body types were not accounted for while designing these features. This is even worse for pregnant women who are not thought of while designing a standard seatbelt.
“The problem today is that most car companies design to the test, and no further,” says David Friedman, vice president of advocacy at Consumer Reports and a former NHTSA administrator. Friedman believes that regimented, overly predictable testing leads to engineering that simply checks boxes, rather than a more holistic approach to safety design. “That’s why the next generation of crash tests needs to include more representative dummies and some randomness, putting different dummies in different seats across the tests. Then automakers would finally have to protect everyone equally.”
According to Astrid Linder, Ph.D., Professor & Safety Researcher, One way that automakers and safety advocates are addressing inequality in the short term is by developing computer models that can simulate how human bodies of different shapes, sizes, or sexes react in a crash. This approach is already in use by some automakers, including Toyota and Volvo. Volvo developed a computer model of a midsized pregnant female in the early 2000s and worked with Chalmers University to create a computer model of an average-sized female to develop its whiplash protection system — the very same one that protects males and females equally.
Voice Assistants
Now let’s talk about voice assistants. I’m sure you all are aware of Google Home, Amazon’s Alexa, Microsoft’s Cortana & Apple’s Siri, and even had a conversation with at least one of them before. But have you ever wondered why all these systems have female names & voices by default? They even often have a submissive & flirtatious tone.
The problem, according to a report released in 2019 by Unesco, stems from a lack of diversity within the industry that is reinforcing problematic gender stereotypes. [4]
Embedded in their humanized personalities, are generations of problematic perceptions of women. These assistants are capable of influencing interactions with real women, the report warns. As the report puts it, “The more that culture teaches people to equate women with assistants, the more real women will be seen as assistants — and penalized for not being assistant-like.” [4]
This bias coded into technologies is often unconscious and has been embedded in not just voice assistants but other technologies like facial recognition that misidentifies Black faces. [5] Joy Buolamwini beautifully describes the issue in this poetry that I highly encourage you should watch.
What can help?
To start, there are a few things that we all can do to move towards a more equitable world.
- Identifying Bias: The first step towards fixing the biases is identifying them. As we know, the biases are so deeply embedded that it’s often unconscious and overlooked. Consciously seeking those biases and identifying them will help us figure out ways to address them.
- Research on Diverse data sets: Since products and technology have been created from years of biased historical data, the next step would be to bring in a more diverse data set for research. It’s essential that we research diverse groups of people and account for not just gender differences but also other factors such as age, people of color, and disabilities.
- Building Structural Equity: Having Diversity, Equity & Inclusion efforts in organizations that encourage equity and promote inclusivity would be a great way to begin. Being transparent about opportunities, and encouraging diverse groups to pursue them will help build structural equity.
- Having a seat at the table: Last but certainly not least, change begins from the leadership level and having diverse groups of people represent leadership and making decisions, will help bring in a huge impact.
“Recognize that diversity is not simply a tick box to get past: It is the key to designing products that really work for everyone. And collect sex-disaggregated data, from the very beginning of every process!”
References:
- The Crash Test Bias: How Male-Focused Testing Puts Female Drivers at Risk
- The Gender Bias Behind Voice Assistants
- Siri and Alexa Reinforce Gender Bias, U.N. Finds
- MIT Researcher Exposing Bias in Facial Recognition Tech Triggers Amazon’s Wrath
- How to Make Artificial Intelligence Less Biased
- Three Steps to Make Tech Companies More Equitable
Leave a Reply