Know your enemy: gender bias in design

Kamila Rodzinka
Kamila Rodzinka
Dec 03, 2021
10 min read
Know your enemy: gender bias in design

What is gender bias?

Gender bias is defined as behaviours, beliefs, and mindsets that favour one gender over another. It is a form of unconscious bias that involves assigning certain characteristics and stereotypes to other individuals or groups. These ascribed attitudes affect the way a person interacts, engages, and understands relationships with others. 

In most cases, the term gender bias is used to describe the privileges and special treatment that white, heterosexual males receive in modern society. This advantage stems primarily from the deeply entrenched androcentric (male-centric) way in which culture, worldview, and history are constructed, perceived, and transmitted. This means placing the male point of view at the centre and taking the male experience as the norm and point of reference.

A 2020 UN report revealed that nearly 90% of all people hold some kind of bias against women. This is a vast and fundamental problem that affects many areas of life, including politics, work and education, relationships, sports, religion, or design. Design, broadly defined, is the focus of this article, so let's take a look at how androcentric perspective in this area impacts our lives.

 

How biased design affects our lives

 

First of all, it is important to note that design has the potential to create social narratives as well as give them direction and momentum. Therefore, when practised in an environment where gender bias is present, it can contribute to the perpetuation, validation, and distribution of that bias. This leads to widening social inequalities, offering only selected groups access to benefits and improvements in quality of life.

In this section, I want to take you through examples of various products designed with a disregard or misunderstanding of the experiences of female users. These examples often highlight an unconscious aspect of gender bias. While we can assume with a high probability that their designers did not intentionally want to exclude an essential part of the audience, we can see that deep-seated androcentric thinking influenced the final solutions. 

 

- Car seatbelts

 

One of the most infamous examples of the biased design is car seatbelts. Looking back, automotive product design and development was (and still is) primarily defined by men. The invention of seatbelts dates back to the latter half of the 19th century, but even decades later, in the 1960s, their design standards were configured for men. The vehicle crash test protocol that emerged at the time was based on the use of dummies modelled after the average male whose height, weight, and stature were in the 50th percentile. This means that seatbelts were designed to be safe only for men, as women (especially pregnant women) were not considered at all. In fact, studies show that they are nearly 50% more likely to have a severe injury than men and their risk of death increases by 17%. Although this situation is slowly beginning to change, we are still far from satisfactory safety standards. In 2011, female dummies became required in crash tests. However, still, male counterparts make up the majority, which does not correspond to the actual distribution of the driving population (according to statistics, in the U.S. and Europe, half of the drivers are women). Additionally, few female dummies are tested as drivers - they usually drive as passengers or just not at all.

 

 

Protective gear

 

In 2016, the U.S. military began recruiting women into units that had previously been male-only. However, it turned out that women's armour needs were not adequately thought through. The Army indeed added some smaller sizes, but other pieces of equipment, such as boots and helmets, were not fully accommodated. This state of affairs has many inconveniences for women, notably the inability to fire their weapons properly. Moreover, some women in the military have been forced to adapt their equipment, which often involves the removal of some protective panels or makeshift alterations to protect organs.

This case is not isolated. It turns out that most of the protective equipment for workers is designed primarily for male bodies. Recently, there has been a buzz about health care workers who, with the coming of the Covid-19 pandemic, have begun to point out that the lives of female health care professionals are at risk because PPE (personal protective equipment) is designed for men. For example, all masks are produced based on a male template, which is even more surprising when faced with the information that 75% of health workers are female.

However, voices of outrage influence these areas. For example, U.S. military officials have pledged to speed up the process of fitting body armour for women. Various initiatives to improve PPE are emerging worldwide, such as Fit for Women, launched by the global Women in Global Health (WGH) movement.

 

Screenshot from video posted on Twitter by Dr Arghavan Salles, in which she jokes about her head being too small for PPE.

 

- Smartphones and video controllers

 

"Size matters" is a phrase that many women who use smartphones can say. In their experience, many of the latest smartphone models have been designed without considering female users as most do not fit comfortably in the average woman's hand. Smartphones are getting bigger and bigger, with display sizes often exceeding 6 inches (15.25 cm). This size can make it difficult or impossible for many women (and men with small hands) to use the phone with one hand. This is because women's hands are on average about an inch (2.5 cm) smaller than men's.

The situation is similar with video-game controllers. Some gamers with smaller hands find that using standard controllers negatively affects their comfort, and the gaming experience itself is not as enjoyable as it could be.

 

Comparison of two iPhone sizes in women's hands.

 

- Virtual reality headsets

 

Motion sickness in VR has afflicted this technology since its very beginning. Women tend to experience more VR-induced nausea than men. What are the explanations for this? For example, one implies that most systems use a distance suggestion method that is easier to program and render but is also preferred only by men (motion parallax method). In contrast, women prefer the shape-from-shading technique. These are two very different depth cues. The implication is that VR sends the wrong signals to women's brains.

Another, simpler explanation implies that women are much more likely to feel nauseous than men when using VR, probably because 90% of them have pupils closer together than the default setting of a typical headset. In this case, the solution to the problem lies in applying minor adjustments to the helmet design. Simple, isn't it?

 

Woman using a VR headset.

Source: Unsplash

 

- Health apps

 

It is impossible to forget an infamous part of the tech giant's history and leader in smartphone innovation. Following the release of Apple's Health app, the company was heavily criticised for overlooking important women health issues. The app, which was supposed to be a comprehensive solution and has features such as tracking sodium intake, ignored a vital health aspect of half the population - the menstrual cycle. Apple users could not use the period tracking feature until the iOS 9 release. Could the explanation for this ignorance be that period simply isn't a concern for the vast majority of company employees who are men? Maybe.

 

Data showing Apple's workforce distribution in 2020

 

- Facial recognition

 

Studies have shown that commercially available facial analysis programs (used, for example, in law enforcement) from major technology companies do not always provide satisfactory performance. They work best for men with white skin colour - for example, artificial intelligence from IBM Microsoft could correctly identify a man from a photo 99 per cent of the time. However, for women, especially those with darker skin, their effectiveness drops to as low as 35 per cent.

These findings raise questions about how today's neural networks are trained and evaluated and on which datasets they learn to perform computational tasks. After all, it is no secret that it is difficult to achieve satisfactory performance if images of white men dominate the dataset.

However, it is consoling to see that they are becoming more streamlined and diverse in response to criticism, and the effectiveness of facial recognition systems is starting to increase.

 

Screenshot from a video posted on YouTube by MIT Media Lab, showing detection results from Microsoft's facial analysis demo.

 

- Voice assistants 

 

If you've dealt with a voice assistant even once in your life, the first thing you'll probably hear in your head right now is a gentle, maybe even slightly submissive or flirty female tone of voice. Apple's Siri, Google Home, Microsoft's Cortana or Amazon's Alexa – all these systems have default female voices and names. It is not difficult to identify the collective beliefs, stereotypes, and hurtful prejudices about women that are "hidden" here. These are easily transferred and replicated in the latest technologies and solutions, fostered by the undiversified environment and industry in which they are created.

 

Voice assistants release dates and gender options.
Source: UN Report "I'd blush if I could" 

 

The roots of biased design

 

The examples I mentioned above are just the tip of the iceberg. Where can we begin to look for an explanation of this reality? Let's start at the very top of the ladder.

Owners, executives, leaders, and experts (engineers, designers) of technology companies – all of these high-profile positions are filled mostly by men. For example, in the U.S., the percentage of tech startups with at least one female founder stands only at 26% (this result is still better than a few years ago).

 

 

This means that wherever ideas are generated and decisions are made, there is a lack of female representation. Such homogeneous groups carry more limitations than benefits. First, they may be unaware of the problems faced by other social groups, and the solutions they build tend to respond mainly to their own needs and expectations. Second, when one homogeneous group designs and develops most technologies, they consciously and unconsciously convey their own biases. Decision-making on product launches by such groups can also be problematic due to a phenomenon called status belief transfer. This is an asymmetric negative bias, which is that products made by women are disadvantaged in male-typed markets, but products made by men are not disadvantaged in female-typed markets. Research on this topic has provided compelling evidence of the transmission of status beliefs from manufacturers to their products.

Someone may argue that the technology industry is changing after all, and the latest data shows that there are more and more women in it. And indeed – the design industry has no problem attracting women because currently, up to 60% of junior positions are held by them. The problem arises with their retention and promotion. Many companies are concerned that women often take leave to have a baby. As a result, they have a more challenging time negotiating fair pay or earning promotions. These are just some of the obstacles women face in the work environment, and already their opportunities to hold senior positions are significantly diminishing. It is worth emphasising here that diversified teams alone do not guarantee success. Even in gender-balanced teams, project briefs that focus on women may still be underrepresented. This is because those approving budgets or writing project development proposals (i.e. decision-makers) are guided by their own priorities, which may include gender bias.

Another critical driver of biased design is data. Project decisions are often data-driven, but they may not show gender-specific trends when considered as a whole. Additionally, the data sets used may simply be limited and problematic due to the overrepresentation of one group (as was the case with facial recognition systems).

Data issues also arise during prototype testing and user research. Undifferentiated and often not very large, samples may fail to capture different needs and expectations based on gender. This is amplified by confirmation bias, where researchers and designers may look to respondents' answers to confirm their personal beliefs.

How does this circle close? Data affects funding. Funding is decided by the people at the top... Do you see where this is going? In this environment, which is (still) male-dominated, you need a combination of big and small data. In other words, it is important to capture not only general trends but also those of specific groups. Otherwise, entrepreneurs developing new technologies will rely on their own experience and assumptions, which are not universal. Less data on women and other underrepresented groups means less chance of making ideas credible and getting funding.

 

Changes are coming. How do we support them?

 

Despite the many obstacles and problems facing the design and technology industry, we must admit that we live in the best of times to overturn outdated and harmful standards. And in fact – changes are coming. To the examples mentioned above of improvements in various types of products, we can add emerging legal standards on the horizon. For example, last year, the U.S. Federal Trade Commission turned its attention to fairness in AI. One of its commissioners publicly stated that it should expand its oversight of discriminatory AI. It's good to see that designers, developers, engineers, leaders and even legislation are responsive and react to previously overlooked user needs. It's just a shame that such changes happen slowly and only in response to loud protests and criticism from the public. It is no longer acceptable that in the 21st century, thinking about people's needs is still dominated by the male perspective in the context of products and services intended for everyone. 

So, how can we accelerate and facilitate this change? Let's conclude this article with a handful of tips and directions.​​​​​​

  1. Diverse teams

Building diversified teams should include not only project or research teams but also managers, executives, and engineers. Hire more women and promote them to leadership positions. Don't see this as a box to check, but as an opportunity to create products that don't discriminate against certain users. Build a better future for all.

  1. Diversified and sex-disaggregated data

Collect diversified data and look not only for general trends but also gender-specific ones

  1. Education & bias awareness

Educate yourself and your colleagues – even excellent data can fall short if those analysing it are unaware of the biases and stereotypes that influence the discovery of the right context and framing of insights.

  1. Algorithmic fairness 

Companies using AI should use best practices to both identify and minimise instances where their AI generates unfair results. Emerging clear standards for fairness testing can help them do this. Companies can also draw on public guidance offered by experts in the field.

  1. Challenging priorities

Designers should be aware of decision-making priorities at earlier stages, such as funding or R&D, and question those that may contain gender biases.

 

Sources:
Perez, C. C. (2019). Invisible women: Exposing data bias in a world designed for men. Random House.
Elise Tak, Shelley J Correll, Sarah A Soule, Gender Inequality in Product Markets: When and How Status Beliefs Transfer to Products, Social Forces, Volume 98, Issue 2, December 2019, Pages 548–577, https://doi.org/10.1093/sf/soy125

 

Kamila Rodzinka
Kamila Rodzinka
Dec 03, 2021
10 min read

Need a help to define UX processes for your product?

Let’s TALK

Looking for a Partner for Your Digital Product?

Let’s TALK