Data is fundamental to our world. Data influence a wide range of things in our lives: from policy making and urban planning, to the size of our medicine pills or the soles of our shoes. Data is involved in practically every single design decision we experience in the world. Often times, data is collected through a one-size-fits-all approach and is not gender segregated. But men and women behave differently. Men and women react to things both emotionally and physically in different ways. This one-size-fits-all approach to data collecting also has great and far-reaching implications when it comes to accounting for the well beings of women.

On March 25, NASA canceled its highly anticipated all-female spacewalk, citing lack of space suits in the correct size. To put it short: they only have one medium-sized astronaut suit – as most of the other suits are sized L or XL – designed for larger astronauts, i.e. men. Astronaut Anne McClain had learned, during her first spacewalk, that a smaller space suit fit her better, which means that the medium suit is safer for women who typically are smaller in comparison to men.

Astronaut McClain in a spacesuit. Photo: NASA.

How this realization did not come sooner is a prime example of data bias. The designers of the suits and the users of the suits (male astronauts) probably didn’t set out to exclude women. They simply didn’t consider the needs of a non-male-sized astronaut. In a way, it makes sense as historically we’ve had 300 male astronauts and only 40 female astronauts in space.

Space suits are also expensive to make. Therefore in the past, smaller sized female astronauts simply weren’t considered for spacewalks. The bias here is not because data wants to exclude women, but more so because data, by default, is assumed to be applicable to the male observer and the male user. So, the data bias gap is filled when astronaut McClain went on her first space walk and realized that – indeed – a smaller sized suit suits her better. Ultimately, McClain recommended to not push for all-female spacewalks for the safety of both female astronauts.

Back on Earth, these data biases have deadlier implications on women as well. In 2013, the Food and Drug Association in the USA recommended women to reduce their sleeping pill dose by half (as opposed to one pill) because women are metabolizing the drug twice as slowly as men. Women are also 50% more likely to be misdiagnosed after suffering a heart attack – because for decades, the heart attack symptoms that are common knowledge are male symptoms and heart attacks exhibiting female symptoms are treated as anomalies.

These kinds of data biases are everywhere. And so, the need to close the gender data bias gap is urgent as the world is becoming increasingly inseparable from artificial intelligence.

Like data, artificial intelligence, or AI, are within every aspects of our daily lives. AI is a system that does basic computing decisions. Our email filters, automatically corrected typos, Google predictive searches, even chat boxes are all examples of basic AI. Like much of today’s research, designs and policies is reliant on data, AI also relies on data input to “learn”. The problem is that many developers and decision-makers tend to come from homogenous groups, such as coming from similar backgrounds, similar education, race and gender. Again, while these developers may not have gone out of their way to be racist or sexist, their biases can negatively influence women’s lives.

Consider Google’s infamous “racist” auto-tag, where it identified two black people as ‘gorillas’. While Google made statements to modify and improve their facial tagging algorithms, they ended up quick-fixing it by quietly taking the function to tag gorillas away.

Another example is the Microsoft bot Tay, who learned to be racist and sexist 16 hours after being online. Tay is a Twitter bot designed to learn from tweeting with other Twitter users. She did not learn to be racist or sexist by default, but due to learning from massive amounts of racist and sexist tweets from internet trolls. Microsoft later suspended Tay indefinitely, claiming the reason being the AI learned to misbehave through its followers.

There is also a data bias when it comes to health. Rarely do we see information about what a heart attack feels like for a female. Photo: Kirstin Grace Simons, Arkansas Dept of Health.

While these examples seem bizarre and – dare I say – somewhat dystopically hilarious, the reality is that AI is already influencing our daily and professional lives. Companies are using AI recruiting tools to sort through resumes. Amazon shut down their recruiting AI after learning that it is biased against women.

Similar biases are seen in facial recognition systems. MIT researchers have found that facial recognition systems are terrible at recognizing individuals who are especially women with darker skin. When engineers train these machines primarily with images of white males, the algorithm learns from it and becomes biased.

When society collectively feeds into these sorts of machines learning with our own biases – either by trolling or unconscious bias – the bias lives on. We have inherent biases, and AI inherits it. This is not an argument against AI or data collecting, however. In flights, AI is helping planes and pilots make better and safer flying decisions. Auto-correct helps us detect misspellings in our essays. Data research helps us understand medicine, urban planning, population, and overall helps make better informed decisions. The only way to counter this problem is to recognize and encourage more women and diversity within these fields, accept that there are biases, and try to work constructively to create a better and more inclusive world.’

Emily Hsiang

2024 © The Perspective – All Rights Reserved