“Imagine a world where your phone is too big for your hand, your doctor prescribes a drug that is wrong for your body, or the countless hours of work you do every week are not recognised or valued every week.

If any of this sounds familiar, chances are that you’re a woman,” writes Caroline Criado Perez, author of #1 Sunday Times best-selling book, Invisible Women: Exposing Data Bias in a World Designed for Men, published in 2019.

Her book aims to show how the gender data gap – “a gap in our knowledge that is at the root of perpetual, systemic discrimination against women” – has created a pervasive but invisible bias in everything from policy to infrastructure and product design.

In one example, she looks at toilets. In April 2017, during the interval of a show at a London theatre, BBC journalist, Samira Ahmed, went to use the loo. Typically, there are long queues outside the women’s toilets at intervals. The theatre had decided to “fix” the problem by replacing the 'men’s‘ and ’women’s' signage with 'gender neutral with urinals' and 'gender neutral with cubicles'. Instead, this exacerbated queues, as only men were using the 'gender neutral with urinals' toilets, but both men and women were using the 'gender neutral with cubicles' toilets. “Rather than rendering the toilets actually gender neutral by this move, they had simply increased the provision for men,” writes Perez. “Women are generally not able to use urinals, while men are, of course, able to use both urinals and cubicles. There were also no sanitary bins in the 'gender neutral with urinals' toilets.”

She says that while it may seem fair to grant male and female toilets the same amount of floor space (often formalised in plumbing codes), if male toilets have both cubicles and urinals, the number of people who can relieve themselves at once is actually far higher.

Secondly, women take up to 2.3 times as long as men to use the toilet. The reasons range from being more likely to have someone needing assistance with them (a child, disabled person or elderly person), experiencing menstruation (which necessitates changing out sanitary products) or pregnancy (which reduces bladder capacity) among other things.

“Routinely forgetting to accommodate the female body in design – whether medical, technological or architectural – has led to a less hospitable world and more dangerous for women to navigate,” Perez notes.

Gender vs sex

To paraphrase Carolyn M. Mazure, a professor in women’s health and professor of psychiatry and psychology at Yale, the term ’sex’ should be used as a classification, generally male or female, according to reproductive organs and one’s chromosomal complement (typically XX for female and XY for male). On the other hand, the term 'gender' refers to a person's self-representation as male or female. However, Eric Swanson, director of research at Open Data Watch, says that while there is currently much debate around the recognition that gender is not binary, when it comes to statistics, data related to gender is still collected largely based on sex (usually, sex assigned at birth). “At this point, there is no practical way as yet to collect a wide range of data on non-binary genders,” he says.

When it comes to statistics, the terms 'gender' and 'sex' are often used interchangeably.

Visibility required

To address a problem, it has to be visible. We need good data and to be able to segment it to make clear any underlying trends and patterns. Tawheeda Wahabzada, data and policy specialist at Open Data Watch, explains that ‘gender data gaps’ refer to a lack of available data or sex-disaggregated or gender-disaggregated data.

In other words, where there’s a gender data gap, we don’t have data on how an issue affects women, whether it’s because data doesn’t exist or because they aren’t broken down into subcategories based on gender/sex. These gaps pose barriers to the knowledge of the status of women and girls in terms of education, access to health, ICT, political participation, economic roles, and so forth.

For example, Perez points out a 2008 study of textbooks recommended by Dutch medical schools, which found that sex-specific information was absent even in sections on topics where sex differences have long been established (such as depression and the effects of alcohol on the body). Furthermore, results from clinical trials were presented as valid for men and women, even when women were excluded from the study.

Perez calls this the “male-default bias” and points out how gender gaps in medicine mean that women are more likely than men to be misdiagnosed or given drugs that “don't work” as they’re meant to.

Why collecting and reporting data matters

Swanson says when it comes to collecting data, the assumption is that statistics exist because they're useful, partly because they help to guide our choices, including policy choices. “If you want them to guide policies about gender equity, for example, you need sex-disaggregated data.”

In partnership with Data2X, Open Data Watch led the Bridging the Gap project, assessing gender data gaps in Sub-Saharan Africa (including South Africa), Latin America and the Caribbean, and Asia and the Pacific.

Bridging the Gap assessed the availability of 104 gender-relevant indicators in 15 Sub-Saharan African countries, aiming to identify any gaps (because you can’t fill a gap until you’re aware of it). It revealed that 48% of gender-relevant indicators are missing or lack sex-disaggregated data in the countries studied.

“What was alarming to me is that there was potential for other indicators to be disaggregated by sex,” says Wahabzada. “For example, in Sub-Saharan Africa, 52% of gender data indicators are available with disaggregation, and 13% were available, but weren't disaggregated by sex.”

How South Africa stacks up

Bridging the Gap (published in 2018) found that South Africa lacked data on key aspects of women’s lives, with gaps identified in four dimensions:

  • Availability: South Africa’s national databases include only 62 of the 104 gender indicators.
  • Disaggregation: 16 gender indicators lack sex disaggregation.
  • Adherence to standards: 26 published gender indicators do not conform to internationally recommended definitions.
  • Timeliness: Six indicators have no published observations since 2015.

While closing gender data gaps requires time, resources, and capacity, Bridging the Gap suggests three opportunities to improve South Africa’s gender data ecosystem:

  • Publish indicators on human security (particularly crime and violence) that conform to international standards with appropriate disaggregation.
  • Improve sex disaggregation in international databases for economic opportunities and health indicators. While data for many, they lack disaggregation.
  • Harness available administrative data as a primary source of information on education to improve the availability of sex-disaggregated education indicators that conform to international standards.

How business can get involved

Wahabzada and Swanson suggest several ways the private sector, academia and civil society can help to support an improved data ecosystem, including:

  • Assisting with ITC solutions or technologies
  • Data sharing (with the requisite privacy controls in place)
  • Developing data-related skills and capacity (from data literacy to analysis)
  • Funding or undertaking research
  • Helping to mobilise political commitments and meaningful actions to advance inclusive and disaggregated data (for example, the Global Partnership for Sustainable Development Data’s Inclusive Data Charter)

Avoiding bias in the age of big data

The Netflix documentary, Coded Bias, focusing on MIT Media Lab researcher Joy Buolamwini’s discovery that facial recognition does not see dark-skinned faces accurately, made many people question the power of artificial intelligence in our lives and the bias being baked into algorithms.

Prof. Manoj Chiba, associate professor and data scientist at GIBS, says, “The problem is that the data fed into algorithms is based on our previous decisions, which means it includes our unconscious biases. So how do we address the bias in the data? The truth is that we can’t. If we start artificially manipulating the data, we end up with artificial results. Artificial results will only have a short-term impact; they will not have any medium to long-term impact at all.”

Instead, Chiba contends, we should change how we work with data, acknowledge bias in the data, and let this awareness drive decisions and actions.

“People like me were taught that when you see an outlier in data, you should ignore it or take it out of your calculations and analysis. That’s probably one of the root causes of the problems,” he says.

For example, looking at employee remuneration, imagine a call centre where most staff are women working at the lower end of the pay scale. A male CEO’s salary might be much higher than the bulk of employees and thus be excluded as an outlier, unintentionally skewing the equation when examining the issue of gender pay equity.

“Our mindset needs to change, so we include outliers in our analyses,” says Chiba. “Then decision-makers need to look at the analyses and identify when they’re reflecting something undesirable, such as being biased against women. From there, decisions need to acknowledge the bias and focus on steps that can be taken. Those will start feeding these fancy machine learning algorithms, and so algorithms will learn as we go.”

Chiba adds that we can also address bias by correcting data weighting. For example, where pharmaceutical studies have historically largely focused on white, male test subjects, using AI to scan through previous studies and databases and adjusting weightings for factors like age, gender, race and location, future scenarios can now be simulated. This accelerates R&D and uses existing technology to help address historical imbalances.

Taking action

To address gender gaps, we need to improve how we collect (and disaggregate) data and how we use data to shape decisions. This begins with acknowledging existing biases, which rest with people, not numbers or algorithms.

“The most important thing is to recognise the bias in our decision heuristics and to become conscious of this bias,” Chiba says. “For example, if I’m made aware that I only hire people who look and sound like me, it doesn’t mean I can’t ever hire an Indian male employee again – it means I become conscious of my familiarity bias. That will affect how I think and make decisions in the interview process. Remember, we are the data generators. We need to change our decisions to change the bias we see in AI.”

Perez notes that closing the gender data gap will not magically fix all the problems faced by women, “but getting to grips with the reality that gender-neutral does not automatically mean gender-equal would be an important start”.

Gallery

Related

Art(ificial) – Take #2

Art(ificial) – Take #2

High Tech, High Stress

High Tech, High Stress

Who Owns Your Face?

Who Owns Your Face?