How to tackle the sticky gender bias in AI?
- Any examination of bias in AI needs to recognize the fact that these biases mainly stem from humans’ inherent biases.
- A critical ingredient of common AI systems like Amazon’s Alexa and Apple’s Siri, among others, has been found to show gender biases.
- The data that informs algorithms, AI, and automation should be sex-disaggregated.
Artificial Intelligence (AI) appears neutral, but it’s made by humans, which means it internalizes all the same bias as we have – including gender bias. It can be seen through popular AI applications that we use today, most of which propagating gender stereotypes. In order to produce technology that is fairer, there must be a concerted effort from researchers and machine learning teams across the industry to correct this imbalance.
It is important to note that gender bias is not merely a male problem. A UNDP report entitled Tackling Social Norms found that about 90% of people — both men and women — hold some bias against women. That said, AI applications are generally trained using data that are generated by humans, and humans are inherently biased. For instance, natural language processing (NLP), a critical ingredient of common AI systems like Amazon’s Alexa and Apple’s Siri, among others, has been found to show gender biases – and this is not a standalone incident.
While most of the virtual assistants powered by AI have women’s voices, the most powerful computer in the world, Watson, is named after a man. Hence why, in order to produce technology that is fairer, a more concerted effort from researchers and machine learning teams across the industry is needed to correct this imbalance. Harvard Business Review cited word-embeddings as a bias aspect of AI. Like a game of word-association, these systems can often associate ‘man’ with ‘doctor’ and ‘woman’ with ‘nurse’. These don’t reflect modern society, or at least how we want modern society to progress.
How does gender bias occur in AI?
Simply put, gender bias occurs during machine learning and an example is in the dataset. Say there are not enough women contributing, there will be holes in the AI’s knowledge, and this is why bias errors happen. Machine learning is of course led by humans, which means their own bias will be incorporated within the AI system.
The problem is, algorithms are everywhere, making decisions on our behalf in ways that are often opaque to us. With 78% of AI professionals being men, male experiences inform and dominate algorithm creation. This gender bias can have significant adverse implications for women. A real-world built and designed using data for men ignores the needs of half its population. This holds true even when artificial intelligence is harnessed to solve challenges facing all of humanity.
What can we do about it?
Frankly speaking, technologies are rarely gender-neutral in practice. If AI and automation continue to ignore women’s experiences or leave women behind, everyone will be worse off. Experts reckon that all standards related to AI and automation should integrate a gender perspective in a holistic manner, rather than treating gender as merely a bias issue to be managed.
Hence why, the first and most important step in fighting gender bias in AI is to correct the datasets used in training and testing systems, models, and algorithms. The data that informs algorithms, AI, and automation should be sex-disaggregated, otherwise, the experiences of women will not inform these technological tools and in turn, might continue to internalize existing gender biases against women. Moreover, even data related to women should be guarded against any inherent gender bias.
For that to be possible, women should be active participants—rather than mere passive beneficiaries—in creating AI and automation. Women and their experiences should be adequately integrated into all steps related to the design, development, and application of AI and automation. In addition to proactively hiring more women at all levels, AI and automation companies should engage gender experts and women’s organizations from the outset in conducting human rights due diligence
2 December 2022
1 December 2022