Tackling online misogyny in Ethiopia
Ethiopian women face rising online discrimination, hindering equality. 91Ö±²¥ researchers, with the Centre for Information Resilience and local partners, used NLP to expose the issue and offer evidence-based steps for safer digital participation.
As more of our lives move online, new risks are emerging alongside new opportunities. One of the most concerning is technology-facilitated gender-based violence (TFGBV), the gendered harassment, abuse and discrimination carried out or amplified through digital platforms. For many women and girls, this creates barriers to safe, meaningful participation in public life.
In Ethiopia, TFGBV has become a serious challenge, yet little quantitative evidence has existed to measure its scale or provide solutions. Aiming to fill that gap, a 91Ö±²¥ team led by Dr Riza Batista-Navarro, in collaboration with the Centre for Information Resilience, carried out the ADAGE project to highlight the scale and nature of gendered hate speech online.
Natural language processing (NLP)
To carry out the research, CIR developed a lexicon of more than 2,000 inflammatory terms across four languages – Amharic, Afaan Oromo, Tigrigna and English.
Then, by combining expertise in computational linguistics, NLP and the Ethiopian online context, Dr Riza Batista-Navarro’s team developed a framework for identifying hate-containing posts on social media, while factoring in dimensions such as the target, type and nature of hate speech.
This approach enabled the analysis of millions of social media posts, of which more than 7k were examined in detail. The analysis led to key findings: (a) that – different to Ethiopian men – Ethiopian women receive substantial hate speech in the form of mockery, irony and gender stereotypes that imply inferiority; and (b) the risk of women being targeted by online hate speech is compounded by other protected characteristics such as ethnicity. Working closely with Ethiopian experts, the team ensured cultural and linguistic accuracy, producing the first large-scale labelled dataset of its kind.
Data to inform action
The findings show that women and girls face distinct forms of online abuse compared to men and boys. Gendered insults, stereotypes, and mockery are commonplace, often minimised or dismissed as less harmful than threats or aggressive language. Yet these forms of abuse reinforce harmful gender norms and contribute to the silencing of women in public life. Intersectional abuse, where gender combines with ethnicity or religion, was also prevalent, particularly during times of conflict.
Addressing TFGBV is vital to ensuring women and girls can participate safely and meaningfully in public life.
The project has already led to a report and a set of 34 recommendations across seven policy areas, designed to guide government, civil society and tech companies. Together, they offer practical recommendations towards a safer online environment – and greater gender equality. These recommendations include: targeted, platform-specific responses; greater public education on hate speech; and stronger action from governments, civil society organisations, and social media companies are required.
By strengthening the evidence base and providing practical recommendations, the ADAGE project has helped support safer online spaces for women and girls in digital and public life.
Meet the researcher
Dr Riza Batista-Navarro is Senior Lecturer in Text Mining at the Department of Computer Science of the University of Manchester. In her work, she focusses on the development of natural language processing methods for information extraction, explainable text classification, machine reading comprehension and language modelling.