Measuring Gender Bias in Information Retrieval Models

Abstract

Information Retrieval (IR) is crucial for finding digital information, and search engines based on IR form an integral part of our daily lives. However, these systems can reinforce harmful societal biases, including gender bias, contributing to a vicious cycle of bias between data, algorithm, and user. Therefore, it is essential to define reliable metrics for quantifying gender bias in IR systems and to understand the reinforcing mechanisms in the ecosystem. In this thesis, we propose a novel extrinsic (application level) metric that measures bias in search results, a semi-extrinsic metric that assesses the bias in rankings of selected documents, and an intrinsic metric that evaluates the bias in query embeddings. We compare two types of extrinsic metrics: metrics based on the occurrence of gender-specific language in the search results and metrics that define bias as a skew in the search results of a non-gendered query toward those for gender-specific variations of the same query. Conducting experiments on a set of non-gendered bias-sensitive queries, we find that the former only perform well in certain subject areas while the latter are more susceptible to the influence of irrelevant search results. We further investigate the correlation of the bias measurements of the three metrics. Our experiments indicate a moderate positive correlation between the intrinsic and extrinsic metric results across three analysed IR models. They indicate a high positive correlation between the semi-extrinsic and extrinsic metric results across five IR models, whereby the per-model results for two of the five models show a negative or no correlation. We further use the metrics to measure bias in models before and after fine-tuning them using biased user interaction data. We find that 10 out of 13 bias measurements across different models indicate an increase in bias. However, only the semi-extrinsic metric detects a significant effect. Overall, our work contributes to the advancement of reliable metrics for measuring bias in IR.


Citation

Linda Ratz
Measuring Gender Bias in Information Retrieval Models
Advisor(s): Navid Rekab-saz,
Johannes Kepler University Linz, Master's Thesis, 2023.

BibTeX

@misc{Ratz2023master-thesis,
    title = {Measuring Gender Bias in Information Retrieval Models},
    author = {Ratz, Linda},
    school = {Johannes Kepler University Linz},
    year = {2023}
}