What is Demographic Parity? Explained Simply

In our last article, we delved into machine learning fairness and why it matters. If you missed that article, please see [insert link]. Having established the need for ensuring fairness in the models we develop, the next natural question is: How do we proceed?
Over the years, researchers have developed different ways to measure and mitigate bias in machine learning systems. In today’s article, we will focus on understanding one metric that can help us estimate how fair our machine-learning model is.
A classifier Yhat is said to be demographic parity fair if it ensures an equal positive prediction rate among groups.
$$P(\hat{Y} = 1| A = 0) = P(\hat{Y} = 1| A = 1)$$
Let’s consider an example: we have a trained classifier that predicts whether someone should be granted or denied a loan. We also assume that gender, which is a binary-sensitive attribute (Male or Female), is one of the factors being considered represented as A above. What demographic parity seeks to achieve is that the probability of the classifier assigning a positive prediction (in this case, granting the loan) should be equal between males and females.
Let’s say we have a dataset of 100 loan applicants. The classifier predicts whether an applicant should be approved for a loan, with a "1" indicating approval and a "0" indicating rejection. For simplicity, we focus on the binary sensitive attribute, gender, where 50 applicants are male and 50 are female.
Now, let’s assume that after training the classifier, we have the following results:
60% of male applicants (30 out of 50) were granted the loan.
40% of female applicants (20 out of 50) were granted the loan.
So, the classifier assigns a positive prediction (loan approval) to 30 out of 50 male applicants, and 20 out of 50 female applicants.
Mathematically, we can calculate the probability of a positive prediction for each group:
$$P(\hat{Y}=1 | A = 0) = \frac{30}{50} = 0.60 \, \text{for males}$$
$$P(\hat{Y}=1 | A = 1) = \frac{20}{50} = 0.40 \, \text{for females}$$
Here, we see that the probability of getting a loan is higher for male applicants (0.60) compared to female applicants (0.40). This violates demographic parity because the classifier does not treat both genders equally in terms of the probability of receiving a loan. In other words, the classifier is not demographic parity fair.
Why Does This Matter?
The issue with this kind of disparity is that it may reflect unintended bias in the model, which could stem from historical inequalities, societal stereotypes, or biased training data. For example, if the classifier is trained on past loan approval data where males were more likely to be granted loans, it may learn to disproportionately favor male applicants, even when controlling for other factors such as credit score, income, and debt-to-income ratio.
This can have significant real-world consequences. In this case, women may be unfairly disadvantaged when applying for loans, potentially limiting their access to financial resources. This is why fairness metrics like demographic parity are important—they help us identify and address such biases in machine-learning models before they cause harm.
Written By,
Subscribe to my newsletter
Read articles from Women in Machine Learning and Data Science Accra directly inside your inbox. Subscribe to the newsletter, and don't miss out.
Written by

Women in Machine Learning and Data Science Accra
Women in Machine Learning and Data Science Accra
WiMLDS's mission is to support and promote women and gender minorities who are practicing, studying, or are interested in the fields of machine learning and data science. We create opportunities for members to engage in technical and professional conversations in a positive, supportive environment by hosting talks by women and gender minority individuals working in data science or machine learning. Events include technical workshops, networking events, and hackathons. We are inclusive to anyone who supports our cause regardless of gender identity or technical background. Our Code of Conduct ( https://github.com/WiMLDS/starter-kit/wiki/Code-of-conduct ) is available online and applies to all our spaces, both online and off. • Follow @wimlds ( https://twitter.com/wimlds ) on Twitter for general WiMLDS news or visit http://wimlds.org ( http://wimlds.org/ ) to learn about our chapters in other cities. • Women & gender minorities are invited to join the global WiMLDS Slack group by sending an email to slack@wimlds.org.