Machine learning Fairness and Why it Matters
Table of contents

Why Machine Learning Fairness Matters
In recent years, machine learning has become increasingly embedded in decision-making processes across various sectors. These systems are now used in fields such as healthcare, where they help make predictions about cancer detection and drug discovery, and in finance, where they influence decisions like loan approval or fraud detection.
While machine learning has shown significant benefits in these areas, there are growing concerns about bias and discrimination embedded in these systems. These biases can emerge unintentionally, and when left unchecked, they can perpetuate inequality. This article aims to explain why this issue is important and why everyone, regardless of their background, should care about it.
I will use an example that is very specific to the Ghanaian context.
A Real-World Example: The Gendered Career Pathways in Ghana
In Ghana, like many other parts of the world, societal expectations have historically shaped career choices. For many years, girls were often encouraged by their parents to pursue careers in nursing or midwifery, especially in the early 2000s. This was because these professions were seen as more accessible and offered steady job opportunities. Boys, on the other hand, were often encouraged to go to medical school if they had strong academic performance, as it was viewed as a more prestigious career path with high earning potential and job security.
Fast forward to today, and many women are thriving as nurses and midwives, and many men have become doctors. While both paths have been successful for many individuals, the career trajectories were, to some extent, influenced by gendered expectations. These societal biases played a role in determining career options, often limiting opportunities based on gender rather than individual capabilities or aspirations.
Linking the Example to Machine Learning
In machine learning, biases can emerge if systems are trained on historical data that reflects societal patterns. For example, if a loan approval system is trained on data from a society where men historically had greater access to high-paying jobs, the system might unintentionally favor male applicants over female applicants, even when other factors, such as creditworthiness, are equal.
This concern isn't just hypothetical; it has already been observed in several real-world cases. For instance, a machine learning model used for hiring may favor male candidates for certain roles due to historical gender imbalances present in the training data. These biases can perpetuate stereotypes and reinforce existing inequalities, making it harder for marginalized groups to secure equal opportunities.
Why Should We Care?
The problem with biased machine learning systems extends beyond technical challenges; it has real-world consequences. Discriminatory algorithms can lead to unequal treatment, perpetuate societal inequalities, and impede progress toward a more just society. As more decisions are made by machines, we must ensure that these systems are fair and equitable for everyone, regardless of gender, race, or socioeconomic status. The increasing use of machine learning in fields such as hiring, healthcare, finance, and law enforcement raises the stakes significantly. If left unaddressed, biases in these systems can exacerbate existing social disparities and undermine public trust in technology.
Conclusion
Machine learning is a powerful tool with the potential to enhance many aspects of our lives. However, it is crucial to recognize that it is not neutral. Like people, machine learning systems can reflect the biases present in society. By understanding these issues and advocating for fairness in these systems, we can work towards a future where technology benefits everyone equally, without perpetuating harmful stereotypes or discrimination.
Written By,
Subscribe to my newsletter
Read articles from Women in Machine Learning and Data Science Accra directly inside your inbox. Subscribe to the newsletter, and don't miss out.
Written by

Women in Machine Learning and Data Science Accra
Women in Machine Learning and Data Science Accra
WiMLDS's mission is to support and promote women and gender minorities who are practicing, studying, or are interested in the fields of machine learning and data science. We create opportunities for members to engage in technical and professional conversations in a positive, supportive environment by hosting talks by women and gender minority individuals working in data science or machine learning. Events include technical workshops, networking events, and hackathons. We are inclusive to anyone who supports our cause regardless of gender identity or technical background. Our Code of Conduct ( https://github.com/WiMLDS/starter-kit/wiki/Code-of-conduct ) is available online and applies to all our spaces, both online and off. • Follow @wimlds ( https://twitter.com/wimlds ) on Twitter for general WiMLDS news or visit http://wimlds.org ( http://wimlds.org/ ) to learn about our chapters in other cities. • Women & gender minorities are invited to join the global WiMLDS Slack group by sending an email to slack@wimlds.org.