Media Narratives and Algorithmic Bias: A Conversation on Race, Identity, and Justice

December 17, 2024

As humans, our perceptions are shaped by memory, experience, and the lessons we learn early in life. Memories influence how we see the world, while experiences help us understand ourselves and others. Foundational beliefs, often formed through what we are taught and observe as children, can remain unexamined into adulthood. For the sake of convenience or comfort, we may accept these beliefs as truth, shaping or unique perspectives. At times, this can lead us to view our own perspective as superior, unaware of how deeply rooted and unquestioned these assumptions truly are.

CILAR recently organized an executive leadership program with the Canada School of Public Service. One of our guest speakers led a thought-provoking discussion on media representation and the importance of leadership. Given that Black federal government executives attended this engagement, our speaker focused on what it meant for him, as a Black man, to critically examine Black representation in the news organization he leads.

The speaker recounted how, in his community, turning on the television often meant seeing a Black person associated with a negative news story—such as an arrest. This led him, and the audience, to reflect on how the appearance of a Black face on the screen frequently elicited assumptions that the content to follow would be unfavourable. This pattern instilled fear and disappointment, adding to the perception that, once again, bad news was linked to a Black individual. However, the speaker also highlighted the infamous Bruce McArthur case, where the public was inundated with images of a convicted killer smiling on social media, enjoying the outdoors, and engaging in everyday activities. He questioned why McArthur’s mugshot wasn’t shown.

For example, why is a 15-year-old White boy arrested for burglary described as a 'boy,' while a 15-year-old Black boy arrested for the same crime is described as a 'man.' The speaker highlighted how media shapes perceptions, often driving negativity toward one race while showing leniency to another. As a news anchor, he emphasized his commitment to equitable representation.

The speaker’s reflections on media's role in shaping racial perceptions resonated with me in a recent conversation I had with one of CILAR’s community members on the subject of race, identity, and representation She shared a comment her daughter had received, framed as a compliment, that made me stop and think: 'You don’t look Arab.'This led me to question: How was that a compliment? Why would looking Arab be perceived as undesirable?

Eliyana Haddad, a scholar from the Faculty of Education at York University, brings a thoughtful perspective to the complex discussions of race, racism, and racialization processes. Through her work in Critical Analysis of Early Understandings of Race, Racism, and Racialization Processes, Eliyana critically engages with historical and contemporary discourses, illuminating how these frameworks shape societal structures and lived experiences. This analysis, completed as part of GS/EDUC 5421: Discourses of Race and Racist Discourse under the guidance of Dr. Stephanie Fearon.

Allow me to introduce Eliyana Haddad, a proud Palestinian-Lebanese-Canadian. Inspired by CILAR’s recent research on algorithmic bias and the creation of discriminatory perspectives, I had the privilege of interviewing Eliyana to discuss the reality of power, positionality, and shaping one’s point of view.  

Personal Experiences:
How can media narratives and Western representations of racial identities shape people's sense of self?  

As part of this discourse, we see how algorithms are designed to impact what people see when they scroll - reaffirming certain views or minimizing others.


Media narratives heavily influence individual and collective identities, especially for marginalized groups. Historically, these narratives have relied on stereotypes, framing certain communities—racialized, Indigenous, or immigrant populations—in ways that diminish their complexity. Algorithms exacerbate this by prioritizing sensational or conflict-driven content, often amplifying biases and reinforcing narrow perceptions of groups which creates an ongoing challenge for individuals navigating their identities in societies that undervalue or misrepresent them.

Data and Dehumanization:
How do geopolitical events and dominant Western narratives often erase or misrepresent stories, and what impact does that have on identity?

Dominant narratives often erase or distort the stories of marginalized communities during geopolitical crises, framing them through a lens that serves existing power structures. For example, Indigenous struggles for sovereignty, Black movements against systemic violence, and even the erasure of Palestinian voices in Western discourse. This particular erasure perpetuates the devaluation of Palestinian life and culture, framing them either as aggressors or helpless victims. Rinaldo Walcott talks about how media outlets use selective data to bolster these narratives, further alienating and dehumanizing marginalized communities (Walcott, 2020). For those like myself with ties to these misrepresented identities, these portrayals not only impact public perception but also contribute to the internalized erasure of identities, as individuals feel disconnected from the full scope of their histories and lived realities.

Algorithmic Decision-Making:
How do you feel about the growing reliance on algorithms in areas like policing, hiring, or loan approvals? Have you encountered or observed specific examples where this reliance perpetuated racial discrimination?

The increasing reliance on algorithms in areas such as hiring, policing, and social services perpetuates systemic inequities. For instance, predictive policing disproportionately targets Black and Brown communities due to historical data reflecting biased policing practices that, in many jurisdictions, are built using data collected during “documented periods of flawed, racially biased, and sometimes unlawful practices and policies.” (Richardson et al., 2019). Hiring algorithms that are also trained on biased data sets often disadvantage women and non-binary individuals or exclude applicants from non-Western backgrounds. By automating inequality, these systems replicate and entrench the discrimination they are meant to overcome.

Representation and Erasure:
Who gets to decide which data is collected and used? How does this influence whose stories are told or erased in our digital systems?

The decision-makers behind data collection are typically those in positions of power, such as governments, corporations, and institutions, and often exclude marginalized communities, leading to their stories being erased or misrepresented. This control over data reinforces structural inequalities, as marginalized voices are excluded or tokenized. For instance, the experiences of Indigenous communities are frequently omitted from national narratives, Black and immigrant communities are disproportionately framed in terms of deficits rather than resilience, and while data on Palestinian casualties exists, Western media often downplays or skews it, erasing the human cost of conflict. This exclusion not only devalues these communities but also shapes public understanding and policy in harmful ways.

The intersection of Race and Identity:
Describe the complexities of navigating identity as someone who is both deeply tied to Arab heritage and perceived differently in Western contexts.

Navigating this dual identity is fraught with challenges and contradictions. I’ve been told, “You don’t look Arab,” as if that were a compliment. Comments like these reflect Western ideals of desirability and the devaluation of Arab identities. Walcott’s critique of racialization applies well here when we take a look at the ways in which Arab identity is constructed in Western discourse and how it shapes both external perceptions and internal struggles. Being Palestinian and Lebanese in the diaspora means carrying the weight of misrepresentation while asserting pride in a heritage often vilified or misunderstood.

Bias in AI Development:
Many algorithmic biases are baked in at the design stage. Have you witnessed or heard about situations where diverse voices were excluded from the tech development process? How might their inclusion have changed the outcome?

Algorithmic systems frequently reflect the biases of their creators, often due to the exclusion of diverse voices during their development. When systems lack input from women, Black, Indigenous, LGBTQIA+, and disabled communities, to name a few, they fail to account for the unique challenges these groups face. For example, facial recognition technologies have higher error rates for darker skin tones and gender-diverse individuals, highlighting the consequences of this exclusion (Hardesty, 2018). Including these perspectives would lead to more equitable and effective outcomes.

Political and Structural Dynamics:
How are racialized narratives constructed and perpetuated in Western discourse and what does that mean for resistance and reclamation of identity?

Western discourses often perpetuate racialized and gendered narratives through media, education, and policy frameworks that simplify complex identities and portray marginalized groups as monolithic or defined by conflict and deficit. For instance, Black communities are often framed in terms of crime and poverty, Indigenous people as relics of the past, and immigrant communities as burdens. Resisting these narratives involves amplifying the voices of those impacted and challenging the systems that perpetuate these stereotypes.

Media and Algorithmic Framing:
Media algorithms often amplify specific narratives. How have you seen this play out in terms of race, particularly in moments of crisis or conflict, such as during uprisings or international disputes?

Media algorithms prioritize content designed to engage, often through sensationalism, which disproportionately affects marginalized groups. During moments of crisis, these algorithms amplify polarizing narratives that frame racialized groups as aggressors or victims, obscuring their humanity. For example, during uprisings in Palestine, algorithms suppressed pro-Palestinian content while amplifying state narratives. This phenomenon is equally evident in portrayals of Black Lives Matter protests, global refugee movements, or Indigenous land defence efforts, where the focus on conflict overshadows resilience and agency. This selective framing distorts public perception and silences dissent. Walcott also talks about this, how algorithms are tools of power, not neutral entities. Challenging this requires both public accountability and systemic reform.

Accountability and Regulation:
In your opinion, what steps can be taken to hold organizations accountable for algorithmic bias? Are there effective models of regulation or community oversight that you support?

Addressing algorithmic bias requires transparent practices, robust oversight, and meaningful inclusion of marginalized communities in decision-making. Models of community-based participatory governance—where impacted groups play a central role in regulating these systems—are promising. Additionally, implementing mandatory bias audits and diversifying the tech sector can help ensure algorithms serve equity rather than exploitation.

Reimagining AI for Equity:
If you could redesign an algorithmic system to actively combat racism and promote equity, what would it look like? How would you ensure it serves excluded communities rather than exploits them?

Reimagining AI to promote equity involves designing systems that prioritize inclusivity and fairness. This would mean creating datasets representative of all communities, building transparency into decision-making processes, and actively working to identify and address biases. An equity-focused AI system might prioritize amplifying excluded voices, recognizing intersectionality, and offering pathways to counteract historical injustices, ensuring that technology serves as a tool for empowerment rather than oppression.

This conversation around identity, representation, and the systemic biases embedded in media and technology is more relevant than ever. As highlighted by our speaker and community voices like Eliyana Haddad, these issues are deeply personal yet universal, touching every facet of how underserved groups are perceived and how they navigate their identities. Whether it's through challenging reductive media narratives, interrogating the biases in AI systems, or questioning who controls the stories that shape our world, the goal remains the same: to reclaim agency and foster equity. By addressing these critical questions and amplifying diverse perspectives, we can begin to dismantle the structures that perpetuate inequity and build systems rooted in fairness, representation, and humanity.

About our Guest: Eliyana Haddad
Eliyana holds a bilingual Honours BA in Sociology with a minor in Gender and Women's Studies and a Certificate in Sexuality Studies. She is pursuing a Master’s in Education at York University, focusing on Palestinian resistance, critical race theory, and media narratives. Eliyana also works in communications, leveraging her academic expertise to address systemic inequities and amplify marginalized voices.
Follow her on LinkedIn: https://www.linkedin.com/in/eliyana-haddad/

Author: Rayah Ali
Rayah Ali is a documentary filmmaker and communications professional at CILAR, amplifying underrepresented voices and advocating for mental health awareness.
Follow her on LinkedIn: https://www.linkedin.com/in/rayah-ali-3b31a4119/