Breaking the algorithm: The rise of 'algospeak' in social media
By Nancy Nuñez, María Morena Vicente and Emiliano Rodriguez Nuesh
In an age of information overload, the internet is often seen as a tool for knowledge, yet it remains surprisingly challenging to find reliable information on certain critical but underreported issues, such as ongoing global conflicts or other genocides. In this article, we propose to explore the systemic barriers that prevent the visibility of these topics and the role search engines play in reinforcing these invisibilities.
Algorithmic bias
Search algorithms prioritize popular or mainstream content over issues that affect marginalized groups. This phenomenon, known as algorithmic bias, isolates important crises from the public consciousness. For instance, the use of complex ranking systems can push lesser-known crises further down in search results, effectively isolating these topics from public consciousness.
Whether this isolation results from user preferences, search engine biases, or a combination of both remains a pressing question. Algorithms favor content that generates clicks and engagement, often at the expense of stories that may evoke discomfort or require more nuanced understanding.
How do our own psychological tendencies—such as avoiding distressing information—intersect with technological filters, contributing to a self-reinforcing cycle of ignorance?
The rise of social media platforms has created additional psychological and technological filters that shape how we consume information. The concept of psychic numbing finds new relevance in the way social media algorithms curate content. Platforms like TikTok increasingly rely on engagement metrics that prioritize entertainment or viral content over substantive global events. This fosters a form of psychological distancing, allowing users to disconnect from complex issues like genocide or humanitarian crises.
The emerging phenomenon of 'algospeak'
On social media platforms, the emerging phenomenon of 'algospeak' is a new form of coded language that users develop to avoid content moderation on social platforms, particularly when discussing sensitive or controversial issues. Algospeak is a term that combines algo- from the word algorithm with the word speak and is used to describe evasive language strategies.
A recent example involves how Palestinian voices on TikTok and Instagram have been forced to use euphemisms and indirect terms to discuss the Israel-Hamas conflict without triggering algorithmic censorship.
Another instance of algospeak emerged during the Black Lives Matter protests in 2020, where activists used modified hashtags and alternative spellings to continue conversations about systemic racism despite algorithmic suppression. Creative adaptations like using "BLM" instead of "Black Lives Matter" or adding characters and numbers to avoid flagging helped these movements remain visible online.
Algospeak has become essential as a growing lexicon of coded phrases that have emerged to evade content moderation in discussions related to human rights.
"Unalive" – Used in place of words like "kill" or "death" to avoid content removal in discussions about violence, war, or protests.
"SA" or "S/A" – Abbreviation for "sexual assault," commonly used to prevent the content from being flagged or suppressed by social media algorithms.
These coded phrases create a digital workaround for discussing critical issues without triggering algorithmic moderation. While this allows users to evade content takedowns, it also presents a challenge: the need to constantly adapt language to stay visible.
While algospeak enables users to share sensitive content, it also has drawbacks. The use of coded language can fragment public discourse, creating confusion or misunderstanding for audiences unfamiliar with the evolving terms. Additionally, important discussions may become hidden behind layers of cryptic language, limiting their reach and accessibility.
As social media continues to evolve, so too does the language we use to communicate within its constraints. Algospeak, while it offers a temporary solution to censorship, it also underscores the need for better mechanisms to ensure that vital discussions, especially those related to human rights and global conflicts, can be held openly and without fear of suppression.
Here are some useful links to further explore how social media algorithms and phenomena like algospeak impact the visibility of critical issues:
Pro-Palestinian creators use secret spellings, and code words to evade social media algorithms
Civil rights groups urge Facebook to fix 'racially biased' moderation system