Algorithms and Compassion Collapse

By María Morena Vicente and Emiliano Rodriguez Nuesch

Research shows that social media algorithms tend to show us content that aligns with our interests and beliefs, limiting our exposure to differing opinions and perspectives.

This reinforcement of like-minded views fosters polarization, as we become less likely to understand or empathize with those outside our bubble.

This logic can contribute to "compassion collapse," where people show lower levels of empathy with others.

How do algorithms define what we read and see online?

Algorithms shape what we see online by analyzing our behavior—what we click, like, share, and spend time viewing—and using this data to curate content tailored to our interests and habits.

This personalization creates different feedback loops, continually reinforcing our preferences and limiting exposure to diverse perspectives.

While this makes our online experience feel relevant, it often traps us in filter bubbles that shape our knowledge, opinions, and interactions with the world.

How do we go beyond the bubbles?

Behavioral psychologist Michele Gelfand conducted an interesting experiment to test this idea. What if reading someone else’s diary could reduce prejudice and increase empathy?

In her research, Gelfand used a method known as the "diary contact technique," where participants read authentic diaries from someone of another culture, aiming to reduce cultural distance and challenge stereotypes.

Gelfand’s study involved 10 Americans and 10 Pakistanis, who submitted diaries over a week, detailing mundane aspects of their lives. Then, 200 participants from each culture were asked to read the diaries.

The results were compelling: participants who read diaries from the opposite culture showed reduced prejudice, viewing others as more human and similar. They still recognized cultural differences, but the experience helped bridge the gap between perceptions.

This showcases how individual stories are a great way to increase our empathy and compassion due to the singularity effect.

There are many other things we can do to break free from filter bubbles.

We can start by actively seeking diverse voices and perspectives. Follow people, organizations, and media outlets that challenge your usual views to broaden your understanding of different ideas and experiences.

For example, non-profit platforms like Braver Angels promote civil discourse between opposing political views, fostering understanding and reducing polarization. Engaging in such projects can provide a broader and clearer perspective on reality.

Besides reducing your time online, there are also experiential activities you can join to counteract the constant exposure to the algorithm.

The Empathy Museum's "A Mile in My Shoes" is a traveling exhibit that invites participants to literally walk in someone else's shoes while listening to an audio recording of their life story. It aims to build empathy by offering direct insights into someone else's experiences.

The Human Library is designed to build a positive framework for conversations that can challenge stereotypes and prejudices through dialogue. The Human Library is a place where real people are on loan to readers. A place where difficult questions are expected, appreciated and answered. This initiative is now available in more than 80 countries.

Engaging in meaningful face-to-face conversations with those who hold different views fosters empathy and understanding in ways that online interactions often cannot.

What other things can you do when you’re online?

Diversify Your Information Sources. Follow Across Ideologies by subscribing to news outlets, and organizations representing various political, cultural, and global perspectives.

Use Bias-Tracking Tools. Platforms like AllSides and Ground News show how stories are covered across ideological divides. Read articles from sources labeled left, center, and right to form a balanced view.

Search and Engage Intentionally. Interact with diverse posts to train algorithms to suggest a broader range of content. Search for content from perspectives you don’t usually engage with.

Follow Opposing Accounts. Intentionally follow accounts or pages that challenge your views, creating a more diverse feed.

Clear Algorithm Bias. Regularly clear your browsing and search history. Use incognito mode to explore topics neutrally without prior personalization influencing results.

Turn Off Autoplay Features. Disable recommendations on platforms like YouTube or social media to prevent echo chamber reinforcement. Disable personalized ads and tracking where possible.

Be Critical and Intentional. Before clicking or sharing, ask: Who benefits from this perspective? and What might the opposing side say?

Keep in mind: algorithmic systems reflect human biases, but they can also be reshaped by human agency. By taking proactive steps like these, you can outsmart algorithms and cultivate a more nuanced understanding of the world, reducing polarization and fostering empathy.