Skip to content
Capilano Courier
Menu
  • Home
  • Sections
    • News
    • Features & The Profile
    • Arts & Culture
    • Letters
    • Humour
    • Video Production
  • About
    • Advertise
    • Contribute
  • Meet the Crew
  • Online Issues
  • Events
Menu

Is Toxicity Hardwired into Social Media Platforms?

Posted on December 1, 2025December 11, 2025 by Editor-In-Chief

Research into AI tells us it is

Lily Dykstra (she/her) // Contributor
Rachel Lu (she/her) // Illustrator

It’s no secret that over the years social media has garnered a reputation for being problematic. On platforms such as Instagram and TikTok, we sometimes blame this negative atmosphere on users, but, as it turns out, social media’s toxicity may be more a product of its fundamental structure. While human interaction is a crucial component of social media, a recent study shows that most of these platforms operate in a way that incentivizes bad behaviour, and that users—or certain algorithms—are not necessarily to blame. 

In 2025, at the University of Amsterdam, two researchers used LLMs (large language models) to create bots in order to simulate social media users. The study had the bots interact with one another, post, repost and follow each other. On the main news feed, 10 posts were shown, with only five of them being from users the bot already followed to simulate the way users find new accounts and how what is popular tends to be shown more frequently. The platform was created without any “complex recommendation algorithms,” the goal of this being to “construct a minimal environment capable of reproducing well-documented macro-level patterns.” In the end, they found the same problems that many real-life users have already been experiencing on social media, including political echo chambers, negative influences and followings concentrated to a small percent of users. Each round of simulation, which involved allowing the bots to operate within the artificial social media just as humans would in real-life, consisted of a random user reposting, sharing or doing nothing with the posts shown to them on the feed. Who each bot follows was determined by what they repost, and often they only interacted with users who share the same “beliefs” or views as their own. 

On social media platforms, it’s common for extreme political opinions to garner more attention than less inflammatory content, which in turn, creates a kind of feedback loop where users are primarily subjected to certain kinds of views. Users are more likely to interact with content that is seeing high numbers of likes and comments—whether they are aware of it or not—furthering its reach and ensuring that only what is popular is what is seen. Finally, over time followers are concentrated to a small number of users, minimizing the visibility of certain users and perpetuating popular ideals. 

Within the study, after the bots showed signs of exhibiting the kinds of negative behaviours that are stereotypical of current social media platforms, the researchers implemented a number of different changes to the algorithms, primarily ones that have been brought up in popular discourse regarding how to make social media a more positive place. The changes included structuring the feed chronologically, obscuring likes and hiding bios to help limit the echo-chamber-effect. Overall, these changes to the algorithm had little effect, and in some circumstances exacerbated the issues. In short, the researchers found that algorithms do little to influence the way social media operates, specifically in relation to remedying its harmful outcomes. 

This suggests that the study’s “findings challenge the common view that social media’s dysfunctions are primarily the result of algorithmic curation,” and that “the problems may be rooted in the very architecture of social media platforms.” The findings of this study bring into question how social media can be fixed, but also, the ethics of using platforms that are, evidently, so inherently flawed.

 

Category: News

Post navigation

← Faculty Merger Divides Senate
Left Behind →

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Upcoming Tabling Hours: Friday, January 16, 2026, from 12 to 2 p.m. at the Learning Commons entrance (LB 126).

Latest News

  • Major Win for CapU Student Workers   
    New Student Employee Union Gets Wage Increase  Mayumi Izumi (she/her) // Contributor Rachel Lu (She/Her) // Illustrator Organizers at […]
  • Orange Pilled
    Vancouver Mayor Ken Sim’s Bitcoin Obsession   Ben Taylor (He/Him) // Crew Writer   Alex Baidanuta (She/Her) // Illustrator    […]
  • “The province just put our campuses on the chopping block” –ABCS
    Students and faculty across the province are sounding the alarm Laura Morales P. (she/her) // Co-EIC Yizou Li (He/Him) // Illustrator  The […]
  • DULF and the Case for Radical Harm Reduction
     The need for safer supply continues as the Drug Users Liberation Front contends with legal battle  Ren Zhang (they/them) // Contributor […]
  • Who will fund Canadian colleges and universities if not lower-middle income countries?
    Post-secondary education at the intersection of austerity and greed Laura Morales P. (she/her) // Writer & Data Visualization Andrei […]
  • Delays for on-campus student housing
    University announces Summer 2026 move-in date Cami Davila (she/her) // Crew Writer Rachel Lu (she/her) // Illustrator Capilano University’s […]
Video Production
We sat down with Jason Madar, a computer science instructor at Capilano University, to talk about AI, what’s real, what’s hype, and why understanding how it actually works matters more than ever.

As AI continues to reshape education, Madar is focused on making these tools accessible, transparent, and grounded in critical thinking.

📖 Read the full "ARTIFICIAL" issue and more:
https://www.capilanocourier.com/
📲 Follow us for updates, stories, and behind-the-scenes:
@capilano.courier
Understanding AI
Subscribe
What even is a Zine? Mia shows us a behind the scene of how this little publication comes together, the vision behind it, and how to become a paid contributor of the C.C. Crumb!
Indigenous power means something different to every student, but it always begins with voice, community, and truth. Hear what CapU students had to say.
What does campus clean-up day look like?
© 2026 Capilano Courier | Powered by Minimalist Blog WordPress Theme