Skip to content
Capilano Courier
Menu
  • Home
  • Sections
    • News
    • Features & The Profile
    • Arts & Culture
    • Letters
    • Humour
    • Video Production
  • About
    • Advertise
    • Contribute
  • Meet the Crew
  • Online Issues
  • Events
Menu

Is Toxicity Hardwired into Social Media Platforms?

Posted on December 1, 2025November 30, 2025 by Editor-In-Chief

Research into AI tells us it is

Lily Dykstra (she/her) // Contributor
Rachel Lu (she/her) // Illustrator

It’s no secret that over the years, social media has garnered a reputation for being problematic. On platforms such as Instagram and TikTok, we sometimes blame this negative atmosphere on users, but, as it turns out, social media’s toxicity may be more a product of its fundamental structure. While human interaction is a crucial component of social media, a recent study shows that most of these platforms operate in a way that incentivizes bad behaviour, and that users—or certain algorithms—are not necessarily to blame. 

In 2025, at the University of Amsterdam, two researchers used LLMs (large language models) to create bots in order to simulate social media users. The study had the bots interact with one another, post, repost and follow each other. On the main news feed, 10 posts were shown, with only five of them being from users the bot already followed to simulate the way users find new accounts and how what is popular tends to be shown more frequently. The platform was created without any “complex recommendation algorithms,” the goal of this being to “construct a minimal environment capable of reproducing well-documented macro-level patterns.” In the end, they found the same problems that many real-life users have already been experiencing on social media, including political echo chambers, negative influences and followings concentrated to a small percent of users. Each round of simulation, which involved allowing the bots to operate within the artificial social media just as humans would in real-life, consisted of a random user reposting, sharing or doing nothing with the posts shown to them on the feed. Who each bot follows was determined by what they repost, and often they only interacted with users who share the same “beliefs” or views as their own. 

On social media platforms, it’s common for extreme political opinions to garner more attention than less inflammatory content, which in turn, creates a kind of feedback loop where users are primarily subjected to certain kinds of views. Users are more likely to interact with content that is seeing high numbers of likes and comments—whether they are aware of it or not—furthering its reach and ensuring that only what is popular is what is seen. Finally, over time followers are concentrated to a small number of users, minimizing the visibility of certain users and perpetuating popular ideals. 

Within the study, after the bots showed signs of exhibiting the kinds of negative behaviours that are stereotypical of current social media platforms, the researchers implemented a number of different changes to the algorithms, primarily ones that have been brought up in popular discourse regarding how to make social media a more positive place. The changes included structuring the feed chronologically, obscuring likes and hiding bios to help limit the echo-chamber-effect. Overall, these changes to the algorithm had little effect, and in some circumstances exacerbated the issues. In short, the researchers found that algorithms do little to influence the way social media operates, specifically in relation to remedying its harmful outcomes. 

This suggests that the study’s “findings challenge the common view that social media’s dysfunctions are primarily the result of algorithmic curation,” and that “the problems may be rooted in the very architecture of social media platforms.” The findings of this study bring into question how social media can be fixed, but also, the ethics of using platforms that are, evidently, so inherently flawed.

 

Category: News

Post navigation

← Faculty Merger Divides Senate
Left Behind →

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Latest News

  • Delays for on-campus student housing
    University announces Summer 2026 move-in date Cami Davila (she/her) // Crew Writer Rachel Lu (she/her) // Illustrator Capilano University’s […]
  • Left Behind
    A breakdown of the current political scene in BC shows why progressive voters are feeling unaccommodated by their options  Theodore Abbott […]
  • Is Toxicity Hardwired into Social Media Platforms?
    Research into AI tells us it is Lily Dykstra (she/her) // Contributor Rachel Lu (she/her) // Illustrator It’s no secret that over the […]
  • Faculty Merger Divides Senate
    The Board of Governors seeks advice from Senate on the merging of two faculties, but is it really a merger? Laura Morales Padilla (she/her) […]
  • AI Slop: College Crisis
    AI is polarizing post-secondary education, with instructors divided on how it should be used  Yasmine Elsayed (she/her) // Contributor […]
  • Are Students Paying More for Instructors to Teach Less?
    As free Open Educational Resources become increasingly available, students question faculty members’ preference for paid textbooks   […]
Video Production
October 28, 2025

This is the full recording of the Capilano Students’ Union Annual General Meeting held on October 28, 2025.
Watch to hear discussions and  updates from CSU leadership.
CSU Annual General Meeting
Subscribe
© 2025 Capilano Courier | Powered by Minimalist Blog WordPress Theme