Impact of Content Moderation Strategies on Cross-Platform Toxicity: An Agent-Based Analysis

Erik Nordby

Georgia Institute of Technology (OMSCS Student)

Date: 4/16/2026

Abstract

The rapid rise of social media has brought with it numerous challenges in moderating toxic and harmful content. Given the interconnected nature of social media sites and the ease with which users can move between platforms, overly strict moderation may push users to migrate to more toxic environments.

This demonstrates an Agent-Based Approach for modeling this migration and the impacts that different moderation techniques may have on social media users. Using the Construct simulation framework, I've modeled user behavior across multiple platforms to attempt to better understand how varying moderation policies affect both individual platforms and the broader social media ecosystem.

Research Focus

Primary Questions

  • • How do moderation strategies impact system-wide toxicity?
  • • What are the patterns of user migration between platforms?

Methodology

  • • Agent-based simulation using Construct framework
  • • Measured impacts across moderation strategy, strictness, and accuracy
Platform Impact

Moderation strategies showed varying impacts on platform toxicity:

  • • The distribution of the users across toxicity levels was the most important determinant of platform toxicity.
  • • The moderation strategiy also had a major impact on overall toxicity levels.
  • • Individual platform toxicity was easier to control, while overall toxicity exhibited more complex relationships
User Behavior

User responses to moderation varied by strictness and moderation techniques:

  • • Strict moderation led to increased user migration
  • • Soft moderation techniques like warnings and temporary bans mitigated these migration patterns