Bluesky Experiences Explosive 17x Surge in Moderation Reports in 2024 Amidst Rapid Growth

Bluesky Experiences Explosive 17x Surge in Moderation Reports in 2024 Amidst Rapid Growth

Bluesky has recently unveiled its annual moderation report for 2024, highlighting significant growth in its user base and the subsequent impact on its Trust & Safety team. With the surge in users, Bluesky faced a substantial increase in reports related to harassment, trolling, and intolerance, raising concerns about online safety and community standards.

Bluesky’s Rapid User Growth

In 2024, Bluesky welcomed over 23 million new users, becoming a popular alternative for those migrating from Twitter/X. This influx can be attributed to various factors, including:

  • Changes in user blocking policies at X
  • Political shifts following the U.S. presidential election
  • A temporary ban on X in Brazil, which drove users to Bluesky

Increased Moderation Efforts

To cope with the heightened demand for moderation, Bluesky expanded its moderation team to approximately 100 moderators and is actively recruiting more staff. The company has also initiated psychological counseling for its moderators to support them in handling graphic content.

In total, Bluesky received an astonishing 6.48 million moderation reports in 2024, a dramatic increase from the 358,000 reports in 2023.

New Features for User Engagement

This year, Bluesky plans to enhance user interaction with moderation processes by allowing reports to be submitted directly through the app. This feature will enable users to track actions and updates on their reports more effectively.

During a peak period in August, when Brazilian users flocked to the platform, Bluesky recorded up to 50,000 reports per day, necessitating the hiring of additional Portuguese-speaking moderators.

Automation in Moderation

To manage the surge in reports, Bluesky has begun automating various report categories beyond just spam. However, this automation has led to some false positives. Despite this, the processing time for reports has significantly improved, dropping to just a few seconds for high-certainty cases, compared to 40 minutes previously.

READ ALSO  Opera Browser Elevates User Experience with Integration of Bluesky, Slack, and Discord

Report Statistics and User Behavior

In 2024, approximately 4.57% of active users (around 1.19 million) made at least one moderation report, a decrease from 5.6% in 2023. The breakdown of reports includes:

  • 3.5 million for individual posts
  • 47,000 for account profiles
  • 45,000 for lists
  • 17,700 for direct messages (DMs)

The majority of reports were related to anti-social behavior, indicating a strong desire among Bluesky’s community for a healthier online environment compared to platforms like X.

Categories of Reports

Bluesky categorized reports as follows:

  • Misleading Content: 1.20 million
  • Spam: 1.40 million
  • Unwanted Sexual Content: 630,000
  • Illegal or Urgent Issues: 933,000
  • Other: 726,000

Moderation Appeals and Account Takedowns

In 2024, 93,076 users submitted a total of 205,000 appeals regarding moderation decisions. Additionally, there were:

  • 66,308 account takedowns by moderators
  • 35,842 automated account takedowns
  • 238 requests from law enforcement and governments

Bluesky responded to 182 of these requests and complied with 146, with most coming from Germany, the U.S., Brazil, and Japan.

Conclusion

Bluesky’s moderation report for 2024 provides valuable insights into the platform’s growth and the challenges it faces in maintaining a safe and welcoming environment. As the social media landscape continues to evolve, Bluesky’s commitment to addressing user concerns through enhanced moderation and user engagement initiatives will be crucial for its ongoing success.

For more information on social media moderation practices, you can visit this resource.

Similar Posts