The Federated Moderation Toolkit

The Federated Moderation Toolkit is an evidence-based, human-centric initiative designed to enhance safety and moderation practices across federated digital spaces. This research-driven project aims to create a safer, more inclusive federated digital commons by leveraging collective expertise and cutting-edge research in platform moderation.
Traditional social media moderation relies on centralized control and algorithmic enforcement. Federated platforms need something fundamentally different: a toolkit that enables coordination without centralization, consistency without uniformity.
Unified dashboard: We're developing a federated moderation dashboard that allows moderators to manage safety across multiple platforms from a single interface. Whether you're moderating a Mastodon, Pixelfed or a Bonfire server, you could view reports, coordinate responses, and share safety intelligence through one unified system.
Two paths to adoption: Platforms could either implement our documented specifications directly into their codebases or integrate with our dashboard through APIs and webhooks. This flexibility ensures that both resource-rich and resource-constrained platforms can benefit from advanced moderation capabilities.
Evidence-based development: Every feature we build is grounded in systematic research and validated through continuous testing with practicing moderators. We're not guessing what federated moderation needs, but we're building solutions based on documented evidence and real-world expertise.
Project approach
The project addresses critical gaps in federated platform safety through a comprehensive approach that includes:
- Research and evidence synthesis: Identifying and documenting best practices for federated platform moderation through systematic analysis of existing research and community knowledge
- Reference implementation: Developing practical solutions as extensions that will power the bonfire framework and serve as implementation guides for other platforms
- Community co-design: Collaborating with 200+ experienced moderators from IFTAS's international community for user research, co-design, and rapid prototyping
- Cross-platform compatibility: Creating interoperable tools and documentation that work across different federated platforms
- Participation in standards development: Coordinating with the ActivityPub Trust and Safety Taskforce to develop and propose enhancement protocols, reach out to Prosocial Design Network, Fires, TSF and ROOST.tools to co-design, gather feedback and develop integrations with relevant initiatives in the open.
What we're building
The Federated Moderation Toolkit consists of three interconnected layers that work together to transform how moderation happens in federated environments:
Technical infrastructure
Community-informed moderation features developed through extensive co-design with practicing moderators and validated through iterative user research. These tools, built using Bonfire's ActivityPub extensible framework, represent ready-to-use solutions that any ActivityPub platform can chose to adopt directly or integrate via APIs.
Operational framework
Evidence-based workflows, response templates, and shared vocabulary developed through extensive research and co-design with practicing moderators, informed by evidence, experience and standards from groups like Trust and Safety Professionals Association, Trust and Safety Foundation, and DTSP Safe Framework Specification ISO/IEC 25389. This framework enables consistent, effective moderation practices while adapting to each community's unique needs and policies.
Knowledge ecosystem
Comprehensive documentation, implementation guides, and educational resources that make sophisticated moderation practices accessible to platforms regardless of their technical resources or moderation experience.
This integrated design approach ensures that each component works synergistically. For example, a reporting feature encompasses not just technical code, but also UX patterns informed by moderator research, configuration options adaptable to different community policies, and comprehensive documentation explaining implementation and interoperability considerations.
Team
This collaboration combines moderation expertise, technical knowledge, and community support, positioning the project to develop crucial features benefiting the entire social web ecosystem.
- IFTAS (Independent Federated Trust and Safety): Led by Jaz-Michael King, providing access to an extensive international moderation community and established trust and safety expertise.
- Erin Kissane: Co-author of grant-funded research on Fediverse governance and moderation, bringing critical academic and policy perspectives.
- Bonfire Networks Team: Providing technical development expertise and a proven platform for federated social infrastructure.
This collaboration combines moderation expertise, technical knowledge, and community support, positioning the project to develop crucial features benefiting the entire social web ecosystem.