User blog:DToast/Introducing Discussions AbuseFilter

We believe discussions are part of what makes a healthy, well-rounded community. They’re a place where people can share ideas, ask questions and learn from each other. They provide a space for members to express themselves openly and honestly, which is critical for building trust in a community

But what about times when there are people abusing the feature with spam and vandalism? Over the years we’ve made incremental improvements to our moderators’ toolsets but we’ve admitted there are still some major areas of improvement needed. Earlier this year, at Community Connect, we committed to providing large, new, and impactful moderation tools for Discussions. Today we’re releasing the first such tool - Discussions AbuseFilter.

How It Works
You may be familiar with an already existing tool called AbuseFilter, and the name of this new Discussions tool is no accident. We took the premise and general workflow of this MediaWiki (wiki articles) tool and broadly implemented it in Discussions AbuseFilter while also making some feature enhancements.

In AbuseFilter and now Discussions AbuseFilter, it allows admins and moderators to proactively prevent certain bad or undesired content, like spam and vandalism, from being added to their wikis. While Special:Block allows people to prevent users from making an edit, AbuseFilter tools help make that decision based on the written content, not the contributor. In order to do this, moderators create filters. The filter automatically detects when a user has posted abusive content and flags their post for action. It can also automatically block these users who tried to make such posts so you don’t have to worry about them trying to add bad content anywhere else in your community. And because these filters are set up to be triggered automatically, once you set a filter and verify it is working as expected, you can let it do the work for you.

This tool can help you to:
 * Keep your community safe by automatically detecting and removing abusive content before it ever gets published.
 * Prevent spam or other unwanted content from appearing in your discussion feed.
 * Specifically handle content issues specific to your community. For instance, does your wiki have some spoiler issues that people often ignore policy on? You can design a filter to help keep people’s posts within the guidelines by preventing certain phrases from being saved.

Discussions AbuseFilter is also internationalized for our supported languages, so it will work the same on any written content, regardless of whether that content is in English, Spanish, French, etc.,

All of this is done via an user interface (UI) that is designed using Fandom’s Design system and modern engineering principles, so this powerful tool is intuitive and easy to use. Prior to this full rollout, we released a beta version of Discussions AbuseFilter on 25 wikis for them to try it out and give us feedback. Through that we’ve been able to make a number of improvements, but the feature’s ease of use has been one of the biggest pieces of feedback we received.

How to access Discussions AbuseFilter
Discussions AbuseFilter is turned on by default for wikis with AbuseFilter AND Discussion enabled. The extension will be made available on wikis where there is a clear and current need for it. Administrators can request the feature from their assigned Wiki Representative or us to enable the feature.

Even if you don’t have Discussions enabled, it will allow for content moderation on other parts of the wiki community that rely on the same Feeds technology, such as Message Wall, Article Comments, or Blogs.

Once enabled you can set filters and view hit logs at Special:DiscussionsAbuseFilter. Any other questions you may have can be answered on the Discussions AbuseFilter Help Page.

How Discussions AbuseFilter was built
The conception and development of Discussions AbuseFilter was unique when it comes to product development at Fandom, so we wanted to take a moment to talk about the collaborative process that got us here. You may know Noreplyz, a member of our SOAP team that handles spam and vandalism across the site. Noreplyz, of his own initiative, built a proof of concept Discussions AbuseFilter for Community Central that proved a feature like that could work. Intrigued by the idea, the Community team took a proposal to our Chief Product Officer and Chief Technology Officer and said that the Community team wanted to sponsor and develop this feature. They agreed.

From there, Noreplyz and his friend Alex, who are both engineers outside of Fandom, began working on building the feature with support from Máté, an engineer on our CATS (Creators Admins Tools and Staff) development team. Together they successfully integrated Discussions AbuseFilter into Fandom’s code base, and enabled us to begin testing the feature on a set of test communities.

That testing experience was inspired by the great work done through the testing of Interactive Maps, that’s now being used for mobile theming prior to its full rollout, and will be the model that we use moving forward for new community features. Through that testing, we made a number of improvements based on community feedback that got us to where we are today with the full release of Discussions AbuseFilter.

We are very thankful to everyone who was involved in the building of this feature!

Looking Ahead
If you’d like to provide a suggestion or have a question post a comment below, ask on Fandom’s Discord #discord-abusefilter channel, or send us a ticket!

As we said at the beginning of this blog, this tool is the first - but not only - Discussions moderation improvement we are working on. Stay tuned to this blog in the coming weeks, because we have a few other excellent improvements coming before the year ends - including post and reply history, which our development team will be working on after the completion of mobile theming work

We also know that there are historically Gamepedia wikis that have been eager to try out some of our social features, but we haven’t enabled that opportunity due to our own concerns around moderation. With today’s release, we are exploring what, if any, technical blockers still exist for enabling these features on Gamepedia wikis so we can finally provide the option to try these features out on all wikis across the community platform.

Discussions AbuseFilter is an exciting new tool that will help us combat abuse in Discussions. We’re excited to see how it performs in the wild, and we hope you are too!