How frequently do you deal with flags, blocks, suspensions, and tough users?


(Joshua Rosenfeld) #1

So I originally posted this topic over on Meta (link), but it didn’t get any traction so @erlend_sh suggested I try asking here. I’ve modified the original post a bit, as Meta is filled with all Discourse users, and there are a variety of platforms used here. My apologies in advance if I use any Discourse specific terminology.


So I’ve seen some questions come up over the last few weeks about moderating a Discourse instance. There was this discussion about dealing with overly emotional users, this topic about TL3 (“Trust Level 3” - users who have gained a large amount of “trust” on the site) users with too many flags, and more that I am not finding right now. I guess what I am curious about is what other communities see in terms of their users and content. Over at Stonehearth, in my time as a moderator over the last 4 months, we have seen very few issues. Since our Discourse started just before May of 2013, we have had a grand total of 544 flags (community reports of issues, like spam, inappropriateness, etc.). The average number of flags per month is just 16, and we are seeing a negative trend (linear trend) in flags.

We have only 3 long term (indefinite) suspensions (users blocked from logging in and participating on the site), and very few (less than 10 to the best of my knowledge) total suspensions in our history. In the past 3 years, only 5 posts have been auto hidden by 3 or more flags, though 2 other posts did receive 3+ flags but of different types. All but one of the auto-hidden posts happened in February of 2015, and the flags were all for one of our now long-term suspended users (who received 34 of the 60 flags for the month). I have wondered if the small quantity of auto-hidden posts is due to the low number of spam/inappropriate posts we receive, if users don’t tend to flag posts, or if our mods are particularly active. Of the 544 total flags, ~130 of them (more than a fifth) were from the mods themselves, which were then dealt with immediately.

I guess what I am wondering is what other communities see. Our community is around a indie game, and the majority of the discussion is focused on bug reporting and game suggestions. We have just over 6000 total users, with an average of 170 user visits per day. We have 5 mods in 4 different time zones (3 US, 2 Europe) 4 active admins, and 9 total admins (if I skip the Discourse team since we are hosted). What do you experience on your site? How big are you, what is your focus, how active is your mod team, etc. Do you get a lot of spam/inappropriate posts? How do you decide when to block or suspend an user, and how frequently does that occur?


tl;dr edit per @Jeff_Atwood’s request on Meta:

Question 1: Please provide your general community details. (like what your community is for, how busy it is, how large your moderation team is, etc.)

Question 2: How frequent do you run into “issues”, and what do they tend to entail? (like frequency of spam and flags)

Question 3: How do you deal with “big deal issues”, like suspensions, and how often do you encounter the need to do so?

Bonus Question: If your site has more than one moderator/manager, how do you decide when something needs to be discussed as a team, versus staff going ahead with something on their own?


(Richard Millington) #2

Gaming tends to be a fairly specific use case. The metrics don’t tend to work well in more corporate communities.

For us (and i’m not checking the stats here as I write this)…

  1. Our community is busy without being overwhelming. No moderation team, just Hawk and I (mostly Hawk) taking care of things. I tend to remove flags if I see them as I log in. But that’s usually only when Hawk is asleep on NZ time.

  2. I think we get a couple of month. Some are legit comments that got caught up in the system, a few are genuine spam. But it’s rare either way fortunately.

  3. I tend to immediately suspend any spammer. But, again, we really don’t get many. I’d suspect these were bots. We sometimes remove posts that we feel are overly promotion or off-topic.

BQ: Hawk and I generally duke until there’s only one person standing.


(Sarah Hawk) #3

As Rich has already outlined our situation here (which admittedly, is a fairly unique use case) I’ll discuss a previous Discourse community that I managed, which has more in common with your situation, Josh.

Question 1: SitePoint is a community for web developers. It is large (we migrated 280k active users, and 10s of millions of topics from vB to Discourse a few years back) and I had a mod team of ~50. That team was structured into 3 tiers.

Question 2: We’d clear ~10 flags per day and ban a few members each week.

Moderation was a very difficult road for us to navigate, when we moved from vB. It was one of my motivating factors, but it proved very hard for some of my team. I could see that we were stuck with antiquated processes (and ideas) and that they needed to be thrown out. Many of the team wanted to rewrite the processes in Discourse.

The beauty of Discourse is that it empowers your members to define the kind of community that they want to be a part of. Flags exist because people felt the need to flag. That’s a good thing – it means that your education and communication processes are working. If a community is getting ‘too many flags’ then you need to start priming some different behaviours.

It looks to me like your community has a healthy culture and the tools that Discourse provides are working well.


(Joshua Rosenfeld) #4

Thanks Rich and Sarah, different communities certainly do have different levels of issues. I am glad to hear that it sounds like we have a healthy culture!

While I would agree with you that gaming tends to be a niche case, I would like to see how it compares to other “support” type communities. The support category, and its subcategories tends to be the busiest part of the forum, with users reporting crashes, bugs, and other issues that they need assistance with. I wonder how we compare to those types of communities (corporate or not)?


(Sarah Hawk) #5

Interesting point re the support side of things. @Claudius manages the support community at Skype. He may have interesting insights (although they won’t be Discourse based).


(purldator) #6

Aye. Discourse allows a community to define its unspoken Miller Test in a more public and open manner; without manual staff intervention.

SitePoint would be an online social psychologist’s dream to root around; see the “problem” users (airquotes), see them in situ and also the surroundings that include stimuli the user(s) in question respond to thus output as unwanted behavior.

A reply in general to the topic: Much appreciated for sharing your stats @jomaxro. Very interested in hearing from others.

A trend in less unwanted behaviors (plus the flags to mark them) correlates to the use of Discourse’ trust levels feature. It is the same as it was for my community (used a “pseudo” trust level system), and how it lasted for so long. Out of ten years, I only banned–due to unwanted behavior independent of spamming–around three to four individuals, one user made read-only as requested by the entire userbase. Some were unbanned too, after talking over the behaviors warranting the ban. I was the only true moderator, sole staff, alongside a membergroup composed of what Discourse would call “Leaders” in its trust system, hand picked by me and then eventually picked by those Leaders already in the group. I also had one of them as a “pseudo co-admin” in their own membergroup, hidden; they had the ability to move a user to a membergroup that prevented them from seeing any of the boards, thus silencing them from a !ban in case of any emergency. That !admin only used it for the occasional spam bot the entire time.

Members used the “report post” feature to flag questionable content and the behaviors expressed therein, all taken care of, discreet, preventing/mitigating disharmony with the userbase at large.

(For context, I used SMF 2.0 beta for my fandom (see avatar) community of ~400 active users.)