Trolls and ignoramuses

(Sarah Hawk) #1

This article by Guy Kawasaki about trolls interests me. Some of the examples are hilarious (and it’s worth reading just for those), but I feel like it misses the mark in many ways.

If its to be believed, most trolls are American males who are married with children.

Is that your experience?

What is your position on bumping old topics?
(Richard Millington) #2

I think he’s talking more about a caricature of trolls rather than most real trolls today.

By his definition “These are the folks who combine a strong opinion with a lack of knowledge and whose main goal is attention” you could classify the majority of the internet as trolls. Anyone on Facebook that posts about policy decisions without having read the policy could be a troll.

While most trolls are probably men, the rest would vary pretty wildly by community.

A far better definition I liked was ‘Future Banners Users’. There was a great study last year that conceptualised the idea that with enough data you could identify people most likely to be banned in the future before they commit the act that will lead them to be banned.

From a selection of first 10 posts they created a model that had an 80% accuracy rate in predicting future banned used. I’m honestly surprised more people haven’t tried to build moderation businesses around this.

I blogged about it here. The key signs were:

[quote]* Future Banned Users (FBUs) concentrate on fewer discussions.

  • FBUs mostly post replies to existing threads.
  • FBUs post very frequently.
  • FBUs get fewer votes/rating points.

Other possible predictors include:

  • FBUs use less accommodating language.
  • FBUs use language which is less similar to other members.
  • FBUs swear more frequently. ). [/quote]

(Darren Gough) #3

This is interesting @HAWK

(Mark Baldwin) #4

I love this. What a great definition.

(Sarah Hawk) #5

A troll-hunting unit!

(Duncan Field) #6

I’d be interested to know if there is data on specific topics/organizations and the presence of trolls. Is this problem as present in a community of practice as on a more general forum?

(Suzi Nelson) #7

I define trolls as someone who doesnt care about a problem being solved, they just want to get an emotional response out of people. If you’re wondering if someone has a genuine problem or is just being toxic, a sure-fire way to hammer it out is to ask them a question.

If they reply with more information and context, great! If they respond with more inflammetory remarks, ban and move on with your life. Aint no one got time for that.

@duncanfield I think trolls will show up just about anywhere. Just my own primary research, though :wink:

(Duncan Field) #8

Just doing some research into faith-based forums, I’ve seen some expert level trolling. It made me a little fearful just because we’re building a community with faith as a commonality (of course there’s always differences), but it will mostly act as a community of practice in the non-profit sector, so i think that will cut down on the shenanigans. At least that’s the hope!

(Sarah Hawk) #9

I think that depends on the subject or nature of the practice. Tech communities of practice are often rife with trolls. Professional communities less so. That’s anecdotal though – I don’t have data.

This article about trolling has lots of interesting references and strategies.

It is definitely a subject that opens itself up to potential unrest, in the same way that politics does. Make sure you have a strong set of guidelines (which people won’t read, but they still need to exist) and moderate with transparency. Your primary goal is to create a safe place for people.

(Darren Gough) #10

In my experience the single best way to deal with trolls is absolute silence (member to member I mean).

We actually did a bit of reverse engineering on our old vB platform to create an ignore button. Essentially removes that persons posts from another users line of vision. There’s also a pretty sweet function in vB (or there used to be ) called “tachy goes to coventry”. Essentially it hides all users posts from the entire community apart from that user.

In other words they can see their posts and think it’s business as usual, but to everyone else they’ve simply stopped posting. Nice thing about it is unless they keep logging in and out and running checks (time consuming and annoying) they can’t tell when its been actioned.

(Duncan Field) #11

That’s the idea. I was reading in some FeverBee materials that co-creating a community ‘charter’ is often a good idea. Is this the sort of behavior to address in that process? Or is this more of a top-down approach?

(Sarah Hawk) #12

Others might feel differently, but I very much think this is the sort of behaviour to address in a community-based charter. Allowing people to make decisions around the kinds of behaviours they are comfortable with (and how to deal with things if they happen) supports autonomy and helps with accountability.

(Richard Millington) #13

The ignore or mute button is a pretty good idea. Especially if you can do it for a troll. Having them posting at will and being ignored by everyone is a pretty clever idea.

(Sarah Hawk) #14

That was one of the few things that vB did in core that was a step above everyone else. I’m surprised it hasn’t translated into new platforms TBH.

(Mark Baldwin) #15

One feature on Facebook pages that I use quite a bit is the hide comment. This makes their comment only visible to the person who wrote it and their friends. This is particularly good when someone has not quite stepped over the mark to get banned and they will never know that the majority of people can’t see their comment. :slight_smile: