Some guidance on dealing with suicidal content online


(Mark Wilkin) #1

You might be interested in this new set of guidelines developed by the NSPA in the UK to help forum managers and moderators respond to suicidal content and help people get the support they need.

I’ve had to create a mods policy on how to handle this kind of thing so I know how useful it is to have something to work from rather than creating something from scratch. It might be UK focused in the resources it points to, but the processes and info are solidly useful to all I think.

Feel free to pass it around as that’s what it’s there for and do throw them some feedback if you find it useful.

Cheers


(Sarah Hawk) #2

Thanks @Mark_Wilkin and welcome.
Have you had to deal with suicidal content at some stage in your career? I’m interested in their approach to removing content, more specifically the template for contacting someone afterward:

Hi, we’ve seen your post and we’re worried about you. Your last post included an image of a suicide method, so we’ve taken this down (see our house rules).

The mention of rule breaking seems misplaced. Thoughts?

As an aside, I see you work in non-profit/health/illness related communities. We have quite a number of members in similar areas.

Ping @Priscilla @colleenyoung @Sarah_Broughton @Jess_Evans @Michael @Breastcancer_org


(ForumSentinel) #3

This is always a tricky one and an especially heavy topic on our site, especially since back in 08 someone actually carried out their threat over a video feed they linked. Basically any mention we will remove the content and ban the user. We will give the reason e.g. “suicide thread, 1 month ban” but won’t make any mention of getting help or trying to offer advise since that can get us into hot waters.

We do now have policies for following up with such threats after the fact using the proper off-line channels (such as contacting local authorities) and documenting all the steps taken.


(Sarah Hawk) #4

Legal hot water, you mean?

That sounds like a really awful situation (in 08). Do you think that banning is the right approach, or are you hamstrung by laws (or something)?


(Richard Millington) #5

@lizcrampton - tagging in Liz who used to work out reachout


(Priscilla McClay) #6

Thanks, Sarah - yes, I was actually planning to pop in and share this myself at some point!

I was one of the community managers interviewed by the author of this document when she was doing her research (as, I think, were several of the others you have tagged here) and I think it’s a great resource.

I developed quite a detailed process for dealing with this type of content, which should be in line with these guidelines. Having a process is really helpful because it means that someone won’t be left on the spot, feeling as though it is all up on their shoulders to make a decision. Working in a healthcare organisation has its advantages, in that the process can involve escalation to health and social care staff.

Our processes don’t include banning members for suicidal content, but they do include scope to remove content if it involves a clear intention, or discussion of methods that could be used, and also the scope to contact the emergency services in certain situations. Thoughts and feelings about suicide can be discussed, although we would still get in touch with the member to signpost to more support.

I came across [this article the other day about the Suicide Watch forum on Reddit] (http://www.bbc.co.uk/newsbeat/article/35577626/social-media-and-suicide-what-its-like-being-a-moderator-on-rsuicidewatch) - interestingly it is completely anonymous and they never call the emergency services, which sounds terrifying to me, but arguably does allow people to open up who wouldn’t do so anywhere else.


(Sarah Hawk) #7

Right. Were it not for anonymity then many of those people wouldn’t post at all, affording them no support. I worked as a suicide counsellor and we used the anonymous online community to try and encourage people to call the helpline, but if they chose not to, then we didn’t mobilise services.

It’s a very fine line.


(ForumSentinel) #8

Right, since it’s not our position as representatives of the company to offer advice (and no one wants to be the person whose words were the proverbial straw that broke the camel’s back. I just envision all kinds of legal nightmares should, for example, a moderator’s comments be taken that way.)

I’m not sure banning is a perfect approach, but for this community, it might be a less worse option. Most such claims are trollish since this community attracts more of that type of posting behavior and banning is perfect to remove that bait. Other more serious claims are still better off not posting that in the community since the “alpha male, testosterone fueled” environment isn’t conducive to talking anyone down or being kind and supportive.


(Sarah Hawk) #9

Makes total sense. It goes to show that it’s definitely a case of knowing your own community and understanding how to apply appropriate guidelines and boundaries.


(Priscilla McClay) #10

Yes, of course, it very much depends on the type of community you’re running. Are you able to signpost the user to other help privately when you notify them about the ban?

Sorry to hear that you had such a horrible situation in 08 - that’s pretty much the worst-case scenario that we’re all hoping we’ll never have to deal with.


(ForumSentinel) #11

In the case of a serious threat like that, we have a process for contacting offline authorities and they’ll step in from there. Attempting to offer help/advice is not a liability we can take as a company however.


(purldator) #12

Disclaimer: I am not a medical professional. I only live it.

The top two answers from this StackExchange question are very good answers, as supplement material to the new NSPA guidelines.

Any due diligence on your part as a community manager would be to distract and hold off a suicidal person until the authorities reach them by any IPs you give them; checking their geo location narrows down which police department should be notified. The NSPA guidelines have other good strategies to gather info from the suicidal person to make in-person contact swift; supplement, add, alter as needed according to the situation’s unique context.

Consider yourself a Good Samaritan. This is “psychological CPR” at its pith.

I suggest taking all suicide threats as real and handled accordingly. As in, those threats not easily handled and resolved with a bit of empathetic discussion.

Anyone joking will learn the lesson not to, when police and EMTs storm into their home.


(Liz Crampton) #13

Hey @Mark_Wilkin for sharing and @Priscilla for feeding into the resource. My colleague Celia developed this with NSPA and worked with around 30 partners on this, so we hope it’s a reflection of all of the input.

@HAWK, you are right re. template above being a little curt (we were very limited on space so couldn’t include fuller response templates). Ideally this would read a little softer and it’s very important to avoid shaming members for writing about suicidal feelings… but, it’s important to link it back to the house-rules. Reason being, so members are clear on what they can/can’t write about (eg. you can write about thoughts/feelings around suicide but not plans) and to enforce that it’s about protecting the whole community (not just that member that is struggling).

@ForumSentinel - Sounds like that was a really tough situation to be in. I wonder whether you can point members in the direction of support without it being ‘advice’. eg. ‘We’re sorry you’re feeling overwhelmed and having thoughts of suicide. You might find these resources helpful [helplines/links]’. Do check out the resource above for ideas; might be useful.

We’ve had interesting conversations around banning members for repeatedly posting suicidal content, which is technically breaking the house rules. Other mental health CMs like Depression Alliance and Bipolar UK have the same problem, as do psychologists and therapists in a in-person format. Interestingly the Samaritans won’t ban users for crisis calls, but do cap service if they are being called excessively by the same member.


(purldator) #14

That relates to my reply above:

One can only do so much, and say so much, until it becomes a true liability. A person can only do so much CPR.

It is a crisis line; it is not a therapy line. Crisis means “one time use designed to push the caller toward higher forms of care”.

Think of it as a person who constantly goes to their best friend for ranting sessions about their abusive spouse. That friend can only do so much. “Move out! Go to a shelter! They can help you more. I cannot help you beyond encouraging words. You need to do this yourself and do the right thing.”

Crisis lines/support communities must eventually cap their service, or outright ban someone. They must think for themselves first, even though it may hurt. Others need to use that service too. Repeat callers take away time others in crisis can use. These crisis lines did what they were designed to do. Now that user/caller must follow through. It is their Free Will to choose in the end.

I see repeat callers as scared to make the leap, get help. “Getting help” is perhaps the reason why they want to commit suicide. “If I go to the hospital, my religious family will disown me; If I leave my spouse, maybe they might try to find me and make my life worse than it is now; I am scared to go into a drastically new setting where I may have no control”.

Banning and capping service forces the caller to make the right choice. Cut them off the addiction or the short-term cure. Make them go seek what they really need to do.

I know, as stated above, I have lived this. I have friends the same. I was the one who urged them to go to the ER and told them to have no fear. If they kept talking to me and not get help, I will cut them off too. A suicide survivor (me) will cut them off.

And despite the pain of doing it, I know it is for them and not me.


(Sarah Hawk) #15

Thanks for your insight @purldator, you make a very important point. I think one of the tricky things is that people that don’t have and have never had suicidal thoughts are often scared as hell that they might say or do something that could cause someone to act on their threat, or that if they do, we didn’t do enough and are therefore partially to blame. I realise that those thoughts aren’t rational, but it’s such a scary subject and one which is very hard to be truly empathetic around.

Thank you also for your courage.


(purldator) #16

The mere idea someone is willing to set aside time to talk to someone and lead, pressure them to the best long-term solution speaks more than doing nothing and turning away from it. That can only be a good thing than doing absolutely nothing; turn a blind eye.

Same as someone willing to risk everything to save someone else in spite of having barely any or no knowledge, training to do it efficiently. Forget the technical aspects and the law: the empathetic trauma you mention (such as survivor’s guilt) bears onto one’s soul and slowly kills the spirit than any kind of punitive fine or jail time could ever do. Doing it anyway knowing that risk shows courage.


(Mark Schwanke) #17

Piggy backing on the subject a bit…

How did the community react to you banning the user? What if it isn’t as severe as suicide but rather they’re just in distress(Depressed, mentioning self-harm, etc.) and they need help. I think it is good that you post why the content was removed rather than leaving people to wonder. What if the community is the only place they can really confide in someone open?. You’ve shut the last door. Some may speak out but others may leave without saying anything to you.

Does your site have disclaimers that advice given doesn’t replace that of a professional healthcare provider? Does it state that content posted by members of the community isn’t the responsibility or the views of your organization even though you sponsor the community?

Would love to hear everyone’s thoughts on this.


(ForumSentinel) #18

We won’t ban for depressed talk, only for explicit mentions or discussion of suicide. The ramifications of making such threads have been written in the rules for years now so users know what to expect and there isn’t any backlash from the rest of the community (one or two here and there might say something but you will always have a vocal few no matter what you do).

Since the suicide in 2008 was such a tragic and high profile event, and as a business it’s simply too much liability to host such discussions, we don’t allow them. If we find out about someone who is considering suicide we’ll contact their local authorities for a wellness check. Any other assistance is beyond our scope.

We have a very active legal team who takes care of the proper phrasing and coverage statements :slight_smile:


(Mark Schwanke) #19

Thanks for the clarifications. Hope you have a great weekend!

Mark


(Sarah Hawk) #20

Is that something you can share?