The National Suicide Prevention Lifeline staffs the national suicide hotline (1-800-273-TALK) and now has teamed up with Facebook, the world’s largest social network, to offer online crisis services to certain Facebook members.
I say “certain” Facebook members, because you can’t just log onto Facebook and seek out this free service. You first have to actually publicly post a comment somewhere — like on your wall — that you’re suicidal. Then you have to wait for a concerned friend or family member to read your post, click on the “Report” link, and report it to Facebook. Then, a Facebook staffer looks at the report and, if it meets its suicide criteria, will send the original Facebook user an email.
In this email from Facebook, the user will find a reminder about the national suicide hotline. But this special email also contains something you won’t find on the Facebook website, nor the website of the National Suicide Prevention Lifeline — a link to chat immediately online with a volunteer crisis counselor.
Facebook is providing the financial support for this new service, so not surprisingly it wants to limit its use. That’s a shame, because with the resources of a company like Facebook, they should make this sort of suicidal crisis chat service available to any of their users — without them first having to publicly come out and post about their suicidal intent.
This a great new resource and we commend both Facebook and the Lifeline for providing it as an option to their users. But the new service has a dark side as well…
Facebook is Suddenly Your Paternalistic Health Buddy
Not surprisingly, as a long-time patient advocate and expert in the world of online mental health, I do have some issues about how the service has been rolled out. I want to emphasize that I believe the service itself is an important resource that is being made available.
My issues are related to concerns about a user’s privacy on Facebook and the unusual role Facebook has now taken on for the first time — as a provider of personal health information. Your very own paternalistic health buddy, so to speak.
You see, in the world of mental health, even the fact that you’re seeing a mental health provider is considered privileged health information. You can’t just call a psychologist’s office and ask, “Hey, is so-and-so your patient?” That’s private health information.
By sending a well-intended email to the user’s private email account, Facebook is exposing your email account to personal health information that I doubt most Facebook users ever imagined Facebook would be providing. In this case, the personal health information is that you are someone who may be depressed, suicidal and in crisis.
In most cases, the assumptions that Facebook and Lifeline have made about a person’s email are probably accurate — that a person’s email is private. But in some small minority of cases, that may not be the case. The email address may be shared, or it may be monitored by a concerned parent (or by a nosy spouse or partner).
In these kinds of cases, sending such information is clearly going to violate the person’s expectation of health privacy, especially if they’ve chosen not to share this information with the third-party. Suddenly the third-party knows about the suicidal person’s health status without their consent.
Many will write all of these concerns off as meaningless when someone has made the life-changing decision to take their own life. All bets are off. In fact, a person can be committed to a hospital for being actively suicidal (even against their will).
And in general, I agree — nearly any intervention for someone who is suicidal is better than none at all. ((Why? Because suicide is most often an irrational decision based upon a temporary, intense emotional state of depression. It’s a symptom of a treatable mental disorder. Add to that the fact that few suicidal people actually want to die (they really just want an end to their pain, and for someone to reach out and help them with that), and it makes sense we try our best to intervene in some manner.))
But I worry about the “slippery slope” we open the door to with such an innocuous email. If an email is okay to save someone’s life, why wouldn’t it be okay to try and stop people from engaging in unhealthy (even life-threatening) behavior in general? Or any behavior deemed potentially deadly or dangerous. ((And I haven’t even mentioned the problem with false-positives from reports about someone who writes something “suicidal” on their Wall, but it’s just a harmless morbid phrase making the meme rounds, such as “please shoot me in the face.” When is such a post suicidal, and when is it a cry for help? Suddenly Facebook is in the unenviable position of having low-level staffers trying to make psychologically-driven decisions with little expertise in this area.))
Imagine you post your drunken party photos on Facebook, and then mention you’re going to drive home. The next morning, you may find a helpful email from Facebook reminding you of the dangers of drunk driving. Or worse, the police are dispatched to your location that night (yay geo-tagging!) because Facebook staffers called them after your post was reported. Does anyone want Facebook to become Big Brother?
Last, the fact this email would be sent without a user’s permission for Facebook to email them about health issues is a concern. This is a service I’d want to “opt in” for, not one I’d have to “opt out” of (and I don’t think, today, there is any way to opt out of it even if you wanted to). It subtly suggests that some people’s privacy is less important and less valid than others. The message this sends is, “If you’re suicidal, your health privacy isn’t valid any more. Sorry.”
There are few quick and easy answers to my concerns. One possible solution is to dump the email altogether and simply have Facebook send the message through its own messaging service; it’s not clear why Facebook chose not to go this route. Also, Facebook might be wise to ask their users for their permission first to opt-in to such a service before implementing it, since it completely changes the expectations of my personal user experience with Facebook.
In general, I think this is a great service offered by people trying to address the societal problem of suicide that doesn’t seem to budge much despite efforts to solve it. Since most people are online nowadays — and I’ve seen suicidal pleas since I got started on the Internet in 1991 — this is a move that makes sense.
But did anyone signup to Facebook thinking they might wake up one day to find a concerned email in their inbox from Facebook discussing some personal health concern? No, I doubt many of us did. And for many people, that may present a bit of a problem — especially if Facebook doesn’t let its users know proactively about this new feature.
Because even the best of intentions can have far-reaching, unforeseen consequences.
I also thought it appropriate to mention another online chat service which has been around for about a year now and sadly hasn’t gotten as much attention. And unlike the Facebook service just announced, this one doesn’t require you to go public with your suicidal intentions or interact with Facebook staff through email.
It’s called CrisisChat.org and is available to anyone within the U.S. (sorry, it’s not available internationally) from 12 Noon to 12 Midnight EST every day. Trained volunteers and staff from any one of their 10 centers (soon to be 12) nationally answer the chat. All centers are members of CONTACT USA and are nationally accredited by CONTACT USA or AAS, with additional accreditation standards met in the area of Online Emotional Support.
The only problem with this other free service is that it doesn’t always have enough volunteers to cover the need (which is quickly growing online). There’s nothing more frustrating that being in crisis or suicidal and getting the equivalent of a busy signal. But that’s no fault of the service, which is held back by the funding and donations it receives to support the service.
Speaking of funding, I thought I’d also mention that the National Suicide Prevention Lifeline is funded by a grant from the U.S. Substance Abuse and Mental Health Services Administration (SAMHSA) and administered by Link2Health Solutions, a wholly owned subsidiary of the Mental Health Association of New York City (MHA-NYC). The National Suicide Prevention Lifeline provides free and confidential crisis counseling to anyone in need 24/7 and has answered over 3 million calls since its launch in 2005.
Read the press release: Facebook provides first-of-a-kind service to help prevent suicides
Read the PC Magazine article: Facebook Launches Suicide Prevention Effort
9 comments
I would think that a friend would call 9-1-1, call the friend who posted the suicidal update, call a family member or friend who could get to the suicidal friend quickly…any of these would be more likely than someone reporting the post to facebook. I can’t see anyone I know clicking on the “report” link and then walking away from the situation. In fact, that would really bother me.
It’s a good idea that Facebook are establishing this service, but sadly I cannot envisage it helping those who are truly hell bent on ending their lives. My friend who sadly killed himself little over 2 years ago, only put in his final status update that he couldn’t sleep. There were no other warning signs.
Standard crisis counselling available to anyone through the chat facility would be better. I cannot count the number of times I have logged on to Facebook in a state of severe depression and felt alone when none of my close friends are available to chat.
I think that this will make a lot of people feel like their privacy is being violated and nobody wants to feel like something is being forced on them. I think that this will make people feel attacked and criticized. I don’t know if I think that this is a good idea.
What a waste of time and money. Anybody that is SERIOUS about killing themselves is not going to announce it on Facebook and risk being committed. Even if someone did post it, the procedure is so cumbersome that by the time the email was received, either they’d be over it or dead.
Comments are closed.