Facebook increasingly using Artificial Intelligence to screen your posts for suicide risk


    Facebook is increasingly using Artificial Intelligence to flag certain posts or private messages as suicidal. (Photo courtesy CBS Newspath)

    Depending on what you write on Facebook, an ambulance may show up at your doorstep. Facebook is increasingly using Artificial Intelligence to scan people's posts or private messages for suicidal tendencies, according to a recent New York Times report. Globally, the algorithm-flagged posts are causing Facebook to send contact at local law enforcement officials at a rate of 10 a day.

    In the Corridor, Foundation 2 Crisis Center says they haven't yet seen Facebook's algorithm play out locally, but they say on a number of occasions, they've had third-party callers contact them about a risky post they see.

    "So somebody saw a post from a friend, a family member, maybe someone they don’t know very well that concerns them," said Elisabeth Kissling, marketing director for Foundation 2 Crisis Center. "We try to coach them as much as we can if they know the person and where they live."

    Kissling says their team appreciates Facebook's close work with the National Suicide Prevention Lifeline. For years, Facebook has allowed users to flag posts they see as potentially suicidal.

    "It’s good to know that Facebook has those tools so we can coach someone to click that button to report," said Kissling.

    When it comes to A.I. doing the flagging, not everyone is jumping on board. Dr. John Torous is the head of the digital psychiatry division in the Department of Psychiatry at Beth Israel Deaconess Medical Center, a Harvard Medical School affiliated teaching hospital.

    "The concerning part is, as members of the public, we don't really know what Facebook is doing," said Dr. Torous, citing Facebook's lack of data showing the accuracy of the algorithm and subsequent ambulatory responses. "Is Facebook actually getting it right? Are they actually preventing and saving lives? Are they getting it wrong and causing people embarrassment and public resources? Are they scaring people from talking about mental health because people are worried that 'big brother' is always watching, so it's best not to even talk about it and reach out for help?"

    Kissling, with Foundation 2, says when it comes to assessing suicide risk, it's always best to air on the side of caution.

    "There may be times when law enforcement response really wasn’t needed, but hopefully it’s handled in the best way possible if that does happen," said Kissling.

    Similar to Foundation 2, the Cedar Rapids Police Department tells CBS2 news that Facebook hasn't ever contacted them about risky posts, but community members have.

    "I can recall at least two different incidents that I shared with the shift commander to have an officer check on the individuals’ welfare," said Cedar Rapids public safety spokesman Greg Buelow in an email. "One was a message that included a video of a female that appeared to take a bunch of pills and then the video ends. Another was that a male was going to take his life and officers were sent to check on his welfare, locating the subject’s significant other. I know that dispatch has received calls as well."

    Buelow says challenges of obtaining information on Facebook include trying to identify whether it was the actual person that made the post, how old the post is, and what the location is.

    If you are feeling suicidal, or you are concerned that a friend or someone you know is at risk of suicide, contact Foundation 2 Crisis Center at (319) 362-2174.

    You can also reach the National Suicide Prevention Lifeline at 1-800-273-8255.




    News In Photos

      Loading ...