Facebook plans to expand its pattern recognition software to other countries after its successful tests to detect users with suicidal intent in the United States.
The new suicide prevention campaign by the social network giant makes sense as a basic palliative, considering the release of several studies linking time spent on Facebook with corrosive effects on one's mental health, sense of well-being and self-esteem.
The technical details of the program remain unclear, but the company noted that the software detects certain phrases that act as red flags such as the questions, “Are you ok?” and “Can I help?”
If the software detects a potential suicide, it alerts a team of Facebook workers who specialize in handling such reports. The system suggests resources to the user or to friends of the person such as a telephone helpline. Facebook workers may even call local authorities to intervene.
During the past month, first responders checked on people more than 100 times after Facebook software detected suicidal intent, boasted Guy Rosen, Facebook’s vice president for product management. The success of the software tests has led the company to roll the software out beyond the United States.
The initiative, some argue, represents a form of “empathy-washing” – a term introduced by tech analyst Evgeny Morozov in an article last year for The Guardian about large corporations taking up humanitarian causes as a means to tout their “compassionate” pretenses.
“Empathy-washing initiatives create the false impression that the crisis is under control, with individual ingenuity, finally unlocked by privatized technologies, compensating for the rapidly deteriorating situation on the ground,” Morozov wrote.
“And even if some of them do temporarily relieve the effects of the crises – against their causes, privatized technological solutions are impotent – they also entrench the power of technology platforms as indispensable intermediaries” that are essential in managing a post-crisis landscape.
Last year, when Facebook launched live video broadcasting, videos proliferated of violent acts including suicides and murders, presenting a threat to the company’s image. In May, Facebook said it would hire 3,000 more people to monitor videos and other content.
In April, a report released by Harvard Business Review provided data supporting the argument that excessive use of Facebook leads to negative self-comparisons with other online users, often detracting from more meaningful experiences enjoyed in offline, real-life social networks.
“Exposure to the carefully curated images from others' lives leads to negative self-comparison, and the sheer quantity of social media interaction may detract from more meaningful real-life experiences," the report says.
“Overall, our results showed that, while real-world social networks were positively associated with overall well-being, the use of Facebook was negatively associated with overall well-being,” added researchers Holly Shakya with the University of California, San Diego, and Nicholas Christakis with the Human Nature Lab at Yale University.
“These results were particularly strong for mental health; most measures of Facebook use in one year predicted a decrease in mental health in a later year. We found consistently that both liking others’ content and clicking links significantly predicted a subsequent reduction in self-reported physical health, mental health, and life satisfaction.”
Facebook enjoys access to a vast amount of personal data extracted from its 2.1 billion users, which it claims to use mainly for targeted advertising. However, this is the first time that the company has admitted that it would systematically scan conversations for patterns of harmful behavior.
One exception is its efforts to spot suspicious conversations between children and adult sexual predators. Facebook sometimes contacts authorities when its automated screens detect salacious or inappropriate language.
But it may be more difficult for tech firms to justify scanning conversations in other situations, said Ryan Calo, a University of Washington law professor who writes about tech.
“Once you open the door, you might wonder what other kinds of things we would be looking for,” Calo said.
Rosen declined to say if Facebook was considering pattern recognition software in other areas, such as non-sex crimes.