In Screening for Suicide Risk, Facebook Takes On Tricky Public Health Role
Institute For Ethical Hacking Course and Ethical Hacking Training in Pune – India
Extreme Hacking | Sadik Shaikh | Cyber Suraksha Abhiyan
Credits: NYTimes
A police officer on the late shift in an Ohio town recently received an unusual call from Facebook.
Earlier that day, a local woman wrote a Facebook post saying she was walking home and intended to kill herself when she got there, according to a police report on the case. Facebook called to warn the Police Department about the suicide threat.
The officer who took the call quickly located the woman, but she denied having suicidal thoughts, the police report said. Even so, the officer believed she might harm herself and told the woman that she must go to a hospital — either voluntarily or in police custody. He ultimately drove her to a hospital for a mental health work-up, an evaluation prompted by Facebook’s intervention. (The New York Times withheld some details of the case for privacy reasons.)
Police stations from Massachusetts to Mumbai have received similar alerts from Facebook over the last 18 months as part of what is most likely the world’s largest suicide threat screening and alert program. The social network ramped up the effort after several people live-streamed their suicides on Facebook Live in early 2017. It now utilizes both algorithms and user reports to flag possible suicide threats.
Facebook’s rise as a global arbiter of mental distress puts the social network in a tricky position at a time when it is under investigation for privacy lapses by regulators in the United States, Canada and the European Union — as well as facing heightened scrutiny for failing to respond quickly to election interference and ethnic hatred campaigns on its site. Even as Facebook’s chief executive, Mark Zuckerberg, has apologized for improper harvesting of user data, the company grappled last month with fresh revelations about special data-sharing deals with tech companies.
The anti-suicide campaign gives Facebook an opportunity to frame its work as a good news story. Suicide is the second-leading cause of death among people ages 15 to 29 worldwide, according to the World Health Organization. Some mental health experts and police officials said Facebook had aided officers in locating and stopping people who were clearly about to harm themselves.
Facebook has computer algorithms that scan the posts, comments and videos of users in the United States and other countries for indications of immediate suicide risk. When a post is flagged, by the technology or a concerned user, it moves to human reviewers at the company, who are empowered to call local law enforcement.
“In the last year, we’ve helped first responders quickly reach around 3,500 people globally who needed help,” Mr. Zuckerberg wrote in a November post about the efforts.
But other mental health experts said Facebook’s calls to the police could also cause harm — such as unintentionally precipitating suicide, compelling nonsuicidal people to undergo psychiatric evaluations, or prompting arrests or shootings.
www.extremehacking.org
Sadik Shaikh | Cyber Suraksha Abhiyan, Ethical Hacking Training Institute, CEHv10,CHFI,ECSAv10,CAST,ENSA, CCNA, CCNA SECURITY,MCITP,RHCE,CHECKPOINT, ASA FIREWALL,VMWARE,CLOUD,ANDROID,IPHONE,NETWORKING HARDWARE,TRAINING INSTITUTE IN PUNE, Certified Ethical Hacking,Center For Advanced Security Training in India, ceh v10 course in Pune-India, ceh certification in pune-India, ceh v10 training in Pune-India, Ethical Hacking Course in Pune-India
The post In Screening for Suicide Risk, Facebook Takes On Tricky Public Health Role appeared first on Extreme Hacking | Sadik Shaikh | Cyber Suraksha Abhiyan | Hackers Charity.
from Extreme Hacking | Sadik Shaikh | Cyber Suraksha Abhiyan | Hackers Charity http://bit.ly/2F0GBvM