Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Medicine Stats Science

Predicting the Risk of Suicide By Analyzing the Text of Clinical Notes 70

First time accepted submitter J05H writes "Soldier and veteran suicide rates are increasing due to various factors. Critically, the rates have jumped in recent years. Now, Bayesian search experts are using anonymous Veteran's Administration notes to predict suicide risks. A related effort by Chris Poulin is the Durkheim Project which uses opt-in social media data for similar purposes."
This discussion has been archived. No new comments can be posted.

Predicting the Risk of Suicide By Analyzing the Text of Clinical Notes

Comments Filter:
  • False positives (Score:4, Insightful)

    by dala1 ( 1842368 ) on Wednesday January 29, 2014 @10:04PM (#46106465)

    According to the study this is 67% effective. But, once this is applied to the general population you have an issue, because the vast majority of people are not suicidal. In the US, about 122 in 100,000 people attempt suicide a year, and about one in ten are successful. Even with a test that is 99% accurate, you are going to end up with somewhere around seven million false positives every year if you screen everyone.

  • Re: Swartz (Score:1, Insightful)

    by Anonymous Coward on Wednesday January 29, 2014 @10:56PM (#46106753)
    The ramblings of an AI trying to achieve sentience. They've got a long way to go if they want to rule the world!
  • Re:False positives (Score:4, Insightful)

    by dala1 ( 1842368 ) on Thursday January 30, 2014 @12:49AM (#46107275)

    Seven million extra doctors' visits are hardly inconsequential, especially considering that only about 1 in 175 would actually be suicidal. Consider the time and money spent, the extra doctors who have to be trained and hired (I'm assuming psychiatrists since a GP is hardly qualified to assess a potentially suicidal patient), and the days missed from work for false alarms. That's all before the psychological trauma and loss of trust from your doctor telling you out of the blue that they think you might be mentally ill when you're not.

    Besides, that's not how it would work, because once the tool is out there and used to profile everyone, someone who is suicidal will commit suicide before that 'extra session.' After that, it will be be considered negligent not to 'do something' immediately once someone is flagged, and that something would likely be intrusive and expensive.

    Also, their accuracy rate is 67%, not 99%. I used that number to demonstrate a best-case scenario. As it stands, they would flag around 83 million people while only correctly flagging around 200,000. Good luck with 99.75% false positives.

Living on Earth may be expensive, but it includes an annual free trip around the Sun.

Working...