16 February 2020

Social Media Suicide

By Matthew Tuzel

Young people in America are losing their lives to suicide at an alarming rate. It also seems that tech companies like Facebook may have unwittingly enabled this disturbing trend. It is clear that tech companies and the rest of the country need to do more to help curb the suicide epidemic, but there are important questions about where to start and how to integrate tech companies, healthcare providers, and the government agencies that provide regulation. The U.S. military is an ideal place to start integration of tech, healthcare, and government because the U.S. military has both an acute need for improved mental health and is well positioned to spearhead initiatives to integrate tech, healthcare, and government.

America’s armed forces are in particular need of improved health care. Recently, three sailors on a single U.S. Navy ship died by suicide. Last year, 541 service members died by suicide, including a disproportionate number of young service members. That being said, comparing suicide rates between the military and the general population is problematic. Taken at face value, the military has a higher suicide rate than the general population, but when adjusted for age, sex, and other factors, the military’s suicide rates are similar to the suicide rate among the general population. What is notable, however, is that the military has a greater concentration of young people than just about any other single organization in America, which means that America’s youth suicide problem is particularly concentrated in the military. 


Increased suicide rates are not the only defining feature of America’s youth. Young people, including young service members, are also more likely to use social media. Social media could be a powerful diagnostic tool for military healthcare providers, but tech companies, healthcare providers, and the government need to take steps to close the loop between social media and healthcare providers.

Facebook and Google have already made efforts to help curb suicide among their users. The National Suicide Hotline is the top Google search result for “how to commit suicide.” Facebook has been using Artificial Intelligence and Machine Learning to help identify those likely to die by suicide. A Facebook news release states, “anyone who flags a potential cry for help is shown support options, including resources for help and ways to connect with loved ones.” Additionally, Facebook will contact local authorities but only in “serious cases.” Notably, “local authorities” are probably not healthcare providers.

Efforts to provide resources to those identified as likely victims are important, but they don’t take the critical step of closing the loop with healthcare providers. Not closing the loop effectively prevents treatment because healthcare providers don’t treat conditions that they don’t know exist. For example, your doctor won’t treat you for cancer unless she knows you have cancer. Diagnostic tools for cancer and other diseases are relatively abundant, but not for mental health, and many suicides occur without obvious warning signs. However, warning signs not obvious to people may be obvious to machines. Both the mechanics and the ethics of providing personal information of likely victims to healthcare providers are problematic, but not insurmountable.

Some of the mechanics of closing the loop between tech giants and healthcare providers are due to legal requirements. In order to comply with the Health Insurance Portability and Accountability Act (HIPAA), tech companies would need to ensure data privacy, meaning that any communication with healthcare providers would need to be encrypted. Fortunately, both the tech companies and the military already make extensive use of encrypted communication, but work would need to be done to link the two systems.

It is not enough to link two communications systems. Ultimately, information that someone is likely to die by suicide would need to flow from Facebook or Google to a mental health provider, but how would the tech giants know where to send the information? The military is well equipped to integrate tech companies with mental health providers since the military, in many cases, employs both the patients and providers. Employing both the patient and provider gives the military the ability to have important contact information for both parties, and the means to connect them, but that’s not the only reason to start with the military.

The armed forces are well positioned to spearhead the integration of tech and healthcare for four reasons. First, being a government organization, the military is well positioned to navigate the legal requirements; lawmakers could narrowly adjust legal requirements to ensure success. Second, tech companies should not be expected to shoulder the entire financial burden to standup a program like this. There are many stakeholders in reducing suicide — individuals, healthcare providers, insurance companies, and the government. With the military, all of the stakeholders, except the tech companies themselves, are part of a single organization, allowing a more focused approach. Third, the military has proven to be an excellent leader for positive social change (e.g., racial integration). Finally, Facebook and Google are already trying to move into healthcare because of the financial incentives, but that worries many who have privacy concerns. The military would help the government drive the implementation, ensure security, and lead on privacy matters. Having the military lead on this gives Uncle Sam more skin in the game to ensure proper implementation.

Ethically, there are concerns about tech companies sending mental health information to providers. For this reason, and for practical reasons, this would need to be an opt-in system. Both tech companies and the military could urge members to opt-in. Facebook and Google already use personal data to help target ads. In this case, the tech giants could target ads for this service to individuals that they think are likely in the military.

It’s time to close the loop between one of society’s most potent diagnostic tools and healthcare providers, and the best place to start is with the U.S. military. The military is in acute need of improved suicide prevention, but it is also well positioned to spearhead needed changes. The costs are too high, and the tools are too readily available to ignore this potential solution any longer.

No comments: