Abstract

Visual social media platforms can function as both facilitators and intervenors of concerning behaviors. This study focused on one of the health concerns worldwide, a leading cause of death related to mental health-suicide-in the context of a dominant visual social media platform, YouTube. This study employed content analysis method to identify the factors predicting viewer responses to suicide-themed content from the perspectives of who's, what's, and how's of suicide-themed videos. The results of the hierarchical multiple regression showed that the characteristics of content provider and content expression were more significant predictors of viewer engagement than were the characteristics of the message. These findings have implications for not only platform service providers but also diverse groups of individuals who participate in online discussions on suicide. YouTube has the potential to function as a locus for open discussion, education, collective coping, and even the diagnosis of suicidal ideation.

Highlights

  • According to the latest data on suicide by the World Health Organization [1], nearly 800,000 people die every year due to suicide, meaning one person dies every 90 seconds

  • The results showed that the regression model with three levels had different explanatory powers according to the degree of engagement, each measured by the number of views, likes, and comments

  • The findings showed that videos uploaded by educational facilities and religious groups, and videos textually expressing suicide methods had relatively fewer comments

Read more

Summary

Introduction

According to the latest data on suicide by the World Health Organization [1], nearly 800,000 people die every year due to suicide, meaning one person dies every 90 seconds. There have been longstanding concerns over how social networking services manage content that may negatively affect the psychological well-being of its audience, especially the young users This became an urgent issue following the death of a British girl, Molly Russel, whose father, Ian Russel, stated in an interview with BBC that Instagram encouraged his daughter to commit suicide [2]. In 2017, Molly Russel, who was known to have been posting and searching for keywords related to suicide and self-harm, such as “cutting,” “biting,” and “burning,” ended her own life. The posts that she “liked” were identified to be images that glorified suicide. This prompted the discussion on the need for an advanced platform policy to prevent such incidents from happening again

Objectives
Methods
Results
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.