Instagram tightens rules on self-injury images

Date:

Instagram late Thursday announced it is clamping down on images related to self-injury such as cutting.

The move came after British Health Secretary Matt Hancock met with social media companies about doing more to safeguard the mental health of teenagers using their platforms.

British teenager Molly Russell was found dead in her bedroom in 2017. The 14-year-old had apparently taken her own life, and her Instagram account reportedly revealed she followed accounts related to depression and suicide.

“It is encouraging to see that decisive steps are now being taken to try to protect children from disturbing content on Instagram,” said the girl’s father, Ian Russell.

“It is now time for other social media platforms to take action to recognize the responsibility they too have to their users if the internet is to become a safe place for young and vulnerable people.”

Changes to Instagram’s self-harm content rules follow a comprehensive review involving experts and academics from around the world on youth, mental health, and suicide, according to chief executive Adam Mosseri.

– Downplaying self-damage –

“Over the past month, we have seen that we are not where we need to be on self-harm and suicide, and that we need to do more to keep the most vulnerable people who use Instagram safe,” Mosseri said in an online post.

“We will not allow any graphic images of self-harm, such as cutting on Instagram – even if it would previously have been allowed as admission.”

Instagram has never allowed posts that promote or encourage suicide or self-harm.

The Facebook-owned service is removing references to non-graphic content related to people hurting themselves, such as healed scars, from search, hashtag, explore, or recommendation features.

“We are not removing this type of content from Instagram entirely, as we don’t want to stigmatize or isolate people who may be in distress and posting self-harm related content as a cry for help,” Mosseri said.

Instagram also planned to ramp up efforts get counseling or other resources to people who post or search for self-harm related content.

“During the comprehensive reviews, the experts, including the Centre for Mental Health and Save.org reaffirmed that creating safe spaces for young people to talk about their experiences — including self-harm — online, is essential,” Mosseri said.

“However, collectively it was advised that graphic images of self-harm — even when it is someone admitting their struggles — has the potential to unintentionally promote self-harm,” he continued, citing it as the reason for the ban.

Instagram’s aim is to eliminate graphic self-injury or suicide related imagery and significantly downplay related content in features at the service while remaining a supportive community, according to Mosseri.

On Thursday, Mosseri joined representatives from Facebook, Google, Snapchat, Twitter and other companies who met with Hancock to discuss handling of content related to self-injury or suicide.

“What really matters is when children are on these sites they are safe. The progress we made today is good, but there’s a lot more work to do,” Hancock said after the meeting.

“What all the companies that I met today committed to was that they want to solve this problem, and they want to work with us about it.”

AFP

Share post:

Subscribe

Popular

Prove to church instrumentalists you care—Uncle Ato to church leaders.

Ghanaian gospel musician Uncle Ato is unhappy about how...

Election 2024: I’ll hold a national dialogue on education—Mahama.

The flag bearer of the National Democratic Congress (NDC),...

Youth in Prayer Collaborates With Action Chapel For All-Night Vigil.

As Ghana prepares for the December 7 presidential and...

Archbishop of Canterbury Justin Welby resigning amid outrage over child abuse cover-up scandal in CofE.

Archbishop of Canterbury Justin Welby, leader of the Church...

More like this
Related