In Instagram new tools to ban suicide posts and self harm

In the UK and Europe Instagram has launched new technology to recognise self-harm and suicide content on its app 

Images and words can both be identified by new tools that break its rules on harmful posts.

With the help of these tools, these types of posts will be less visible in the app and in most cases it removes that all automatically.

The owner of Instagram “Facebook”, said it was an essential step but they wanted to do a lot more.

Human referral

On Facebook and Instagram, this technology already exists outside Europe.

 Recognition of posts as harmful by the algorithm can be touched on by the human moderators, who decide whether to take further action, including conducting the user to help organizations and informing emergency services.

By UK’s Press Association news agency that human referral was not currently part of the new tools in the UK and Europe, told by Instagram because of the General Data Protection Regulation (GDPR) linked to the data privacy considerations. 

Implementing a referral process would be its next step said by the social media firm.

Tara Hopkins, public policy director of Instagram said in Europe that ” At the moment in the EU, we can only use that mix of knowledgeable technology and human reviews if that type of post reports to us directly by a member of the community”.

The director also said that due to a less number of cases a judgement would be made by a human reviewer on whether to send additional resources to a user, this could be monitored by regulators to be a “mental health assessment” and it receives greater protection under GDPR hence a part of special category data.

A lack of regulation over suicide and self-harm material put the Facebook and Instagram under fire in recent years.

Including Instagram, Facebook, YouTube, Twitter, Google and Pinterest, all social media companies, admitted to guidelines published by mental health charity Samaritans, in September, in an attempt to set industry standards on the issue.

Samaritans programme manager, Lydia Grace said that ” When we have observed many positive steps in the right direction in the last months, we know that there is more work to do in order to tackle harmful online content,”.

There is a need for rules to ensure technology platforms take quick action to protect from harmful content and they can take in use the tools at their disposal to do this, while ensuring endangered users can access concerned content when they need it.

Instagram also said that there also should a place where users could admit they have considered self-harm or suicide.

Ms Hopkins said and added this ” It will be okay to confess that and we want there to be a space on Instagram and Facebook for that confession”.