Instagram has announced global restrictions on images related to self-harm, such as self-inflicted cuts, amid political and media pressure on social media platforms to take more responsibility to tackle suicides.
The picture-sharing platform will continue to treat "graphic" and "non-graphic" self-harm images differently, but is taking steps to make both less easily viewable.
Graphic images, which it previously blurred but allowed users to click to uncover, will now be removed from the site.
Non-graphic images, such as healed scars, will be allowed to stay, but will now be removed from search, hashtag, the "explore" tab and the recommendation engine.
Instagram stressed: "We want to support people in their time of need – so we are also focused on getting more resources to people posting and searching for self-harm related content and directing them to organizations that can help."
Instagram chief executive Adam Mosseri is currently in the UK to talk to politicians who are preparing to publish plans to create a regulator to police technology companies’ ability to protect children from a specified list of online harms.
In an interview with The Daily Telegraph, Mosseri admitted that there was a place for a duty of care obligation on companies such as his – something the newspaper has been pushing for.
Public interest in the issue is high in the wake of the accusation by a Briton, Ian Russell, that Instagram played a part in the death of his 14-year-old daughter Molly, who committed suicide after viewing self-harm images on the platform.
Advertiser body ISBA has also said it wants independent regulation of social media after a BBC investigation in January found that ads from the likes of Marks & Spencer and the Post Office had appeared next to self-harm content without their knowledge.
Instagram’s previous policy was defended at the time by Facebook UK chief executive Steve Hatch in an interview with the BBC. As well as saying he was "deeply sorry" about Molly Russell, Hatch warned that monitoring harmful content on Instagram is a "really complicated issue" and that designing policies around images of self-harm is "an incredibly tricky area to get right".
"The experts tell us that when those images are posted by people that are clearly in a really difficult situation, often they can be posted because they’re seeking help or support, which can be very supportive and useful," he said. "In those cases, those images are allowed on the platform and then we also follow up with the support that those individuals seek.
"What we don’t allow is sensationalising or glamourising. But we’re also constantly reviewing these policies to make sure we’re getting them right."