Facebook’s UK chief, Steve Hatch, has said that he is "deeply sorry" after material about depression and suicide was found on the Instagram account of a teenager who took her own life.
Ian Russell, the father of 14-year-old Molly, has claimed that Instagram is partly responsible for his daughter’s death after she viewed images that glorified self-harm and suicide.
Instagram, which is owned by Facebook, said in a statement that it "does not allow content that promotes or glorifies self-harm or suicide and will remove content of this kind".
In an interview, Hatch, Facebook’s regional director for Europe, told the BBC: "I’m deeply sorry for how this must have been such a devastating event for their family and everyone I’ve spoken to feels exactly the same."
Russell, a TV director, has called on social media companies to take more action over "harmful and disturbing content that is freely available to people online".
His intervention last night came a day after the UK government published a suicide prevention plan that includes a focus on how social media and artificial intelligence can identify people at risk of suicide.
In response, ISBA said it is "very concerned" that Instagram is monetising self-harm content.
The organisation added: "Today, advertising in the news feed is targeted to the individual and there is no control over what else appears with it. Advertisers are therefore reliant on the strength of Facebook’s and Instagram’s content moderation policies and the effectiveness of their implementation.
"The self-moderation of content by individual companies continues to be a serious part of the problem."
However, Hatch also warned that monitoring harmful content on Instagram is a "really complicated issue" and that designing policies around images of self-harm is "an incredibly tricky area to get right".
"The experts tell us that when those images are posted by people that are clearly in a really difficult situation, often they can be posted because they’re seeking help or support, which can be very supportive and useful," Hatch said. "In those cases, those images are allowed on the platform and then we also follow up with the support that those individuals seek.
"What we don’t allow is sensationalising or glamourising. But we’re also constantly reviewing these policies to make sure we’re getting them right."
ISBA has called for an independent, industry-funded trade body that certifies content policies and audits transparency reporting.