Instagram's new policy on self-harm images - share your thoughts
Trigger warning – this thread talks about self-harm/suicide so please only read if you feel able to. If there’s anything you’ve found upsetting in this thread, please do reach out for support with us https://www.themix.org.uk/get-support or you can phone Samaritans on 116 123.
Hey everyone,
Some of you may have been following this in the news and seen that Instagram have changed their policy to protect users by banning graphic self-harm images. This is in light of a young girl taking her life recently.
Instagram and Facebook have been running expert panel discussions which The Mix have been part of and they’ve written a blog about this new policy which we’d really like to hear your thoughts on. Some of the language used in this blog may be triggering so please only read if you feel able to.
2. But the experts also advised that some graphic images of self-harm, can have the potential to unintentionally promote self-harm even when they are shared in the context of admission or a path to recovery. As a result, based on their feedback, we will no longer allow graphic cutting images even in the context of admission and we will begin enforcing this policy in the coming weeks.
3. We also discussed whether other kinds of content — like scars shared to tell the story of recovery or certain sad memes — might unintentionally promote self-harm. This is an area where there is incomplete and sometimes competing research and the experts suggested that we continue to monitor the latest findings. We’ll do so as we continue discussions around this topic.
4. Finally, the experts emphasized the importance of building products that facilitate supportive connections, finding more opportunities to offer help and resources, and importantly, avoiding shaming people who post about their self-harm thoughts or actions. We will continue to provide resources, including messages with links to helplines, and over the coming weeks we will explore additional steps or products we can provide within Instagram and Facebook.
What are your thoughts on this blog and what do you think about the decision they’ve made to ban graphic self-harm images?
----
If you're interested in getting involved in more of the work we're doing with Instagram, head over to this thread to give your feedback on the wording they're using to communicate with people who are trying to view or post content which contains images of self-harm.
If you're interested in writing a blog post for our website about Instagram's new policy, send us a PM
Aife & the team
Comments
but i think is good thing to ban it. i used to go on instagram and type in "anorexia". (idk if consider as self harm) & though it does give support links when type in those words i would most likely look anyway. and would see pictures of self harm. though it was not nesically promoting it. but for me seeing pictres of it , when hatin myself give me thoughts like ' i deserve to harm myself like that'.
I'm not sure tho about venting generally either it can be good when done properly but in unregulated situations I see that it can just encourage the bouncing of negativity between people, depending on the community. And also the reinforcing of those feelings genetrally.That said it can be good too in getting people in touch with resources, its just depends.
Just bumping this thread in case anyone hasn't seen it yet. Feel free to continue sharing your thoughts below
- Aife
I do think there's a fine balance though. social media is a great place to connect and access quick support from others. There's a fair few closed groups that allow people to share coping strategies and experiences with others. Educating people about what could trigger others is key. Maybe closer monitoring on public posts that anyone can see since people who join a closed group will have an idea of the sort of discussions that will happen there, and will presumably be comfortable else they wouldn't have joined.
Personally I don't think a warning message before viewing a post would work - quite often you wouldn't know if something would trigger/offend you until you actually see it. (Unless it's blatantly graphic or unnecessary, then it should be banned outright imo).
Hope this helps, and cheers for the great work you're all doing to keep social media safe for us all! 😀
Also think if they’re going to spend extra effort in regulating then it would be great to also spend extra effort to signpost - perhaps they could send people those targeted ads signposting to support? I’ve noticed how targeted their advertising is so they could probably identify vulnerable people quite easily?
- Lucy
Chlöe here - I work for The Mix and have been part of the discussions with Facebook and Instagram. Just wanted to say thanks for all your thoughts and feedback - much appreciated. Good to hear that we're pretty much on the same page too... that it's a good move.
There have been a couple of questions about whether this will be automated removal or human. It's human. They have about 50,000 content moderators and they'll be the ones implementing the new policy change.
As for the thoughts around signposting. You're spot on with making it even better. We've been talking about how that's done and whilst it won't change much today - we're going to keep working with them in coming months to make it happen.
Thanks again for your thoughts so far - you're amazing . I'm going to be going to meet with them tomorrow, so if you've got any last minute thoughts do shout!
It makes sense that it would be human removal. When listening to the arguments for and against, I did wonder how it would ever be possible for software to accurately detect self-harm images compared to others. I hope that the human moderators have access to mental health support too!
I do think that social media can be a good platform to build a supportive community on, and it makes sense that people might want to talk about things when they get low, but it's important for it to remain supportive and not encouraging/promoting self-harm.