Home General Chat
If you need urgent support, call 999 or go to your nearest A&E. To contact our Crisis Messenger (open 24/7) text THEMIX to 85258.

Instagram's new policy on self-harm images - share your thoughts

TheMixTheMix Posts: 3,040 Boards Guru
edited February 2019 in General Chat

Trigger warning – this thread talks about self-harm/suicide so please only read if you feel able to. If there’s anything you’ve found upsetting in this thread, please do reach out for support with us https://www.themix.org.uk/get-support or you can phone Samaritans on 116 123.

 

Hey everyone,


Some of you may have been following this in the news and seen that Instagram have changed their policy to protect users by banning graphic self-harm images. This is in light of a young girl taking her life recently.


Instagram and Facebook have been running expert panel discussions which The Mix have been part of and they’ve written a blog about this new policy which we’d really like to hear your thoughts on. Some of the language used in this blog may be triggering so please only read if you feel able to. 


I've copied below the main points from the blog (some of this has been edited from the main blog so it's in line with our guidelines. You can read the full blog post here).

1. First, these experts unanimously reaffirmed that Facebook should allow people to share admissions of self harm and suicidal thoughts, but should not allow people to share content promoting it. They stressed the importance of giving room for people to share the challenges they are going through, including admitting thoughts or actions of self harm. They said this content, though tragic and upsetting to some, often helps people connect with support and resources, helping in their recovery and saving lives.

2. But the experts also advised that some graphic images of self-harm, can have the potential to unintentionally promote self-harm even when they are shared in the context of admission or a path to recovery. As a result, based on their feedback, we will no longer allow graphic cutting images even in the context of admission and we will begin enforcing this policy in the coming weeks.

3. We also discussed whether other kinds of content — like scars shared to tell the story of recovery or certain sad memes — might unintentionally promote self-harm. This is an area where there is incomplete and sometimes competing research and the experts suggested that we continue to monitor the latest findings. We’ll do so as we continue discussions around this topic.

4. Finally, the experts emphasized the importance of building products that facilitate supportive connections, finding more opportunities to offer help and resources, and importantly, avoiding shaming people who post about their self-harm thoughts or actions. We will continue to provide resources, including messages with links to helplines, and over the coming weeks we will explore additional steps or products we can provide within Instagram and Facebook.


What are your thoughts on this blog and what do you think about the decision they’ve made to ban graphic self-harm images?


----


If you're interested in getting involved in more of the work we're doing with Instagram, head over to this thread to give your feedback on the wording they're using to communicate with people who are trying to view or post content which contains images of self-harm.  


If you're interested in writing a blog post for our website about Instagram's new policy, send us a PM :) 


Aife & the team

We're @Aife, @Ella, @Emma_, @Gemma, and @JustV - the staff team here at The Mix.

Our DMs are monitored Monday - Friday (10am - 6pm) with limited cover on weekends. We have a great team of moderators looking after the community outside of those hours to check in on any reported posts.

We're not able to provide support on this account. If you need support urgently for any reason, please call 999, go to your nearest A&E or contact Crisis Messenger by texting THEMIX to 85258.
Tagged:

Comments

  • SienaSiena Posts: 15,463 Skive's The Limit
    edited February 2019
    sorry i didnt read the spolier part / blog atm as feel lil fragile atm ah so can only comment on ban on self harm. and i don’t know much about the ban or read much to comment much

    but i think is good thing to ban it. i used to go on instagram and type in "anorexia". (idk if consider as self harm) & though it does give support links when type in those words i would most likely look anyway. and would see pictures of self harm. though it was not nesically promoting it. but for me seeing pictres of it , when hatin myself give me thoughts like ' i deserve to harm myself like that'. 
    “And when they look at you, they won't see everything you've been through. They won't see the **** that turned to scars that began to fade with time. They won't see the heartbreaking things that shook up and changed your entire world. They won't know how many tears you cried or even what it was you were crying about. They won't see how strong you had to be because you had no other choice. What they will see though is how compassionate you are because you experienced pain. What they will see is how kind you are because you experienced how cruel the world is. What they will see is how good you are because you've seen how bad things or people can be. The difference between you and your experiences are who you choose to be, despite everything that could have turned you cold and unkind.You are the good the world needs and the best of us.” ~ Kirsten Corley
  • independent_independent_ Community Champion Posts: 8,610 Legendary Poster
    I think the ban is a good thing, these pictures can be really upsetting and triggering for people. I think people should definitely be signposted to helpful resources if their content is reported for this reason. I also think there should be a way for people who may accidentally come across a picture (there's always a few people who think they can get away with uploading things that are banned) to access phone numbers etc quickly. 
    “Sometimes the people around you won’t understand your journey. They don’t need to, it’s not for them.”
  • tkdogtkdog Posts: 281 The Mix Regular
    While I don't really know too much about it I consider it a good thing to ban, these pictures can be quite harmful and upsetting to those who encounter it. Those who already have some issues can more easily fall prey to such behaviour. This stuff is way too graphic.
    I'm not sure tho about venting generally either it can be good when done properly but in unregulated situations I see that it can just encourage the bouncing of negativity between people, depending on the community. And also the reinforcing of those feelings genetrally.That said it can be good too in getting people in touch with resources, its just depends.
  • TheMixTheMix Posts: 3,040 Boards Guru
    edited February 2019
    Thanks so much for sharing your thoughts so far everyone. It's really interesting hearing your views on this.

    Just bumping this thread in case anyone hasn't seen it yet. Feel free to continue sharing your thoughts below :) 

    - Aife
    We're @Aife, @Ella, @Emma_, @Gemma, and @JustV - the staff team here at The Mix.

    Our DMs are monitored Monday - Friday (10am - 6pm) with limited cover on weekends. We have a great team of moderators looking after the community outside of those hours to check in on any reported posts.

    We're not able to provide support on this account. If you need support urgently for any reason, please call 999, go to your nearest A&E or contact Crisis Messenger by texting THEMIX to 85258.
  • HarryTHarryT Community Manager Posts: 320 The Mix Regular
    edited February 2019
    I think the banning of graphic images is really important to prevent these from acting as a trigger for other users.

    I do think there's a fine balance though. social media is a great place to connect and access quick support from others. There's a fair few closed groups that allow people to share coping strategies and experiences with others. Educating people about what could trigger others is key. Maybe closer monitoring on public posts that anyone can see since people who join a closed group will have an idea of the sort of discussions that will happen there, and will presumably be comfortable else they wouldn't have joined.

    Personally I don't think a warning message before viewing a post would work - quite often you wouldn't know if something would trigger/offend you until you actually see it. (Unless it's blatantly graphic or unnecessary, then it should be banned outright imo).

    Hope this helps, and cheers for the great work you're all doing to keep social media safe for us all! 😀
    Hello amazing human (yes, that's YOU). I wish that you could see the amazing person who I see within you  ✨
  • Lucy307Lucy307 Posts: 1,171 Wise Owl
    Yeh I totally agree that this is a good thing. In my personal experience, I didn’t know about self harm until I saw a picture of it online. As well as being extremely triggering it can also glamorise self harm for younger kids which is so wrong.

    Also think if they’re going to spend extra effort in regulating then it would be great to also spend extra effort to signpost - perhaps they could send people those targeted ads signposting to support? I’ve noticed how targeted their advertising is so they could probably identify vulnerable people quite easily? 

    - Lucy
    Treat yourself as you would treat a good friend
  • ChloeChloe Deactivated Posts: 25 Boards Initiate
    Hey guys,

    Chlöe here - I work for The Mix and have been part of the discussions with Facebook and Instagram. Just wanted to say thanks for all your thoughts and feedback - much appreciated. Good to hear that we're pretty much on the same page too... that it's a good move.

    There have been a couple of questions about whether this will be automated removal or human. It's human. They have about 50,000 content moderators and they'll be the ones implementing the new policy change.

    As for the thoughts around signposting. You're spot on with making it even better. We've been talking about how that's done and whilst it won't change much today - we're going to keep working with them in coming months to make it happen. 

    Thanks again for your thoughts so far - you're amazing <3. I'm going to be going to meet with them tomorrow, so if you've got any last minute thoughts do shout!

      :3
  • MaisyMaisy Moderator Posts: 617 Incredible Poster
    I think the banning of self-harm was a good call. I've never actually come across self-harm images, however I know others have and I was shocked to hear how easy they are to come across. Since I've not self-harmed or been exposed to images of self-harm, I was really disturbing to see these images being easy to come across, and I can't imagine what it feels like for those that struggle with self-harm and other issues. 

    It makes sense that it would be human removal. When listening to the arguments for and against, I did wonder how it would ever be possible for software to accurately detect self-harm images compared to others. I hope that the human moderators have access to mental health support too! 

    I do think that social media can be a good platform to build a supportive community on, and it makes sense that people might want to talk about things when they get low, but it's important for it to remain supportive and not encouraging/promoting self-harm. 
    FAQ | How to report a post | How to report spam
    I'm a community moderator. I'm here to help guide discussions and make sure Community Guidelines are followed. I can't send DMs, but you can message @TheMix or email community@themix.org.uk with questions or concerns.
Sign In or Register to comment.