Jordan from Switchboard will be here Wednesday and Friday this week to answer your questions on anything to do with sexuality and relationships - leave your questions over on this thread
Head over to this thread to submit a song for our February 'Love' playlist that we'll be putting on Spotify!

Instagram's new policy on self-harm images - share your thoughts

The MixThe Mix The Mix HQPosts: 2,524 Community Managers
edited February 2019 in Anything Goes

Trigger warning – this thread talks about self-harm/suicide so please only read if you feel able to. If there’s anything you’ve found upsetting in this thread, please do reach out for support with us https://www.themix.org.uk/get-support or you can phone Samaritans on 116 123.

 

Hey everyone,


Some of you may have been following this in the news and seen that Instagram have changed their policy to protect users by banning graphic self-harm images. This is in light of a young girl taking her life recently.


Instagram and Facebook have been running expert panel discussions which The Mix have been part of and they’ve written a blog about this new policy which we’d really like to hear your thoughts on. Some of the language used in this blog may be triggering so please only read if you feel able to. 


I've copied below the main points from the blog (some of this has been edited from the main blog so it's in line with our guidelines. You can read the full blog post here).

1. First, these experts unanimously reaffirmed that Facebook should allow people to share admissions of self harm and suicidal thoughts, but should not allow people to share content promoting it. They stressed the importance of giving room for people to share the challenges they are going through, including admitting thoughts or actions of self harm. They said this content, though tragic and upsetting to some, often helps people connect with support and resources, helping in their recovery and saving lives.

2. But the experts also advised that some graphic images of self-harm, can have the potential to unintentionally promote self-harm even when they are shared in the context of admission or a path to recovery. As a result, based on their feedback, we will no longer allow graphic cutting images even in the context of admission and we will begin enforcing this policy in the coming weeks.

3. We also discussed whether other kinds of content — like scars shared to tell the story of recovery or certain sad memes — might unintentionally promote self-harm. This is an area where there is incomplete and sometimes competing research and the experts suggested that we continue to monitor the latest findings. We’ll do so as we continue discussions around this topic.

4. Finally, the experts emphasized the importance of building products that facilitate supportive connections, finding more opportunities to offer help and resources, and importantly, avoiding shaming people who post about their self-harm thoughts or actions. We will continue to provide resources, including messages with links to helplines, and over the coming weeks we will explore additional steps or products we can provide within Instagram and Facebook.


What are your thoughts on this blog and what do you think about the decision they’ve made to ban graphic self-harm images?


----


If you're interested in getting involved in more of the work we're doing with Instagram, head over to this thread to give your feedback on the wording they're using to communicate with people who are trying to view or post content which contains images of self-harm.  


If you're interested in writing a blog post for our website about Instagram's new policy, send us a PM :) 


Aife & the team

We're @Mike, @Italia and @Ed - the staff team here at The Mix. We don't provide support via this account, but if you have any questions about the boards or need a hand finding your way around, feel free to drop us a message. Alternatively, you can head over to the Help Desk.
Tagged:
littlegemz

Comments

  • ShaunieShaunie England 🏠Posts: 7,365 The Mix Elder
    edited February 2019
    sorry i didnt read the spolier part / blog atm as feel lil fragile atm ah so can only comment on ban on self harm. and i don’t know much about the ban or read much to comment much

    but i think is good thing to ban it. i used to go on instagram and type in "anorexia". (idk if consider as self harm) & though it does give support links when type in those words i would most likely look anyway. and would see pictures of self harm. though it was not nesically promoting it. but for me seeing pictres of it , when hatin myself give me thoughts like ' i deserve to harm myself like that'. 
    “If we could look into each other’s hearts and understand the unique challenges each of us faces, I think we would treat each other much more gently, with more love, patience, tolerance, and care” Marvin J. Ashton
    Sholay09Chloelittlegemz
  • independent_independent_ Boards Legend ScotlandPosts: 1,239 Fanatical Poster
    I think the ban is a good thing, these pictures can be really upsetting and triggering for people. I think people should definitely be signposted to helpful resources if their content is reported for this reason. I also think there should be a way for people who may accidentally come across a picture (there's always a few people who think they can get away with uploading things that are banned) to access phone numbers etc quickly. 
    Sholay09Chloelittlegemz
  • tkdogtkdog Posts: 73 Miniposter
    While I don't really know too much about it I consider it a good thing to ban, these pictures can be quite harmful and upsetting to those who encounter it. Those who already have some issues can more easily fall prey to such behaviour. This stuff is way too graphic.
    I'm not sure tho about venting generally either it can be good when done properly but in unregulated situations I see that it can just encourage the bouncing of negativity between people, depending on the community. And also the reinforcing of those feelings genetrally.That said it can be good too in getting people in touch with resources, its just depends.
    Chloelittlegemz
  • The MixThe Mix The Mix HQPosts: 2,524 Community Managers
    edited February 2019
    Thanks so much for sharing your thoughts so far everyone. It's really interesting hearing your views on this.

    Just bumping this thread in case anyone hasn't seen it yet. Feel free to continue sharing your thoughts below :) 

    - Aife
    We're @Mike, @Italia and @Ed - the staff team here at The Mix. We don't provide support via this account, but if you have any questions about the boards or need a hand finding your way around, feel free to drop us a message. Alternatively, you can head over to the Help Desk.
  • htwohig2412htwohig2412 Posts: 113 The Mix convert
    edited February 2019
    I think the banning of graphic images is really important to prevent these from acting as a trigger for other users.

    I do think there's a fine balance though. social media is a great place to connect and access quick support from others. There's a fair few closed groups that allow people to share coping strategies and experiences with others. Educating people about what could trigger others is key. Maybe closer monitoring on public posts that anyone can see since people who join a closed group will have an idea of the sort of discussions that will happen there, and will presumably be comfortable else they wouldn't have joined.

    Personally I don't think a warning message before viewing a post would work - quite often you wouldn't know if something would trigger/offend you until you actually see it. (Unless it's blatantly graphic or unnecessary, then it should be banned outright imo).

    Hope this helps, and cheers for the great work you're all doing to keep social media safe for us all! 😀
    "I know what I have to do now, I’ve got to keep breathing because tomorrow the sun will rise. Who knows what the tide could bring?"
    Cast Away


    Chloelittlegemz
  • Lucy307Lucy307 UKPosts: 699 Incredible Poster
    Yeh I totally agree that this is a good thing. In my personal experience, I didn’t know about self harm until I saw a picture of it online. As well as being extremely triggering it can also glamorise self harm for younger kids which is so wrong.

    Also think if they’re going to spend extra effort in regulating then it would be great to also spend extra effort to signpost - perhaps they could send people those targeted ads signposting to support? I’ve noticed how targeted their advertising is so they could probably identify vulnerable people quite easily? 

    - Lucy
    Treat yourself as you would treat a good friend
    Chloelittlegemz
  • ChloeChloe Posts: 7 Newbie
    Hey guys,

    Chlöe here - I work for The Mix and have been part of the discussions with Facebook and Instagram. Just wanted to say thanks for all your thoughts and feedback - much appreciated. Good to hear that we're pretty much on the same page too... that it's a good move.

    There have been a couple of questions about whether this will be automated removal or human. It's human. They have about 50,000 content moderators and they'll be the ones implementing the new policy change.

    As for the thoughts around signposting. You're spot on with making it even better. We've been talking about how that's done and whilst it won't change much today - we're going to keep working with them in coming months to make it happen. 

    Thanks again for your thoughts so far - you're amazing <3. I'm going to be going to meet with them tomorrow, so if you've got any last minute thoughts do shout!

      :3
    htwohig2412
  • MaisyMaisy The Mix convert CymruPosts: 238 Moderator
    I think the banning of self-harm was a good call. I've never actually come across self-harm images, however I know others have and I was shocked to hear how easy they are to come across. Since I've not self-harmed or been exposed to images of self-harm, I was really disturbing to see these images being easy to come across, and I can't imagine what it feels like for those that struggle with self-harm and other issues. 

    It makes sense that it would be human removal. When listening to the arguments for and against, I did wonder how it would ever be possible for software to accurately detect self-harm images compared to others. I hope that the human moderators have access to mental health support too! 

    I do think that social media can be a good platform to build a supportive community on, and it makes sense that people might want to talk about things when they get low, but it's important for it to remain supportive and not encouraging/promoting self-harm. 
    Chloelittlegemz
Sign In or Register to comment.