If you need urgent support, call 999 or go to your nearest A&E. For Crisis Support (open 24/7) text THEMIX to 85258.
Read the community guidelines before posting ✨
Want to share your experience of using our Community?
We're collecting Community Case Studies which could be used on our website, on social media, shared with our volunteers, or shared with third parties who may be interested to hear how online communities help people.
Click here to fill out our anonymous form
We're collecting Community Case Studies which could be used on our website, on social media, shared with our volunteers, or shared with third parties who may be interested to hear how online communities help people.
Click here to fill out our anonymous form
Comments
It has my face but not my date of birth.
I hope you don't mind me hopping in here, but this is the main reason the Childline boards are so hard to use and make it very difficult to make connections with people and have meaningful conversations. I do hope The Mix won't turn out the same way, as that would be a huge change and probably very inconvenient.
"The way that I have found the light in my life is through the expressive arts because I know that I will be accepted for the way I am." ~ Me
"I'm going to get strong again and see you soon. " ~ Anonymous
I'm not very comfortable with using a selfie but it seems like if I still want to use TheMix, which I'm still considering wither or not if I should leave, but it looks like I have no choice.
I just feel like too many rules and regulations may make things less friendly, and more corporate.
Honestly there are times when I wished I found this place in 2015-16 because I think it would have helped a lot when I was doing my GCSE'S and just stuff in general.
@Amy22 I was referring to childline in the comment above, but yes having found this around then too it was definitely very helpful through exams and everything
I would also be interested to see the mix’s take on the law behind this. Whether @Nathan is right and it’s that section they are basing this decision off of, or another one entirely.
Because if it’s a section that is up for interpretation, it feels like it’s a strange decision to leap to. If it’s made clear in the law that is one thing but if it isn’t and it just requires safeguards to be put in place, why wouldn’t reactive moderation be enough, as it is on other social media sites (think TikTok, your posts don’t get approved on TikTok).
This has created a hell of a lot of bad feeling and even sparked a personal row between some members, which I know has caused a significant level of upset. So I feel this really needs to be clarified, why is it necessary (and no, it just is is not an answer to this question) and why does it come above the feelings of the community?
What actually is your aim with this? If it’s following the law, then I’d love to know which part of the law, and you’ll be able to quote that. If it’s community safety, arguably creating more loneliness jeopardises that - you may well end up with more people simply waiting for 9am, or worse, for chat, to post in crisis.
Speaking of chat, I don’t suppose this is being implemented there too (this is NOT a suggestion, btw, and would also go down like a lead balloon). Why is reactive moderation enough in chat but not on the boards? If the issue is that there aren’t enough staff, then why aren’t there more volunteers? Back when i first joined, there were volunteer mods online at all hours of the day. They couldn’t PM, but you’d get a PM from staff the very next day if they’d had to take any action at all.
I think the main issue people are raising is the nighttime posting. The difference between here and childline is that childline operate a 24 hour chat service alongside their community. So if someone was in need of support they could use that, in privacy, from the same device, without having to make a phone call. And yes wait times could be exceptionally long at night but they have a queue indicator telling people this. And they are not primarily a crisis service. The mix doesn’t have its own stand alone service like this, and before you say “but there’s crisis messenger”, the wait times are astronomical for shout at that time of night and i have seen this from both sides, both as a texted and as a volunteer. And this is absolutely not made clear on either the mix website, or the shout website. Because it’s meant to be a crisis service, people expect to get through quickly (and rightly so). The community is often people’s only option, to get something off their chest and have it acknowledged, or even just for a chat with somebody. Do any of you know of any helplines that are there.. just for a chat with somebody? Because I certainly don’t.
A particular time in my life springs to mind. It was very recently. And being able to use the venting threads at night to write about how my day went and how i was feeling was the highlight of my day. Because i could be honest. Who knows how i would have felt had i not had that. Support and chat services for adults are very limited as it is, and even at that, most are not open overnight.
Thank you to the few people left who made this place safe and special, and I wish you all the best.
@AnonymousToe I know its tough and you're definitely valid. Like I don't agree with all this either and of course many others. Im going to miss you so much and I'm sure many others will too, I remember you the hugs thread you gave nice comments and you've supported me, I hope I've been helpful to you too. Even though I will miss you, I respect your decision to leave. Also there's other spaces open like Mind side by side, Kooth, and Qwell but they might depend on your area. Thanks so much for your support over the years.
We're here on Boards monitoring, and today we'll have our GC and Support Thread as spaces to connect with each other.
But i have to admit the thought has crossed my mind too, before any of these changes it never had. That is the true extent of the impact this is having - you are likely going to lose long term members because of this, and the community may well become a shell of its former self.
It is the members that make this community what it is, and without them, it isn’t a community at all. I have to question whether the idea is to drive away people who will argue against the changes, and adopt a new generation of members who blindly accept them?
Cool2Talk - https://peerchat.link/cool2talk
For: Young people aged 12 and over | Cool2talk provide a safe space where young people can get their questions answered accurately and without judgement. We respond to all questions within 24 hours
No Panic - https://peerchat.link/nopanic
For: Adults | Online resource | Support | Offers extensive range of resources online (some free, some with fee) for people suffering from anxiety, panic attacks, OCD, phobias, and other related anxiety disorders.
SANEline services - https://www.sane.org.uk/how-we-help/emotional-support/saneline-services
SANEline is a national out-of-hours mental health helpline offering specialist emotional support, guidance and information to anyone affected by mental illness, including family, friends and carers. We are normally open every day of the year from 4pm to 10pm on 0300 304 7000, serving Great Britain and Northern Ireland.
Crisis services:
* If you need urgent help or have any concerns for your health or safety, the quickest way to get help is to call 999 or go to your nearest A&E.
*Crisis Messenger - Our crisis messenger text service provides free, 24/7 crisis support across the UK. If you’re aged 25 or under, and are experiencing any painful emotion or are in crisis, you can text THEMIX to 85258.
* Papyrus - If you are having thoughts of suicide, you can contact HOPELINEUK for confidential support and practical advice. You can call them on 0800 068 4141 or text them on 07786209697.
* Samaritans are reachable by phone and email 24/7. Whatever you're going through, you can call them any time, from any phone on 116 123.
[/quote]
Please see here too our Self-Care prompts post, which walks through many options for self-care activities for a range of different feelings you might be experiencing: https://community.themix.org.uk/discussion/3607428/self-care-kit-the-mixs-ultimate-guide#latest.
https://peerchat.link/GIF
Ah right no worries I just realised that it was for childline. I've never used it actually to be fair.
I don`t think it will be locked but it sounds like you will have to wait until the morning for it to be approved (if it does.)
"The way that I have found the light in my life is through the expressive arts because I know that I will be accepted for the way I am." ~ Me
"I'm going to get strong again and see you soon. " ~ Anonymous
Honestly im sorry to hear this and I will definitely miss you on here. It is quite sad to see people leaving especially because of the new changes but I can understand and respect your decision as well. Again were all here for you if you need anything.
Neither is a good situation, but one is marginally better than the other.
My prediction that the OSA would cause this to happen
So, back in October, 3 or 4 months ago, i posted this in my online safety act thread, predicting the vagueness of the online safety act as being an issue that would cause organisations to jump to extremes with censoring and moderating. This is because the cost of interpreting it wrong is too high to risk anything. I went as far as to put it as two of the 5 key points where I recommended improvement, by setting exact definitions, and not leaving anything up to interpretation like the act has done:
Why the Online Safety Act has forced the Mix to change
I think the mix's recently announced changes to the community in order to abide by the Online Safety Act (by there own words) are a very predictable consequence of this law. My reasoning that it's section 10 specifically is because section 10 actively states that organisations are responsible for removing content proactively, and ensuring the platform can't be used to spread it, though this is one of many sections that call for actions such as these, and is not limited to this. The issue is the act don't define what "harmful but not illegal" means (especially in section 11), nor does it define what proportionate measures are, nor do they set timescales for removal, nor point out what is acceptable levels of preventative action from platforms. It also doesn't define high risk content, which means it could very much extend to anything including mental health. In other words, it's up to interpretation, and the safest option to ensure that it's fully compliant is to pre-moderate everything to stop "harmful but not illegal content" being posted and to prevent any users from seeing it. That vagueness and not having clear definitions is a critical flaw I spoke heavily about months ago, and like i predicted, organisations fearing a potentially several million quid fine, will overcompensate and take the very actions that are currently being taken, to avoid Ofcom coming along and fining them on small technicalities or because Ofcom interpret sections differently than organisations do. Ofcom may expect a certain amount of preventative measures and decide that by not filtering content before it's posted and people can see the post, it's not enough. It sets no time limits for removal of "harmful but not illegal" content either, uses entirely vague terms and definitions, and so it becomes a very simple question. Should an organisation risk it at a time when ofcom are openly admitting to making up there own definitions for certain words to justify enforcement (see the ofcom vs 4chan case, in ofcoms own report), or should they play it safe and legally cover themselves from all possible interpretations to prevent crippling organisation destroying fines?
This was one of my biggest concerns about the OSA. It's not specifically written or ordered by law to "pre-moderate", but makes it where the only safe legal option is to go to the extreme and cover all bases with pre-moderation, so that no matter how someone at Ofcom interprets the vague requirements and what falls under it, it's still compliant with it regardless, even when stretched to it's most extreme definitions. This is why I think the Mix jumped to the safest option automatically, pre-moderation, which is also the most community disrupting sadly, to cover themselves and to protect the organisation. Under the OSA currently as it is, Ofcom could come along as it is and say, somebody posted something in crisis during the night, and because it wasn't seen and removed until morning and other users will have seen it, they could say "we think that's "harmful content", you have failed to comply with sections 10, 11, 12, 19, 21, 23, 37, and 47 of the online safety act and so you're fined 10% of your revenue, or several million quid, whichever is higher". That's the reality of the Online safety act looming over the Mix, and this is the exact reactionary behaviour i predicted would happen back in my October post about it. We already saw this on different platforms, in which an MP's speeches was censored due to there not being clearly defined terms for "harmful but not illegal", and to avoid a fine, the platform took no chances. This is what i think is happening here. No chances are being taken, because the cost of getting it wrong is too high, and the definitions to vague.
Specific Sections of the OSA
I want to point out that I don't just think it's section 10, although that's the one i think played the biggest factor and is also the section i'm most familiar with from discussions at uni with IT lecturers and masters students about it. I think it's also falls under section 11, in which providers of regulated services must take all reasonable steps to ensure that their services do not facilitate harm to individuals using their services, Section 12 in which platforms must take steps to prevent children from encountering harmful content, Section 19 in which platforms must have systems in place to prevent users from encountering harmful content including through effective moderation systems, Section 21 in which providers of regulated services must assess the risks to individuals and take appropriate steps to mitigate these risks, Section 23 in which platforms must ensure that harmful content is identified and removed quickly after being posted, Section 37 in which providers must have robust content moderation systems to ensure that harmful content is quickly identified and acted upon, and section 47 in which providers of regulated services must have systems in place to ensure that harmful content is not readily accessible to users.
So in other words, the mix have to tackle the challenge of trying to meet vague legal requirements correctly, whilst also trying not lose the majority of the community in the process. To summarise, they have to do 5 things by law essentially:
1) Ensuring that no content can cause "harm" to users, even though "harm" is not defined and could be considered somebody posting something in crisis, or something that may be triggering. If the Mix's definition doesn't match Ofcom's and is too lax, then it's a massive fine
2) Ensure that any content considered "harmful" is not seen by any other users, which would mean it has to be stopped before it could be posted and seen, thus the pre-moderation option there going for, or else, you guessed it, a massive fine
3) Ensuring "harmful content" is quickly removed, with no acceptable timescale given for it. it could be 10 minutes, it could be 10 hours, it's up to how ofcom interprets it, and if the mix fall short of this unknown timescale, then it's a massive fine
4) Ensuring that all possible risks are mitigated, though with no definition of what are acceptable mitigations, it could mean pre-moderation, it could be considered a penalty system for posting "harmful but not illegal" content, again, it's entirely up to interpretation, but if the mix aren't seen to be doing enough by Ofcom's standards, then it's a massive fine
5) Ensuring that content moderation is robust and fast acting, again, with no timescale attached, so it could be 10 minutes or 10 hours, and if it's not, you guessed it again, it's a massive fine.
Why the Mix's platform right now is not sustainable due to the OSA, without change
If the Mix was to keep things as they are, and somebody posted in crisis during the night which other users saw, that's by law a potential violation of the OSA. If the Mix didn't remove the posts of someone in crisis quickly, with no timescale given, so it could be 10 minutes, it could be one day, we simply don't know, it all depends how ofcom see it, then it's a potential break of the OSA. If the Mix don't prevent it being posted in the first place, then that is another potential violation of the OSA. If the Mix's interpretation of "harmful content" is different from how Ofcom see it and is too lax, because it's not clearly defined in the act, it's, you guessed it, a possible violation of the OSA. The Mix, i think, if they had any other alternative, wouldn't be doing this extreme measure, but because of the poor quality of the online safety act as a law and it's total vagueness, there isn't any other legally safe options for them available other than to cover all bases and do what they've done. That is why i think the mix specifically had to change the way it has to abide by the OSA.
This is how each change was done to comply with the OSA in my opinion:
1) Stopping posts going live during the night - to comply with the requirement to remove "harmful" content quickly. If mods aren't online during the night, they can't do it as rapidly as is required by the act.
2) Pre-moderating posts before they go live - to comply with the requirement to prevent users seeing "harmful" content and to prevent the platform being used to spread it. Harmful posts can be stopped before it's seen by any users this way.
3) ID checks - To verify users are within the correct age range and also to an extent to prevent users outside the UK from accessing services here. Not so much a massive requirement under the OSA, but i assume this could just about fall under section 21 of risk assessment and mitigation at a push.
Why other big tech firms aren't using such extensive measures like the Mix
Now, it was raised that Tiktok don't have pre-moderation, and that's a really good point, however, the key difference is that most of these big tech firms A) have amazing legal teams that make it a massive fight and B ) are not UK based, but international with US backing. I would point to the recent event of US senators proposing sanctions on the UK over the mere suggestion of an X ban, which the UK backed down on, and also the online safety act having failed to successful enforce a single fine they put on companies abroad, with some companies outright telling Ofcom no thank you and refusing the fine. The difference with the mix is that it's UK based entirely, and so they can be hit by this in a way that big tech simply aren't being hit, with the full extent of the law. This is why it's more dangerous for the Mix not to go to extreme lengths, even whilst US and international tech firms are glossing over most of the OSA.
Why I think the Mix doesn't have a choice in this, but still are flawed
I do fundamentally believe that there are different mitigations that can be done instead of their current plan to comply with the OSA, like automated AI moderation during the night and for post screening before the posts go live, instantly and without delay. That being said, I do feel i need to point out though that the mix's actions aren't being done maliciously here, but simply because they need to protect themselves as an organisation, and this is simply the safest way to do it, by not taking any chances on vague unclear definitions and requirements. There hand is being forced by the online safety act, but there is still room for improvement admittedly different from these changes.
Hopefully i don't get targeted again for this by people who think personal insults count as debate, but you never know. I'm happy for discussion and debate, but i'm not going to entertain personal attacks. I strongly disagree with the changes being made myself, I think there's far better, less extreme alternatives, but I understand why this act makes the Mix jump to the most extreme option, regardless of what they actually want.
That's my stance on it anyway. Again, this is my opinion as to why the Mix have made the changes that they have. And I apologise for the length of it.
Here's the actual online safety act itself if you want to verify any information yourself from it yourselves: https://www.legislation.gov.uk/ukpga/2023/50