Home Community Announcements
If you need urgent support, call 999 or go to your nearest A&E. For Crisis Support (open 24/7) text THEMIX to 85258.
Read the community guidelines before posting ✨
Want to share your experience of using our Community?

We're collecting Community Case Studies which could be used on our website, on social media, shared with our volunteers, or shared with third parties who may be interested to hear how online communities help people.

Click here to fill out our anonymous form

Community Updates 2026

1235

Comments

  • NemuritaiNemuritai Posts: 945 Part of The Mix Family
    I'm also unsure about how the age verification works exactly and whether I need something in particular, since I don't have any IDs or anything like that.
  • so_very_tiredso_very_tired Posts: 701 Part of The Mix Family
    Does a disabled bus pass or a debit card work? If neither of those work I'm screwed. Also, sorry for being a dick earlier.

    @so_very_tired does your bus pass have the PASS logo on it, a picture of yourself and a date of birth? If so it would fall under PASS cards. Otherwise you may be able to verify using a selfie and not upload ID at all, which is what i plan to do.

    It has my face but not my date of birth.
  • independent_independent_ Community Connector Posts: 10,113 An Original Mixlorian
    that probably wouldn't work then no. But like i say you might be able to do a selfie.. but if that's not accurate you may have an issue. Which I pointed out earlier in the thread, that some people may not have the required ID or may be more comfy using ID than a selfie. I feel more comfortable with a selfie since my face is online anyway, whereas my actual ID isn't.
    “Sometimes the people around you won’t understand your journey. They don’t need to, it’s not for them.”
  • NemuritaiNemuritai Posts: 945 Part of The Mix Family
    Question about the discussion boards. When you say chats will be checked before going online, will it be in a similar way of how Childline checks their chats. For people who have never used Childline, when you send a message a counsellor would check it to make sure it is safe, this could take hours or even days and I think it prevents users from making connections or friends. Will you be doing this, or doing this in a different way that won't sacrifice users making connections or friends?

    I hope you don't mind me hopping in here, but this is the main reason the Childline boards are so hard to use and make it very difficult to make connections with people and have meaningful conversations. I do hope The Mix won't turn out the same way, as that would be a huge change and probably very inconvenient.
  • independent_independent_ Community Connector Posts: 10,113 An Original Mixlorian
    I do worry about the meaningful conversations thing too. As heated as this thread got, it was still a conversation, and it would not be happening in the way it is with pre moderation. I found aging out of childline ok because by the time i was aging out, I felt I had grown out of restrictions like that.
    “Sometimes the people around you won’t understand your journey. They don’t need to, it’s not for them.”
  • DancerDancer Community Connector Posts: 8,294 Legendary Poster
    Some aspects of Childline were good such as the helpline but their community was not always very efficient and didn`t feel like a community.
    "There's a part of me I can't get back. A little girl grew up too fast. All it took was once. I'll never be the same." ~ Demi Lovato
    "The way that I have found the light in my life is through the expressive arts because I know that I will be accepted for the way I am." ~ Me
    "I'm going to get strong again and see you soon. " ~ Anonymous 
  • so_very_tiredso_very_tired Posts: 701 Part of The Mix Family
    that probably wouldn't work then no. But like i say you might be able to do a selfie.. but if that's not accurate you may have an issue. Which I pointed out earlier in the thread, that some people may not have the required ID or may be more comfy using ID than a selfie. I feel more comfortable with a selfie since my face is online anyway, whereas my actual ID isn't.

    I'm not very comfortable with using a selfie but it seems like if I still want to use TheMix, which I'm still considering wither or not if I should leave, but it looks like I have no choice.
  • VerityVerity Community Manager Posts: 184 Helping Hand
    Hey guys, just a reminder that we will be coming back to you with answers and encourage you all to keep disucssing and putting questions together, we will be getting back to you as soon as posssible.
  • NathanNathan Community Connector Posts: 2,933 Boards Guru
    Is it possible in future to potentially explore automated ai filtered moderating for night time discussions on boards, a bit like automated flagging? That way, boards can be open during the night for those who just need a bit of company or a bit of a vent during those times, with all posts potentially in violation of the OSA filtered automatically out. It seems to me that having a such a system would allow the requirements of the OSA to be forefilled, whilst not causing it to impact the community as severely as this measure currently does. It might just be worth looking at in future. I even believe side by side have a similar thing with there DM's, in which posts can be blocked if automated filters flag it, until a moderator see's it, where as the post's that are fine are allowed to go up.
  • Amy22Amy22 Posts: 5,748 Part of The Furniture
    edited January 16
    I think that's more to do with the times tbh. It was a lovely little place back in 2015-16 when I first joined. It absolutely has its place, and shows how pre moderation can work in the right environment and circumstances. Posts were approved very quickly back then though. Those days will always have a special place for me.

    Honestly there are times when I wished I found this place in 2015-16 because I think it would have helped a lot when I was doing my GCSE'S and just stuff in general.
    Just a person who likes pop culture and films
  • independent_independent_ Community Connector Posts: 10,113 An Original Mixlorian
    Amy22 wrote: »
    I think that's more to do with the times tbh. It was a lovely little place back in 2015-16 when I first joined. It absolutely has its place, and shows how pre moderation can work in the right environment and circumstances. Posts were approved very quickly back then though. Those days will always have a special place for me.

    Honestly there are times when I wished I found this place in 2015-16 because I think it would have helped a lot when I was doing my GCSE'S and just stuff in general.

    @Amy22 I was referring to childline in the comment above, but yes having found this around then too it was definitely very helpful through exams and everything
    “Sometimes the people around you won’t understand your journey. They don’t need to, it’s not for them.”
  • AnimalloverbAnimalloverb Posts: 655 Incredible Poster
    My keyboard changed some of my message and I have only just noticed, I do apologise that it may not of made sense it was a rushed job as I needed to get to a meeting. My bad.
  • RedemptionRedemption Community Connector Posts: 6,388 Master Poster
    Oh. I didn’t think it could get any worse. I’m not giving my face. And locking posting at night means I just won’t be able to post. I wanted to leave the site gradually but I feel like I’m being forced out now. I’ll probably leave as soon as this stuff comes out 😕
    Thank you to the few people left who made this place safe and special, and I wish you all the best.

    @AnonymousToe I know its tough and you're definitely valid. Like I don't agree with all this either and of course many others. Im going to miss you so much and I'm sure many others will too, I remember you the hugs thread you gave nice comments and you've supported me, I hope I've been helpful to you too. Even though I will miss you, I respect your decision to leave. Also there's other spaces open like Mind side by side, Kooth, and Qwell but they might depend on your area. Thanks so much for your support over the years.
  • Sian321Sian321 Community Manager Posts: 3,398 Boards Guru
    Thank you for continuing to express your thoughts here, everyone. We're taking note of all responses. I'm really mindful of the emotional impact these updates are having for some, and really hope today you can find whatever you're needing right now, whether that's care and comfort from yourself, from those around you, or just some space to process and let yourself feel.

    We're here on Boards monitoring, and today we'll have our GC and Support Thread as spaces to connect with each other.
  • ellie2000ellie2000 Posts: 4,873 The Mix Elder
    I was wondering if some signposting would be need. Like other forums or help if others need it. Sites n watnot
    Crazy mad insane
  • Sian321Sian321 Community Manager Posts: 3,398 Boards Guru
    Hey @ellie2000 , thanks so much for asking. I can certainly share some other options if anyone's wanting some extra support these days following the announcement. The following spaces are here for you:

    Cool2Talk - https://peerchat.link/cool2talk
    For: Young people aged 12 and over | Cool2talk provide a safe space where young people can get their questions answered accurately and without judgement. We respond to all questions within 24 hours

    No Panic - https://peerchat.link/nopanic
    For: Adults | Online resource | Support | Offers extensive range of resources online (some free, some with fee) for people suffering from anxiety, panic attacks, OCD, phobias, and other related anxiety disorders.

    SANEline services - https://www.sane.org.uk/how-we-help/emotional-support/saneline-services
    SANEline is a national out-of-hours mental health helpline offering specialist emotional support, guidance and information to anyone affected by mental illness, including family, friends and carers. We are normally open every day of the year from 4pm to 10pm on 0300 304 7000, serving Great Britain and Northern Ireland.

    Crisis services:
    * If you need urgent help or have any concerns for your health or safety, the quickest way to get help is to call 999 or go to your nearest A&E. 
    *Crisis Messenger - Our crisis messenger text service provides free, 24/7 crisis support across the UK. If you’re aged 25 or under, and are experiencing any painful emotion or are in crisis, you can text THEMIX to 85258.
    * Papyrus - If you are having thoughts of suicide, you can contact HOPELINEUK for confidential support and practical advice. You can call them on 0800 068 4141 or text them on 07786209697.
    * Samaritans are reachable by phone and email 24/7. Whatever you're going through, you can call them any time, from any phone on 116 123.
    [/quote]


    Please see here too our Self-Care prompts post, which walks through many options for self-care activities for a range of different feelings you might be experiencing: https://community.themix.org.uk/discussion/3607428/self-care-kit-the-mixs-ultimate-guide#latest.

    https://peerchat.link/GIF
  • Amy22Amy22 Posts: 5,748 Part of The Furniture
    Amy22 wrote: »
    I think that's more to do with the times tbh. It was a lovely little place back in 2015-16 when I first joined. It absolutely has its place, and shows how pre moderation can work in the right environment and circumstances. Posts were approved very quickly back then though. Those days will always have a special place for me.

    Honestly there are times when I wished I found this place in 2015-16 because I think it would have helped a lot when I was doing my GCSE'S and just stuff in general.

    @Amy22 I was referring to childline in the comment above, but yes having found this around then too it was definitely very helpful through exams and everything

    Ah right no worries I just realised that it was for childline. I've never used it actually to be fair.
    Just a person who likes pop culture and films
  • DancerDancer Community Connector Posts: 8,294 Legendary Poster
    Oh. I didn’t think it could get any worse. I’m not giving my face. And locking posting at night means I just won’t be able to post. I wanted to leave the site gradually but I feel like I’m being forced out now. I’ll probably leave as soon as this stuff comes out 😕
    Thank you to the few people left who made this place safe and special, and I wish you all the best.

    I don`t think it will be locked but it sounds like you will have to wait until the morning for it to be approved (if it does.)
    "There's a part of me I can't get back. A little girl grew up too fast. All it took was once. I'll never be the same." ~ Demi Lovato
    "The way that I have found the light in my life is through the expressive arts because I know that I will be accepted for the way I am." ~ Me
    "I'm going to get strong again and see you soon. " ~ Anonymous 
  • Amy22Amy22 Posts: 5,748 Part of The Furniture
    Oh. I didn’t think it could get any worse. I’m not giving my face. And locking posting at night means I just won’t be able to post. I wanted to leave the site gradually but I feel like I’m being forced out now. I’ll probably leave as soon as this stuff comes out 😕
    Thank you to the few people left who made this place safe and special, and I wish you all the best.

    Honestly im sorry to hear this and I will definitely miss you on here. It is quite sad to see people leaving especially because of the new changes but I can understand and respect your decision as well. Again were all here for you if you need anything.
    Just a person who likes pop culture and films
  • ellie2000ellie2000 Posts: 4,873 The Mix Elder
    Ppl can use any person's photo if they don't want to use their one
    Crazy mad insane
  • NathanNathan Community Connector Posts: 2,933 Boards Guru
    edited January 18
    I've moved this post over from my OSA thread, because i feel it relates a lot more to this specific community update than the general OSA criticisms, and i want to put across why i personally think the Mix have done what they've done here. I've also added a bit beyond section 10, which are also extremely relevant, perhaps even more so than section 10 in some parts. I also want to clarify, this isn't support of the mix's actions at all, i think they are making a huge mistake, but this is what i believe is there reasoning for it.

    My prediction that the OSA would cause this to happen
    So, back in October, 3 or 4 months ago, i posted this in my online safety act thread, predicting the vagueness of the online safety act as being an issue that would cause organisations to jump to extremes with censoring and moderating. This is because the cost of interpreting it wrong is too high to risk anything. I went as far as to put it as two of the 5 key points where I recommended improvement, by setting exact definitions, and not leaving anything up to interpretation like the act has done:
    Nathan wrote: »
    Another major flaw is vagueness. Laws should define terms precisely to prevent loopholes or overreach. The OSA does the opposite. it leaves key phrases vague and introduces terms like “harmful but not illegal” without clarity of what is even covered by that. The predictable result is that platforms, fearing fines of up to 10 % of global revenue, will over censor to avoid risking said fine.

    Why the Online Safety Act has forced the Mix to change
    I think the mix's recently announced changes to the community in order to abide by the Online Safety Act (by there own words) are a very predictable consequence of this law. My reasoning that it's section 10 specifically is because section 10 actively states that organisations are responsible for removing content proactively, and ensuring the platform can't be used to spread it, though this is one of many sections that call for actions such as these, and is not limited to this. The issue is the act don't define what "harmful but not illegal" means (especially in section 11), nor does it define what proportionate measures are, nor do they set timescales for removal, nor point out what is acceptable levels of preventative action from platforms. It also doesn't define high risk content, which means it could very much extend to anything including mental health. In other words, it's up to interpretation, and the safest option to ensure that it's fully compliant is to pre-moderate everything to stop "harmful but not illegal content" being posted and to prevent any users from seeing it. That vagueness and not having clear definitions is a critical flaw I spoke heavily about months ago, and like i predicted, organisations fearing a potentially several million quid fine, will overcompensate and take the very actions that are currently being taken, to avoid Ofcom coming along and fining them on small technicalities or because Ofcom interpret sections differently than organisations do. Ofcom may expect a certain amount of preventative measures and decide that by not filtering content before it's posted and people can see the post, it's not enough. It sets no time limits for removal of "harmful but not illegal" content either, uses entirely vague terms and definitions, and so it becomes a very simple question. Should an organisation risk it at a time when ofcom are openly admitting to making up there own definitions for certain words to justify enforcement (see the ofcom vs 4chan case, in ofcoms own report), or should they play it safe and legally cover themselves from all possible interpretations to prevent crippling organisation destroying fines?

    This was one of my biggest concerns about the OSA. It's not specifically written or ordered by law to "pre-moderate", but makes it where the only safe legal option is to go to the extreme and cover all bases with pre-moderation, so that no matter how someone at Ofcom interprets the vague requirements and what falls under it, it's still compliant with it regardless, even when stretched to it's most extreme definitions. This is why I think the Mix jumped to the safest option automatically, pre-moderation, which is also the most community disrupting sadly, to cover themselves and to protect the organisation. Under the OSA currently as it is, Ofcom could come along as it is and say, somebody posted something in crisis during the night, and because it wasn't seen and removed until morning and other users will have seen it, they could say "we think that's "harmful content", you have failed to comply with sections 10, 11, 12, 19, 21, 23, 37, and 47 of the online safety act and so you're fined 10% of your revenue, or several million quid, whichever is higher". That's the reality of the Online safety act looming over the Mix, and this is the exact reactionary behaviour i predicted would happen back in my October post about it. We already saw this on different platforms, in which an MP's speeches was censored due to there not being clearly defined terms for "harmful but not illegal", and to avoid a fine, the platform took no chances. This is what i think is happening here. No chances are being taken, because the cost of getting it wrong is too high, and the definitions to vague.


    Specific Sections of the OSA
    I want to point out that I don't just think it's section 10, although that's the one i think played the biggest factor and is also the section i'm most familiar with from discussions at uni with IT lecturers and masters students about it. I think it's also falls under section 11, in which providers of regulated services must take all reasonable steps to ensure that their services do not facilitate harm to individuals using their services, Section 12 in which platforms must take steps to prevent children from encountering harmful content, Section 19 in which platforms must have systems in place to prevent users from encountering harmful content including through effective moderation systems, Section 21 in which providers of regulated services must assess the risks to individuals and take appropriate steps to mitigate these risks, Section 23 in which platforms must ensure that harmful content is identified and removed quickly after being posted, Section 37 in which providers must have robust content moderation systems to ensure that harmful content is quickly identified and acted upon, and section 47 in which providers of regulated services must have systems in place to ensure that harmful content is not readily accessible to users.

    So in other words, the mix have to tackle the challenge of trying to meet vague legal requirements correctly, whilst also trying not lose the majority of the community in the process. To summarise, they have to do 5 things by law essentially:
    1) Ensuring that no content can cause "harm" to users, even though "harm" is not defined and could be considered somebody posting something in crisis, or something that may be triggering. If the Mix's definition doesn't match Ofcom's and is too lax, then it's a massive fine
    2) Ensure that any content considered "harmful" is not seen by any other users, which would mean it has to be stopped before it could be posted and seen, thus the pre-moderation option there going for, or else, you guessed it, a massive fine
    3) Ensuring "harmful content" is quickly removed, with no acceptable timescale given for it. it could be 10 minutes, it could be 10 hours, it's up to how ofcom interprets it, and if the mix fall short of this unknown timescale, then it's a massive fine
    4) Ensuring that all possible risks are mitigated, though with no definition of what are acceptable mitigations, it could mean pre-moderation, it could be considered a penalty system for posting "harmful but not illegal" content, again, it's entirely up to interpretation, but if the mix aren't seen to be doing enough by Ofcom's standards, then it's a massive fine
    5) Ensuring that content moderation is robust and fast acting, again, with no timescale attached, so it could be 10 minutes or 10 hours, and if it's not, you guessed it again, it's a massive fine.

    Why the Mix's platform right now is not sustainable due to the OSA, without change
    If the Mix was to keep things as they are, and somebody posted in crisis during the night which other users saw, that's by law a potential violation of the OSA. If the Mix didn't remove the posts of someone in crisis quickly, with no timescale given, so it could be 10 minutes, it could be one day, we simply don't know, it all depends how ofcom see it, then it's a potential break of the OSA. If the Mix don't prevent it being posted in the first place, then that is another potential violation of the OSA. If the Mix's interpretation of "harmful content" is different from how Ofcom see it and is too lax, because it's not clearly defined in the act, it's, you guessed it, a possible violation of the OSA. The Mix, i think, if they had any other alternative, wouldn't be doing this extreme measure, but because of the poor quality of the online safety act as a law and it's total vagueness, there isn't any other legally safe options for them available other than to cover all bases and do what they've done. That is why i think the mix specifically had to change the way it has to abide by the OSA.

    This is how each change was done to comply with the OSA in my opinion:
    1) Stopping posts going live during the night - to comply with the requirement to remove "harmful" content quickly. If mods aren't online during the night, they can't do it as rapidly as is required by the act.

    2) Pre-moderating posts before they go live - to comply with the requirement to prevent users seeing "harmful" content and to prevent the platform being used to spread it. Harmful posts can be stopped before it's seen by any users this way.

    3) ID checks - To verify users are within the correct age range and also to an extent to prevent users outside the UK from accessing services here. Not so much a massive requirement under the OSA, but i assume this could just about fall under section 21 of risk assessment and mitigation at a push.

    Why other big tech firms aren't using such extensive measures like the Mix
    Now, it was raised that Tiktok don't have pre-moderation, and that's a really good point, however, the key difference is that most of these big tech firms A) have amazing legal teams that make it a massive fight and B ) are not UK based, but international with US backing. I would point to the recent event of US senators proposing sanctions on the UK over the mere suggestion of an X ban, which the UK backed down on, and also the online safety act having failed to successful enforce a single fine they put on companies abroad, with some companies outright telling Ofcom no thank you and refusing the fine. The difference with the mix is that it's UK based entirely, and so they can be hit by this in a way that big tech simply aren't being hit, with the full extent of the law. This is why it's more dangerous for the Mix not to go to extreme lengths, even whilst US and international tech firms are glossing over most of the OSA.

    Why I think the Mix doesn't have a choice in this, but still are flawed
    I do fundamentally believe that there are different mitigations that can be done instead of their current plan to comply with the OSA, like automated AI moderation during the night and for post screening before the posts go live, instantly and without delay. That being said, I do feel i need to point out though that the mix's actions aren't being done maliciously here, but simply because they need to protect themselves as an organisation, and this is simply the safest way to do it, by not taking any chances on vague unclear definitions and requirements. There hand is being forced by the online safety act, but there is still room for improvement admittedly different from these changes.

    Hopefully i don't get targeted again for this by people who think personal insults count as debate, but you never know. I'm happy for discussion and debate, but i'm not going to entertain personal attacks. I strongly disagree with the changes being made myself, I think there's far better, less extreme alternatives, but I understand why this act makes the Mix jump to the most extreme option, regardless of what they actually want.

    That's my stance on it anyway. Again, this is my opinion as to why the Mix have made the changes that they have. And I apologise for the length of it.


    Here's the actual online safety act itself if you want to verify any information yourself from it yourselves: https://www.legislation.gov.uk/ukpga/2023/50
    Post edited by Nathan on
Sign In or Register to comment.