Online Harms: Who is responsible and what should happen to platforms who break the rules?
One of the main changes being discussed is that the Government will put in place a new “duty of care” to make companies take more responsibility for the safety of their users.
This means that if something happens online that breaks the law or is harmful, e.g. lots of people are being bullied or there is lots of fake news, social media companies and big technology firms could be fined.
So in other words..
The regulator will be acting a bit like a sports referee. When your playing sports, there are rules you need to stick to, and consequences if you break them and the ref is responsible for keeping an eye on things and making sure that everyone plays by the rules. In the same way the regulator will do that for social media companies, making sure that they are playing by the rules so everyone is kept safe.
- What are your thoughts on a regulator? Do you think it will help?
- What should happen to platforms and companies which break the rules?
“People
who wade into discomfort and vulnerability and tell the truth about their
stories are the real badasses.”
- Brene
Brown
Comments
Thanks for taking the time to think this through. I hear what you are saying. The next post is all about how we educate people to be able to be responsible online.
So what you are saying is that people should be responsible rather than the platforms? This does make a lot of sense, but could be more challenging to police maybe? I really like what you have to say about individuals taking responsibility though, just as we would expect people to do in public.
Thanks again! Super helpful
“People who wade into discomfort and vulnerability and tell the truth about their stories are the real badasses.”
- Brene Brown