Welcome to Admin Junkies, Guest — join our community!

Register or log in to explore all our content and services for free on Admin Junkies.

  • Admin Junkies is proud to announce 📣 an awesome ☀️ summer special on ✍️ Content Bundles for YOUR forums! Kickstart your discussions with a Content Bundle. For the entire month of June, use the promo code AJSUMMER 🎉 to receive 50% 🎁 off your content bundle. For example, a package that normally only costs 100 Credits will only cost 50 💰 credits. Full news here.

Online Safety Act receives Royal Assent putting rules to make the UK the safest place in the world to be online into law.

Joined
Jul 15, 2023
Messages
635
Credits
1,535
Can I get some opinions on this? Will it have any impact on community management?
The Online Safety Act has today (Thursday 26 October) received Royal Assent, heralding a new era of internet safety and choice by placing world-first legal duties on social media platforms.

The new laws take a zero-tolerance approach to protecting children from online harm, while empowering adults with more choices over what they see online. This follows rigorous scrutiny and extensive debate within both the House of Commons and the House of Lords.

The Act places legal responsibility on tech companies to prevent and rapidly remove illegal content, like terrorism and revenge pornography. They will also have to stop children seeing material that is harmful to them such as bullying, content promoting self-harm and eating disorders, and pornography.

If they fail to comply with the rules, they will face significant fines that could reach billions of pounds, and if they don’t take steps required by Ofcom to protect children, their bosses could even face prison.

Technology Secretary Michelle Donelan said:
Today will go down as an historic moment that ensures the online safety of British society not only now, but for decades to come.

I am immensely proud of the work that has gone into the Online Safety Act from its very inception to it becoming law today. The Bill protects free speech, empowers adults and will ensure that platforms remove illegal content.

At the heart of this Bill, however, is the protection of children. I would like to thank the campaigners, parliamentarians, survivors of abuse and charities that have worked tirelessly, not only to get this Act over the finishing line, but to ensure that it will make the UK the safest place to be online in the world.
The Act takes a zero-tolerance approach to protecting children by making sure the buck stops with social media platforms for content they host. It does this by making sure they:
  • remove illegal content quickly or prevent it from appearing in the first place, including content promoting self-harm
  • prevent children from accessing harmful and age-inappropriate content including pornographic content, content that promotes, encourages or provides instructions for suicide, self-harm or eating disorders, content depicting or encouraging serious violence or bullying content
  • enforce age limits and use age-checking measures on platforms where content harmful to children is published
  • ensure social media platforms are more transparent about the risks and dangers posed to children on their sites, including by publishing risk assessments
  • provide parents and children with clear and accessible ways to report problems online when they do arise
Home Secretary Suella Braverman said:
This landmark law sends a clear message to criminals – whether it’s on our streets, behind closed doors or in far flung corners of the internet, there will be no hiding place for their vile crimes.

The Online Safety Act’s strongest protections are for children. Social media companies will be held to account for the appalling scale of child sexual abuse occurring on their platforms and our children will be safer.

We are determined to combat the evil of child sexual exploitation wherever it is found, and this Act is a big step forward.
Lord Chancellor and Secretary of State for Justice, Alex Chalk said:
No-one should be afraid of what they or their children might see online so our reforms will make the internet a safer place for everyone.

Trolls who encourage serious self-harm, cyberflash or share intimate images without consent now face the very real prospect of time behind bars, helping protect women and girls who are disproportionately impacted by these cowardly crimes.
In addition to protecting children, the Act also empowers adults to have better control of what they see online. It provides 3 layers of protection for internet users which will:
  1. make sure illegal content is removed
  2. enforce the promises social media platforms make to users when they sign up, through terms and conditions
  3. offer users the option to filter out content, such as online abuse, that they do not want to see
If social media platforms do not comply with these rules, Ofcom could fine them up to £18 million or 10% of their global annual revenue, whichever is biggest – meaning fines handed down to the biggest platforms could reach billions of pounds.

The government also strengthened provisions to address violence against women and girls. Through the Act, it will be easier to convict someone who shares intimate images without consent and new laws will further criminalise the non-consensual sharing of intimate deepfakes.

The change in laws also now make it easier to charge abusers who share intimate images and put more offenders behind bars. Criminals found guilty of this base offence will face up to 6 months in prison, but those who threaten to share such images, or shares them with the intent to cause distress, alarm or humiliation, or to obtain sexual gratification, could face up to two years behind bars.

NSPCC Chief Executive, Sir Peter Wanless said:

Having an Online Safety Act on the statute book is a watershed moment and will mean that children up and down the UK are fundamentally safer in their everyday lives.

Thanks to the incredible campaigning of abuse survivors and young people and the dedicated hard work of Parliamentarians and Ministers, tech companies will be legally compelled to protect children from sexual abuse and avoidable harm.

The NSPCC will continue to ensure there is a rigorous focus on children by everyone involved in regulation. Companies should be acting now, because the ultimate penalties for failure will be eye watering fines and, crucially, criminal sanctions.
Dame Melanie Dawes, Ofcom Chief Executive, said:
These new laws give Ofcom the power to start making a difference in creating a safer life online for children and adults in the UK. We’ve already trained and hired expert teams with experience across the online sector, and today we’re setting out a clear timeline for holding tech firms to account.

Ofcom is not a censor, and our new powers are not about taking content down. Our job is to tackle the root causes of harm. We will set new standards online, making sure sites and apps are safer by design. Importantly, we’ll also take full account of people’s rights to privacy and freedom of expression.

We know a safer life online cannot be achieved overnight; but Ofcom is ready to meet the scale and urgency of the challenge.
In anticipation of the Bill coming into force, many social media companies have already started making changes. TikTok has implemented stronger age verification on their platforms, while Snapchat has started removing the accounts of underage users.

While the Bill has travelled through Parliament, the government has worked closely with Ofcom to ensure protections will be implemented as quickly as possible once the Act received Royal Assent.

From today, Ofcom will immediately begin work on tackling illegal content, with a consultation process launching on 9th November 2023. They will then take a phased approach to bringing the Online Safety Act into force, prioritising enforcing rules against the most harmful content as soon as possible.

The majority of the Act’s provisions will commence in two months’ time. However, the government has commenced key provisions early to establish Ofcom as the online safety regulator from today and allow them to begin key preparatory work such as consulting as quickly as possible to implement protections for the country.
Link: https://www.gov.uk/government/news/...afer-online-as-world-leading-bill-becomes-law
 
Advertisement Placeholder
I have so many questions about this law. They did have to concede that the central plank of it was actually impossible for the targets they wanted, and watered it down to “if feasible”.

It’s also woefully naive about what happens online. So naive it isn’t funny.

I think small sites are *probably* fine, because you don’t want CSAM and similar material on your site, you never did, and if such things ever got posted you’d remove it as soon as possible. The question is whether that is good enough (and if you ask 10 lawyers you’ll get 11 different opinions), but realistically you’re probably fine because they’ll have bigger targets to go after. (Disclaimer: I am not a qualified lawyer, my opinion should not be considered legal advice. It is primarily based on observing what other broad enforcements look like in the UK and how meaningfully they’re ever implemented.)

I continue to wonder if laws like this are pushed by the big companies to edge out smaller rivals.
 
The thing I took from it was preventing access to p0rn by children. There are a number of small forums which have NSFW areas. They're gonna need to tighten up their rules a bit.
Most likely yes. But as you experienced first hand, it’s easily avoided. Even p0rnhub’s first popup is just acknowledging you are over 18 and is sufficient for them to access the content.

They will likely need to step up their game and maybe ask for an ID and registered account which will have approval. If anything I think forums are not the main focus, but p0rn is.
 
I would not begin to suggest this is anywhere near as benign as it seems, though.

There is a lot of nuance to this that is in no way obvious if you’re not from the UK and not familiar with exactly what’s going on.

“Won’t someone think of the children” is a very common tactic leveraged by several of our far-right and not-so-far-right institutions, and the fact this was pushed by Braverman says a lot. There are many examples of her apparent cruelty to other human beings on record, especially to asylum seekers.

More concerning, and more relevant, is her general stance on civil obedience. Protesting is slowly, and surely, under her watch being made illegal. Aside from what that means in terms of a slide towards a dictatorship, it necessarily comes with a mandate to play at thought-police. After all, we can’t have unmutual citizens calling out the government, now, can we?

Whatever your stance towards p0rn is, and whatever your stance on children getting access to it (which, frankly, they will one way or another because horny teenagers are stupid, because horny), this is one of those measures that’s “for your safety” but is really more of a thin end of a larger wedge.

Protecting people from material they shouldn’t see is complicated. Who decides what people shouldn’t see? Who decides what is acceptable?

And that’s the real crux of it. “Everyone” agrees that CSAM is bad, so it’s banned. “Everyone” agrees that “children shouldn’t see p0rn” so that’s banned. But there’s other things in there you have to watch out for.

Some thought is given to making it harder to buy and sell drugs. “Everyone” would agree this is a worthy goal, in theory.

Some thought is also given to “harmful content”. Well, “everyone” would agree… wait, what are we calling harmful?

Hate speech? Inciting terrorism? Could the antics of Just Stop Oil count for these things? Today, no, but tomorrow?

There’s other things that are considered “potentially harmful” too, such as eating disorder content. Content encouraging self harm. These things “should” be easy to reason through but they’re really not. This legislation is terrifyingly wide ranging, and I can absolutely see it being wielded in bad faith. A group that talks about body positivity / body acceptance for “overweight” people? Suddenly you can argue this is potentially harmful because it encourages bad eating.

As I said I don’t think most people running a forum have anything to fear because I think most forums are simply too small to be worth anyone’s time investigating.

But don’t make the mistake of assuming any of this is in good faith or that it will be used as such by the authorities. This is overly broad legislation that in its original form would have seen WhatsApp leave the UK because it would have compelled them to allow the govt access to all conversations “but only when necessary” or something like that. Trouble is, once you make a master key, *you’ve made a master key* and it doesn’t matter if “only the good guys have it” because one way or another the bad guys will make their own.
 
I seriously don't know how big sites are going to implement this. Sure they can take the same approach that Snapchat did and remove underage accounts, but it shouldn't be the government's job to dictate what can and cannot be said or shared on a platform and it solely should be up to the parents to monitor their kids online usage at the end of the day. I know a few of my cousins who refuse to let their children have social media accounts, and my sister has done the same thing. If I had kids of my own, I wouldn't allow them on social media either and I would try my best to monitor their internet usage. I'm surprised the US isn't trying this, though it seems that a lot of libraries here are under attack more so than the internet these days.
 
The big sites already have this covered to some degree.

X is already trialling pay-to-sign-up which won't fix this but moves the liability; if a child uses a parent's card to sign up, that's hardly the platform owner's fault.

Meta relies a lot on advertising so they'll already be wanting to keep basically everything in the list away because advertisers won't want to pay to be near that.

YouTube is leaning harder on AI to identify content that might be questionable because AI moderation is cheaper than humans.


This is not about safety, the gpvernment does not for one minute seriously this will improve safety (and if they do, they're more clueless and inept than even I gave them credit for). This is about the government thinking it can direct what people can see which is a far different problem.

The US has a harder time implementing something like this because there's a lot of people who will double down (hard) on 'freedom of speech'. Even if it might be to defend speech I might personally find offensive, there's enough that claim it that it would be scrutinised *hard* on that line of attack. See also Section 230 law as an example of the back and forth there. Though depending on the next election I could totally see that happening; the new Speaker of the House is sufficiently regressive (see quotes relating to "18th Century values") that if he is any indication of the 2024 government, you can expect a full doubling-down on what some have been referring to as Christofascism, which this would *absolutely* be something they'd go after.
 

Log in or register to unlock full forum benefits!

Log in or register to unlock full forum benefits!

Register

Register on Admin Junkies completely free.

Register now
Log in

If you have an account, please log in

Log in
Who read this thread (Total readers: 0)
No registered users viewing this thread.

New Threads

Would You Rather #9

  • Start a forum in a popular but highly competitive niche

    Votes: 5 18.5%
  • Initiate a forum within a limited-known niche with zero competition

    Votes: 22 81.5%
Win this space by entering the Website of The Month Contest

Theme editor

Theme customizations

Graphic Backgrounds

Granite Backgrounds