Ofcom is trying to destroy our privacy at a bit slower pace. This is still unacceptable.
It’d affect people abroad too, I suppose, in that some platforms operate in both the UK and abroad. Either they leave the UK, spin off a British subsidiary, or follow the most-intrusive monitoring requirements of any country in which they operate.
It’s 1000 pages apparently, and applies equally to lemmy/mastodon servers.
So unless you’re into reading lots of legal text, running a server in the UK just got a whole lot messier.
Has the draft guidelines actually been published? I can’t find a link to it anywhere
This is the best summary I could come up with:
Social media platforms should fight online grooming by not suggesting children as “friends” by default, the communications watchdog says.
The warning is contained in Ofcom’s first guidance for tech platforms on complying with the Online Safety Act.
This first draft code of practice published by Ofcom in its role enforcing the Online Safety Act covers activity such as child sexual abuse material (CSAM), grooming and fraud.
These include requiring the largest platforms to change default settings so children aren’t added to suggested friends lists, ensure children’s location information cannot be revealed in their profile or posts, and prevent them receiving messages from people not in their contacts list.
The method is already widely used by social media and search engines, according to Professor Alan Woodward of Surrey University.
Asked if Ofcom had the resources it needed, Dame Melanie admitted it was a “really big job” but added "we’re absolutely up for the task.
The original article contains 728 words, the summary contains 153 words. Saved 79%. I’m a bot and I’m open source!