A previous statement from X's Safety account said the social media platform had stopped Grok from digitally altering pictures of people to remove their clothing in "jurisdictions where such content is illegal". But campaigners and victims said the ability to generate sexually explicit pictures using the tool should have "never happened" in the first place, and Ofcom said its investigation would remain ongoing. The EU regulator said it may "impose interim measures" if X refuses to implement meaningful adjustments.
Technology Secretary Liz Kendall will use her speech to the Labour Party conference to order firms to detect and remove unsolicited explicit images being sent online. Firms that fail to comply could be fined up to 10% of their qualifying global revenue and potentially see their services blocked in the UK. Ms Kendall will tell activists in Liverpool that cyberflashing will be made a priority offence under the Online Safety Act, placing extra duties on firms to protect users from seeing unsolicited nude images or videos.
The viral video app said several hundred jobs in its trust and safety team could be affected in the UK, as well as south and south-east Asia, as part of a global reorganisation. Their work will be reallocated to other European offices and third-party providers, with some trust and safety jobs remaining in the UK, the company said. It is part of a wider move at TikTok to rely on artificial intelligence for moderation.