Elon Musk’s AI tool Grok will no longer be able to edit photos of real people to show them in revealing clothing in jurisdictions where it is illegal, after widespread concern over sexualised AI deepfakes.
"We have implemented technological measures to prevent the Grok account from allowing the editing of images of real people in revealing clothing," reads an announcement on X.
It also reiterated that only paid users will be able to edit images using Grok on its platform.
This will add an extra layer of protection by helping to ensure that those who try and abuse Grok to violate the law or X’s policies are held accountable, according to the statement.
A spokesperson for UK regulator Ofcom said it was a "welcome development" – but added its investigation into whether the platform had broken UK laws "remains ongoing".
X’s change was announced hours after California’s top prosecutor said the state was probing the spread of sexualised AI deepfakes, including of children, generated by the AI model.
Subscribe here: http://bit.ly/1rbfUog
For more news, analysis and features visit: www.bbc.com/news
#Grok #AI #ElonMusk #BBCNews
A tanker and a container ship have both reported attacks in the region, after Iran warns the Strait of Hormuz…
L'informazione della testata giornalistica di LA7 diretta da Enrico Mentana
ROMA (ITALPRESS) – Finisce 1-1 la sfida tra Roma e Atalanta, valida per la 33^ giornata di Serie A. Alla…
NAPOLI (ITALPRESS) – Colpo della Lazio al “Maradona”. La formazione di Sarri batte 2-0 il Napoli in trasferta nel match…