This browser is not actively supported anymore. For the best passle experience, we strongly recommend you upgrade your browser.
| 3 minutes read

Malicious deepfakes - Ofcom calls for immediate action

The application of much of the UK's Online Safety Act (OSA) is still some way off, but with growing pressure to tackle fast-improving deepfakes, Ofcom published a discussion paper on Mitigating the Harms of Deceptive Deepfakes on 23 July 2024. 

The paper does not constitute guidance for regulated services and Ofcom says it does not recommend or require specific actions, but it also concludes “it is imperative…that all players in the technology supply chain take action today to curtail this creation and sharing of malicious content.  This includes services regulated by the [OSA] as well as non-regulated services that lie outside of the regime”.  

Ofcom identifies four broad categories of intervention:

  • Prevention - blocking the creation of harmful deepfakes with model developers introducing safeguards, the use of prompt and output filters, and the removal of certain types of content from model training datasets. 
  • Embedding - attaching information to content to indicate its origins (eg using watermarks, labels or metadata) as it is uploaded - this is done primarily by model developers and online platforms but potentially by others like cloud providers.
  • Detection - using tools to reveal the origins of content - primarily undertaken by online platforms using automated and human review.
  • Enforcement - online platforms setting and communicating clear rules about synthetic content and acting against those who breach the rules by taking down content and suspending or removing accounts.

Ofcom recognises that many services are already taking steps to mitigate the harm of malicious deepfakes, both in their terms of service, and in their use of watermarking and other techniques, however, the final section of the discussion paper sets out Ofcom's expectations of what needs to be done.

Regulated services

Regulated user-to-user and search services have safety duties under the OSA which will cover assessing and mitigating the risks of harm associated with deepfakes. Ofcom is consulting on its Codes of Practice for illegal harms and protection of children which include measures to help services tackle illegal and harmful deepfakes. While final Codes of Practice and guidance have not been published and therefore related duties have not commenced, Ofcom says “there is no excuse for these services to sit idle in the meantime. We encourage them to act now - as some are already doing - to protect their users from encountering and being targeted by this content”.

Non-regulated services

Ofcom recognises that the OSA applies predominantly to ‘downstream’ services that interface with users, but says it will be “challenging to make inroads into tackling deepfakes without intervening upstream”. As a result it encourages model developers and hosts to implement appropriate safeguards to help them identify deepfakes produced or facilitated using their products. It suggests a wide range of potential measures based around the four intervention themes.

Next steps for Ofcom

Ofcom's work on deepfakes will continue, not only with OSA Codes and guidance, but also with a planned focus on the role of deepfakes in fraudulent advertising, facilitating CSAM and online gender-based violence and abuse. It will also work to close regulatory gaps and look at technical solutions.

What does this mean for you?

Ofcom emphasises the role of the entire technology supply chain and recommends a multi-pronged approach to tackling deepfakes, while campaigners continue to advocate for new UK legislation to regulate Generative AI models. The Labour government said in the King's Speech that it would "seek to establish the most appropriate legislation to place requirements on those working to develop the most powerful AI models”, but it did not propose an AI Bill. As Ofcom suggests, any legislation that does emerge is likely to look at ‘upstream’ technologies so they, as well as those impacted by the OSA, should pay careful attention to Ofcom's discussion paper. 

 

We'll be covering new and changing digital and data regulation at our event, Digital Forum on Thursday 12 September. Click the banner below to visit the event page for more information and to register your place.

“It is imperative…that all players in the technology supply chain take action today "

Tags

technology media & communications, copyright & media law, data protection & cyber, advertising & marketing, artificial intelligence & machine learning, ai, new digital products & data