This browser is not actively supported anymore. For the best passle experience, we strongly recommend you upgrade your browser.
| 3 minutes read

Controlling deepfakes - Stephen Fry shouldn't panic

English national treasure Stephen Fry is known for many things, but for a large audience it is for his mellifluous readings of the Harry Potter books. Fry has complained that an artificial intelligence tool had accurately, but without permission, imitated his voice for a history documentary, having trained itself from those same Harry Potter readings. Similarly, Tom Hanks has warned fans that his image and voice in a US dental ad are not actually him or authorised by him but were created using AI.

Meanwhile, US actors remain on strike seeking, among other things, stronger reassurances about how AI may be used to manipulate their image in filmmaking. Does AI and the increasing realism with which it imitates celebrities really threaten their livelihoods, in particular where their images or voices are used without any authorisation, such as in advertising? 

Despite much hand-wringing that the law lacks remedies to prevent the imitation of famous voices and images, things are not so bleak under UK law.

Passing off

Although AI threatens to have such a profound impact on the world that what went before is sometimes forgotten, these are not actually new problems. Despite there being no formal UK right of publicity, celebrities do enjoy rights to protect commercial uses of their image such as of their likeness or voice.

The primary form of protection lies in passing off. If the celebrity has enough goodwill in their image, and this is used in a way that deceives consumers into believing that it is the celebrity and was done with their consent, it can be passing off. 

Passing off has been relied upon in this way by celebrities over the years in a number of cases:

  • The racing driver Eddie Irvine successfully sued a radio station, Talksport, for doctoring a photograph of him on a phone so he appeared to be holding a portable radio and listening to the station, implying an endorsement of it. 
  • Rihanna successful sued the retailer TopShop for selling a range of garments bearing her image. 

Commercial imitation of voices is also nothing new. Before the advent of smartphones, there was a fad for customised mobile phone ringtones, many of which imitated celebrity voices. Letters threatening passing off claims usually resulted in the ringtone being withdrawn from sale. 

There is no reason why commercial use of an AI-generated likeness or voice of a celebrity should be any different. The scale of the potential problem now is perhaps greater, thanks to the ease of access to the necessary technology, but the legal remedies are there.

Advertising regulation

The UK's non-broadcast advertising rules (set out in the Advertising Standards Authority's CAP Code), should also give some comfort to celebrities. Marketers must be careful how they portray public figures and are 'urged' to obtain written permission before referring to a person with a public profile or implying any personal approval of the advertised product. It is a self-regulatory code rather than a statute, but the consequences of failing to comply with the CAP Code can be severe. The rules around broadcast advertising and public figures are stricter still.  

Defamation and privacy

UK law also affords strong protection for one's reputation. One of Fry’s reported concerns about AI imitations is that they could be of his image reading anything, even illegal material. If so, whoever is responsible for the creation and publication of the deepfake may be at risk of claims for defamation and potentially invasion of privacy. 

Post-mortem rights

Less clear-cut is the position around the image of deceased celebrities, especially where it is questionable to what extent the celebrity would have accrued relevant rights in their lifetime. 

Attempts by his estate to enforce 'Elvis Presley' as a UK trade mark on soap famously failed, but advertisers are still playing it safe by obtaining consents to feature an image and voice likeness of Albert Einstein advertising smart meters. 

An AI example is the revival of another familiar UK voice, the late James Alexander Gordon who broadcast the Saturday football results throughout the 70s and 80s in a distinctive style which gave the result without one needing to hear the score. With his family’s permission, his voice is being generated by AI to read out modern results. While asking their permission was surely the correct approach, it might not have been legally necessary.

The ability of AI to 'revive' dead celebrities and make them appear as young actors in their prime may hugely benefit the producer, but without any requirement under UK law to obtain permission from or compensate the celebrity's heirs. The US, with its mixture of state laws on post-mortem rights of publicity goes some way to addressing this, but UK law currently does not. Perhaps this is where today's celebrities should be directing their outrage.

Despite there being no formal UK right of publicity, celebrities do enjoy rights to protect commercial uses of their image such as of their likeness or voice.


consumer & retail, technology media & communications, brands & advertising, copyright & media law, advertising & marketing, advtech, artificial intelligence & machine learning, fashion & luxury, ai, brands, ip disputes, marketing & data, technology disputes, trade mark registration & portfolio management