Skip to content
Law Society of Scotland
Search
Find a Solicitor
Contact us
About us
Sign in
Search
Find a Solicitor
Contact us
About us
Sign in
  • For members

    • For members

    • CPD & Training

    • Membership and fees

    • Rules and guidance

    • Regulation and compliance

    • Journal

    • Business support

    • Career growth

    • Member benefits

    • Professional support

    • Lawscot Wellbeing

    • Lawscot Sustainability

  • News and events

    • News and events

    • Law Society news

    • Blogs & opinions

    • CPD & Training

    • Events

  • Qualifying and education

    • Qualifying and education

    • Qualifying as a Scottish solicitor

    • Career support and advice

    • Our work with schools

    • Lawscot Foundation

    • Funding your education

    • Social mobility

  • Research and policy

    • Research and policy

    • Research

    • Influencing the law and policy

    • Equality and diversity

    • Our international work

    • Legal Services Review

    • Meet the Policy team

  • For the public

    • For the public

    • What solicitors can do for you

    • Making a complaint

    • Client protection

    • Find a Solicitor

    • Frequently asked questions

    • Your Scottish solicitor

  • About us

    • About us

    • Contact us

    • Who we are

    • Our strategy, reports and plans

    • Help and advice

    • Our standards

    • Work with us

    • Our logo and branding

    • Equality and diversity

  1. Home
  2. For members
  3. Journal Archive
  4. Issues
  5. March 2020
  6. Deepfakes, and how to avoid them

Deepfakes, and how to avoid them

Technology can now permit the creation of convincing, but fake, video or audio evidence of key players in an action. How do you go about trying to detect these?
11th March 2020 | Douglas McGregor, Christy Foster

Evidence in court traditionally consists of paper documents and the oral evidence of witnesses. But with the rise of portable technology almost everyone can now take a picture, shoot a video or record a voice clip. These contemporaneous records of events are increasingly being taken into court and used as key pieces of evidence. But is seeing really believing? Litigants and legal advisors need to be aware that things are not always as they seem.

What are deepfakes?

Deepfakes are highly convincing fakes that could convince even the savviest viewer. They can be made using AI to create videos or voice clips based on existing images or clips of a person speaking. The end product can be highly convincing “evidence” of something that never actually happened.

This technology has been used to create viral videos such as the one that appears to show Barack Obama insulting Donald Trump. However, the increasing availability of the technology involved in creating deepfakes means a growing risk of them slipping into evidence.

Family lawyer, Byron James, has recently drawn attention to the use of deepfakes in litigation. A voice clip was lodged with the court of a threatening message apparently left by his client. Despite having the same accent, tone and use of language as his client, it was ultimately proven to be a deepfake. The client had never left the message.

Another risk comes from the potential for ultra-realistic masks to fool witnesses – even from close range. A recent article looking at the use of masks draws attention to research which indicates that witnesses are not very good at spotting when a mask is being worn – whether looking at photographs or in real life. A further example involved a man arrested in the US after being identified in CCTV footage by his own mother. It turned out that the real culprit had been wearing an ultra-realistic mask and the individual originally accused was not involved in the crime.

Spotting the fakes

Concerns about the rise in deepfakes have prompted AI firms to act, with many now working on “deepfake detectors” and online security systems offering protection against the technology.

Researchers from UC Berkeley and the University of Southern California have created a tool that can detect when videos have been synthetically generated, with at least a 92% success rate. It currently identifies deepfakes of political figures by tracking minute facial movements, which are unique to the individuals. Fortunately, deepfakes are not yet sophisticated enough to mimic real-life movements perfectly, thus enabling the tool to identify what is fake and what is real.

However, this detector is reliant on large volumes of existing footage, in order to learn the unique quirks of the individual, so it isn't effective for individual litigants who are unlikely to have hundreds of hours of footage of themselves available to be analysed.

Additionally, experts anticipate that almost as soon as cybersecurity finds a way to detect the fakes, the creators will find a way to adapt, to avoid the detectors, automatically rendering them obsolete.

This game of cat and mouse was first seen in 2018, when a detector was launched that successfully recognised the use of deepfake technology in a video, by identifying a lack of blinking in featured individuals. Shortly after this detection device was announced, deepfake AI was updated to make individuals blink – rendering this mechanism of detection largely defunct.

However, all is not lost; technology may not be needed to spot a deepfake. A recent study showed that people can already spot fakes 88% of the time. An increased awareness of the potential for deepfakes and how to spot them can only increase this number – and there are several clues to look out for.

  • Does the person say something strange? Do they use a turn of phrase you wouldn’t expect? In a recent French case, a fraudster using a hyper realistic mask was only uncovered after a minor linguistic slip up – the use of "vous" rather than "tu".
  • Perhaps they promise something that never arrived? A UK executive was recently convinced by a deepfake of his CEO's voice, calling him to tell him to send $250,000 to a fraudulent account. The fraud was only discovered when the executive realised the money the CEO said would arrive by way of reimbursement never appeared.
  • Do the facts add up? In the above case the executive's suspicions were first raised when he realised that he had been called by an Austrian number. His CEO was based in Germany.
  • Do you have an original electronic copy of the image or clip? Being able to see the original metadata of a file could highlight if it has been altered. If a video or recording is lodged with the court, ask to see an electronic copy of the original.

Fortunately, current instances of deepfakes in court actions are rare, but that doesn't mean the legal sector should be complacent. Ever-improving technology means it is an area that is only likely to develop further, and something that litigants and legal advisers will need to consider looking out for in appropriate cases.

The Author

Douglas McGregor is an associate and practice development lawyer, and Christy Foster is a trainee, with Brodies LLP

Share this article
Add To Favorites
https://lawware.co.uk/

Regulars

  • Book reviews: March 2020
  • Profile: Jim Drysdale
  • People on the move: March 2020

Perspectives

  • Opinion: Val Dougan
  • Letters: March 2020
  • President's column

Features

  • Roberton: a better alternative
  • Nikah-only marriage: a Scottish remedy?
  • Shining a light on arbitration
  • Beyond the books
  • Appropriate adults: a legal framework
  • A brief history of (the law on) time

Briefings

  • Reach of case management
  • Hello Brexit, bye bye streaming portability?
  • Home defeat for Liverpool
  • When it pays less to network
  • Scottish Solicitors' Discipline Tribunal – Mar 20
  • CGT: early reporting for all
  • ILC welcomes new faces

In practice

  • Price transparency: how does it work?
  • TCSP: themes from the review
  • The best of times; the worst of times
  • SPA roundup
  • Not so good to go
  • Ask Ash

Online exclusive

  • Reading for pleasure: March 2020
  • Price transparency: why you should pay attention
  • Deepfakes, and how to avoid them
  • Working round the virus
  • Equal pay: a material defence?
  • Territorial scope strikes twice more

In this issue

  • Why should a software supplier be independent?
  • Would you drive your car without a dashboard?

Recent Issues

Dec 2023
Nov 2023
Oct 2023
Sept 2023
Search the archive

Additional

Law Society of Scotland
Atria One, 144 Morrison Street
Edinburgh
EH3 8EX
If you’re looking for a solicitor, visit FindaSolicitor.scot
T: +44(0) 131 226 7411
E: lawscot@lawscot.org.uk
About us
  • Contact us
  • Who we are
  • Strategy reports plans
  • Help and advice
  • Our standards
  • Work with us
Useful links
  • Find a Solicitor
  • Sign in
  • CPD & Training
  • Rules and guidance
  • Website terms and conditions
Law Society of Scotland | © 2025
Made by Gecko Agency Limited