UK Budget: Key details
30 Oct 2024
14 October 2024
It’s getting harder and harder to tell what’s real and what's fake in today’s digital world and the rapid advancement of technology. An alarming trend is the increase in "deepfakes”.
These AI-generated audio and video clips are now so realistic they have the potential to fool anyone. They're being used to spread fake news, impersonate others, and even carry out financial fraud.
The better news is that they are not infallible and there are steps that you can take to keep your guard up, as this article will now explain.
The misuse of deepfake technology has spanned everything from political deception to elaborate financial crimes.
Earlier this year, a Hong Kong employee was deceived into believing they were on a video call with their boss, leading to a transfer of HK$200 million (£20 million) to criminals1. Another case in December 2023 saw scammers in Australia using deepfake videos of government officials to promote a fake investment scheme2.
New threats are also surfacing, including malware that steals personal data like biometric facial profiles, which can then be used to create tailored deepfakes for identity theft and fraud3. It’s a grim reality we now face.
“The sophistication of deepfakes is growing, and with it, the threat to individuals everywhere,” cautions Tessa Bosschem, a Cyber and Information Security Specialist at Barclays Private Bank.
“These fake videos and audio clips can manipulate reality, but many people are still unaware of the danger. And as AI technology only gets better, we can expect to see even more realistic and convincing deepfakes in the future.”
When deepfakes target you directly, the risks can be significant, as impersonation may lead to financial loss and reputational harm.
But regardless of whether these deepfakes involve you, your friends, celebrities, unfamiliar individuals or even your financial adviser, the warning remains the same: you need to be vigilant against financial fraud, blackmail, identity theft and investment scams.
To ensure your safety against these potential threats, it’s important to remain alert during both your online interactions and phone conversations.
“Follow these guidelines and think about reaching out to a security expert to assess your risks and protect your safety,” recommends Bosschem.
What should you do if you find yourself in a deepfake video? Realising you’ve been targeted can be extremely upsetting.
To protect yourself and minimise the risk of it happening again, be careful about what you share online. Limit the sharing of personal information, especially high-quality images and videos, to make it more difficult for scammers to create realistic deepfakes. Adjust your privacy settings and only connect with trusted individuals.
“Being a victim of a deepfake can shake your confidence and leave you feeling powerless,” adds Bosschem. “If you find yourself in this situation, make sure you document everything, report the content, and consider seeking legal advice. Recovery includes pursuing justice and finding the support you need.”
While deepfakes are alarming, it’s important to remember that they remain relatively rare. You can greatly minimise your risks by maintaining awareness and exercising caution.
“High-net-worth individuals are particularly susceptible to deepfake scams,” warns Bosschem. “However, you can safeguard yourself by being vigilant, informed and taking proactive measures.”
And if you’re ever in doubt about an email or call from your bank, feel free to delete it or to hang up.
You can always call your bank back (or email) through the formal channels, to confirm the authenticity of any message you’re unsure about. Caution should be the name of the game.
This communication is general in nature and provided for information/educational purposes only. It does not take into account any specific investment objectives, the financial situation or particular needs of any particular person. It not intended for distribution, publication, or use in any jurisdiction where such distribution, publication, or use would be unlawful, nor is it aimed at any person or entity to whom it would be unlawful for them to access.
This communication has been prepared by Barclays Private Bank (Barclays) and references to Barclays includes any entity within the Barclays group of companies.
This communication:
(i) is not research nor a product of the Barclays Research department. Any views expressed in these materials may differ from those of the Barclays Research department. All opinions and estimates are given as of the date of the materials and are subject to change. Barclays is not obliged to inform recipients of these materials of any change to such opinions or estimates;
(ii) is not an offer, an invitation or a recommendation to enter into any product or service and does not constitute a solicitation to buy or sell securities, investment advice or a personal recommendation;
(iii) is confidential and no part may be reproduced, distributed or transmitted without the prior written permission of Barclays; and
(iv) has not been reviewed or approved by any regulatory authority.
Any past or simulated past performance including back-testing, modelling or scenario analysis, or future projections contained in this communication is no indication as to future performance. No representation is made as to the accuracy of the assumptions made in this communication, or completeness of, any modelling, scenario analysis or back-testing. The value of any investment may also fluctuate as a result of market changes.
Where information in this communication has been obtained from third party sources, we believe those sources to be reliable but we do not guarantee the information’s accuracy and you should note that it may be incomplete or condensed.
Neither Barclays nor any of its directors, officers, employees, representatives or agents, accepts any liability whatsoever for any direct, indirect or consequential losses (in contract, tort or otherwise) arising from the use of this communication or its contents or reliance on the information contained herein, except to the extent this would be prohibited by law or regulation.
Guardian, ‘Company worker in Hong Kong pays out GBP20m in deepfake video call scam’, February 2024Return to reference
AFP, ‘Deepfake of Australian treasury, central bank officials used to promote investment scam’, December 2023Return to reference
Bleeping Computer, ‘New ‘Gold Pickaxe’ Android, iOS malware steals your face for fraud’, February 2024Return to reference