Partnership. Expertise. Commitment.
Our industry experts provide insurance coverage, services and solutions tailored to meet your specific needs.
Advances in artificial intelligence (AI) and deep-learning technologies are making synthetic media — deepfakes — more convincing, making it hard for us to tell what is genuine. Deepfakes take the form of images, text, audio or videos altered or generated to appear that people did or said something that they never actually did or said. The technology can be combined with AI and deep-learning techniques to manipulate real media to create synthetic media.
Deepfake technology has been available since at least 2017. Since then we've seen rapid improvement in the technical quality of deepfakes, and the means to access and create them has become easier. In 2023 popular generative AI platforms such as Midjourney 5.1 and OpenAI's DALL-E 2 emerged as widely available tools for threat actors to conduct deepfake campaigns.1
As this technology has evolved, so too have the criminal tactics to exploit it. Threat actors are using it to create synthetic media that can be used in a variety of destructive ways, creating a new and frightening reality in the 2023 cyber threat landscape.
Key deepfake technologies include:
Various deepfake creation communities are online, connecting deepfake experts with those who want to create synthetic media. Here are some common deepfakes and how threat actors use them.
Deepfake pornography accounts for the vast majority of deepfake videos. Victims are typically women from a range of professions. Non-consensual deepfake pornography can be shared indefinitely.
An individual or group with a particular political ideology could seek to disrupt an election by using deepfake video or audio to attack an opposing party
Political leaders around the globe have already been targeted, and the threat goes beyond elections. Impersonations of political leaders and high-ranking military personnel could lead to geo-political conflict.
For several years hackers have convinced victims to transfer funds to false accounts, typically by using emails impersonating CEOs and other business leaders.
We now have evidence that hackers have progressed to using synthetic audio to execute the same crime. Criminals could expand on this method by impersonating business leaders to manipulate stock prices by having a CEO announcing false information.
Individuals with grudges could attack others with deepfake technology in both personal and business environments. The outcomes of divorce proceedings, job applications and vendor bidding competitions could all be affected.
No single person, entity or technology solution can control the creation and distribution of digital content on an end-to-end basis. Its lifecycle is facilitated by a combination of people, hardware and software and it lives in cyberspace — designed for easily and quickly sharing information, which includes deepfake videos and audio. Once content is shared on the internet, it can be extremely difficult, if not impossible, to remove.
The Federal Government's eSafety website notes that while deepfake technology is advancing rapidly, some signs can help identify fake photos and videos.2
These include:
If in doubt, question the context. Ask yourself if it's what you'd expect that person to say or do, in that place, at that time.
The cyber insurance industry is evolving as new cyber threats surface. The most comprehensive policies pay for data breach crisis management, including lawyers, IT forensics investigators, credit monitoring services and public relations experts. They may also reimburse their clients for defending and settling lawsuits.
However, many policies require specific conditions to trigger coverage, and damage caused by impersonation in a deepfake video or audio may not be covered. In view of the latest deepfake threats, there are three potential losses to consider when negotiating insurance cover.
Read your cyber insurance policy carefully, explore other policies and consult your broker for advice on managing the deepfake threat. In addition to cyber insurance protection, Gallagher offers expertise, advice and resources for building business resilience to withstand cyber security incidents.
1Nelson, Jason. "FBI Warns of AI Deepfake Extortion Scams," Decrypt, 5 Jun 2023.
2"Deepfake Trends And Challenges — Position Statement," Australia Government eSafety Commissioner, 23 Jan 2022.
Gallagher provides insurance, risk management and benefits consulting services for clients in response to both known and unknown risk exposures. When providing analysis and recommendations regarding potential insurance coverage, potential claims and/or operational strategy in response to national emergencies (including health crises), we do so from an insurance and/or risk management perspective, and offer broad information about risk mitigation, loss control strategy and potential claim exposures. We have prepared this commentary and other news alerts for general information purposes only and the material is not intended to be, nor should it be interpreted as, legal or client-specific risk management advice. General insurance descriptions contained herein do not include complete insurance policy definitions, terms and/or conditions, and should not be relied on for coverage interpretation. The information may not include current governmental or insurance developments, is provided without knowledge of the individual recipient's industry or specific business or coverage circumstances, and in no way reflects or promises to provide insurance coverage outcomes that only insurance carriers' control.
Gallagher publications may contain links to non-Gallagher websites that are created and controlled by other organisations. We claim no responsibility for the content of any linked website, or any link contained therein. The inclusion of any link does not imply endorsement by Gallagher, as we have no responsibility for information referenced in material owned and controlled by other parties. Gallagher strongly encourages you to review any separate terms of use and privacy policies governing use of these third party websites and resources.
Insurance brokerage and related services to be provided by Arthur J. Gallagher & Co (Aus) Limited (ABN 34 005 543 920). Australian Financial Services License (AFSL) No. 238312