- Anežka Karpjáková
Communication with colleagues and clients through phone calls or video calls is common practice for companies. While these technologies greatly facilitate the day-to-day functioning of businesses, they can also pose certain risks. One such threat, particularly due to the massive advancements in artificial intelligence development, is fraud using so-called deepfakes.
What is a deepfake?
A deepfake can be considered a digital, visual, or audio medium that generates false or manipulative content using AI algorithms or advanced machine learning techniques, which have great potential to deceive. For instance, using AI technology, an attacker can create a fake audio recording or video depicting a fictitious or real person (such as the CEO of a well-known company).
Due to the rapid development of artificial intelligence, deepfakes are becoming increasingly accessible to attackers. To create fake audio today, an attacker needs only a 20-second recording of the person they wish to impersonate. They can be so perfect that in some cases they are nearly indistinguishable from reality.
Anyone can become a victim
The number of cyberattacks utilizing deepfake technology is on the rise and it is important to note that these frauds are not only targeting regular users but also companies and governmental institutions. Due to fake videos or audio recordings, companies can lose tens of millions and, as a bonus, they may also suffer damage to their good reputation or a breach of sensitive data.
One example of a deepfake fraud is the attack on a British energy company in 2019, where an attacker, using a fake audio recording, pretended to be the CEO of a German energy firm and convinced the CEO of the English company over the phone to make a bank transfer of £243,000 to a foreign account controlled by the attacker.
Similar cases are found abroad. Attackers are also targeting Czech companies. For instance, the Czech company GymBeam experienced this type of attack when an employee conducted a video call with an attacker who impersonated the company's CEO using deepfake video.
How can companies defend against deepfake frauds?
How can companies proactively prevent attacks utilizing deepfakes? Given that there currently isn't reliable software that can definitively detect deepfakes, it is crucial to focus on increasing vigilance and raising awareness of this trend among employees, as well as within the company’s top management, as these individuals are often targeted by attackers.
According to a 2024 CEDMO Trends study more than half of respondents reported that they do not know what deepfake is or how to recognize it. A KPMG study from last year states that approximately 46% of companies have not taken any measures to protect against deepfakes.
How to recognize fake recordings?
Although the quality of deepfake content can be very high, many experts claim that in some cases it is practically impossible to identify fraudulent deepfakes (especially in the case of audio recordings)—we have a few tips on what to pay attention to in videos, photos, or audio recordings:
Unrealistic movements and expressions
Watch for inconsistencies in facial or body movements, such as unsynchronized sound with lip movements, lack of blinking, strange mouth movements, or inconsistent lighting and shadows.
Verification questions
During the call, use questions that only the other party should know the answers to (for example, personal experience, internal data, details on a shared topic).
Distorted sound
Voices in videos may sound strange, monotonous, or synthetic. There may be a discrepancy between the sound and lip movements.
Suspicious content
Videos may contain unusual or extreme statements or requests that are not consistent with the person the video allegedly portrays.
Verification tasks
It is difficult for deepfakes to respond in real-time. Ask about what is currently being discussed in the daily news, what you are wearing, or what is behind you in the video call frame.
Unnatural details
Pay attention to minor errors in the background, sudden changes in lighting, or unnatural emotions on the faces of the individuals.