A Real Case of Payment Fraud by Deepfake Audio

By Debra R. Richardson, MBA, CFE, CAPP

According to IBM, Generative Artificial Intelligence (AI) tools refers to deep-learning models that can generate high-quality text, images, and other content based on the data they were trained on. Using AI tools appears to have unlimited capabilities to make most of what you do easier and more efficient – both at home and at work.  You and many of your colleagues may be waiting for your companies to approve access to AI tools.  Fraudsters are not waiting.  AI tools are helping fraudsters’ requests appear legitimate and it’s proving successful in perpetrating payment fraud. 

In this three-part fraud series, we are looking at three real cases of payment fraud using AI tools and identifying how you can mitigate that fraud so you can mitigate that fraud.  

This first article looks at payment fraud using deepfake audio.

Victim Loss $35M: Is That Really Your Exec Calling? 

A Forbes article, entitled Fraudsters Cloned Company Director’s Voice In $35 Million Heist, Police Find reported that in 2020, a company involved in an acquisition was a victim to a $35M loss.  Why?  Because the financial employee received a phone call from his director confirming a series of transfers with instructions in multiple emails received.  The problem is that this was not the director on the phone. It was a fraudster using one of the many AI tools to change their voice.  

Today, AI tools need the audio of a person’s voice to reproduce that voice and combined with text to speech capabilities or real-time voice cloning the fraudster was able to carry on a conversation in the director’s voice.  All without arousing suspicion of the financial employee, especially (and shockingly) since the employee knew the director’s voice and was sure it was him.  

How to Mitigate AI Voice Cloning to Avoid Payment Fraud

  • Human-Based Mitigations that you can implement at the user level. Note, one or both may require management approval for the implementation and/or cost. 
  • This was in effect a “reverse confirmation call”.  If a confirmation call is required, ensure you initiate the confirmation call to the phone number on file.  By accepting the confirmation based on an inbound call, where the ‘from number’ can be spoofed, you have no way to confirm what number the call originated from.  
  • When receiving calls related to vendor inquiries and payments, authenticate the requestor to verify the caller is who they say they are.  The Masterfile Fraud Prevention Specialist Certification course includes a section on authentication that explains how to build an authentication reference to ask two to three identifying questions to confirm identity.  
  • Technology-Based Mitigations that may require both IT/Systems team collaboration to identify a suitable tool, and management approval for the implementation and/or cost.
    • Voice Biometric Authentication: This technology uses unique voice characteristics to verify the identity of the speaker. Voice biometrics can analyze hundreds of voice characteristics, such as pitch, tone, and accent, which are difficult to replicate accurately with deepfakes.
  • Advanced Behavioral Analysis: Implementing systems that analyze the behavior of transactions in real-time can help detect anomalies that may indicate fraud. This includes analyzing the context of the conversation, the timing of the request, and any deviation from the normal pattern of interaction between the parties involved. Advanced AI and machine learning algorithms can be trained to detect subtle cues that a human might miss, providing an additional layer of defense against sophisticated deepfake audio fraud.

Stay Safe from Deepfake Audio Payment Fraud

Fraudsters are improving their tools to make their fraudulent requests appear legitimate in order to perpetrate payment fraud.  And it’s working.  The best way to mitigate this is to put authentication techniques, internal controls best practices and vendor validations in place. This means that when (not if) a fraudulent request is processed, a fraudulent payment will not occur. 

Check back for Part 2 of this three-part series.

Three Examples of How Fraudsters Used AI Successfully for Payment Fraud – Part 2: Deepfake Video

About the author – Debra R. Richardson

Formerly the Accounts Payable Senior Manager for Vendor Setup and Maintenance, Debra now works with AP teams to mitigate fraud, avoid fines, and enhance vendor data integrity in the setup and maintenance process.

Join our free online community to connect with Debra and access more of her insights.

You may also like these articles

Technolgies Shaping the Accounts Payable Automation Landscape
Welcome to IFOL Insights: Technologies Shaping the Accounts Payable Automation Landscape. Unlock the secrets to streamlined financial processes with our […]
Revealed: Top Priorities for Accounts Payable Leaders in 2024
Unlocking Success: Understanding Accounts Payable Leadership in 2024 In any effective finance function, every transaction counts and every decision reverberates […]

Want to learn more?

Join our  free online community to access further resources, white papers and webinars, or  talk to us about membership and training.

Quick Links