When Your CFO on Zoom Is a Deepfake: The Singapore US$499K "Boardroom" Scam

John Marzella
Identity Verification Security Deepfake Impersonation AI Finance Challenge Zoom
When Your CFO on Zoom Is a Deepfake: The Singapore US$499K "Boardroom" Scam

When Your CFO on Zoom Is a Deepfake: The Singapore US$499K “Boardroom” Scam

In early 2025, a finance director at a multinational firm in Singapore joined what looked like a routine Zoom call with the company’s senior leadership. On the agenda: a confidential cross-border transaction. On the call: the CFO and several other executives, all familiar faces.

After the meeting, the director wired roughly US$499,000 to the account they specified.

Every person on that call was a deepfake.


What Happened

Based on the police and media reports, the incident unfolded like this:

  1. The hook - A finance executive received a message appearing to come from the company’s UK-based CFO, asking for urgent help with a confidential acquisition.
  2. The trust-building call - To make the story feel more legit, the “CFO” proposed a Zoom call. When the executive joined, they saw multiple “senior leaders” - faces and voices that matched people they knew.
  3. The instructions - During the call, the fake CFO outlined the need for a series of transfers to support the acquisition. The request was urgent, confidential and framed as part of a tightly controlled deal.
  4. The transfers - Reassured by the presence of leadership on video, the finance director followed the instructions and transferred approximately US$499,000 to a new beneficiary.
  5. The discovery - Only later, when the real CFO and executives denied any knowledge of the meeting, did the victim realise the entire call had been synthetic.

This is the nightmare scenario many security teams have been warning about: AI-generated video and voice convincingly impersonating leadership, in real time, to push through fraudulent payments.


Tactics: Deepfake + Authority + Urgency

This attack layered classic fraud techniques with cutting-edge generative AI:

  • Multi-party deepfakes - The attackers didn’t just fake one executive. They simulated a full “boardroom” of familiar faces, making it socially harder for the victim to push back or ask awkward questions.
  • Believable business context - The pretext was a confidential M&A transaction - a perfect blend of secrecy and urgency where extra scrutiny can feel “off”.
  • Weaponised “best practice” - How many security trainings say “If you’re unsure, jump on a call”? In this case, the video call became the proof, not something to be verified.
  • High-pressure timeline - Tight deadlines and confidentiality discouraged the finance director from running the request through normal internal controls and checks.

Where Controls Broke Down

Several assumptions that used to be safe are now dangerously outdated:

  • “If I can see and hear them, it’s really them.” - Deepfakes break this. Video is now just another untrusted medium.
  • “Unusual requests are okay if I get verbal confirmation.” - Verbal confirmation on top of a compromised channel is still compromised.
  • “Senior leadership can override process.” - The process allowed a single executive-sounding instruction to trump standard finance controls.

What was missing: an independent, cryptographic check that the human asking for money really was the person they claimed to be.


How Veraproof Challenge Can Defend Against Deepfake “Boardrooms”

Deepfakes work because they exploit informal trust signals—voice, face, familiarity. Veraproof Challenge cuts around all of that and anchors decisions to your identity provider and policies.

Here’s how this scenario looks in a company using Veraproof Challenge.

1. Policy: Every High-Value Transfer Needs a Challenge

You implement a rule such as:

Any payment above US$50,000, or to a new beneficiary, must be authorised by a Veraproof Challenge issued to the named requester (e.g. CFO) and approved by them via SSO.

Now, when finance receives an instruction—even live on Zoom - the only valid authorisation is a completed Veraproof Challenge.

2. The Deepfake Hits an Identity Wall

The attacker can imitate the CFO’s face and voice, but they cannot:

  • Log into the real CFO’s SSO account
  • Complete the Veraproof Challenge using strong Idp-backed authentication

If no valid challenge comes back, the payment does not proceed - no matter how convincing the video call felt.

3. Multi-Party Verification for Sensitive Flows

For especially sensitive payments, you can require multiple verified humans:

  • CFO and Treasurer must verify their identity via Veraproof Challenge
  • New overseas beneficiaries require an additional Challenge from Legal or Procurement
  • Exception thresholds for “confidential” deals are clearly defined

Deepfaking an entire board is hard enough. Deepfaking their IdP-backed identities at the same time is practically impossible without full account compromise across the leadership team.

4. Training People to Say: “I’ll Challenge That”

Veraproof Challenge turns “pause and verify” from optional advice into muscle memory.

When staff are trained that any unusual or high-value request must go through a Challenge, they have a simple, non-confrontational script:

Happy to help- as per policy I’ll send a Veraproof Challenge to confirm and then we can proceed.

That line works whether the real CFO is on the other end… or a deepfake is.


Takeaways

  • Live video is no longer proof of identity. Treat it as untrusted input.
  • Wire transfers and vendor updates should never be authorised purely on email, chat or a video call.
  • A repeatable, strong IdP-anchored challenge flow is how you stop a single deepfake meeting from rewriting your balance sheet.

If you’re worried about the “first deepfake incident” in your own organisation, the best time to bake in Veraproof Challenge is before you see your CFO on a call they never attended.