October 6, 2024

Advancing Business Journey

Empowering Business Excellence

26% of execs targeted by deepfakes said fraudster’s aim was financial and accounting data

26% of execs targeted by deepfakes said fraudster’s aim was financial and accounting data

This audio is auto-generated. Please let us know if you have feedback.

Deepfakes — images or recordings created virtually to manipulate or misrepresent someone or something with malicious intent — have been on the rise as businesses turn to technology to support their most essential functions. As one Chinese company and its CFO learned the hard way earlier this year, these deepfakes can happen anywhere at any time and can have unpredictable cost consequences.

According to new data from Deloitte, not only are the occurrence and fear of these types of cybersecurity attacks on the rise but CFOs themselves may be most impacted by these breaches due to the aims of bad actors.

According to data from multiple polls during a Deloitte webcast titled “Generative AI and the fight for trust,” more than a quarter of attendees (26%) who were impacted by deepfake fraud over the last 12 months said their company’s financial and accounting data was the target.

Confidence in protecting financial data

The ability to protect financial data, arguably the lifeblood of any modern business’s decision-making processes, is far from widespread. Less than half (46%) said they are confident they can protect financial and accounting data from deepfakes. While a fifth (20%) admitted they weren’t confident they could, a larger portion (33%) notably answered “don’t know” or “not applicable.”

Experience in this regard is key. Among those who were confident in their ability to deter deepfakes from their financial data, experience with a deepfake attack was a key factor. Among those who had experienced one attack or more, they were much more likely to display confidence in their ability to counter it next time around.

Warding off fakes

As data breaches reach costs of upward of seven figures, great cybersecurity technology may not be enough for organizations whose leaders don’t prepare their employees for frauds initiated by deepfakes. Through training, skepticism and risk management, finance leaders and their fellow executives are preparing for deepfakes and their risks to businesses’ most precious commodity — data.

Findings showed a variety of ways in which organizations are preparing their teams for deepfake threats. Over a fifth (22%) said they educate and articulate new threats, while slightly fewer (19%) said they conduct training like role-playing scenarios.

Interestingly, more respondents said they did nothing to counter deepfake threats than said they’ve turned to more technology to do the same, highlighting the continuing pressure on leaders to produce ROI from their heavily invested-in tech stacks.

However, deepfake-powered technology hasn’t erased all trust levels in generative AI when it comes to its business purposes. More than two-thirds (68%) of respondents said some trust remains, but it largely depends on the use case associated with the generative AI’s function.

Tomorrow’s risks

When asked about risks associated with AI technologies over the next 12 months, results indicated a mix of concerns: loss of trust between stakeholders (25%), further compromise of company data (22%), financial loss (18%), and reputational damage (14%).

However, when it comes to preparing for risks, leaders are mostly split on whether they’re taking action now. While more say they’re preparing than not in most categories, the area that’s most difficult to prepare for — a loss of trust among business stakeholders — is the least prepared area of organizational risk from AI tools.


A variety of C-suite and other executives were polled online during a Deloitte webcast titled “Generative AI and the Fight for Trust” on May 21. The amount of answers depended on the question asked. The results were published on Sept. 16.

link

Leave a Reply

Your email address will not be published. Required fields are marked *

Copyright © All rights reserved. | Newsphere by AF themes.