A trusted voice protecting the interests of our members in Harrisburg and on a national level.
Fostering the growth of aspiring accountants with educational, motivational and financial support.
Integrity, accuracy, and ethics are pillars of the accounting profession.
Multiple Dates
Multiple Dates
Multiple Dates
Sep 10, 2025, 04:12 AM
CPA-PAC receives no funding from PICPA membership dues; every dollar comes from voluntary contributions. So, your participation is essential to ensuring the PICPA remains an influential advocate for the CPA profession in Pennsylvania.
Subscribe today and never miss an update!
The use of artificial intelligence (AI) is growing, in the accounting world and among scammers. With all of AI’s benefits, there is a dark side that CPAs must be aware of so they can take steps for self-protection and not become victims of ever more sophisticated schemes.
by Jacob R. Hough, CFE
Jun 14, 2024, 12:00 PM
You have probably seen news covering the latest advancements in artificial intelligence (AI). One “advancement” in particular, AI and deepfake technology, convinced the actor, filmmaker, and studio owner Tyler Perry to indefinitely pause his $800 million movie studio expansion.1 So, what exactly is deepfake technology and how will it impact everyday people and businesses?
Whether you realize it or not, AI is a part of many aspects of modern daily life. It is used on the websites we visit, and we are greeted with AI-powered chatbots for online customer service. But with the helpful progress of AI, there is a darker side – the potential of AI for exploitation and manipulation through scams and frauds. As AI continues to advance, so do the tactics used by malicious actors to deceive and defraud.
AI, with the ability to analyze vast amounts of data and mimic human behavior, has become a powerful tool for scammers. These perpetrators leverage AI tools to orchestrate sophisticated scams that are difficult to detect and combat. One of the most unsettling examples is deepfake content, where scammers use machine learning techniques to manipulate audio and video recordings and create convincing but entirely fabricated content.
Specifically, AI voice cloning is on the rise, according to the computer security company McAfee. Approximately 77% of AI voice scam victims lost money.2 Whenever we answer the phone and hear a loved one’s or close friend’s voice on the other end, we instantly know and trust that voice. It is reasonable for us to believe that we would recognize their voices and speaking habits. However, according to McAfee, voice-cloning tools are capable of replicating how a person speaks with up to 95% accuracy.
How are they doing this? In short, we are careless with sharing our likeness, giving cybercriminals the data they need to replicate our voice and likeness. With the use of video-based social media platforms such as TikTok, Instagram, and other social media platforms, we unknowingly create templates for fraudsters to clone our voice and likeness. Based on a McAfee survey conducted in the United States, about 52% of adults share their voice at least once per week.3 Cybercriminals can now target individuals by cloning their targets’ voice, based on their own submissions to social media.
These advanced schemes created the need for the Federal Trade Commission in 2023 to create the Office of Technology to identify and keep up with advancements of AI and bolster the government’s ability to protect consumers.4
Deepfake technology and AI have been exploited in various ways, from impersonating public figures to creating fraudulent videos to generate panic. One example of how damaging and convincing deepfakes have become was a viral photo that circulated on the internet in 2023 showing the Pentagon in Washington, D.C., on fire after an alleged explosion. It created a panic and a temporary dip in the stock market.5 The image was later determined to have been created by AI, but it was convincing enough on first viewing to elicit widespread concern and impact the stock market.
There is more to the threat than national and international affairs. Deep-fakes can be convincing enough to have an employee of a company to send millions of dollars to cybercriminals. An employee at one multinational firm received an email seemingly from the company’s chief financial officer (CFO) to send a large sum of money in an unusual transaction. Rightfully, the request raised red flags with the employee. However, the employee subsequently joined a video conference call with the CFO and other members of staff that the employee recognized. After the video call, the employee had been convinced to proceed with the unusual transaction. It turns out the alleged CFO and staff members that the employee spoke to were all part of an AI deepfake operation that convincingly duped the employee to hand over $25 million.6
AI tools are increasingly being adopted by CPAs and other accountants to assist with their day-to-day tasks. Some benefits associated with AI include bookkeeping automation, client communication, data analysis, and fast tax research.7 According to a study by the University of Pennsylvania, about 24% of top-performing client advisory services are using AI in their practices.8 As AI continues to advance, it is reasonable to assume that more CPA firms will use AI in their practices. As this occurs, it is incumbent on practitioners and accountants in industry to institute safeguards to protect themselves, their companies, and their clients.
Here are a few tips and good practices to use to protect yourself from AI-powered scams:9
As technology continues to advance, we must adapt quickly to protect ourselves. The use of AI and the benefits that stem from it can be useful for businesses and people to enhance our daily routines. However, with the benefits that arise there will always be a dark side. As technologies continue to evolve, we must continue to stay at the forefront of self-protection and not become victims of ever more sophisticated and nefarious schemes.
1 www.hollywoodreporter.com/business/business-news/tyler-perry-ai-alarm-1235833276
6 https://edition.cnn.com/2024/02/04/asia/deepfake-cfo-scam-hong-kong-intl-hnk
7 https://tax.thomsonreuters.com/blog/how-do-different-accounting-firms-use-ai
8 www.journalofaccountancy.com/news/2023/nov/how-artificial-intelligence-can-help-save-accounting.html
9 https://states.aarp.org/arizona/chatbots-and-voice-cloning-fuel-rise-in-ai-powered-scams; www.mcafee.com/blogs/family-safety/how-to-protect-your-family-from-ai-scams; www.cnbc.com/2024/01/24/how-to-protect-yourself-against-ai-voice-cloning-scams.html
10 www.scientificamerican.com/article/how-to-keep-ai-from-stealing-the-sound-of-your-voice
Jacob R. Hough, CFE, is a senior consultant with Forensic Resolutions Inc. (a part of J.S. Held) in Westmont, N.J. He can be reached at jacob.hough@jsheld.com.
The use of artificial intelligence (AI) is growing, in the accounting world and among scammers. With all of AI’s benefits, there is a dark side that CPAs must be aware of so they can take steps for self-protection and not become victims of ever more sophisticated schemes.
by Jacob R. Hough, CFE
Jun 14, 2024, 12:00 PM
You have probably seen news covering the latest advancements in artificial intelligence (AI). One “advancement” in particular, AI and deepfake technology, convinced the actor, filmmaker, and studio owner Tyler Perry to indefinitely pause his $800 million movie studio expansion.1 So, what exactly is deepfake technology and how will it impact everyday people and businesses?
Whether you realize it or not, AI is a part of many aspects of modern daily life. It is used on the websites we visit, and we are greeted with AI-powered chatbots for online customer service. But with the helpful progress of AI, there is a darker side – the potential of AI for exploitation and manipulation through scams and frauds. As AI continues to advance, so do the tactics used by malicious actors to deceive and defraud.
AI, with the ability to analyze vast amounts of data and mimic human behavior, has become a powerful tool for scammers. These perpetrators leverage AI tools to orchestrate sophisticated scams that are difficult to detect and combat. One of the most unsettling examples is deepfake content, where scammers use machine learning techniques to manipulate audio and video recordings and create convincing but entirely fabricated content.
Specifically, AI voice cloning is on the rise, according to the computer security company McAfee. Approximately 77% of AI voice scam victims lost money.2 Whenever we answer the phone and hear a loved one’s or close friend’s voice on the other end, we instantly know and trust that voice. It is reasonable for us to believe that we would recognize their voices and speaking habits. However, according to McAfee, voice-cloning tools are capable of replicating how a person speaks with up to 95% accuracy.
How are they doing this? In short, we are careless with sharing our likeness, giving cybercriminals the data they need to replicate our voice and likeness. With the use of video-based social media platforms such as TikTok, Instagram, and other social media platforms, we unknowingly create templates for fraudsters to clone our voice and likeness. Based on a McAfee survey conducted in the United States, about 52% of adults share their voice at least once per week.3 Cybercriminals can now target individuals by cloning their targets’ voice, based on their own submissions to social media.
These advanced schemes created the need for the Federal Trade Commission in 2023 to create the Office of Technology to identify and keep up with advancements of AI and bolster the government’s ability to protect consumers.4
Deepfake technology and AI have been exploited in various ways, from impersonating public figures to creating fraudulent videos to generate panic. One example of how damaging and convincing deepfakes have become was a viral photo that circulated on the internet in 2023 showing the Pentagon in Washington, D.C., on fire after an alleged explosion. It created a panic and a temporary dip in the stock market.5 The image was later determined to have been created by AI, but it was convincing enough on first viewing to elicit widespread concern and impact the stock market.
There is more to the threat than national and international affairs. Deep-fakes can be convincing enough to have an employee of a company to send millions of dollars to cybercriminals. An employee at one multinational firm received an email seemingly from the company’s chief financial officer (CFO) to send a large sum of money in an unusual transaction. Rightfully, the request raised red flags with the employee. However, the employee subsequently joined a video conference call with the CFO and other members of staff that the employee recognized. After the video call, the employee had been convinced to proceed with the unusual transaction. It turns out the alleged CFO and staff members that the employee spoke to were all part of an AI deepfake operation that convincingly duped the employee to hand over $25 million.6
AI tools are increasingly being adopted by CPAs and other accountants to assist with their day-to-day tasks. Some benefits associated with AI include bookkeeping automation, client communication, data analysis, and fast tax research.7 According to a study by the University of Pennsylvania, about 24% of top-performing client advisory services are using AI in their practices.8 As AI continues to advance, it is reasonable to assume that more CPA firms will use AI in their practices. As this occurs, it is incumbent on practitioners and accountants in industry to institute safeguards to protect themselves, their companies, and their clients.
Here are a few tips and good practices to use to protect yourself from AI-powered scams:9
As technology continues to advance, we must adapt quickly to protect ourselves. The use of AI and the benefits that stem from it can be useful for businesses and people to enhance our daily routines. However, with the benefits that arise there will always be a dark side. As technologies continue to evolve, we must continue to stay at the forefront of self-protection and not become victims of ever more sophisticated and nefarious schemes.
1 www.hollywoodreporter.com/business/business-news/tyler-perry-ai-alarm-1235833276
6 https://edition.cnn.com/2024/02/04/asia/deepfake-cfo-scam-hong-kong-intl-hnk
7 https://tax.thomsonreuters.com/blog/how-do-different-accounting-firms-use-ai
8 www.journalofaccountancy.com/news/2023/nov/how-artificial-intelligence-can-help-save-accounting.html
9 https://states.aarp.org/arizona/chatbots-and-voice-cloning-fuel-rise-in-ai-powered-scams; www.mcafee.com/blogs/family-safety/how-to-protect-your-family-from-ai-scams; www.cnbc.com/2024/01/24/how-to-protect-yourself-against-ai-voice-cloning-scams.html
10 www.scientificamerican.com/article/how-to-keep-ai-from-stealing-the-sound-of-your-voice
Jacob R. Hough, CFE, is a senior consultant with Forensic Resolutions Inc. (a part of J.S. Held) in Westmont, N.J. He can be reached at jacob.hough@jsheld.com.
Ensure that your interests are represented in Harrisburg with state legislators and regulators.
Get more than 30 hours of free CPE per year with monthly town halls, professional issues updates, free member programs, and self-study CPE Academy courses.
Get up to $100 off CPE courses, exclusive unlimited discount packages, and discounts on CPA products and services
Tap into PICPA's Career Center for career guidance and a Pennsylvania accounting specific job board.
Subscribe to My PICPA Weekly for timely information on technical updates in tax, auditing, and other financial topics. This personalized newsletter is your ticket to getting the most important topics in front of you as soon as they become issues.