CPA Now Blog

Data Analytics Brings Lasting Change to World of Auditing

Discussing their feature from the winter 2020 Pennsylvania CPA Journal, Christopher T. Kosty, CPA, and Matthew Kraemer, CPA, of Schneider Downs & Co. Inc. in Pittsburgh, explore audit data analytics. Specifically, they talk about the details of a recent exposure draft on the topic, its benefits over traditional sampling, and its effect on recent accounting standards updates regarding revenue recognition and leases.

Nov 18, 2019, 07:00 AM

Discussing their feature from the winter 2020 Pennsylvania CPA Journal, Christopher T. Kosty, CPA, and Matthew Kraemer, CPA, of Schneider Downs & Co. Inc. in Pittsburgh, explore audit data analytics. Specifically, they talk about the details of a recent exposure draft on the topic, its benefits over traditional sampling, and its effect on recent accounting standards updates regarding revenue recognition and leases.

If you’d like, you can download this episode’s audio file. Additionally, you can follow us on iTunes, Google Play, or subscribe to our RSS feed.

View sponsorship and commercial opportunity details.

By: Bill Hayes, Pennsylvania CPA Journal Managing Editor



Podcast Transcript

A recent exposure draft released by the AICPA Auditing Standards Board highlights the significant impact data analytics is set to have on the world of auditing. To walk us through this exposure draft, as well as to explain why data analytics has the capability to be so much more efficient than data sampling in traditional auditing, our guests today authored a feature for the winter 2020 Pennsylvania CPA Journal, Christopher T. Kosty and Matthew R. Kraemer of Schneider Downs & Company Inc. in Pittsburgh join us today to provide a preview of their feature. Chris, Matt, thanks for joining us today.

In June 2019, the AICPA Auditing Standards Board released an exposure draft that addressed data analytics and the auditing process in a pretty significant way. What is the focus of this exposure draft, and how is it set to affect auditing?

[Kosty] I think I can look at this a few ways. As a whole, I think the overarching theme of the exposure draft was to address the evolving nature of business and audit services technology, how that's impacting the way we audit, and how companies run their business, quite frankly. At a more detailed level, for the auditors, it's really aimed at addressing some different assertions as it relates to audit evidence.

The AICPA took a stab at redefining a few things as it relates to data analytics that were previously pretty vague. I know there's a lot of buzzwords that kind of fly around in the data analytics community, and sometimes it's hard differentiate what's hype versus what's reality and going to be implemented in our day-to-day processes. This was really their attempt, I think, at starting to integrate some of this new technology that we see coming through into our existing auditing standards.

There's three pillars of focus that they have throughout the exposure draft. The first one being emerging techniques and technologies, which is kind of the point of your question, how are some of these new things being integrated into existing audits? Pillar two is professional skepticism. I'll hit on that one just a little bit. There are some relevant things as it relates to data analytics there that I think are important to bring up. The third one is external information sources, and that's really just as we start to pull in outside information that we're not getting directly from a company during our audits. How are we as auditors going to be expected to properly vet the reliability of that data coming from a third party?

I won't talk about that one too much, but I do want to go back to the emerging techniques and technologies, because I think this is really the largest portion of this exposure draft, and how it has the most impact for us. So for the first time that I've seen, the AICPA takes an attempt at defining audit data analytics, and I'm not going to read the exact definition. It's somewhat broad, but basically they're just saying using these kinds of tools and techniques that they consider audit data analytics, AI, which is artificial intelligence, the robotic process automation, blockchain, all of these buzzwords that we've been hearing for a couple of years, they kind of loop those all into one term that they're going to call automated tools and techniques.

For the emerging techniques and technologies, the question that they're proposing is, how are auditors going to be expected to implement some of these automated tools and techniques within their audits? I think it's a good question because, as I'm sure you know, accountants aren't really known for embracing change so much.

We enjoy having some guidance and a little bit of clarification on where's the line, and can we walk up to it? We don't want to go over that line, so on and so forth. They do a nice job of providing a few examples throughout the exposure draft of what they would consider to be some audit data analytics that you can integrate into your process. Whether that's for your risk assessment, for recalculating some calculations that your client prepares, whether it's just overall analytics.

The feeling that I got from the exposure draft is there's a lot of support and encouragement to kind of embrace this new technology and start implementing it within your engagements and figuring out where does it make sense to use, where is maybe it not going to be as irrelevant to use, and maybe where does it cross that line that it's not really as appropriate to use as auditors?

I think the key takeaway from there is that they don't really believe that audit data analytics alone can stand alone as an audit procedure. It really needs to be paired together with additional testing, some sort of client inquiry, things like that to really become sufficient and appropriate audit evidence.

The big takeaway there is audit data analytics. They view it as another tool in the auditor's toolbox, not necessarily a brand new stand-alone procedure. That's the overview of that first pillar, the emerging techniques and technologies.

The second one I want to hit on real quick because I think there is a pretty significant change as it relates directly to data analytics is within the professional skepticism focus here. One of the themes there is they're going to try to clarify how auditors need to document or clearly demonstrate that they have applied an appropriate level of professional skepticism.

That's something that maybe we haven't got a checklist currently. Have you applied the appropriate amount of professional skepticism? Yes. Okay. They're now saying, "Maybe we need to take it a little bit further, how can you demonstrate that to us?"

One of the things that's important to note here, and I think is their aim at doing so, is that they're proposing to begin focusing the measure of sufficient and appropriate audit evidence. The word sufficient, they're going to start leaning it more toward persuasiveness. I think historically when you hear sufficient and appropriate audit evidence, you think of quantity and quality of that audit evidence.

Some of the new automated tools and techniques, to use their phrase, and audit data analytics allow us to evaluate and process much larger volumes of data a lot more quickly. I think their knowing that is shifting that focus away from just sheer quantity of data to persuasiveness. What are the results of your analytics telling you? How does that align with your expectations, and not just being able to apply a coverage approach and saying, "Well we looked at 60% of the dollar value of this balance. So, we feel pretty confident that it's sufficient?”

That is going to be, I think, a pretty big shift. But some of these, again, automated tools and techniques are going to help to be able to satisfy that from an audit perspective.

Just looking at an excerpt from your piece so I'll read it verbatim: "With data analytics and processing tools, identification of these factors can be applied to 100% of the population, allowing the auditor to sample much more efficiently by extracting journal entries matching this specified criteria." Can you explain to us how it is that data analytics is able to work so much more efficiently than the data sampling you'd usually see in a traditional audit?

[Kraemer] Whenever I think about audit data analytics, I always say it's not only we're becoming more efficient, but we're also becoming more effective with our procedures. So, I think right off the bat, that we're going to become more effective with our sampling. The ultimate goal for audit sampling is we want a sample that represents the population. If we can apply these data analytics to 100% of the population, then we're able to get a better cross-section of the actual population that we're sampling from and become more effective and more systematic with our selections.

The other thing is the efficiency, obviously. I think a big piece of data analytics is being able to make them repeatable in nature. Another big buzzword is the automation, process automation, which our coworker, Adam Costa, talked about RPA with you last week. Whenever you're able to apply these data analytics in a repeatable fashion, you're ultimately going to become more efficient year-to-year or period-to-period under audit.

I'd say it's kind of twofold. We're becoming more efficient because we're able to make it more repeatable, and we're becoming more effective because we can target the higher-risk entries. Obviously, you have to be able to dispose of the risk with the high-risk entries, so maybe you test a few of those and you're able to analyze the remaining population of the high-risk entries and determine that, "Hey, there actually isn't as much risk as we thought here originally."

Another interesting segment of your article talks about the accounting standards updates on revenue recognition and leases, which are obviously so huge for CPAs and talked about how data analytics can help in adjusting to these standards. Can you walk us through how data analytics helps improve the processes connected to those impactful standards in particular?

[Kraemer] I can talk about revenue recognition first because this is the one that's becoming effective first. So, obviously, the revenue recognition standard has that five-step model now, and we in our data analytics world tend to focus on two of the steps in particular that we think really have the best opportunity to apply that analytics with.

The first one is identifying the performance obligation and, hand-in-hand with that, determining the transaction price. The whole spirit of the revenue recognition standard is essentially, "Hey, we can't recognize revenue that we may reverse in the future." A big piece of that is making sure that there's the proper transaction price.

What we've seen in practice is you can take a transaction and follow that through the subsequent period and identify if there's been any adjustments to that same revenue transaction through a transaction ID. That will allow you to better estimate the actual transaction price after all the adjustments and be able to apply the proper transaction price to similar transactions in the future.

Then, the second big piece that we see is the actual recognition of the revenue. That's step five of the five-step model, and this is probably the most impactful for a lot of people because the big thing here is recognizing revenue over time as opposed to the traditional “at a point in time” revenue recognition.

One example that we've talked about, and I think we talk about in the piece as well, is for the trucking industry. This impacts them. Traditionally, they would recognize the shipment whenever that reaches its destination. Now they have to recognize revenue over the in-transit period. So pretty quickly, you can test 100% of the population or, if you're management, you can make your adjustment based on 100% of the loads in transit at the measurement period by just doing a simple calculation of the origination date, and the load delivery date, and allocating the revenue on 100% of those loads based on how much was in transit before the measurement period and after the measurement period.

Those are really the biggest areas that we've used data analytics over the past year and a half or so to kind of address the revenue recognition standard.

The findings that come from data analytics are often really complex, and yet they then need to be communicated to people without an extensive background. Are there ways that CPAs' responsible for the communication of data findings can simplify for those who are less initiated?

[Kraemer] Definitely. One of the terms that we like to throw around here is the role of a data translator. So, someone who knows enough about the data, how it's generated, where it comes from, but also knows enough about operations of the business to be able to translate that data into actionable insights or communicate precisely how that impacts the business.

Probably the biggest tool that a data translator can use to quickly communicate those trends or findings is data visualization. Data visualization is just presenting the data findings in an easy, quick way to interpret things using charts and graphs. There's a lot of tools out there.

Some of them are probably more in depth than others, but you can even use something as simple as Microsoft Excel to perform data visualization. That we found, even from an audit side, helps reduce the time of interpreting the data. We use that a lot with our audit procedures to present our findings, so the reviewer can then come in and have more of a 10,000-foot view and be able to identify the things that are high risk on the engagement pretty quickly.

[Kosty] Going back to the first question: the AICPA's definition of audit data analytics, they actually reference the word visualization within there. So to Matt's point that we're utilizing some of those tools and concepts within our audit engagements, I think that's also something that they're proposing and are expecting to see become adopted more industry-wide.

To Matt's point, it allows for ease of interpretation, a little bit quicker review, and can even help to point out some abnormalities that maybe weren't as easy to see just looking at the straight data set. When you give it a graphical representation, it's easier to see that something looks a little bit off or is trending in a different way.

This obviously isn't a process that you're going to wing. An established action plan, it seems like it's going to be necessary to begin instituting data analytics into the firm's audit processes. What do you think the key steps are that need to be included in an action plan like this?

[Kosty] It's definitely necessary. When you're implementing change anywhere, it's going to be challenging, and having something in place upfront is important. I think key steps, number one most important thing, and Matt would agree with me, is setting a goal or some sort of objective off the bat before you do anything else.

With something like this, it's very easy to get into the weeds or run in a couple of different directions without a clear-cut path or goal or something you're trying to accomplish. We asked ourselves from the beginning and continue to ask ourselves, "What are we trying to accomplish? Is it time savings? Is it the effectiveness of our audits? Is it some of both? What's the intent?" Establishing that first and foremost is going to help in the long run.

Past that, getting some quick wins off the bat. Maybe you have some new ideas that you want to propose to people or implement in your process. Why not just pick off the low-hanging fruit and prove to people that kind of your theories or your ideas of how you can implement some of this stuff into your processes actually works in real practice?

Maybe take some of those things that you're used to doing on a day-to-day basis that you're very familiar with and target those first. Or something that you know people are going to get excited about to see as a quick win. Make your proof of concept that way. It'll help support your value proposition as a whole when you start trying to scale this a little bit bigger than just one or two tasks.

Another word we throw around a lot, and I think it is key throughout this whole process, is collaboration. You need to get input from all levels. Some of the staff or seniors who are working in the field, managers up through shareholders that kind of have a little bit broader oversight of the engagement as a whole. What's everybody's input, and how can you really pull that together to accomplish what you've set out as your number one objective?

Then lastly, just define success and remeasure your results and redefine success, because it's going to change throughout the process. You're going to start off going one direction, you're going to sift through the weeds a little bit and see that maybe we need to veer off this way and target something different. That's okay, and it's going to happen. Embrace that and redefine what you're trying to accomplish.

Tell people about your successes. It's really easy to hide in your corner, get your work done, maybe implement something that saves you a couple of hours of efficiencies on an audit engagement, and nobody will ever know about it. It's a little bit incumbent upon you to tell people about the successes, what kind of changes you've implemented, how that helped you in your team, and how you can then turn around and help that person.

Because, going back to the collaboration, the more people you tell about these things they're going to come up with their own ideas, and you're going to just create this environment of creativity, and how can we think outside the box to get some of these things accomplished in a more efficient and effective way?

[Kraemer] You have to be a little bit boastful whenever you get those quick wins. You also have to relate how those wins actually impact everyone at the organization. I know in our journey getting the data analytic practice off the ground over the past year or so, one of the biggest wins that we've seen is whenever we get people excited about something that impacts them because they not only want to start there, but they don't want to start going at every other engagement that they have that might have a better opportunity, or more opportunity to become more efficient for themselves so they can free up and do more value-add analysis for our clients.

PICPA Staff Contributors

Disclaimer

Statements of fact and opinion are the authors’ responsibility alone and do not imply an opinion on the part of PICPA officers or members. The information contained in herein does not constitute accounting, legal, or professional advice. For professional advice, please engage or consult a qualified professional.

Stay informed about
PICPA blogs, upcoming events, and more

Subscribe to PICPA communications