In a preview of her presentation at the Jan. 7, 2021, PICPA Technology for Public Accounting Conference, Leslie Kirsch, former director of central analytics for Withum, joins us to discuss the importance of data readiness and integrity, including ways to make more effective client data requests.
Consider taking a minute to participate in Leslie’s dissertation research study on the relationship between tone at the top and the risks of the fraud triangle.
By: Bill Hayes, Pennsylvania CPA Journal Managing Editor
On January 7, 2021, at the PICPA Technology for Public Accounting Conference, Leslie Kirsch, former director of central analytics for Withum, will address the importance of data readiness and integrity, including ways to make more efficient and effective data requests to clients and tips for planning data requests that tie to the objectives of an analysis. Today, she is with us to provide a sneak peek at her presentation.
You have an extensive background in data analytics, obviously. What should firms be focusing on to ensure their electronic data is reliable?
[Kirsch] Data reliability is always a really hard question, I have to admit. One of the things that I always tell firms, I always tell accountants to focus on whether you're looking at internal data, client data, you want to think about the data life cycle. You want to think about all the different spots where fraud or error could be introduced. There's going to be points where data's collected. There's going to be the system where it sits and where it gets processed. There's going to be some point where reports get generated for you to use. At all of those points, there are going to be different types of risks.
Thinking holistically about the entire data life cycle, I think that's really critical. I know, in the session, we're going to cover a lot of specific questions you can ask about that data life cycle and how you can focus your efforts a little bit.
What would you say are some of the software tools available to people that could help accountants with ETL challenges?
[Kirsch] So ETL – that's extract, transform, and load – is how we get our data instances where we can actually answer questions with them.
There are tons of different tools that are available. You don't have to worry, whatever your level of maturity in dealing with data, there are going to be different tools that are right for you. I've used a variety of different tools, some of which are pretty simple. They're the next step up, let's say, from Excel where you have a lot of control, you can do a lot of things, but they're a lot more efficient to clean up ugly data and put it in the shape that you need. There are a lot of vendors out there that have tools to connect directly with client systems or to help you access your own systems. Those don't necessarily require a lot of technical skill. You don't have to be a data scientist to use them. There are also a lot of tools that, if you want to take those next steps into the data science world, you can learn to program.
I know we're going to be covering these different levels of tools, and the pros and cons and what you have to be ready for if you want to use them. There are tons of different tools depending on what your specific challenges are that you're facing.
What are the important professional standards for data reliability that firms should rely on when they're solving those ETL challenges?
[Kirsch] That's always, to me, one of the most fun things about being a data scientist that works in accounting and auditing is that we have this fun thing where not only do we have to worry about best practices for things that data scientists think about with data every day, but we have to worry about what the professional standards say we need to do. Document, document, document, that's what the standards say, when we think about replicability of our work, what we need to document so that pure reviewers can see it. When we think about audit standards for supervision and review of work. Even when all that we're doing is cleaning ugly data, we still have to make sure that was a reliable process that we used.
The tools that are out there, sometimes when we think about the software tools that are available, when you're using AI-based tools that are coming from a vendor, you have to worry about understanding why they're accurate, why you can trust them to do the cleanups? Going back to our software tools question, that's really tied to which tools you would choose and why? There's an aspect of professional standards that we think about in that too.
I wonder if you could share a few tips to help firms with their data requests to clients.
[Kirsch] Definitely this one is a sneak peek, but this is so critical. I'll say it to people 100 times. I'll give them the same advice here that they're going to hear in the session, which is specificity. All of the time, I see the same mistake happen when we talk to clients, where we ask them, "Hey, give me this report, give me a general ledger for the year." The thing is, when you ask for that, the client's going to give you whatever they use day-to-day for a general ledger. Not only is it probably not all the same data points you care about, it's also possibly not in the format that you want it.
All the time, I find if you just say to a client, "Hey, give me a general ledger that includes X, Y, and Z," and you say, "Give it to me in Excel or give it to me in a CSV," all the time we get a PDF and we could have gotten something different from a client if we had just been specific when we asked for it. So, some of our solving ETL challenges is just being a lot more specific. It doesn't hurt you. The worst thing is that they say, "I can't do that." And you say, "Well, then let's do this instead."
When you're planning data requests, what's the best way for firms to ensure their results meet the analysis objective?
[Kirsch] That's really the core of what ... I've been an audit data analytics specialist for a little bit over 15 years now. I think one of the hardest things is to really make sure that you backed yourself up to, “Why am I doing this task? What is it that I'm hoping to achieve by it?” One of the biggest questions you get around things like what's the level of analysis, right? What level of detail would it take to meet that objective?
Another common mistake I see all the time is that, at the end of the day, we perform data analytics where we kind of checked the system against itself, where we got two different reports, but it came from the same system. A lot of the time backing up to the objective and reminding yourself, "Well, I want an independent source of information for this." Going back to remind ourselves of those things means that you're thinking about not just, what are the reports I need, but what systems do they need to come from for me to get to that object?
Leslie, thanks for being with us today and for sharing this information with our audience. As you were good enough to mention a couple of times during the discussion, we're looking forward to the presentation at the January 7, 2021, PICPA Technology for Public Accounting Conference. We're sure our audience is in for an informative session. Thank you very much. And I believe you had some information about an anonymous survey that you wanted to share with our audience.
[Kirsch] So I'm so excited and I hope that I'll see a lot of folks at the conference. I'm so looking forward to talking to everybody about all things data. I'd love anybody who listens to this podcast, if you'd be willing to participate, I'm actually conducting some research right now on the fraud triangle. I have an anonymous survey. I'm looking to talk to people who haven't committed fraud about some of their experiences with the fraud triangle. Anyone willing to participate, I know we're putting the link to the survey on the page for this podcast. Hopefully you'll be able to take it. I hope I'll see you all at the conference.