Data analytics can be a powerful tool – when you understand how to successfully implement it.
The subject of data analytics is being discussed more frequently in audits and is often touted as ‘the future of auditing’.
However, this has led to many audit teams believing that to perform an effective audit they must use data analytics and the result is often not what they hoped for or expected.
Trying to force data analytics into an audit rather than looking at where it might align with audit strategy wastes time, resources and can give false comfort to those who rely on the results without understanding how the tests work.
The main issue is almost a catch-22 situation: You can’t implement suitable data analytics without prior experience and understanding and you can’t obtain good experience and understanding of data analytics without applying it in a real audit.
Choosing your tools appropriately
To help audit teams get started there are plenty of data analysis tools available ranging from easy-to-use Excel add-ins to more complex and powerful systems which often require additional investment in training. There are even tools which use machine learning (often under the banner of AI (artificial intelligence) that can review large amounts of data against criteria but adapt reporting and vary the criteria based on the auditor’s responses or unusual activity. This allows focus to be on key issues while ignoring standard exceptions which are known and have mitigating controls.
This variety leads many audit teams to their first big question – which one do I need for my business?
Due to the cost and complexity of these tools, simply exploring them all to find the best fit is inefficient. Understandably there is a lot of hesitation when deciding whether to take a risk and invest in software which might not be right for you.
The main driver for the software should always be to match an audit need and achieve a required goal. For example, if you have low volumes of data for which you want to do straightforward analytical review (like looking at large values or duplicates) then a simpler solution which requires minimal training, such as Excel, is most likely to suit your needs to begin with.
Alternatively, if you want to review data from multiple sources in a variety of formats to look for any correlations and trends then you are likely to need more powerful tools (resulting in more software and training costs).
Whatever the complexity, if you haven’t at least explored solving your analytical needs using basic spreadsheet software to understand your current limits, then you risk taking on advanced tools without fully understand your own requirements. Often a small investment in Excel training can be enough to drive the role of analytics forward in a business.
Thankfully, many software providers are happy to offer detailed demonstrations (often for free) and discuss their suitability in relation to your needs.
When should we use data analytics in the audit?
One good indicator that analytics could help is where the current substantive testing approach involves reviewing a large number of items from single or multiple data sources against fixed measurable criteria (such as items over a given amount etc). In this case there’s a good chance that automating the substantive test can give 100% comfort with a high level of efficiency.
However, a better question might be to review when you should not use data analytics? Many audit teams have become frustrated when their first attempt using analytics has failed.
Situations which often result in analytics failing are:
Data unreliable - if you cannot demonstrate that the source data is complete and accurate or that the system is robust to prevent unauthorised manipulation, then you cannot have comfort in your analytical conclusions. In this case, the first step would be to seek to improve the client’s control environment so that future analysis can be performed on reliable data.
Data extraction issues - many systems and applications can’t easily generate the level of detail required for analysis in a single report. For example, if the aim is to review all customer details but the system can only extract one customer report at a time then the effort required to pull together all the required data may outweigh any analytical benefits.
Walkthrough failed - before you perform any analytical test, you should perform a walkthrough with a smaller data set (even one record would do). If you cannot apply your analysis to a single set of data and get an understandable result, then there is no point in attempting to extract and test 100% of a large data set. For example, you might be testing sales orders to invoices but on testing one you find that matching them requires a manual inspection of the physical invoice. In which case you are unlikely to be able to match them analytically.
Data is unstructured - if the data is not in a tabular format (with field titles and ordered columns) it becomes more challenging and time consuming to ‘clean’ the data manually. The potential to make mistakes at this point is high. Data analysis is still possible but the risk of error increases and therefore in such cases, exploring different ways to produce the data would be advisable.
Getting the most out of data analytics
To maximise the benefits from data analytics it is essential to build a suitable framework and methodology which suits your auditing needs and maintains consistency in practice. But creating this is often a challenge without prior experience and so the audit team should consider the following to help:
Start small - understanding and experience is essential to implementing data analytics and so to build this expertise up it can be useful to start with straight forward analytical routines on Excel replacing basic tests or running in parallel with normal testing to ensure that the results meet expectations. This is an investment in time and resources but helps quickly build up an understanding and can lead to more complex analytical testing.
Focus training - training too many staff in analytics at the start is often problematic as without regular use of the tools and techniques the training can quickly be forgotten. Instead, training a small number of staff who can then, through repeated application of those skills, develop their expertise and then train others as required can be a more suitable use of resources.
Don’t replace thinking with analysis - many auditors have found themselves simply running the same analysis scripts and routines from previous years, putting the results on file and moving straight onto the next test without challenging what just happened. The risk could be that circumstances have changed, either with the process or even a corruption of the scripting. It’s not unheard of for people to discover new ways around automated detection. For example, analysis may pick out staff who post large unusual expense claims without authorisation but then someone discovers that posting multiple small claims goes undetected. As with all audit tests, analysis should be continually challenged to ensure it meets the audit requirements.
Data analytics can drive large efficiencies in audits and allow a deeper and wider understanding of the challenges facing a business, but it might not suit every audit. Starting small and building up knowledge and understanding is key to successfully implementing data analytics. More advanced toolkits can then be explored to address understood needs.
Finally – never forget that the role of the auditor hasn’t changed. The audit still requires interpretation of results, professional scepticism and a challenge for continual improvement. Data analytics simply allows the auditor to apply these skills more effectively.
Andrew Davidson – IT audit senior manager, Johnston Carmichael
Additional resources: webinars
ACCA UK's Internal Audit Network ran a series of four webinars on big data and how to use it from March to May this year on the following topics:
what is big data?
the legislation around big data
data analytics – assurance from an audit perspective
how internal audit can use data to provide assurance.
Each webinar lasts approximately one hour and provides one unit of verifiable CPD where it is relevant to your work. You can register for the on demand version of these webinars here