In the words of William Shakespeare from The Merry Wives of Windsor, "Better to be three hours too soon, than a minute too late."
Robotic Process Automation (RPA) is commonly used by finance teams to automate and simplify processes that are part of everyday accounting operations. There are many great RPA vendors used by finance teams to automate previously time-consuming, manual accounting processes with a standardized, auditable approach that improves speed to value and accuracy while lowering costs.
However, when it comes to producing repeatable analytics
, many finance teams are still stuck in legacy manual processes. Lack of automation in the analytics of accounting/ finance data leads to untimely results for business users and make it difficult to leverage the analysis for more informed business decisions.
In fact, many finance analysts are still stuck in “Excel Hell” when it comes to analytics
. This creates a gap between when and how the analytical results are delivered to the business users. Continuous manual data movement not only slows time to value, but also has other inherent risks, including:
- Risk of data accuracy. Constant data movement results in a higher risk of human error due to manual (and undocumented) transformations, data re-keying errors and inaccurate interpretation of source data.
- Lack of detail. Data is commonly summarized to fit the analytic, resulting in a a lack of detail which supports the summarized results, requiring the consumer to ‘trust’ the analysis.
- Inability to socialize the results. A spreadsheet can be shared with other users, assuming those users know how the analysis was performed and that it was done in the format they are looking for. However, many times the results are not available to other users, as the analysis was performed in a siloed environment. This inability to socialize results leads to a repeat of the same exact analysis by different users.
Innovative finance teams are now turning to automation to solve the complexity and challenges of these manual processes. Automation not only significantly improves in the timeliness of results, it results in analytical sets that are more accurate, detail driven and flexible.
How this is done?
The first step in the automation of manual processes is the documentation
of current methodologies. Because these manual tasks are completed monthly, this step is usually not that difficult. In fact, the documentation of this process should be done regardless of automating the process. It provides an audit trail of the steps required and often leads to new ideas for efficiently completing the tasks. A benefit/risk assessment of the process should also be completed. The documented process is a critical resource when turnover or transition occurs in the organization.
Identification of data sources
is the next critical step. Where are the data sources located which are required for analysis? How is the data acquired? How do I incorporate it into my current process? Is the process manual or automated? Is data manually entered and what is the risk if the data is entered incorrectly? Is the detailed data to support summary results available? Answering these questions helps determine the risk of inaccurate data integration and the benefits of an automated approach.
Another step is to understand the uses of the analysis
. Who are the users? What are they using this for? What critical business decisions are based on these results? Do they require summary information or is supporting detail required? What is the business value
Another important question to ask is what happens if this analysis is no longer available? Too many times, legacy processes are done without understanding who is using the analysis and if it is still serving their business needs. Automating a process without understanding the business needs / uses and the benefits delivered to end users will result in failure.
Once the process has been documented, automation
can begin. Select workflow and modeling tools that are understandable to the business users and don’t require technical coding skills or are difficult to maintain. Modeling tools should be designed for the business and be powerful enough to drive the analysis down to granular detail. The results can be summarized in the reporting tools but will have the underlying detail to support anomaly analytics. Storing the results (both summary and detail) in a platform that is accessible to all is invaluable. The result set will have detailed driven results, source data used to create the analysis and deliver more consistent, accurate and timely results to the business.
reduces manual errors, provides more timely results, is more accessible and detailed driven and provides enhanced strategic capabilities. However, the decision to automate must be cost effective, generate a return on the investment and provide business value to ultimate end user.
Now that the data foundation is established, the process is automated and a roadmap of what we want from finance analytics is deployed, we should understand what is in store for the future. In the next CFO Analytics blog, we will address the Future of CFO Analytics and how it can drive new insights for your business users.