I was sitting in a prospective client meeting not long ago discussing their current monthly cost modeling process. The client described the roles and responsibilities of each financial analyst. The analysts were heavily involved with their business partner to update the model drivers, volumes, and expenses, using excel spreadsheets as their collection tool.
The conversation then switched to how the analysts get collected data in and out of their cost modeling tool. The analysts had to consolidate their excel spreadsheets and send them to an IT analyst. The IT analyst had total responsibility of loading the data, running the tool, and extracting the data from the tool for validation by analysts. The process could run within a day but most often it was a two to three day exercise. Depending on when the financial analyst started working with their business partners it could take anywhere between 3 days and 2 weeks to communicate the impact of changes made in their excel spreadsheet back to their partners. It didn’t matter if the change was a minor adjustment to a percentage or a major overhaul of an operations groups, it could take weeks to fully understand the impact to service or product costs and rates.
It made me reflect over my career and how cost modeling has changed, or more importantly hasn’t. My first major cost modeling project was 10 years ago. That client had a single modeler/financial analyst responsible for consolidating every individual model for her colleagues. She aggregated all of the data in an access database and loaded it to a consolidated master model. It could take upwards of three weeks for her to report back the impact of any individual modeler’s changes to data.
Looking at the time required to report back the impact of modeling changes you would think these examples are outliers. You might assume they aren’t or weren’t using the latest tools and best practices. Maybe they’re just inefficient. Unfortunately, scenarios like these are more the norm than the exception. These organizations operate similarly to the majority of clients Armada has worked with over the last 10 years and work with currently.
I would have hoped that advancements in technology and the roll out of new cost modeling applications from SAS, Oracle, Acorn, and others, that the pace would have quickened. I would have hoped that financial analysts would possess the tools they need to make changes and report the impact of those changes in a matter of hours not days.
Somehow the industry has adopted the belief that running cost modeling applications, getting data in and out, and reporting is the responsibility of IT. That an advanced technical skill set is needed. That understanding and writing query languages is required. This acceptance has led to building large teams around technology. Too often, many companies have as many resources running the application as are analyzing the data coming out of the application. Instead of using the wealth of information cost models provide, they spend time maintaining production environments.
It is time for all of this madness to stop. It is time to empower financial analysts with the tools they need to build, update, maintain, and report on their cost model without complete reliance on an IT counterpart.
As the Product Manager for Armada’s cost modeling tool, Acumen, I have made it my mission to create a tool with the Financial Analyst in mind. Every enhancement and release of Acumen is focused around one primary objective, the user experience. Specifically, can a finance user do everything they need within Acumen? Do they have the ability to make changes, load bulk data, and get results without having to know SQL and without ever leaving Acumen? All while still providing a tool that can be scripted by IT to run a minimal touch production cycle.
We know when a model is in a production cycle that a significant portion of the hierarchies, drivers, volumes, etc. should be loaded systemically without manual touches from a financial analyst. By allowing organizations access to Acumen’s table structure we give them the ability to create ETL process to load all that data.
In our next major release of Acumen, we are integrating it with Aegis, Armada’s data manager and monitoring tool. Aegis will allow both finance and IT users to script loads from data sources without ever leaving Acumen. We have built a tool that provides the best of both worlds.
By allowing Finance analysts to manage the modeling process and data organizations can focus on what is important, driving business value from the modeling data.