It’s time to improve cost model efficiency

It’s time to improve cost model efficiency

I was sitting in a prospective client meeting not long ago discussing their current monthly cost modeling process. The client described the roles and responsibilities of each financial analyst. The analysts were heavily involved with their business partner to update the model drivers, volumes, and expenses, using excel spreadsheets as their collection tool.

The conversation then switched to how the analysts get collected data in and out of their cost modeling tool. The analysts had to consolidate their excel spreadsheets and send them to an IT analyst. The IT analyst had total responsibility of loading the data, running the tool, and extracting the data from the tool for validation by analysts. The process could run within a day but most often it was a two to three day exercise. Depending on when the financial analyst started working with their business partners it could take anywhere between 3 days and 2 weeks to communicate the impact of changes made in their excel spreadsheet back to their partners. It didn’t matter if the change was a minor adjustment to a percentage or a major overhaul of an operations groups, it could take weeks to fully understand the impact to service or product costs and rates.

It made me reflect over my career and how cost modeling has changed, or more importantly hasn’t. My first major cost modeling project was 10 years ago. That client had a single modeler/financial analyst responsible for consolidating every individual model for her colleagues. She aggregated all of the data in an access database and loaded it to a consolidated master model. It could take upwards of three weeks for her to report back the impact of any individual modeler’s changes to data.

Looking at the time required to report back the impact of modeling changes you would think these examples are outliers.  You might assume they aren’t or weren’t using the latest tools and best practices.  Maybe they’re just inefficient. Unfortunately, scenarios like these are more the norm than the exception. These organizations operate similarly to the majority of clients  Armada has worked with over the last 10 years and work with currently.

I would have hoped that advancements in technology and the roll out of new cost modeling applications from SAS, Oracle, Acorn, and others, that the pace would have quickened.  I would have hoped that financial analysts would possess the tools they need to make changes and report the impact of those changes in a matter of hours not days.

Somehow the industry has adopted the belief that running cost modeling applications, getting data in and out, and reporting is the responsibility of IT.  That an advanced technical skill set is needed. That understanding and writing query languages  is required. This acceptance has led to building large teams around technology. Too often, many companies have as many resources running the application as are analyzing the data coming out of the application. Instead of using the wealth of information cost models provide, they spend time maintaining production environments.

It is time for all of this madness to stop. It is time to empower financial analysts with the tools they need to build, update, maintain, and report on their cost model without complete reliance on an IT counterpart.

As the Product Manager for Armada’s cost modeling tool, Acumen, I have made it my mission to create a tool with the Financial Analyst in mind. Every enhancement and release of Acumen is focused around one primary objective, the user experience. Specifically, can a finance user do everything they need within Acumen? Do they have the ability to make changes, load bulk data, and get results without having to know SQL and without ever leaving Acumen? All while still providing a tool that can be scripted by IT to run a minimal touch production cycle.

We know when a model is in a production cycle that a significant portion of the hierarchies, drivers, volumes, etc. should be loaded systemically without manual touches from a financial analyst. By allowing organizations access to Acumen’s table structure we give them the ability to create ETL process to load all that data.

In our next major release of Acumen, we are integrating it with Aegis, Armada’s data manager and monitoring tool. Aegis will allow both finance and IT users to script loads from data sources without ever leaving Acumen. We have built a tool that provides the best of both worlds.

By allowing Finance analysts to manage the modeling process and data organizations can focus on what is important, driving business value from the modeling data.

When Ad Hoc reporting becomes the norm

When Ad Hoc reporting becomes the norm

Why do ‘ad hoc’ reports exist? By definition, these reports are meant to be performed only once for a specific business question which cannot be answered with an organization’s standard reporting. However, many organizations end up regularly running reports which are not included in their standard reporting, defying that one-time definition.  There are really three instances for when and why this could occur.

  • Changes or developments in the market which require additional analysis valuable to the business.
  • Proof of concept: a new idea or way of reporting which requires company buy-in before it can be implemented as a standard report
  • Progression of manager knowledge. When implementing a new reporting system, many companies limit the amount of data given to managers to avoid information overload. As managers adapt and begin to understand the information given to them, reports can then be expanded to include additional data for manager use.

While these are valid, if not desired reasons to expand upon standard reporting, regular “ad hoc” reporting is a practice which can lead to multiple inefficiencies, including excess reporting and quality errors, while consuming unnecessary IT resources. Therefore management should anticipate and plan for these occurrences in their organizations, and create procedures to manage and implement the ongoing changes in reporting requirements.

A great way to manage this process is to facilitate manager collaboration.  By doing this, not only will it increase manager engagement, but it allows managers to communicate the information they need and how they consume and analyze that information.  This can help identify obsolete reporting, and determine where to redeploy those resources to better serve their current information needs.

Collaboration also promotes the transfer of knowledge between managers. Managers are able to share ideas and techniques they use to analyze the data already available to them, which some managers may not be familiar with.  In many instances, ad hoc requests and business problems can be solved with data that is already available, but the business managers lack the know-how to alter the data to a format which they can leverage. Facilitating a collaboration group will allow those business managers who have solved those issues to share their techniques, resulting in greater overall efficiency and less demand on IT resources.

As previously mentioned, it is important for an organization to understand that updating reporting requirements should be a continuous activity. Therefore, to best manage those changes, manager review, collaboration sessions, and implementation should be set on an iterated timeline.  While it may not be necessary to conduct on a monthly or quarterly basis, performing annual or bi-annual reviews can help de-clutter and streamline the reporting process, all while engaging manager participation.

Implementing a reporting system in any organization, whatever the size, is a considerable task which can consume a lot of resources.  Integrating ongoing maintenance into your implementation plan can help preserve the quality and continuity of your investment.