There are some axioms in financial management you simply can’t violate no matter how smart or how good you are. For instance: when it comes to FP&A reporting, you may want fast, accurate and cheap, but you can only pick two.

Of the three, in my experience, the only “non-negotiable” choice among them is “accurate.” Choosing among the remaining two is where conflict often arises.

Indeed, in an environment of almost constant change — one of the challenges of working at a large corporation — even basic financial reporting and economic rules of the company can be altered from year to year. So, while flexibility and nimbleness may be great for leadership in adapting to changing business conditions and needs, it can be a real challenge for finance and IT.

Case in point: years before a company that I once worked for took the plunge and installed a top tier ERP, we used a home grown financial reporting tool. It wasn’t a “bad” tool per se, and most finance users had back-end access to data and built custom reports using manual tools. But, any time users start building nonstandard reports it takes more time to get results and the possibility of errors increases.

(To clarify: nonstandard reporting does not mean reorganizing a P&L or drilling a report down from the back end to show more detail. Rather, it’s when users pull data from multiple sources and filter items in and out based on their own criteria. These reports are then used for review and analysis but users in the field can’t tie them back to any standard-produced report. While the report creator likely had good intentions, it usually creates confusion. Moreover, they usually take significant time and effort to build and, if users don’t have confidence in the results, they simply ignore them — which leads to wasted effort and zero results.)

At the time this happened, big ERPs like SAP, Oracle, PeopleSoft, and JD Edwards were exploding onto the scene — many of which were geared towards manufacturing companies. We were a service company still on a homegrown system that allowed almost all of finance access to the back end tables that supported all reporting. While this kind of access was empowering and facilitated analysis, it also led to reports being created that didn’t tie back correctly. More than one fire drill was caused by bad reporting.

In an effort to address this, leadership installed a dashboard and reporting tool on top of our existing financial system. The tool made use of existing reports, allowed for ad hoc reporting, and did not require some of the technical skills needed with our homegrown tool. In this instance, they chose accurate and cheap. This additional layer of the new reporting tool added some time to getting out results, but it also limited users’ manipulation on the back end thus increasing accuracy (bye bye nonstandard user created reports!). The approach was also orders of magnitude less expensive than a new ERP.

Because I was intimately familiar with the back end of the existing tool and had a penchant to tie schedules out, I was tasked with being a functional expert on the dashboard project. It was a typical role: help describe the reporting process to consultants assisting us, point out some of the tables in the back end and how to use them, and build check data to use for testing. I was there to answer any questions about how end users see and use data.

There was one additional request, however: a nonstandard report for direct nonpayroll (DNP) spend. This was a hot topic with leadership and we did not have a standard report to satisfy that request. The design had all the hallmarks of being nonstandard: data from various sources, backend manipulations based on management decisions, and it did not tie to any known report. This required spending considerable time in meetings devising how to build a report to satisfy everyone. This process and circulating sample reports to test users took weeks but was eventually successful. (The measurement of DNP was dependent on the current economic model; more on that in a minute.)

As with any other report, I then gave the specs for DNP reporting to my technical counterpart. Since this report was nonstandard, it required quite a bit more effort for the technical team (i.e. scratch build) to implement. It was challenging enough that the entire dashboard project was taking more time than anticipated. But then, as we were approaching a new fiscal year, leadership decided to make some changes to the economic model for the upcoming year. This meant all reports would have to go through remediation to make sure they were compliant with the new economic model. 

As with most testing, we knocked out the simplest and easiest reports first. After some small fixes we moved on to more complicated reports. Next thing I knew, we met with project leaders asking why all the reports were not updated … with specific questions about the DNP report and why it was still outstanding. I had to remind the leads that the DNP report was scratch-built and not a copy/import from our existing financial system. Also I had to describe how changing the economic model heavily impacted all the prior work done on designing the queries that supported the DNP report. Bottom line, the age-old reporting rule — fast, accurate, cheap … pick two — came to roost.

Things settled down once our project leads understood the impact of the economic model on rebuilding the DNP report. The technical consultant assisting me was assigned full time to the project. It took time (and money) to properly rebuild and test the report but it got done, and delivered on its requirements. (Though additional coding was required when the economic model changed once again.) Ultimately we delivered a fast and accurate report … it just wasn’t cheap.  

These types of situations don’t have to happen

This all happened years ago. Today’s technology has made it possible to avoid these situations with automated systems, integrated ERPs and robust reporting tools. Cloud-based, purpose-built systems like Finario, for instance, are continuously updated to be best-in-class and ahead of the curve, they combine standard functions with the ability to customize where needed, and they reinforce sound governance with the consistent application of required data, decision rationale, and analyses.

Though, as the axiom goes, it may be not “cheap” to digitally transform key elements of the finance function, it’s not “expensive” either. That’s because the return on investment from reduced labor hours, better cost management, improved decision making and the ability to pivot quickly in times like what we’re living through now, is — as another saying goes — a “no-brainer.”