Executive management often has the wrong idea about how to do cost optimization. It’s a misunderstanding that stems from the traditional benchmarking process, which may follow this common storyline:
1. Business leaders hire a generic management consulting firm, which typically lacks practical implementation experience with benchmarking. Unfortunately, even within specialist advisory firms, the benchmarking capability is almost always housed under the research function instead of advisory, eliminating the possibility of any meaningful context being applied to the benchmarks.
2. They sign up with a traditional benchmarker that creates research and survey-based data. They pay for access to what is essentially a data catalog created by spreadsheet analysts that are far removed from the actual realities of a transaction. The data includes averages of averages, and is unlikely to represent unique IT environments.
3. The organization does a shallow rate exercise that casts a tunnel view on rates, missing the broader elements that can impact the total cost reduction opportunity.
4. Executive management continues to treat benchmarking as a quick and tactical procurement exercise, as opposed to a strategic undertaking that can guide critical decisions involving employee retention, outsourcing strategy, and contract renewals.
This is why traditional benchmarking is a failed process.
Data is meaningless without proper analysis or context. Even if you see results, you have no idea what was originally bloated or inefficient in the organization. Worse, the committee may now believe that IT costs are too high and should be reduced when, in reality, the costs are appropriate for the type of environment, and the best approach is to get to a new environment (which requires investment). Doing nothing would have been better than doing anything at all.
Moreover, traditional benchmarking providers are typically housed under research outfits that treat the process as a productized data catalog split up into a roster of roles, with the goal to circulate and upsell as many times as possible.
And when there are more salespeople than actual IT advisors, there’s no strategy behind the data extrapolation. Researchers don’t have experience sitting at the negotiation table. They’ve never done a cost assessment, and the numbers they share are based on surveys, rather than hands-on research. At best, the data is funnelled from the advisory side of the business, and much can be lost in translation.
Is the data archived and normalized? Is it an apples-to-apples comparison? Probably not—and there’s no chance an IT organization can achieve true optimization potential when rudimentary spreadsheet data is applied to chaos.
Complicating things further is the sales buzz surrounding automation. The sheer volume of new-fangled technologies and services that claim to solve all your ills is overwhelming. Investments in global robotic automation are estimated to achieve a CAGR of 60% by 2020, totaling $6.5 billion. But how much of that will be wasted because no one has figured out how IT organizations are run in the first place?
The path to proper cost reduction must begin with a more holistic benchmarking methodology. One that is based on data sourced from actual engagements and takes into account the bigger picture of the organization.
Have a Question? Just Ask
Whether you're looking for practical advice or just plain curious, our experienced principals are here to help. Check back weekly as we publish the most interesting questions and answers right here.