It’s a truism to state that European financial services providers are faced with continuing environmental and market pressures, and have an ever-increasing appetite for transformation; low interest rates, regulatory and supervisory pressure and evolving competition have been par for the course for the past decade and have forced banks and insurers to adopt a range of ‘change the business’ strategies.
That said, only a small number of European institutions have been able to sustain exceptional Cost-to-Income ratios, which we categorise as below 45%. These ‘Performers’ are by far outweighed by a small number of organisations that are in the ‘Sustainer’ category (CIR=45%-55%) and the majority who are ‘Laggers’ (CIR>55%). The difference between these three categories is typically explained by an organisation’s ability to leverage sustainable cost advantage resulting from their investments in transformation; reducing product complexity, the automation of manual processing or the optimisation of process, the implementation of new architectures or smart people management and transformation.
Conversely, the laggards tend to maintain complex product portfolios, optimise manual labour, or maintaining high-cost legacy IT systems.
The first strategy has been the business model review, where organisations redefine their business model by focusing on products, clients and markets; either enhancing existing offerings or creating brand-new models and products. We’ve found that these efforts typically result in around 10-20% cost savings potential, but with only a short-term scope and effect.
The second popular approach has been the redesign of procurement and IT sourcing practices; increasing the efficiency for locating goods and services and finding the best and cheapest providers. While the cost savings potential for sourcing optimisation is higher at between 15-30%, the effect and scope are again short-term and procurement optimisation can typical only be carried so far before it starts to affect quality.
Organisational restructuring is a third common approach, involving the merging, demerging or redesign of operating models and offering a savings effect that is typically sustained over the medium-longer term by with a cost effect of around 10%. Complicating restructuring is of course the other cultural issues that have to be addressed with changes to people’s roles, responsibilities, competencies and of course headcount.
The last of the four common cost and efficiency strategies is of course process improvement; offering a high cost savings potential at 30-40% but requiring ongoing commitment from the organisation to constantly reinvent and reengineer the value chain. The banking and finance industry has spent billions over the past decade in implementing process improvement programs such as Lean Six Sigma, with results ranging from the exceptional to the dismal depending on the organisation’s willingness to invest over extended time frames, the work done to instil the required process mindsets and competencies and the stamina of the organisation in committing to a program of work that may extend past the tenue of most CEOs. Needless to say, result have varied significantly and the reputation of methodologies such as Lean have suffered due to inconsistent application.
For operations managers, a more recent ‘silver bullet’ has been the appearance of automation technologies. In fact, the recent demise of Lean Six Sigma as a ‘brand’ can at least be partially attributed to the perception that automation will enable the evolution of banking operations and significant cost savings without having to perform lengthy process discovery and improvement. Unfortunately, the reality has been less compelling; many banks and insurers have found that their automation programs have ‘hit the wall’, with initial automation programs stalling, cultural issues abounding, quality and performance challenges persisting and the overall maturity of automation not progressing past the initial state. Automation in isolation has clearly not been the answer, so the question is; what can organisations do to achieve efficiency and cost control while maintaining their ability to transform to meet evolving customer and environmental needs?
The answer comes from a combination of methodologies and approaches that the industry has already dabbled in, plus a newer approach to process discovery and control. Process improvement is still necessary – although in a modified form. The paradigm of the 12-week process improvement effort has given way to new methodologies in process discovery that yield deeper understanding of process flows and frequencies, plus additional data relating to time, cost and quality that have previously only been available after significant data collection and statistical analysis. I’m talking here about process mining; the technology that sits at the foundation of financial services transformation.
The answer lies in the three ways we typically measure process; time, cost and quality. Traditional methods of process discovery required operational excellence professional to run multiple workshops with subject matter experts, each with their own perceptions of how the process ‘really runs’. The result is a combination of discrete process views that may or may not reflect reality, and certainly will not encompass every process path. The time taken for discovery and mapping can be measured in weeks to months, the cost significant and the quality of the result inconsistent.
Compare this to process mining based discover. Every interaction with a system leaves a digital footprint; system logs from one or more involved systems are consolidated to provide full process transparency, plus provide data that would normally take further weeks to collect; time, cost and quality data. This full spectrum collection can be further enriched with data from other systems, such as trade processing, markets, procurement or other sources.
Using these digital footprints, optimal process sequences can be identified – from the ‘happy path’ to the so-called ‘process strangers’; process variations that rarely occur but represent significant risk and performance aberrations. KPIs and targets become easier to identify and benchmarks can be taken in comparison with internal and external sources.
Of course, from an automation perspective, process mining represents the state of the art; process variations can be quickly identified and understood and the organisation can use this in depth knowledge to first optimise the process, remove waste and enhance quality, digitalise the data therein, before pinpointing where automation will leverage the greatest return on investment and how automation needs to evolve to keep delivering maximum value. This three-step process is the key to enabling automation efficiency and effectiveness and it’s only achievable through the insights and awareness that process mining brings.
Author:
Ashley Cooper
Senior Consultant, Financial Services
BearingPoint Finland
Join us on December 8th, 3:00 pm EEST when I talk further about the challenge of financial services transformation and how process mining can act as a catalyst for ongoing change in environments of uncertainty and cost pressure. We’ll be joined by Mika Moisio from QPR Software, who will provide a demonstration of the ProcessAnalyzer platform and describe how process mining is the key to ongoing change in your organisation.