Toggle search
Toggle location
Toggle master download

Downloads

Facing down the noise challenge: finding answers in weak signals

A leading financial services company was faced with a conundrum. It was dealing with over 55,000 car loan applications a year, of which only 30 (roughly 0.05%) could be proven to be fraudulent. However the overall costs of fraud to the organisation amounted to some 600,000 Euros per year. Clearly the situation merited closer analysis, however no matter what the organisation tried, from asking for more information from loan applicants to running increasingly complex queries from its database and business intelligence tools, it was no closer to identifying potentially fraudulent applications in advance. The volume of information was simply too great.

The organisation then tried something new: running an exhaustive analysis of its entire data set using complex event intelligence techniques. Within a matter of days, it yeilded specific rules identifying the 7% of loan applications most likely to cause fraudulent applications – which could then be treated with greater scrutiny. It was also possible to derive rules for applicants who were of minimal risk, enabling the loans process to be simplified and freeing up resources to treat more complex cases. The result: faster processing for the majority of applicants, more accurate identification of problems and overall, reduced costs to the organisation.

The kinds of issues illustrated by this case study reflect the majority of organisations that we meet. Business leaders and front-line staff want to make the best possible decisions they can to drive their organisations forward. Too frequently the challenge is having too much information, meaning that any desire to drive innovative thinking and more insightful decision-making is quashed.

According to the Washington Post, the amount of information available in the world increased by a factor of 100 in 20 years, from 2.6 to 276 billion gigabytes. Without a doubt, businesses have benefitted; they reach new markets, have far greater customer understanding and an increased productivity. At the same time companies face a growing challenge: the quantities of data now in existence can make it harder, not easier to support decision making. Businesses are battling exploding quantities of information: the current buzz around “big data” equates to dramatic increases in volume, velocity and variability of information. This combines to create what feels like little more than noise by decision makers.

Retail, for example, has become a complex web of supply chain management, customer buying behaviours, store locations, loyalty schemes and joined-up marketing. Healthcare is equally complex, with data coming from demographic shifts, organisational structures, pre-emptive diagnostics and integrated therapeutic pathways.

Given the inordinate amounts of data that are being generated across these activities, is it really any wonder that signals get lost? The consequence is that decision makers fall back on ‘best-worst’ current models, public hospitals continue to focus on beds over successful treatments and sales team behaviours reflect where they are in a quarter. 

The inevitable result is a less efficient business, as decision makers have no time to work out a better alternative. Organisations can sometimes be their own worst enemies, focusing on the numbers rather than the strategy, or continuing to use out-dated Key Performance Indicators (KPI’s) whether or not they remain meaningful for the evolving business. Equally, each department can have its own drivers and measures, which can sometimes come into conflict.

Businesses can stumble or even fail due to having an incomplete picture. This was brought into sharp focus by J.P. Morgan Chase, where the exclusive focus on the ‘bigger picture’ of trading risk by senior executives was central to the company’s recent $25 billion loss in shareholder value. Meanwhile, back in 2000, deregulated UK power company Independent Energy went bust because of its inability to pull together billing data.  

How can these challenges be resolved?

Most organisations we talk to are on a continuous improvement journey – be it in terms of product quality, business process re-engineering or supply chain rationalisation. A similar path is both possible and desirable for achieving higher levels of insight. However, as long as it is too difficult to extract real meaning from the wealth of available information, decision makers will fail to see the wood for the trees.

Information technology is supposed to hold the answer 

If IT was doing a better job of delivering the right information to the right people at the right time, organisations would be better able to get on with the job. Not all of the blame can be fully placed at the door of technology but several reasons do exist to explain why technology has not yet delivered on its promise.

Why companies are stuck with traditional models?

Data has traditionally existed in ‘silos’. The first data stores were little more than internal repositories for software applications and even with the arrival of relational databases the link between an application and its data has proved hard to break. Even today applications from CRM and ERP systems to social networking tools tend to ‘sit’ on top of a repository built and architected for that purpose.

As a result, organisations depend on a series of data islands, that contain data of varying quality. Various attempts to improve data access, integration, transformation and analysis have been made over the years, under different headings such as data warehousing, business intelligence or analytics. Meanwhile, numerous techniques have been used to mine the information from fuzzy logic to statistical analysis.

Despite the serious levels of innovation and thinking that has gone into such tools they have, in general, been limited by the siloed nature of data repositories. Rather than working horizontally across all data islands, organisations have been forced to make extensive use of ‘the human layer’. This means they have had to engage smart people who can decide which rep es to access, the queries to write and the variables to test. The human layer has been critical when it comes to interpreting the outputs of analysis.

Nothing is wrong with this model in principle. However, most organisations recognise that even with a great deal of effort put into using such tools and employing smart people, the results they arrive at never present a complete picture. Traditional approaches and the tools they use are non-exhaustive. This means they cannot execute all possible analyses. Equally they lack the intelligence to interpret data in any meaningful way and rely on human intervention to do so. 

The result is partial interpretation of an incomplete picture, based on an historical understanding. Even the brightest people can limit their imagination to what is already there (e.g. existing KPIs) which means it can be difficult for an organisation to move from where it is currently. The opportunities for leaps in understanding or breakthroughs in how business is done are minimised by the fact that the organisation is starting from a restricted and therefore restrictive position. 

What if organisations were able to work with the fullest set of data at their disposal and still derive useful insights? Such ideas have been considered in the past, but then ruled out for one simple reason: the required amount of processing power is simply ‘too great’ to enable this to happen. By ‘too great’, read that the costs of processing outweigh the potential returns or are unaffordable for many organisations, or alternatively that computing time may be so long that by the time it takes to process massive data sets, the world has already moved on. The result has been that organisations continue to employ traditional models that look at information vertically, employing non-exhaustive solutions to specific problems and relying on the human layer to decide where to focus their efforts. 

But this is about to change. 

Are we on the brink of a breakthrough?

Computers are able to work exhaustively on a problem, at least in theory. The heritage of ‘artificial intelligence’ goes back at least 50 years when rules-based algorithms were first used to parse sentences and prove mathematical theorems. This field has seen continued progress, particularly in sectors such as investment banking and advanced research where the economics of high performance computing make sense. 

The opportunity to use these advances to solve complex business problems is huge. Consider the insurance fraud example: while it is possible to take a best guess as to the causes of fraud, today most organisations spend a great deal of time checking customers in advance, hoping that this will minimise the chances of fraud down the line. Using the kinds of complex event intelligence tools available today, it is possible to define a detailed set of rules about which situations cause fraud, and which are highly unlikely to do so. This enables the right questions to be asked of applicants, saving time for both company and customer, and minimising risk in the process.

While we have some great examples of what such models can make possible they have remained in the hands of the few rather than the many. The key for business is return on investment (ROI) – that is, how can organisations be sure that the money spent on exhaustive data analysis will result in either increased revenues or reduced costs? The answer is that in the past they couldn’t, processor time was simply too expensive to justify running such a task. Even though data volumes have been increasing rapidly, the cost of processing has plummeted at a faster rate.

An additional variable concerns the size of each problem, which has not seen the same level of growth as data or processing. For example, the quantity of information available about a malaria outbreak available today has not grown significantly compared to that of ten years ago – nor the variables required to understand it. What has changed however is that due to increased miniaturisation of technology both the time taken to execute analytics tasks and the overall cost of processing have fallen sharply. 

A task that took 3 months of supercomputer time and €100 million in 1999 now takes under 24 hours to run and costs less than a tenth of this figure. Computer power crossed the threshold about two years ago. Tasks such as the fraud example moved from being a costly option to an affordable reality. As the relative price continues to drop, we are convinced that complex event intelligence will become a business necessity, translating directly into ROI. “We’re looking at significant cost savings in R&D, and much faster time to market,” explains a key decision maker for a healthcare product manufacturing company. 

An equally significant contributory factor which will further catalyse adoption is the mainstream arrival of cloud computing. While critics have suggested that these offer little more than a return to the ‘computer bureau’ models of the 1970’s, the hosted processing model is ideal for the delivery of such services. Given that many organisations will want to run complex event intelligence as a one-off, such capabilities can be made available for a wider set of possible uses than if organisations had to buy in their own infrastructure from healthcare and manufacturing to market analysis and fraud prevention. At last, all the pieces of the jigsaw are there. This enables a new approach towards complex event intelligence. 

Becoming an insight-driven organisation
This goes far deeper than a simple, stepwise improvement in technology. The goal is to break out from the cycles of traditional analytical reporting and use tools to deliver new levels of insight, not just more numbers. By applying such tools businesses become free to concentrate on what the data actually says and where to focus their efforts rather than drilling too far into the detail in the hope it will reveal some answers.

Becoming an insight-driven business is not just about the tools. It is also up to organisations to become smarter in both their broader understanding of what is going on and their ability to enact day to day business decisions. At one end of the spectrum we can see companies that are content to work within traditional structures and follow traditional approaches; either because their business models are relatively fixed (for example the legal and accounting professions) or because they see no real benefit in changing. Many organisations will look for opportunities to use higher-order analytics tools such as complex event intelligence as a one-off or on a repeated basis to be sure that they are covering all the bases.

There are no wrong answers; instead each organisation can consider where it fits on the insight maturity scale:

  • Traditional – the current approach, which explores existing data pools vertically, using specialists to drive statistical tools.
  • One-off – in which a broader view across data sets is geared around solving a specific problem, as a one-off initiative with predictable ROI.

  • Repeatable – where the organisation makes use of complex event intelligence techniques on a regular basis, for example to identify new growth areas as part of business strategy development.
  • Insight driven – real-time use of complex analytics tokeep a picture of the changing business context, adjusting business models and driving operational decisions.

With processing costs continuing to fall, the pool of problems that can be solved cost-effectively will broaden. Today it may be the case that more complex event intelligence is suited better to one-off tasks delivering specific insight, the ROI threshold is quite clearly moving (see figure 5). This reveals new opportunities and enables more organisations to continue further on the journey towards greater insight.

BearingPoint's HyperCube

It is a complex event intelligence tool, enabling exhaustive analysis of massive data sets to identify new insights. This means possibilities that the business might not have thought of; or the actual causes of situations such as staff retention, systems failure or insurance fraud. HyperCube offers a response to the challenge, enabling a bigger picture view to be established and at the same time providing an unprecedented understanding of the detail. HyperCube is recognised by industry analyst firms and research bodies such as the IDC and the Institut Pasteur as offering breakthrough potential for organisations in a variety of domains.

Taking the leap - from insight to action 
With such capabilities as complex event intelligence reaching the mainstream, we believe organisations have an unprecedented opportunity. It is rare to see such a close alignment between the long-standing need for greater insight from the still-growing pools of data businesses have at their fingertips, and the availability – finally! – of technologies that can deliver on this requirement.

While the opportunity is compelling, such a transformation cannot happen overnight. Becoming an insight-driven business is not for the faint-hearted. The business will have to change both its organisational structure and its day to day activities. This affects everyone from senior executives to front line staff.

To become truly insightful the organisation needs to integrate the outputs of such tools into every strand of business decision making. At the same time it must ensure that compliance and governance rules are adhered to. Ignorance can be bliss however some businesses might not be happy to discover that they were doing it all wrong, or may prefer not to find out. These businesses face an even greater risk. The risk of being caught out by the unexpected.

Ultimately, the question is not whether to adopt complex event intelligence technologies, but what are the risks of not doing so? This is in terms of dealing with specific challenges and encompass the danger of lost opportunities to gain competitive advantage. While the promise is great, the first step does not need to be so far-reaching vas the power of such tools quickly becomes apparent when applied to a compelling need.

Whether an organisation chooses to act on the insights the tool reveals is a matter for the decision makers concerned. Whatever happens decision makers can no longer claim that they were working on the basis of insufficient information.

  • Acknowledgements
    The author would like to thank Eric Falque, Thierry Lalande and Ludovic Leforestier at BearingPoint; Jon Collins at Inter Orbis

More institute articles