AuRep is owned by seven institutions representing the majority of the Austrian market
A mountain of template-based reporting is pushing the financial sector to the precipice; it is time to consider the input approach to lighten the compliance load and release the information bottleneck
In previous studies from the BearingPoint Institute, we have looked at how financial regulation is changing across Europe, highlighting significant regional differences. Across the board however, regulation has underperformed across the past decade. From the scandals that first caused Sarbanes-Oxley and Basel II to come into force, through to more recent initiatives, rather than being seen as a way of improving governance, they have often been treated more as a charter to exploit loopholes.
By continuing down the path of increased regulation for banks and insurers, the main result has been more complexity (and therefore more loopholes) without having a noticeable effect on either good governance or reduced risk. Hundreds of thousands of data points must be reported in an increasingly onerous template-based reporting system that is pushing an already weakened financial sector to the precipice.
So, where next for regulatory reporting? In this article we capture the views of some of the main players in the debate and ask the difficult questions. Are organizations really geared up to deal with the increasing paperwork mountain being created, and to what extent does it meet the objectives of the regulators? Could they cope with real-time reporting, and what are the alternatives? Can organizations and regulators work together to deliver a better framework which benefits all parties?
One example of a country looking to buck the trend is Austria, where such moves are already being considered. We look at this example to see how countries and regulators are starting to think differently about the role of regulation, what fundamentals need to be kept and where there is room for change.
The solution creates a software platform that bridges the gap between the IT systems of Austria’s central bank (OeNB) and the banks. This allows critical information to be extracted from the sector at will by the central bank without increasing the administrative burden for the data providers. It marks a significant shift in regulatory and statistical reporting, away from the archaic system of form-filling, to a future framework better able to cope with the growing demands of supervisors, including ad hoc requests that fall outside the regulatory reporting cycle.
The European Central Bank is now looking into the data input approach to manage the mammoth task of supervising the eurozone’s most important banks. And with careful handling, this approach may be emulated elsewhere.
Is your organization ready for a paradigm shift in regulatory reporting?
Austria’s Alpine slopes tend to get most attention at the start of a year, but in central bank circles, all eyes in early 2015 are on the opposite end of the country. In Vienna ambitious changes to the collection and interrogation of bank data by Austria’s central bank, the Oesterreische Nationalbank (OeNB), are causing quite a stir.
The solution adopted by the country’s forward-thinking central bank and banking sector represents a new approach to regulatory reporting, leaving formatted templates to the annals of history.
The new methodology creates a software platform that bridges the gap between the IT systems of the OeNB and the banks. This allows critical information to be extracted from the sector at will by the central bank without increasing the administrative burden for the data providers.
It marks a significant shift in regulatory and statistical reporting, away from the archaic system of form-filling, to a future framework better able to cope with the growing demands of supervisors, including ad hoc requests that fall outside the regulatory reporting cycle.
From conversations with our banking and insurance clients, we know how much pressure they are under from their growing compliance duties, which often force repetition of data entry and has started to affect the efficiency of day-to-day operations.
Since the latest regulatory rules were introduced in the eurozone in 2014, all 8,000 or so EU banks are committed to reporting up to 700,000 data (note 1) points quarterly in layers upon layers of different templates in a digital language called XBRL. This is just for prudential reporting; combining with statistical reporting, it is much more. How long before we reach 1,000,000 data points? Is that really sustainable?
Frustrations were apparent at the annual gathering of the financial services industry at Euro Finance Week in Frankfurt am Main in November 2014. Christian Clausen, who was president of the European Banking Federation (EBF) until the end of 2014, told the audience that it was time to ask whether we have gone too far, too fast: ‘It is time to recalibrate the regulations … banks need to be competitive and need to be able to lend’ (note 2).
Even though reporting timeframes have been squeezed from months to weeks, to ensure a more timely view of financial risks, we do not believe that this is adequate to prevent another failure of a financial institution systemic to economic stability – precisely the end goal of supervisors who beefed up the rulebook.
Could we be looking down the barrel of real-time reporting? This could mean tipping the balance too far the other way, with implications for the quality of information delivered to regulators as well as the risk of keeping to the form whilst losing the substance of reported data.
One thing is clear. Prescriptive templates no longer work in a fastchanging digital world. Urgent debate is needed on how the world’s financial services industry could be better and less onerously supervised via a smarter approach to regulatory reporting.
It is time to recalibrate the regulations … banks need to be competitive and need to be able to lend.
Christian Clusen, former president of the European Banking Federation
Since many of the world’s biggest economies were brought to the brink by the shocking collapse of Lehman Brothers in 2008 (note 3), the focus has been on addressing the cracks in the global financial architecture.
Some of the deepest fissures were caused by gaps in data. Not being able to identify the scale of exposure to Lehman Brothers or its affiliated network, traders panicked and pulled out of positions that may well have been sound. Lehman Brothers wasn’t an isolated case.
All of a sudden, not being able to identify counterparty risk turned a bad situation into a catastrophic one. The crisis exposed the need for highquality, comparable and timely data on the global financial network. Since then, policymakers, regulatory agencies and standard-setters in Europe have been collaborating to greater harmonise and standardise supervisory reporting for banks and insurance companies.
A milestone in this journey was reached in November 2014 (note 4), when the European Central Bank (ECB) took over the supervision of about 130 of the eurozone’s largest financial institutions under the Single Supervisory Mechanism (SSM).
In regulatory circles this event has been hailed as a major breakthrough. Mario Draghi, ECB president, described the SSM as an ‘absolute necessity’ at the seventh ECB Statistics Conference in October 2014 (note 5).
One thing is clear: prescriptive templates no longer work in a fast-changing digital world
The SSM is a step in the right direction, with common rules helping to monitor risk more effectively. The stress tests carried out in 2014 on 123 major European banks by the European Banking Authority (EBA) (note 6), the EU bank watchdog, were lauded as an example of what can be achieved using these common methodologies.
The exercise was designed to provide supervisors, market participants and institutions with consistent data to compare and contrast EU banks’ resilience under adverse market conditions. Stress tests are likely to become a fixture in the already demanding reporting landscape – with additional compliance work for banks under the watch of the EBA.
European insurers are also battling their own regulatory reporting demons with the EU’s pensions and insurance regulator, the European Insurance and Occupational Pensions Authority (EIOPA), conducting its own stress tests in 2014 (note 7).
The EIOPA exercise took six months to complete; the regulator issued nine sets of questions, each with their own reporting template. With the new regulatory framework, Solvency II, set to be introduced in January 2016, even more scrutiny is on its way.
EIOPA is also preparing for the implementation of Pillar 3 of the Solvency II regime (note 8), where regulatory reporting is even more complex than the submissions required by the EBA. Asking for large numbers of so-called quantitative reporting templates (QRTs) will set a new benchmark in the volume of data that can be collected using outdated methodologies and technologies.
It is right that supervisors should be able to demand data more frequently and with more granularity from organisations that could represent systemic risk, but the current framework of reporting is becoming increasingly costly and time-consuming for data providers.
So we come to the radical solution being adopted in Austria, where the regulator and the regulated joined forces to turn the tables on the template-driven model and use new technologies to create a new regulatory value chain (note 10). The initiative is based on greater harmonisation and integration of data within banks as well as greater integration of the IT systems of the supervisory authority and the supervised entities.
The way it works is through a buffer company, called Austrian Reporting Services GmbH (AuRep), which is co-owned by seven of the largest Austrian banking groups, representing 87% of the market (note 11). This allows cost-sharing of compliance as well as standardisation of data collection.
AuRep runs on BearingPoint ABACUS, a common software platform, which works as the central interface between the banks and the OeNB (note 12). Granular bank data sets are captured automatically for supervisors to interrogate in whichever way they want, whilst the banks retain control over their commercially sensitive data, maintaining only the so-called ‘passive data interface’ on the AuRep platform.
AuRep is owned by seven institutions representing the majority of the Austrian market
The operating system solves the problems with the status quo, which Dr Johannes Turner, Director, Directorate General of Statistics, OeNB outlined at the Euro Finance Week conference in November 2014 (note 13). ‘You take a simple, plain vanilla loan and you have to report it five times,’ he said. ‘Different departments within the bank will be required to provide the same data to the regulator at different times.’
Austria’s new framework has the potential to succeed in clearing the information bottleneck. It represents a paradigm shift in bank supervision and statistical data remittance, finally putting an end to the delays associated with requests and formatting, and allowing greater reconciliation between numbers collected for various purposes.
data points must be reported by banks monitored by the European Central Bank
Austria’s new framework has the potential to succeed in clearing the information bottleneck; it represents a paradigm shift in bank supervision and statistical data remittances
The Austrian model is a data-input approach – each regulated entity prepares its data in a standard format in a series of basic ‘datacubes’ as prescribed by the national central bank, the Oesterreichische Nationalbank (OeNB), defined by business type, such as mortgages or business loans.
The granular data stored in the cubes is the same data the supervisory authority recommends as input so it should be separated from the commercially sensitive operations of the bank.
Direct interrogation of basic cubes by the supervisor would raise concerns over confidentially; data protection closeness of the supervisor to day-to-day operations of banks and other regulated entities.
The Austrian reporting company (AuRep) is a buffer between the supervisor and the banks. The basic cubes are uploaded to AuRep and transformed into a series of smart cubes formatted to the OeNB’s remittance requirements.
These cubes are a single interface with the supervisor. The data AuRep receives is in a standard format, so a change in required data needs a single coordinated update to all members.
Ad hoc data requests do not require the completion of multiple templates but can be gathered from the data uploaded from the basic datacubes, then input into a smartcube by AuRep, which forms the supervisor’s dataset.
Different departments within the bank will be required to provide the same data to the regulator at different times.
Dr. Johannes Turner, director, directorate general of statistics, OENB
Market players do not like regulators telling them what IT solutions to use.
Adam Farkas, executive director, European Banking Authority
Mario Draghi, ECB, said in a speech in 2014: ‘It is one thing to have information, which, like blood, flows through the veins of the system. It is another to ensure that everything beats at the same rhythm and all organs in the body get all they need from the same single flow’ (note 14).
With Italy being a poster child for this ‘input approach’ for 30 years and now Austria’s innovative solution it is really getting supervisory authority officials enthused. With the ECB being tasked with dealing with statistical and supervisory data of the Euro banking sector there is a strong need for consistency, innovation and smarter way of approaching regulatory reporting.
In an exclusive interview with the BearingPoint Institute (note 16), Dr Johannes Turner said that the Austrian model ‘ensures consistent, and highly qualitative data, whilst ‘reducing the amount of checking we have to do… The big win for the banks is that they are not burdened with the problem of completing templates on many different topics’.
In a sign that Mario Draghi is aware of the limitations of the template approach, he introduced the seventh ECB Statistics conference in 2014 saying: ‘Data integration on the side of the ECB and the other authorities only comes at the end of a data-production process, the first input of which is in the internal systems of the bank’ (note 17).
Amongst regulators there seems to be a general acknowledgement that better insight is needed, although there is not yet universal agreement about how this information should be gathered. Also speaking exclusively with the BearingPoint Institute (note 18), Patrick Hoedjes, Head of Oversight and Operations at EIOPA, agreed that transparency was ‘far from where it should be’.
He said: ‘We can see from the financial crisis how much impact the financial services sector has if it doesn’t perform well. It cuts deep into society, so we need to raise our game. We still don’t know where we would be if another Lehman Brothers happened. That has to be a key objective for 2020, and better data will help towards that’ (note 21).
For many regulators, the data input approach offers a way to increase consistency and quality of data as well as transparency, which is very much on their post-crisis agenda. Some regulators go even further.
The SSM is the first – and most important – of the two ‘pillars’ of reform aimed at making huge government bailouts a thing of the past, at least among banks in the eurozone. Its main aims19 are to:
It has been described as a Herculean task, requiring about 1,000 new staff at the ECB20. The SSM, along with the Single Resolution Mechanism (SRM), will manage the processes around rescuing a troubled bank or bank failure should the need arise. It is expected to be introduced in 2016.
With the arrival of the SSM, the ECB combines supervisory and statistical data collection under one roof. This in turn could herald the introduction of a European input approach to regulatory reporting based on the prominent examples operated in Austria and Italy.
The ECB already kickstarted the discussion under the so-called European Reporting Framework (ERF). The ERF consists of a Banking Data Dictionary (BBD), similar to the Austrian basic cube, and a Statistical Data Dictionary (SDD) harmonising reporting requirements from various domains on the output side.
of the eurozone’s largest financial institutions are regulated under the Single Supervisory Mechanism
In her concluding speech at the ECB Conference on Statistics in 2014, Danièle Nouy, Chair of the Supervisory Board of the SSM, said: ‘Integration, harmonisation and standardisation are necessary conditions, although not sufficient for achieving a fully satisfactory degree of transparency for the banking system. We also need to properly disseminate and communicate the data. In that sense, creating a common repository (“European Hub”) for publicly available data could be a relatively simple task with a very important and positive impact’ (note 22).
Ms Nouy also addressed the central preoccupation of regulators, policymakers and society; to help prevent future financial crises – or at least make them less likely. She highlighted the benefits that data input could bring: “I cannot promise that the ECB can once and for all eliminate the risk of another financial crisis. But the ECB is equipped to minimise this risk, and statistics play a crucial role here. Remember that the inability to correctly measure and analyse the risks associated [with] banking activity was one of the reasons [for] the current financial crisis. Developing and communicating accurate and timely statistics is essential for avoiding the repetition of this failure in the future’ (note 23).
The world’s financial system is not prepared for real-time regulatory reporting and it is questionable what value the realtime tracking of all financial transactions would add
However, for this model to work, buy-in must go beyond the central bankers. Wide cooperation would be needed from the market.
Incentives, including liberation from a labour- and time-intensive process of repeated reformatting of data points seem clear. However, discussions with industry bodies in the banking and insurance sector and their comments at the Euro Finance Week conference in 2014 suggest that, whilst momentum for change is gathering, the mood is still cautious.
Speaking at the conference, Adam Farkas, Executive Director, European Banking Authority (EBA), said the Austrian model was producing ‘nice’ data, but cautioned that there was still more work to be done before regulators embraced this approach with confidence. ‘The compromise is there and the incentive is there but there is no detailed, instructive prescription to an individual bank as to how it should report.’
He added that the large-scale move to digital to produce granular data had to be driven by the banks. ‘Market players do not like regulators telling them what IT solutions to use.’
Also at the Euro Finance Week conference, Robert Priester, Deputy Chief Executive of the EBF, said that European banks are very interested in tackling the problems of an out-of-date and cumbersome reporting methodology, that ‘is not working in the current state of IT systems’ (note 24).
He suggested that this made the Austrian model worth exploring: ‘Within the EBF it has produced a very prominent echo,’ he said, but remained vague about his support. ‘We all agree on data integration,’ he added. ‘The question is how to do that’.
Relevant supervisory reporting … has to come up with a means to stagger deadlines.
Andreas Ittner, VIC governor, OENB
European banks and insurers can be proactive in initiating change:
I believe reporting requirements will be like a datacube, though how that works will remain to be seen.
Patrick Hoedjes, head of oversight and operations, European Insurance and Occupational Pensions Authority
It is clear the tectonic plates that have been shifting under the regulatory reporting landscape in Europe have not yet settled.
Only a few years ago, banks and insurance companies were obliged to report once a year, using paper forms with a six-month remittance period. In just a short time, the changes have been enormous. As demands for disaggregated and more complex data have risen, templates have increased from a handful to hundreds at a time, and fields for entry have grown, too.
As policymakers and regulators seek a more timely systemic risk assessment through reporting harmonisation and standardisation, they turn up the dial on reporting frequency. Time-to-report has been shortened from months to weeks and data requirements are increasingly granular and comparable.
Does this mean we are facing the prospect of real-time reporting? We don’t think so – at least in the medium term. Against the imperative to build an up-to-date and accurate regulatory picture to assess systemic risks, a move closer to real-time reporting would require a complete overhaul of the regulatory framework. Finance and risk departments predominantly work on quarterly or month-end computations, and a move to real-time reporting would mean moving them to weekly or day-end activities. Such an approach would not be economically viable, but it is also questionable whether regulators are equipped to deal with the consequences, which would see them forced to deal with mountains of raw data, rather than the qualitative information banks and insurance companies provide today. The question remains as to the added value such an approach would bring.
Andreas Ittner, Vice Governor at the OeNB, told the audience at the ECB Statistics Conference in 2014 that the big challenge for supervisors with regard to collecting supervisory statistics is how it fits together with real-time market data and early quarterly reporting by banking groups. He said: ‘Relevant supervisory reporting … has to come up with a means to stagger deadlines in order not to become irrelevant at the shorter end, whilst allowing for completeness and consistency at the longer’ (note 25).
We all agree on data integration. The question is how to do that.
Robert Priester, deputy chief executive, European Banking Authority
To borrow an expression from Patrick Hoedjes (note 26), real-time reporting would not cater for completeness at the longer end because it would be the equivalent of a SWAT team raiding a dark room without knowing what they are looking for. Without predefining the substance and the goals of the data for which they are searching, the exercise loses value and the risk is simply that the burden slides from industry to the regulator, which doesn’t ultimately solve the problem.
On the other end of the spectrum, the document-orientated approach does not satisfy the requirements for relevance at the shorter end, and will hamper the drive for more up-to-date regulatory feeds. Old habits die hard; considerable investments have been poured into the current model over the past few years. Like running a second-hand car, there is a point in time when maintenance costs overtake residual values and the first serious fault can be a signal for buying a brand new vehicle.
A move closer to harmonisation of the data input to regulatory reporting would be beneficial for both financial entities and financial supervisors
The current regulatory reporting approach is making it harder to respond effectively with the tight data quality and frequency required to meet the goal of more stringent supervision: to prevent another global financial crisis. Going forward, regulators and industry must agree on a sensible ‘demarcation line’ in the supervisory and statistical data exchange, to reduce the reporting burden for industry whilst improving the transparency of the data in question.
Italy has been doing this successfully for decades under the radar, and now Austria is following suit, although it is early days. With the ECB now looking into the data input approach to manage the mammoth task of supervising the eurozone’s most important banks, it could be that the regulatory value chain in all member countries is ready to explore new and easier terrain.
Can you tell us about the journey to transform bank reporting in Austria and does it work?
It started about two and a half years ago. The discussions about the reporting company have been much more intensive than about the data model. The data model was a relatively quick discussion, as it was clear the banks wanted it.
It’s a one-stop shop for us as a central bank, with the border responsibility for the report remaining in the bank itself.
What has been the biggest win from making the change?
It ensures more consistent, higher quality data. This reduces the amount of checking we have to do and means we can rely on the quality of the data. It means we can compile and review more data, and that will be a cost so for us as it is a zero-balance exercise. The big win for the banks is that they are not burdened with the problem of completing templates on many different topics, many different times. They also save costs by sharing and centralising the process.
Were there concerns about the sharing of competitive intelligence among banks?
In the reporting company there are clear borderlines between the data from one bank to another bank. There is no possibility that one bank can look at the data of another. Maybe five or ten years ago banks had concerns about the amount of data they shared with the authorities, but that’s not really an issue anymore.
What are the biggest challenges in operating this new model?
First of all, there are technical challenges in creating the multidimensional datacubes. Specific IT is needed to handle it, but that isn’t the biggest problem. The much bigger problem is finding out the right data for our users. The cubes are not always filled. You have to know exactly what the definitions are. The statistics department has to prepare the data for customers in a way they can work with it. With forms, we didn’t have this.
What concerns did you have with the Austrian approach?
You can do all kinds of nice things with a general database, but any kind of leak of the system can severely hurt the business of an individual bank. So they are hesitant to share information that could lead to finding out competitive information like risk appetite for loans. The governance of data is another concern. What are supervisors doing with this information? It’s like a SWAT team going to a dark building; you need to have a map, to know what you are going to encounter and which rooms you want to search. A supervisor needs to know what risk they want to investigate.
How does the the consumer benefit from this transparency?
The level of transparency in the financial industry is far from where it should be. I think you can compare it with other businesses that are just as important to people’s lives. You can see from the crisis the impact it has when it fails. It cuts deep into society, and so we need to raise our game here. One of the blocks here is that there is interest from some players for the market to keep it very opaque. I think the understanding needs to come to the industry that it is not in their long-term benefit because eventually it will lose all its clients if customers start to feel cheated.
If you don’t go down the data input approach, what do you do?
I think there is a middle way. The Austrian model goes further because it is integrated into a single institution. Industry is always free to take such an initiative ... but you risk losing some of the governance ... Eventually, I believe reporting requirements will be like a datacube, though how that works will remain to be seen. The challenge for the supervisors will be to find the best way to analyse increasing levels of data. They use various techniques in accounting to throw up anomalies. We haven’t begun to investigate the possibilities.
And real-time regulatory reporting?
I wouldn’t say real-time supervisory reporting is necessary. A company should be responsible for running itself. We don’t need to be stepping into the CEO’s chair. That goes too far.
new staff will be required at the European Central Bank to put the Single Supervisory Mechanism into place