In May-June 2018, the European Central Bank (ECB) and the Basel Committee on Banking Supervision (BCBS) published reports on the progress of the largest, internationally active banks towards compliance with the BCBS Principles for Effective Risk Data Aggregation and Reporting – known as BCBS 239.
Whilst both reports approach the topic from different angles, the similarities in their findings are striking and paint a scenario where, two years after the original compliance deadline, gaps are still significant and widespread.
This blog discusses those findings, examines what we can learn from them and considers what may be next as banks continue to confront the challenges posed by BCBS 239.
Approach to the assessment exercise
The BCBS and the ECB adopted different approaches to assessing the “state of the nation” in relation to BCBS 239 compliance:
- The BCBS based its assessment1 on the approach followed in previous years2 - the relevant home supervisors provided their assessments of Global Systemically Important Banks (G-SIBs) using benchmark questions marked against a four point scale. The results were then normalised and collated centrally by the BCBS.
- The ECB document3 , on the other hand, includes both G-SIBs and other banks directly supervised by the ECB. It is based on a thematic review run in 2017 and includes a “fire drill” exercise whereby banks were asked to provide detailed information about risk reporting governance, process and methodology for two risk indicators covering credit and liquidity risk4 . The exercise was run under the Single Supervisory Mechanism’s (SSM5 ) direction. Although this exercise covers a smaller geographical area, it was more intrusive and, thanks to its standardised nature, the results offer better comparability.
Despite the different methodologies applied, the results described in both documents touch on common themes.
The observations made by both the ECB and the BCBS focus mainly on two areas:
What do the results tell us?
Both the ECB and the BCBS acknowledge that full implementation of the BCBS 239 principles will not be achieved until at least the end of 2019 and that some of the most significant programmes are likely to extend until 2021.
However, the fact that supervisors continue to focus on the progress banks are making tells us that BCBS 239 remains a priority for them. Indeed, this on-going focus on data as a foundation which underpins demonstrably good process and correct, traceable results is backed up by the fact that data considerations are at the heart of other significant regulatory initiatives. Basel III elements such as the P&L attribution test in the Fundamental Review of the Trading Book and the attention the Federal Reserve has devoted to data quality in CCAR both have considerations of data consistency and lineage at their heart.
The industry should therefore expect no reduction in the focus on data from supervisors until banks achieve acceptable quality standards. The finalisation of Basel III and changes across Europe around definition of default and how models need to be further strengthened highlight the importance of effective data management. In addition, with the advent of smaller, more nimble competitors, including fintech, having old and cumbersome systems will put traditional banks at risk of being left behind by a younger, more agile customer base. Banks will need to move quickly to retain their advantage in a competitive market.
Against this backdrop, it is also clear that across the industry, existing participants continue to struggle to achieve the traction on their data agenda that both they and the supervisors want, due to a variety of factors:
- Banks’ regulatory agendas are still overloaded – Banks continue to face a multitude of regulations and requests by supervisors to fix specific pressing problems or face sanction. Given this, although data as a broad enterprise-wide topic is a key enabler of many regulatory requirements, it may not be the most immediate concern in its own right. Whilst in some jurisdictions, like the UK, specific conduct rules around senior management accountability have contributed positively to reinforcing the importance of long-term data remediation, at global level the uncertainty around the consequences of potential non-compliance with broader data requirements makes it more difficult to place this at the top of priorities.
- Addressing BCBS 239 requires “deep surgery” – Prior to the global financial crisis, the drive to expand business coverage and product capability rather than building streamlined and efficient data flows left many banks with patchy data architectures and reliance on increasingly complex end user computing tools. Disentangling this situation requires, for many banks, the need to dig deep into systems and processes – this may often necessitate fundamental ”re-writes” and these re-writes come with a price tag that is hard to accommodate at a time when banks are seeing increasing margin pressure.
- Data is everyone’s problem, but no one’s problem - Data spans all aspects of the Front Office, Risk, Finance and Operations within a bank and is generated, aggregated and consumed at all levels. Banks have struggled to determine who owns what data and how to fix deep-rooted data flow problems. This results in governance and oversight gaps and a lack of drive to take on the challenges posed by data flows at a fundamental level – instead, banks skirt the issues, trying to enhance existing controls and management mechanisms in lieu of addressing the base data challenge.
- Data is an asset but is not always correctly valued – Whilst transforming the data has a cost, correct data is a valuable asset. Not only does it drive better decision making, it also increases reporting quality and reduces regulatory, reputational and operational risk. However, given that data issues are often felt downstream of those originating the data, banks struggle to value data on an end-to-end basis and instead focus on cost and quality issues within current organisational silos. This can, again, lead to a reluctance to fix data in a definitive way.
All these issues have left banks in a quandary and, faced with this, they have on occasion presented optimistic views6 on their actual level of compliance to their supervisors, giving rise to unrealistic expectations. These in turn are leading to increased pressure to deliver on those banks which continue to have deep underlying data issues.
What comes next?
Whilst the data challenge is undoubtedly a significant one, the current supervisory focus and the latest edicts themselves point to the fact that over time banks will be expected to comply. This will not be easy - there is no silver bullet offering an immediate resolution to the co-mingled issues of ownership, governance, conflicting priorities, pre-existing process and infrastructure challenges and mounting cost pressures.
Instead, banks wishing to retain their existing competitive advantage will have no choice but to learn to value their data and to place it at the heart of their next waves of organisation, process and technology developments whilst avoiding implementing changes which exacerbate rather than solve the core issues. Fixing data as a core part of ”what we do” rather than as a bolt-on to the core business will be key.
Banks will adopt a range of strategies to achieve this, ranging from large-scale centralised data programs to more federated programs united by common principles, governance and purpose. How well these various approaches succeed will depend on how well they can be made to suit a given organisation and how well committed and executed they are there. Those banks that succeed will be those that find the best match between their needs, their proposed solution and their readiness and ability to execute.
Analysing some of the industry examples, banks that have had greater success in addressing their data challenges have been those which are most able to frame their data efforts within a clear ownership structure, often placing responsibility for driving data improvements at least in part on those who originate and maintain the data in question. Similarly, being able to tie specific inbound data quality issues to specific cost and downstream productivity issues, as well as understanding what business opportunities better data may bring, can enable successful banks to tie data program deliveries to ”monetised” data benefits and thereby enable them to view improved data as a valuable asset. This ability to monetise the benefits of this improved data can provide a competitive advantage for front office teams to increase revenue as well as allowing support functions to work efficiently and better manage financial resources, such as RWA.
Alongside this, banks are increasingly looking to emergent technologies to help them meet their data challenges – intelligent solutions to fix, manage, store, aggregate and distribute data can offer banks some respite from the need to resolve and remove data challenges through wholesale infrastructure re-writes by providing simpler technology-based paths through them.
In the meantime, Deloitte remains committed to helping our clients however we can as they seek to tailor their own approach to their own situation and their short, medium and long term challenges.
1Progress in adopting the Principles for effective risk data aggregation and risk reporting, June 2018
2BCBS Progress in adopting the Principles for effective risk data aggregation and risk reporting, January 2015, pp. 3, Par. 1.4
3ECB Report on the Thematic Review on effective risk data aggregation and risk reporting
4The risk measures requested were the FINREP-based Non Performing Loans granted to SMEs and the COREP-based retail deposits outflows within the liquidity coverage ratio.
5The SSM is the supervisory division of the ECB which since 2014 has provided a common supervisory approach across the most significant banks across the Eurozone countries.
6See BCBS Progress in adopting the Principles for effective risk data aggregation and risk reporting, December 2015, pp. 4 par.1