BCBS 239, the Basel Committee’s set of principles reinforcing the importance of data risk management, comes into force this coming January – and the signs for increased harmonisation of banking oversight along with the initial requirements for market ‘compliance’ aren’t looking good. That said, many banks appear to have ‘interpreted’ the principles very comprehensively, while others have not!
Launched in response to the financial crisis, during which banks and financial institutions across the world struggled to aggregate the information they needed quickly enough to take action on risk, BCBS 239 is certainly well-intentioned, but perhaps failed to understand the scale of the challenge (in terms of modernisation of systems, data and processes) it was posing to participants.
BCBS 239 compliance – two major barriers…
After it was announced in January 2013, it soon become clear that the regulation came with two key stumbling blocks.
First, the short timescales for compliance. Applying to G-SIBs (Global Systemically Important Banks) in the first instance, the Basel Committee stipulated compliance just three years after the principles were announced.
For the majority of G-SIBs, who, it could be argued, are dealing with the largest, most complex data infrastructures, the task of re-working long standing processes is colossal. As Neil Vernon, Gresham’s CTO, referenced in his recent blog, some providers to the industry have suggested that building a single control, robust enough to meet the requirements mandated by BCBS 239, could take 64 days. Sixty-four days for a single control across a single process. Multiply that by the myriad systems, locations and formats these banks are dealing with; add in the challenges around handling significant amounts of ‘Big-Data’ (so many options available from Hadoop to MongoDB); try to ascertain and agree what ‘version of the truth’ is used in terms of reference data; and attempt to work out how to provide the analytics (existing systems or new approaches), and the story becomes even muddier.
The timescale issue is compounded by the second challenge of BCBS 239; a lack of clarity around the principles themselves. The regulation is a guideline only. There are no metrics provided, nor prescriptive rules. Instead banks are told that reports should be ‘timely’ and ‘comprehensive’; that the frequency ‘should reflect the needs of recipients’. But what works for one bank will be wholly inappropriate for another. And if the principles are open to interpretation, is there a chance that some will pay lip service only – and won’t realign their internal strategies to place risk at the very heart of their operations, as the regulations intend? I’m sure that the impended proposed ‘audits’ from the regulators in June will harvest riveting conclusions and subsequent repercussions.
How can banks ensure data integrity?
The Basel Committee has thankfully acknowledged that the timeframe for BCBS 239 compliance is tight and that adherence to all of the stringent criteria may be a step too far for some. For the G-SIBs who are likely to miss January’s deadline, the admission brings some welcome relief.
But evidence of action is required, so putting the issue on the back burner is most definitely not an option.
Of course many of the foundations and principles of BCBS 239 include some form of reconciliation/ matching and control as part of the processing but there appears to be industry-wide acknowledgment that the existing, traditional legacy reconciliation tools are not really fit for purpose when tackling a program such as BCBS 239.
For the internal team responsible for governance, risk and compliance, it’s a huge, and potentially litigious headache.
But help is available.
Flexible data integrity platforms, like CTC from Gresham, are specifically designed to offer a control framework that reflects the pace and growth of modern banking. A modern platform that can handle multiple complex and wide data sets, it’s built on big data and in-memory data grid technology in which detailed reporting and analytics can be overlaid.
Specific new controls (often complementing the existing infrastructure) can be implemented in a few short weeks, making it a much faster, cost effective option to apply BCBS 239 and other regulatory compliance initiatives.
Banks must be prepared to embrace change
Yet despite the availability of technologies like CTC, the only way regulation like BCBS 239 will have its intended impact is if banks embrace the shift to data integrity wholeheartedly.
With the short timescales for the new regulation, some have been forced to take a sticking plaster approach to compliance. Whether they are prepared to follow the spirit of the principles longer term and place data risk at the very heart of their operations is yet to be seen, but one thing is clear, regardless of the stern eye of the regulator, the potential cost of ignoring gaps in their data processes is far too great.
For more on the potential impact of BCBS 239 and the new technologies that can ensure data integrity without the hassle of wholesale infrastructure change, download the latest Gresham Guide; ‘Banking in the dark: data control frameworks for the new risk era’.