IFRS 9 – time for data integrity to drive cultural change?

10 April 2017
Robyn da Silva

January 2018 looks set to be a watershed moment for financial firms, as two of the most critical financial regulations resulting from the global crash finally go live.

Alongside MiFID II, with its onerous new requirements around financial reporting, IFRS 9 looks set to vex banks and FIs further, as it fundamentally changes existing credit risk modelling from an incurred, to a forward-looking approach.

Data integrity is a pre-requisite for both.

IFRS 9 – a recap

Under the standards, firms will now take a charge immediately on each new loan based on its 12-month expected loss, followed by a further lifetime anticipated loss provision if there is a ‘significant deterioration’ in asset quality. Expected losses will need to be offset with capital provisions.

Unlike many newer regulations, which focus on control, transparency and robust reporting, IFRS 9 concerns long-term financial security – and getting these anticipated calculations right (or wrong) could mean the difference between institutional success and failure.

Most global banks expect IFRS 9 rules to result in their loan loss provisioning increasing by around 50% across asset classes1. With capital ratios already under strain from the Basel III capital framework, it’s a further dent to the purse that many would prefer to avoid – so getting the calculations right to avoid tying up more capital than is needed, is critical.

And the key? Effective data management.

What’s the data challenge with IFRS 9?

The leap in data volumes that will require analysis to calculate a loan’s Expected Credit Loss (ECL) will be significant – historical customer data, in addition to macro-economic information (such as GDP, interest rates, house price index etc.) will need to be gathered, verified and analysed.

This convergence of internal and external data will prove particularly difficult for organisations using data management tools with fixed data models. For many, an additional ETL layer will be required to manipulate the data, causing project delays and increased difficulties around drilling down dynamically and understanding the data.

Add in the governance requirement around reporting, transparency and adding new controls, and those institutions that are running legacy systems face a painful future.

Agility is the only way to handle the IFRS 9 challenge

Only a platform that can collect, aggregate and process the ‘big data’ required under IFRS 9 will be able to satisfy the regulation and ensure appropriate capital provision. 

Agility, and the ability to accept multiple feeds, in multiple formats; verifying and validating data feeds in real-time is a fundamental requirement. Legacy systems without this functionality present a very real risk of over-calculating capital provisions, and unnecessarily triggering a lifetime ECL calculation.

Data integrity as a cultural shift

If the data requirement alone isn’t enough to encourage banks and FIs to review their data integrity platforms, the ongoing climate of regulation should be. As with BCBS 239 before it (and no doubt, many more to come), IFRS 9 requires the convergence of risk and financial systems, and the introduction of wholesale changes to the way data is handled, presented, and controlled. It demands a coordinated approach between accounting, risk and IT – and a new culture of collaboration.

Banks can continue to patch up legacy data management systems to comply with each new round of industry standards, but when senior managers own financial and professional security are on the line, it’s an approach that won’t hold water for very much longer.  

Indeed, January 2018 may enter the history books as the date data integrity shifts to the centre of banking strategy, and becomes the only effective way to ensure long-term stability.

1Fifth Global IFRS Banking Survey: Deloitte