How regulators are driving a data quality revolution | Gresham
In a global landscape characterised by regulatory reporting complexity and scrutiny, regulators are ‘cracking down’: tolerance for poor quality data and reporting errors is fast declining, and the time for turning a blind eye is over.
However, this increased regulatory focus has not quite led to the reduction in reporting errors and poor data quality that regulators hoped for, begging the question: is fear enough to compel financial institutions to get their ducks in a row?
For firms globally, achieving high-quality, accurate data has never been more important to the reporting process – or less straightforward. Data is often stored across multiple repositories and jurisdictions, hampered by manual processes and a lack of oversight. Complexity is rife, mistakes commonplace and data confidence low, exacerbated by regulatory divergence across jurisdictions.
Methods for motivating firms to get regulatory compliance right can be broadly categorised into two camps: the carrot and the stick.
This stick often comes in the form of fines. According to ESMA's Sanctions report, fines imposed by National Competent Authorities (NCAs) under MiFID II more than quadrupled in value in 2020, reaching an aggregated €8.4 million (comprising of 613 sanctions and measures), compared to just €1.8 million (371 sanctions and measures) the year prior.
Yet data integrity and reliability has not improved in tandem. ESMA’s EMIR and SFTR 2020 data quality report, released in April 2021, highlighted this fact in detail for the first time since the European Market Infrastructure Regulation (EMIR) came into effect over seven years ago.
Under EMIR requirements, around 7% of daily submissions are currently being reported late by counterparties. Additionally, up to 11 million of open derivatives did not receive daily valuation updates, and there were between 3.2 and 3.7 million open non-reported derivatives on any given reference date during 2020. Approximately 47% of open derivatives (totalling circa 20 million) remain unpaired.
Attempts to re-use existing legacy solutions already prone to data quality issues has further compounded things. This is particularly rife in firms’ approach to SFTR, a regulation that many have viewed as a close enough cousin to EMIR to simply hit copy and paste.
These ongoing challenges demonstrate that, while the stick does have a role to play, it alone is not enough to address the issue of low data quality.
This is where the carrot comes in. Rather than penalising firms for poor data quality and regulatory reporting financially, helping them realise the potential that strong data integrity can bring – such as reduced costs, increased efficiencies, and a more competitive business offering – may prove more effective in encouraging the prioritisation of data integrity at the C-suite level.
However, that’s not to say that the carrot and the stick must be mutually exclusive. Rather, to ensure financial institutions adhere to high data quality standards, regulators must leverage both the carrot and the stick in tandem.
Beyond the threat of fines, regulators can help financial institutions to get on the front foot by proving the benefits of high data quality. Above all, they must clearly convey that the businesses that will prove successful will not be those acting out of fear, but out of ambition.