Blog

Using data integrity to tackle regulatory requirements | Gresham

Among a landscape of increasingly complex regulatory reporting and compliance requirements and waning tolerance for poor data quality, how can financial institutions take control of their data in order to meet the great expectations of the modern regulator?

For financial institutions across the UK and Europe, ensuring the quality and accuracy of your data has never been more important to the reporting process – or less straightforward. With COVID-19 becoming the world's 'new normal', regulators have made it clear that the time for excuses is over. They will no longer be tolerating compliance or reporting errors.

But data is often stored across multiple repositories and jurisdictions, hampered by manual processes and a lack of oversight. Complexity is rife, mistakes commonplace and data confidence low. Our recent whitepaper on data integrity revealed that not only will regulators no longer accept reporting errors, they will also be enforcing regulations even more strictly. To avoid non-compliance, financial institutions must ensure true data integrity – and they can do so by following five key principles.

1. Data integrity must start at the source

Problems often start early on, in the data collection phase. Financial institutions struggle to deal with different data sources, feeds and formats. They find themselves fighting an uphill battle in terms of data quality.

And banks recognise this as an issue themselves: ‘the issue is that we’re not collecting data properly at source,” the Director of Data Management Technology at a major French bank stated in our white paper. “The fact that data is such a mess coming in is what makes it difficult to use.”

Fortunately, early on is also when problems are easiest to tackle. By simplifying processes and using automation to eliminate manual errors, firms can make their lives a whole lot easier.

While the initial instinct might be to attempt to create a ‘golden source’ – a single view of the truth – “the effort required to create and manage it can reduce the flexibility and responsiveness of the organisation, stifling innovation”, according to the whitepaper.

It’s a delicate balancing act – one facilitated by data agnostic solutions, which allow firms to take any data, from anywhere, and use it as needed, without additional transformation work.

2. Stay in step with regulators

Financial institutions must be able to monitor and keep up with new areas of scrutiny and importance for regulators. Increasingly they face a “global landscape characterised by regulatory complexity and scrutiny.” Every aspect of their business - reporting, trading, operations - is being watched, as regulatory tolerance for errors declines. Those who have done their due diligence will have noticed that regulators are increasingly focused on data quality, with new initiatives and investigations dedicated to this space.

As a case in point, the European Securities Markets Association (ESMA) recently launched an investigation into data quality for both European Markets Infrastructure Regulation (EMIR) and Securities Financing Transactions Regulation (SFTR). Similarly , the Bank of England (BoE) imposed a £44 million fine on Citigroup’s UK operations in 2019 for failing to provide accurate regulatory returns over a four-year period.

3. Higher expectations; greater granularity
Regulators are not only analysing the reports that firms submit. “Increasingly, they want to understand the data that underpins them – how it was sourced, validated and where values have come from,” our whitepaper revealed.

As such, it is not enough for a firm’s data just to be right. They need to be able to prove the source and validity of every piece of data given to the regulator. That means the ability to map the data with greater granularity and more comprehensive lineage.

This can result in greater costs for those reliant on manual processes. According to the Head of Business Data at a Tier 1 European bank, “achieving greater granularity and lineage of data enables more use-cases for the data, but it also means greater cost as this is still a manual exercise.”

As such, the need for automation when it comes to data integrity is greater than ever. Automation doesn’t just make things cheaper – it yields better results. It removes the risk of human error inherent in manual work, which is why many firms are now looking to automate their data integrity and control processes.

4. An ever-changing landscape

Regulations are rarely, if ever, a one-and-done. Financial institutions face frequent compliance updates that add complexity, increase the number of data fields, and require more frequent reporting.

For example, the US Securities and Exchange Commission (SEC) recently introduced the Consolidated Audit Trail (CAT), bringing in a new requirement for financial institutions to track all activity throughout the US for listed-equities and options, and map transaction data against counterparties’ transaction reports – a mammoth task considering the volumes involved.

An extremely low error rate is required to meet reporting obligations, and a robust mechanism for counterparty communication to resolve issues quickly. The scale of Consolidated Audit Trail reporting is the ultimate motivation for firms to drive their error rates down as far as possible. 

And there are similar changes expected across Europe, too. Firms can expect, pretty much in perpetuity, to be compelled to continue making updates and adjustments to their data collection, validation and reporting processes. When even seemingly minor adjustments can result in a great deal of work on the back end, solid data integrity is essential.

5. Power in numbers

Finding the right technology partner – one that can empower a firm to connect and control their data across their enterprise and the wider financial ecosystem, covering not only regulatory reporting, but also reconciliation and post trade automation, and cash management and payments – can make all the difference when it comes to meeting these ever-increasing requirements.

Firms are saddled with the twin challenges of greater expectations and greater scrutiny, as rules become more complex and regulators less flexible. Yet they must persevere. The entire business needs to be aware not only of the risk of compliance failure or inaccuracies, but also of the cost burden that a lack of data integrity places on the organisation in the long run.

To learn more about how data integrity can support your regulatory compliance and reporting, download our white paper: Data integrity: Your key to confidence in a complex regulatory environment.