Data quality under the microscope: Meeting regulatory expectations

Data reconciliation and data quality is often seen as an operations/back office topic but the implications stretch across your business, particularly as regulatory scrutiny increases. This topic was recently discussed at our data integrity webinar, and Gresham CTO Neil Vernon explores further below.

Data opens up a world of exciting possibilities for financial institutions but what about the more onerous, practical work that goes on ‘behind the scenes’ to make this data usable? Understanding the path of your data, and the contexts in which it is used is key - without control over this, firms find themselves reporting and making decisions based on data that they are not 100% confident in - a recipe for disaster.

Obtaining this true understanding means knowing your data inside-out. How does the way it was collected affect it? What data validation – if any – has taken place? Where have changes occurred and are you able to track these – and demonstrate this to auditors? It’s not enough simply to focus on the data requirements for getting a report submitted on time – only by looking at the bigger picture and the processes behind the scenes can you identify data quality issues before they become problematic for your business.

It isn’t only the desire to leverage the commercial power of data which makes this topic important. Regulators are scrutinizing the data that they receive from financial institutions more closely than ever before. Increasingly, regulators don’t just want to know what your numbers are. They want to know where they have come from and how you know that they are correct.

Conducting a detailed analysis of your data and its usage is only part of the story. This activity needs to be treated as a living process, not a one-off task. Documentation must be kept up to date, changes managed, and data monitored in real-time. Taking a holistic approach avoids the fragmentation which creates unnecessary complexity and leads to errors further down the line.

A perfect example is in regulatory reporting: a process where handling everything from the ingestion and submission of data to the management of exceptions, optimisation, and reconciliation in one integrated system allows you to see and verify what you are reporting all the way back to the source, giving you the confidence in your data that regulators now expect and have begun challenging firms on. With regulatory scrutiny and vigilance only set to increase, a granular understanding of your data and controls is more important than ever.

To hear Neil, along with experts from AIG and Scotiabank discuss how financial institutions are tackling data integrity to satisfy regulatory expectations, view last month's data integrity webinar.

New call-to-action


iStock-1094914522-1-1
Solutions

Decrease time-to-market and scale to growth while reducing risk. Putting you in control of your data, operations and growth.

Related Articles

See All