Blog

A golden source of data - Has the industry changed its mind?

Written by Gresham | 25-May-2021 10:24:05

A golden source of data was once high on the priority list for many financial institutions - but have things changed? The topic was raised at our recent data integrity webinar, and Gresham CTO Neil Vernon explores further below.

When discussing ‘data quality’ with any of the world’s financial institutions you are quickly taken down a path of ‘golden source is the solution’. The idea is that firms should have one single source of truth that they can turn to, supplying the enterprise with data it can rely on.

There is no doubt around the importance of data confidence to an organisation, but is a golden source truly a practical – or desirable – state? Once upon a time, the answer was yes, and financial institutions worked towards this utopia of data perfection. However, the questions we need to ask ourselves are what does it cost, and not only in financial terms, to achieve a golden source, how realistic is this goal, and how helpful will it be to an organisation?

Enhancing reporting across multiple counterparties

The original focus of golden sources was around a financial institution’s internal data repositories. However, the ability of firms to report accurately, on time, and in full is often bound up with that of their counterparties – as is the case in the recently introduced Consolidated Audit Trail regulation.

This has led to questions over whether there should be golden sources developed across the industry which each firm can access as needed for regulatory reporting and other requirements. It is easy to see the appeal of this – one single source of truth, properly managed, would reduce error rates in transaction reporting dramatically, as well as reduce or eliminate time spent on counterparty communication. If you and your counterpart report from the same repository how could you possibly report differently?

An innovation roadblock

However, creating such a golden source would drastically limit the abilities of financial institutions in another current ‘hot topic’ area: innovation. A true golden source comes with strict usage and management requirements, which would impede the kind of ‘fail fast’ experimentation with processes and products which the industry has been so at pains to encourage.

Then there’s the unavoidable fact that everyone’s truth is different. As one of our webinar panelists pointed out, the questions that you are using your data to answer will determine the lens through which you should view it. For example, analysing data for product purposes will require you to consider accurate product hierarchy.

Practical realities: Accuracy holds the key

If creating a golden source, in the truest sense of the definition, is neither practical nor desirable, what should firms do instead?

As far as internal data goes, firms should certainly still strive to have a go-to data source – but they should also recognise that a one-size-fits-all solution is not always possible. Instead, appropriate control processes should be applied to ensure that there is no slippage in data quality and that the organisation has full understanding of its usage. Managing data lineage through systems that allow complete visibility of the data lifecycle makes the source of data easily traceable, enhancing understanding further and ensuring that any issues can be easily fixed. Organisations should also educate data consumers to ensure that, when using data, the right ‘lens’ for the situation is applied.

These steps will also help to resolve many of the issues that banks experience when dealing with their counterparties, since each side will have improved the accuracy of its data. Resolving linkage issues with counterparties consumes valuable resources, particularly where escalation is required, but by giving themselves maximum visibility and control over their data, financial institutions can stop many of these issues before they start.

A clean and consistent source of data is undoubtedly an important part of a firm’s data integrity toolkit – but this simplification needs to be taken beyond the golden source and applied across an organisation to the way that data is sourced, tracked, used, and delivered. By removing the complexity of legacy technology and siloes which leave data trapped in different systems, you gain the ability to achieve an holistic view of your data, providing the foundation for true data integrity.

To hear more from Neil, as well as data experts at AIG and Scotiabank, view our data integrity webinar on demand.