Data Integrity is key to reducing error rates
As COVID-19 continues to disrupt global markets, banks, asset managers and capital markets firms alike are facing unprecedented levels of volatility.
And whilst enforced WFH has placed an emphasis on improving the connectivity and security of firms’ data, the trouble is this; before you can make any kind of business decisions based on that data, you need to ensure it’s data that you can trust.
Right now, due to the sheer incline in trading volumes, firms are operating at the very margins of how many transactions they can realistically execute and accurately process across the trade lifecycle. It’s a bit like the M25; on a busy day, you can just about cope. One major incident, and the whole system goes down.
Are firms just one car accident away from an explosion of data problems?
On an average day, an enterprise bank may incur 5000 data errors that need to be dealt with. And with the right identification and remediation technology infrastructure in place, they can be.
But at this level of market volatility, firms can be facing up to 600,000 errors a day, each relating to a transaction that has happened in the real world and therefore needs to be reconciled efficiently.
The bottom line is, firms that don’t have the right processes and technology in place to identify, categorise and reprogram this data automatically are putting themselves at significant financial and reputational risk in what is an already uncertain future for us all.
Gresham Technologies plc
E - firstname.lastname@example.org
T - 020 7242 8867