Data remains a key asset for any enterprise, even more so for banks and other financial institutions subject to the multiple competing pressures of regulation, budgetary constraints, and the proliferation of data sources both within and outside the firm.
Customers expect more real-time and on-demand modes of service. Regulators expect provably accurate, granular detail in ever-shorter timeframes. The need to evidence control and accuracy has never been greater.
Data is central to these concerns – how it is governed, where it originates, who is accountable for its quality, how that quality is measured, proven and published. The escalating amounts of it that have to be dealt with offer both risk and opportunity – the risk of legacy systems and infrastructures being unable to cope, coupled with the opportunity for exploiting and unlocking insights of genuine value to the business.
In financial services, regulation cuts to the heart of the problem – BCBS 239 highlights accuracy, integrity, completeness and timeliness of risk reporting; MiFID II and CAT613 expands the scope of trade and transaction reporting across asset classes where the diversity of data attributes and the interpretation placed on them require strict understanding of where key data originates, how it is used, whether it is accurate against external references and provably consistent and up-to-date.
Clareti Data Accelerator (CDA) launches Gresham’s award-winning Clareti platform firmly into the world of Big Data, combining the established data quality and integrity engine behind CTC with our industry-strength messaging, standards and rule-based data integration. Unique ‘fast data’ techniques allow high-speed ingestion and rule execution over any data at massive scale, simultaneously exposing a fully granular query interface aimed squarely at the ‘Citizen Data Scientist’ – the business user who needs to surface business insights at speed without the friction and inertia of a large IT project.