Introduction.
Market reference data provides descriptive and economic information needed to complete and settle financial transactions for securities. It gets shared across different business areas and stakeholders and provides the context for specific transactions and/or client or in-house commercial activities. At its most basic level, reference data, for a simple trade transaction such as the sale of a stock, involves referencing to standard identifiers (e.g. ISIN) for the underlying security, the buyer, the broker-dealer(s) or execution venue and the price. At its most complex level, reference data can cover all relevant details for the most complex transactions and products, including numerous terms and conditions, different entities, contingencies and dependencies.
This data is surprisingly ubiquitous and complex. It might look simple but describing all of the economics and terms of a product can involve thousands of groups and data attributes per product. At its most basic level, reference data for a simple trade transaction, such as the sale of a stock, involves referencing to standard identifiers (e.g. ISIN) for the underlying security, the buyer, the broker-dealer(s) or execution venue and the price.
Over the past few years, the financial services industry and regulatory agencies have been pursuing a policy of standardising reference data that define and describe trade transactions. However, standardising the reference data has not been straightforward for institutions due to a number of factors:
- Difference in semantics in common terminology through different identifiers and taxonomies for e.g. instrument types, markets, ESG criteria and industry sectors.
- Large amount of data elements that make up the transactions.
- High rate of change in markets and their products.
- Variation in data types, e.g., static, dynamic and bounded.
- Different data and data quality needs in different use cases across the trade lifecycle – from research, market making, execution, post-trade reporting, settlement, and asset services to finance and risk.
The Current State and Challenges.
Financial institutions are grappling with significant challenges in reference data management due to their convoluted, siloed approach based on product lines, customer segments, and geographies, often compounded by past mergers and acquisitions. This fragmentation has resulted in a costly and opaque "spaghetti" architecture characterised by multiple data sources, legacy systems, undocumented connectors, and redundant infrastructure that inflates maintenance costs and creates unpredictable change cycles.
The situation prevents effective standardisation as consumers create local copies of fundamental data elements, with 20-50% of database tables containing reference data that becomes inconsistent across applications. This proliferation of customised copies forces stakeholders—particularly in risk and finance departments—to navigate across disparate local datasets when aggregating information, leading to data quality issues, reporting errors, and significant time spent on remediation rather than value-adding activities.
Organisations are facing multiple urgent issues, too:
- Lack of metering, measuring, and monitoring capabilities.
- High cost of change and lack of a centralised view.
- High level of manual data validation and verification.
A Market Data Transformation Has Begun.
An industry-wide exercise has begun to reassess the current market data offerings and changes provided by vendors to drive new business and technological value.
A shift of focus and alignment of governance is needed. Specifically:
- Data modelling should be treated as business-as-usual, enabling self-service for business users.
- Data and metadata should be managed cohesively.
- Data management and analytics should be treated as coupled disciplines rather than separate functions.
There’s also a need for infrastructure improvements:
- Moving to the cloud to reduce costs and increase scalability.
- Adopting data-as-a-service models.
- Implementing better data quality controls and governance.
Act Now.
Financial service firms are experiencing unprecedented margin pressure while being constrained by legacy infrastructure from early automation efforts. Their fragmented data management approach results in high fixed costs and complex change cycles, leaving them ill-equipped to leverage new analytics technologies, adequately serve business users, or meet regulatory requirements.
To address these challenges, organisations need
- comprehensive market data transformation strategies focused on creating recurring value and fostering collaboration among vendors, IT, and business units.
- Before undertaking large-scale transformation, firms should first conduct thorough market data and platform maturity assessments to identify duplications and redundancies.
- They should prioritise developing reusable assets that generate ongoing value while creating sustainable, cost-effective solutions.
This structured approach enables financial institutions to develop a strategic solution that addresses current pain points while positioning them for future innovation and competitive advantage.
Contact Us

March 1, 2024
