Blog

FRTB and optimal market data management

 

FRTB: A New Era for Market Data Management in Banking

The implementation of the Fundamental Review of the Trading Book (FRTB) marks a turning point for banks. More than just a regulatory update, FRTB demands a complete rethink of how financial institutions manage, govern, and utilise market data. As capital implications grow more severe and the volume and complexity of required data increase, a robust data infrastructure has become essential—not optional.

This blog breaks down the key sections of the FRTB whitepaper by Gresham, connecting the dots between regulation, technology, and operational change.

 

Why FRTB Changes the Game

FRTB introduces stricter rules for how banks assess risk and manage capital. At its core, the regulation emphasises complete, consistent, and high-quality data—particularly over long historical periods. Any missing or unverifiable data can inflate capital charges, especially when risk factors fail modellability tests due to gaps in observed trades.

Banks face the challenge of integrating vast amounts of market data across disconnected systems and departments. Traditional infrastructure simply isn't equipped to handle the scale and audit demands of FRTB.

 

The Need for Scalable, Transparent Data Systems

Legacy systems, built for lower data volumes and simpler reporting cycles, are struggling to keep up. FRTB drives up data requirements through longer time series, granular tracking of risk parameters, and metadata like lineage and proxy documentation.

This shift pushes banks toward more advanced data management platforms capable of handling high-frequency, high-volume demands. The Basel Committee’s BCBS 239 principles and other regulatory frameworks emphasise transparency, consistency, and traceability across all data operations—exactly what FRTB now enforces.

 

Business User Enablement and Operational Agility

Beyond compliance, FRTB highlights a growing demand for faster, more intuitive access to trusted data. Risk and finance teams need “self-service” tools that allow them to explore risk factor histories, trace price lineage, or deploy new models without long lead times.

Modern data management systems should minimise unnecessary data transfers by bringing analytics closer to the source. Agile workflows allow users to adapt quickly, reducing reliance on IT or manual intervention, and speeding up time-to-insight across the business.

 

Three Big Shifts in Market Data Management

The paper outlines three core transformations:

  1. Risk Measurement with Expected Shortfall (ES): Unlike VaR, ES requires banks to maintain 10 years of historical data to assess tail events. This demands reliable storage, flexible sourcing, and the ability to identify the most stressful periods over a decade.

  2. Real Prices and Modellability: FRTB classifies risk factors based on whether they can be modelled using real prices. Banks must collect and link these prices from internal trades, vendors, and data pools—any gaps can lead to higher capital charges due to Non-Modellable Risk Factors (NMRFs).

  3. Risk Factor Classification: New classification rules require banks to tag and group risk factors across liquidity horizons, risk classes, and sensitivities. This involves extended metadata support and time series calculations that require powerful data modelling capabilities.

Getting Data Sourcing and Integration Right

The ‘zero tolerance’ stance on poor data management means that banks must track everything—from where data comes from, to how it's transformed, to how proxies are applied. Dashboards and automated alerts help maintain data integrity and highlight issues before they hit reports.

But clean sourcing is only half the story. The ability to standardise and distribute data across risk and valuation systems is just as critical. Effective data management helps reduce the burden on quant teams who otherwise spend too much time cleaning or formatting raw data.

 

Building the Right Infrastructure

Banks must shift away from patchwork solutions and toward fully integrated data environments. This includes:

  • A common data model for product terms, prices, and risk factors

  • Workflow automation to validate, proxy, and trace data changes

  • Configurable dashboards to track volumes, exceptions, and proxy usage

  • Distribution tools to ensure clean, consistent data reaches consuming systems

Strong governance is key—not just for compliance, but for aligning models and avoiding costly misalignment between pricing and risk functions.

 

Best Practices for FRTB Readiness

FRTB brings complexity, but also a chance to modernise. The whitepaper recommends three key strategic moves:

  • Simplify: Retire legacy systems and consolidate data into centralised, machine-learning-ready environments.

  • Integrate: Align sourcing, modelling, and reporting with replicable, auditable workflows.

  • Enable the Cloud: Cloud infrastructure offers the scalability, flexibility, and cost-efficiency needed to meet evolving demands while maintaining security and agility.

 

Final Thoughts: A Strategic Approach to Regulation

FRTB isn’t just a regulatory hurdle—it’s a catalyst for transformation. The institutions that respond with robust, integrated data platforms will not only stay compliant but also gain a competitive edge in how they manage risk, capital, and change.

A future-proof strategy starts with high-quality data, clear governance, and the tools to turn compliance into confidence.