The implementation of the Fundamental Review of the Trading Book (FRTB) marks a turning point for banks. More than just a regulatory update, FRTB demands a complete rethink of how financial institutions manage, govern, and utilise market data. As capital implications grow more severe and the volume and complexity of required data increase, a robust data infrastructure has become essential—not optional.
This blog breaks down the key sections of the FRTB whitepaper by Gresham, connecting the dots between regulation, technology, and operational change.
FRTB introduces stricter rules for how banks assess risk and manage capital. At its core, the regulation emphasises complete, consistent, and high-quality data—particularly over long historical periods. Any missing or unverifiable data can inflate capital charges, especially when risk factors fail modellability tests due to gaps in observed trades.
Banks face the challenge of integrating vast amounts of market data across disconnected systems and departments. Traditional infrastructure simply isn't equipped to handle the scale and audit demands of FRTB.
Legacy systems, built for lower data volumes and simpler reporting cycles, are struggling to keep up. FRTB drives up data requirements through longer time series, granular tracking of risk parameters, and metadata like lineage and proxy documentation.
This shift pushes banks toward more advanced data management platforms capable of handling high-frequency, high-volume demands. The Basel Committee’s BCBS 239 principles and other regulatory frameworks emphasise transparency, consistency, and traceability across all data operations—exactly what FRTB now enforces.
Beyond compliance, FRTB highlights a growing demand for faster, more intuitive access to trusted data. Risk and finance teams need “self-service” tools that allow them to explore risk factor histories, trace price lineage, or deploy new models without long lead times.
Modern data management systems should minimise unnecessary data transfers by bringing analytics closer to the source. Agile workflows allow users to adapt quickly, reducing reliance on IT or manual intervention, and speeding up time-to-insight across the business.
The paper outlines three core transformations:
The ‘zero tolerance’ stance on poor data management means that banks must track everything—from where data comes from, to how it's transformed, to how proxies are applied. Dashboards and automated alerts help maintain data integrity and highlight issues before they hit reports.
But clean sourcing is only half the story. The ability to standardise and distribute data across risk and valuation systems is just as critical. Effective data management helps reduce the burden on quant teams who otherwise spend too much time cleaning or formatting raw data.
Banks must shift away from patchwork solutions and toward fully integrated data environments. This includes:
Strong governance is key—not just for compliance, but for aligning models and avoiding costly misalignment between pricing and risk functions.
FRTB brings complexity, but also a chance to modernise. The whitepaper recommends three key strategic moves:
FRTB isn’t just a regulatory hurdle—it’s a catalyst for transformation. The institutions that respond with robust, integrated data platforms will not only stay compliant but also gain a competitive edge in how they manage risk, capital, and change.
A future-proof strategy starts with high-quality data, clear governance, and the tools to turn compliance into confidence.