Wrangling FRTB: A Year in (Trading Book) Review

2018 was a year when certain jumbo-sized, data-heavy regulatory regimes finally smoothed out and settled down. Among the few that haven’t, the Basel Committee-driven Fundamental Review of the Trading Book (FRTB) remains one of the most consequential and contentious—threatening the market-making businesses of global systemically-important banks (G-Sibs) and smaller institutions, alike.

At its core, FRTB is a pricing exercise, requiring firms to analyze their trade books against an array of risk models, and then assign additional capital charges for holding those assets, as appropriate. Part of the foundational challenge is proving out to regulators that these “internal models” are parameterized and applied consistently. But another, potentially far more costly element lies in managing the punitive costs of exotic instruments showing “non-modellable risk factors” (NMRFs) under the new standardized regime. An NMRF results when an instrument has fewer than 24 price data points in a year, observed no more than one month apart. Sovereign credit default swaps (CDS), swaptions, and long-dated swaps are among the instruments likely affected.

The industry has voiced serious misgivings on this particular treatment, given the so-called “seasonality” of some trading. The reasons why are clear. As Risk.net reported earlier this fall, Basel Committee surveys indicate that average market risk capital costs under FRTB will increase for G-Sibs by over 50 percent. One bank, they noted, faces a bump three times that size, of 160.5 percent. Smaller firms are even worse off, averaging 76.4 percent. And at an almost unfathomable extreme, the bank worst affected will see their costs increase 469.5 percent.

Three Approaches

Given those numbers, FRTB immediately renders certain market-making activities uneconomic, and banks must decide what to do. For many firms, add-on costs associated with keeping NMRF-laden assets on balance sheet are exceed the potential benefits, and so the first approach is to trim. Going into 2019, many firms are reading tea leaves and picking their battles, trying figure which competitors may be most affected—in which markets and how badly—and what is still worth the trouble. As much pain as FRTB promises, decreasing liquidity also means there are opportunities to be identified, as well.

In addition, the past year has seen many institutions take a more rearguard and proactive perspective to the FRTB plights they face. There are two additional options to consider here, and both are data-intensive. First, firms are adding tools to wrangle and aggregate as much new asset data as possible—filling in pricing gaps and building out market risk models where there were none. This idea is simple. Fewer NMRFs lead directly to cost savings, even at the margins, and for that matter, better modeling capability and cross-bank methodology standardization gives institutions fresh ammunition in their argument with regulators to soften add-on charges’ damage.

Second, firms are also looking a level above: asking how they can more effectively spread these added capital charges out across different trading desks and even legal entities. This is a more conceptual and enterprise-specific task, which involves building a second set of computational models and scenario analysis on top of the FRTB modeling layer. In short: can firms fight FRTB from the inside, with rejiggered trading operations and creative balance-sheet organization as a panacea.

Brick by (Data) Brick

There is a strong argument to be made that FRTB, more than any other post-crisis regulation, implies a direct link between data dexterity and dollars, and the epic (though mostly, to this point, fruitless) haggling over FRTB this year therefore also speaks to FRTB’s impact at a data management level. In its early days, the compliance emphasis was on FRTB reporting engines—making sure the back-end compilation process was sturdy enough to pull the various modeling outputs together, and can effectively manage the transition into 2021. If not quick, then it was certainly meant to be smooth and tidy.

That was then. Today, FRTB response has moved beyond compliance, and much stronger data foundation is required earlier on. Firms must govern the clean intake of a wider variety of pricing data, not all of which is electronic or standardized. FRTB-associated computations will sit in different business functions, from risk to finance and accounting to compliance. And end-decisions about altogether redistributing assets, restructuring entities, or ending certain market-making activities will require a strong narrative all the way through:

  • Where is the data from?
  • Where did it travel?
  • What modeling has it touched or influenced?
  • What reference data work is required to facilitate the risk-management process moving forward?

Time is Now

With regulators holding on firmly, firms are beginning to move. We’ve seen institutions taking a more transformative approach to FRTB data, introducing big data applications and reengineering their infrastructural platforms to meet these demands. All of which can help not only in navigating FRTB, but reacting in a way that can preserve markets under threat. But as these are built out, banks should also ask first-order questions about how they govern and control the data—and models—in question. This will have significant, spectral impact on reporting, modeling analysis, and ultimately decision-making down the line.

With FRTB, it really is about “the more you know”—and that demands a strong start, as well as creative maneuvering.


iStock-1094914522-1-1
Solutions

Decrease time-to-market and scale to growth while reducing risk. Putting you in control of your data, operations and growth.

Related Articles

See All