By Jeremy Taylor, Head of Capital Markets Business Consulting, EPAM UK
October 27, 2017
Fundamental review of the trading book (FRTB), or Basel IV, has been in development for a long time and introduces a paradigm shift in the market risk regulatory framework as it imposes a complete overhaul of market risk capital rules across the globe.
Although FRTB introduces a number of quantitative changes, its data management and business complexity implications present the most formidable challenges for banks. In order to meet regulatory requirements with an ironclad strategy, it’s pivotal to understand some of the impending changes.
Data Quality will be a Crucial Component in Modeling and Reporting
One of the key data issues addressed is the concept and treatment of Non-Modelable Risk Factors (NMRFs), underscores regulators’ long-held concern that market risk models have been introduced where there is insufficient data or issues with data quality.
Going forward, firms must demonstrate that model inputs are derived from real data based on real transactions. Ultimately, NMFRs will alter the way model approval is granted and monitored.
FRTB also introduces an Enhanced Standard Model Approach (SA) along with the Internal Models Approach (IMA). The SA will use a set of sensitivities that are defined and determined by the regulator. Despite an enhanced sensitivities-based regulatory approach, there is still likely to be a gap between IMA and SA.
Risk measures are currently calculated at an entity level. Under the new framework, they will need to be calculated at the desk level. This will lead to a monumental increase in data, analysis and reporting compared to what is required today, shifting banks from a traditional approach of using RMDBS to store data to using technologies like Hadoop, Spark and Scala. In short, poor data will lead to higher capital requirements.
Since risk modeling and reporting will occur at a desk level, its ability to calculate RWA and demonstrate good returns under the new framework will determine desk’s viability and survival.
Complex products and data gaps will drive the appropriate model approach along with what proportion of the RWA calculation is driven by the ‘other’ sensitivity bucket for SA as well as NMRFs for IMA.
Addressing the Underlying Systems to Achieve Alignment
While data quality is important, desks also need to be able to run processes and calculations efficiently and quickly. Given the amount of data involved, any delays can result in catastrophic processing queue build-ups that will put an enormous strain on any firm’s infrastructure. Basically, the whole technology stack must change with the introduction of these strict standards.
Despite significant investment, large banks will still face fragmented and siloed system infrastructure. FRTB insists that data and processing practices must be brought together in a single system where enrichment and transformation is managed consistently and uniformly. This is something that can be done in a Data Lake-type big data store and permits transformation and enrichment can happen using different technologies.
Backtesting and PnL attribution are essential tools in assessing how aligned PnL produced by the risk infrastructure is to that produced by the Finance sub-ledger. Again, data will become an integral part to ensuring alignment. Consistent data sourcing and snap times both imply that the Risk and Finance functions will need to be re-assessed in terms of role and underlying systems.
Implementing Real-World Data to Validate Analyses
Although FRTB is primarily concerned with determining absolute risk and the requisite capital, banks’ senior managers must also understand and explain movement in RWA and pin-point which sensitivity was the cause. This vital measurement tool will help effectively manage businesses. In addition, if managers wish to know the capital requirement impact from acquiring certain assets or trading positions, they will need to implement some form of ‘what-if’ analysis.
Yet another important element to FRTB compliance includes desk structure validation. Banks will have to submit their desk set-ups to regulators in prior to January 2019. The regulators will then grant the appropriate model approach for each submitted desk structure. Data lineage and quality needs to be established for each business unit or desk and has to be BCBS239 compliant. In some cases, this will be a huge undertaking and will need to be ready by mid-2018.
Fortunately in Europe, MiFID II will help deliver improvements to market data quality and coverage since no single bank will be able to offer sufficient data on its own. The establishment of trade repositories – particularly for OTC derivatives – has also increased the availability of real transaction-based data. Data projects are notoriously difficult and come with a great deal of inertia to overcome. Therefore, banks must now design and build a single trusted data source.