Remember me

Register  |   Lost password?

The Trading Mesh

Round Table Report: Responding to the RegTech data management challenge

Wed, 01 Nov 2017 04:36:00 GMT           

Verne Global and Realization Group held a roundtable event on the 19 September 2017 discussing  “Responding to the RegTech Data Management Challenge”.

Regulation is imposing risk measures upon banks that make specific demands upon the data they own. For a long time the data within banks has been influential but unmeasured, akin to the firing of brain cells driving human consciousness. With external demands requiring a more precise alignment of cause and effect, banks are having to analyse and rebuild their regulatory technology (RegTech) infrastructure to ensure it fulfils the regulators’ wishes. On 19 September 2017, a roundtable organised by The Realization Group and Verne Global met to discuss these pressures.

When the Basel Committee on Banking Supervision Paper 239 (BCBS 239) was published in January 2013, it laid some of the groundwork to help banks handle the forthcoming deluge of data-led regulation. It established principles for data aggregation and enhanced reporting in order to overcome the challenges seen in the financial crisis. For many banks, that incentive was needed in order to drive the investment in infrastructure necessary to build a single data architecture.

By complying with BCBS 239, firms have begun to see commonality in the data they use to deliver regulatory reporting and risk measurement solutions. That is increasing the demand for central storage and has pushed them toward addressing regulatory compliance as a programme, rather than as individual projects.

However, the granularity of information involved in compliance means this cannot be a high-level activity, nor a top-down imposed project that will fit all of the new rules.

Fragmented data and fine-tuned rules
Some regulations are so demanding –such as the Fundamental Review of the Trading Book (FRTB) –that banks are struggling with their capacity to manage and report on risk. That means there is a demand for dedicated technology to resolve the risk reporting required, using discrete regulatory data solutions.

FRTB requires firms to assess risk exposure at the trading desk level, in order to work out a firm’s risk exposure on the trading book and its associated capital requirements. They must assess whether factors are ‘modellable’ or not, based upon the level of granularity the data contains. If third party data is used, it cannot be assumed to be correct; its lineage must be established and assessed for proof. Consequently, large datasets need to be tested, mapping risk factors to real price observations. FRTB requires a year of running in anticipation of the actual benefits to Tier 1 capital in 2020.

Other regulations refer to specific outputs and demand frequent checking and rechecking of them. An example of this is within MiFID II, which requires the testing of execution algorithms to limit their potential to create disorderly markets.

In order to conduct that testing, market participants must assess their algorithms against an emulated market microstructure with the test data stored for five years, with retesting on any significant change to an algorithm, including a parameter change. That is a lot of data and compute power. The risks of a compliance failing are considerable, with 15% of the firm’s turnover as a penalty, along with potential jail time and bans from the industry for individuals responsible.

Even when faced with regulations that have common goals, banks can struggle to use common technology. The interplay between the 2010 Dodd Frank Act and the European Markets and Infrastructure Regulation (EMIR) was an example of this. Both were built on the same commitments made by G20 countries but EMIR, which was the latter of the two, required a different data set which was hard to build on the existing one for Dodd Frank. As a result, many businesses had to develop separate data sets to work from.

The new calculations being demanded are an order of magnitude more complex than firms face today. FRTB andMiFID II are expanding firms’ existing workloads to reach across a wider range of asset classes than have ever been tested before. The testing, checking and tuning for these discrete rules are reliant on an infrastructure that can support heavyweight computing, including parallel and grid processing, which needs to be somewhere the bank can afford to run it.

Smarter application of technology
In firms with complex architectures and interconnectivity between teams, building bespoke tools that can be tested and controlled within their own environment, with limited impact on the rest of the organisation, is a common, but costly short-term solution to support regulatory obligations.

However, there are ways to reduce the total cost of ownership when developing RegTech programmes. Firstly, compliance with these and other rules may demand different data sets, but they also require a flexible testing environment. Technology-based approaches such as cloud computing can be used to scale calculations and test models. Where there is a demand for sophisticated hardware, such as graphical processor units (GPUs) to parallelise calculations, or the application of data science techniques, there are ways of mutualising investment and building capability that is not a fixed cost.

The big challenge for a lot of organisations is their cost allocation model. In a conversation about the allocation of shared cost, often no one internally wants to take responsibility for trying to solve it. One solution suggested at the round table was to take the model from collateral management, where allocation teams were given the task of pushing down the cost of collateral, and applying that to data, in order to actively drive down costs of ownership and management.

From an IT viewpoint, cloud and dedicated hosting solutions can both be employed - together in a hybrid model - which adds flexible cloud access to supplement fixed storage. In many cases cloud deployments within firms running efficient infrastructures only remain public for a short time, before being pulled into a private cloud. The cost of public cloud for some tasks have proven considerable and as a result it is often used only for specific tasks within hybrid models.

Banks that do look at dedicated hosting versus the cloud must consider a number of factors. For example under FRTB, data retention requirements can stretch to 10 years, and that data needs to be accessible. It may be needed when a firm runs recalculations, it must be under the control of the bank and its heritage must be certain. Realistic testing is required and therefore data must be sited the same distance from test servers as the firm’s live server is from its closest market. Having solutions in data centres that are able to provide that level of compute power, storage and set-up cost effectively is very important, given the pressure on budgets that firms face.

Given the expense of the major capital market datacentres in London, New York and Tokyo, firms will need to address the cost of footprint for private cloud or dedicated hosting solutions. These data centres will be running hot, making power and infrastructure a key consideration, along with location, access and control.


This article was first published by Verne Global (a client of The Realization Group) - Raising the bar - Responding to the RegTech data management challenge