Remember me

Register  |   Lost password?

The Trading Mesh

Data on demand: Preparing for the next regulatory challenge

Tue, 20 Jun 2017 10:06:10 GMT           

 

Introduction

On 1st June 2017, The Realization Group, together with technology provider Napatech, hosted a round table on the challenge of capturing, storing and accessing trading data. European markets are about to enter a new era:  MiFID II and related measures will require that market participants are able to provide much more data than ever before and regulators are expected to be more demanding in investigating suspicious or irregular trading. In the past, firms had to provide transaction data, but new rules call for quote data as well. The regulations will also apply to a wider array of participants. Taken together, the measures represent what some observers believe will be a fundamental shift for the marketplace, one that puts an onus on speed and completeness when reporting data. That creates acute pressures for the industry as trading ecosystems and the proliferation of venues and products have led to an explosion in the amount of data generated by participants. As a result, market participants are bracing for what they expect will be an onerous period, with MiFID II representing just the start.

 

Mike O’Hara of The Realization Group facilitated a wide-ranging discussion featuring Emiliano Rodríguez, Senior Director Business Development for Napatech, Clive Posselt, commercial director at Instrumentix, Stephen Taylor, CEO of Stream Financial, David Grocott, Director at Financial Technology Advisers, and William Garner, Partner at Charles Russell Speechlys. While they were clear that the road ahead will involve hard work, they also were quick to point out that the technology is there to create seamless systems that can work across jurisdictions, asset classes and technical environments. As the regulatory net widens, participants know they have no choice but to prepare. But there is an important silver lining: putting in place robust data capture systems can also offer significant business benefits.

 

 

New rules, new struggles

 

When it comes to MiFID II, the operative word is prescriptive. Regulators broadly know what they want from market participants: any data, at any time. That was the opening message at the round table discussion as the audience grappled with the implications of a regulatory environment in which authorities are expected to show little patience when they ask for information about trades. 

 

Apart from top tier trading firms, many participants are not prepared for the scale of the regulatory measures. For many of them, this will also be the first time they come under the regulatory spotlight as buy-side and high frequency trading firms will be deemed responsible for showing all their trading activity, down to the last quote and timestamp.  It is not just about suspicious trading activity either. A major theme in MiFID II is best execution. So firms may need to provide data on fill rates across different venues to prove they were getting the best deals possible for clients.

 

Nonetheless, as painful as the regulations may be for some firms, there is another way to view the approach of MiFID II: as a business opportunity. Companies that hitherto have lacked the business case for a data system overhaul now have one. And with better means of capturing and accessing their own information, companies will have more insight for their security, legal affairs or any range of business optimisation initiatives.  That could translate into a more strategically designed, future-proofed infrastructure and greater return on investment.

 

“In the end everything is about the data,” says Emiliano Rodríguez of Napatech, which specialises in packet capture technology for monitoring and collecting data on networks. “You can always go back to the data. Data doesn’t lie.”

 

But historically, “going back to the data” has been a cumbersome affair, involving time-consuming forensic work. That’s because trade-related data can sit anywhere and getting different systems to talk to each other has not always been a straight-forward affair. Various data protocols and the problems inherent with operating a centralised database have made querying specific trades problematic.

 

One approach discussed was to adopt a federated data model, where queries are made across systems dynamically. Essentially, this does away with the notion of having a centralised database and takes the view that it is better to query local data sources to create reports. The problem is that few financial firms have shown awareness of the advantages of a federated model, with many still constantly copying data into large repositories. Such an approach will be tricky in the MiFID II era. 

 

A critical question audience members had was how tough regulators would be from day one. It was noted that with MiFID I, authorities did not immediately pounce on firms but rather showed a degree of “regulatory forbearance”. Can market participants expect that this time around too?

 

One audience member suggested that if regulators feel the market is not taking the new rules seriously enough, there could be some penalties for show. It was generally agreed that if an authority wants to make an example of a firm, it certainly will.

 

 

Focusing on the upside

 

Participants noted that working across functions, regions and applications stacks involves multiple owners. Technology can be an enabler but implementation still requires careful planning. 

 

The common denominator for understanding what happened in a given trade is time. But even with precise timestamps based on Coordinated Universal Time, there may be issues. For instance, timestamps on different algorithms may not be standardised. At the other end of the trading spectrum, voice-based trading throws up questions as to when a trade actually was agreed based on the electronic records.

 

But there is an upside as well when regulators call for granular data such as timestamps. As market participants are forced to capture so much more data, they will also be in a position to use that information themselves to understand trading patterns, perform back-testing or carry out a range of business-enhancing initiatives. In that regard, Rodríguez noted that even with multiple formats for trade-related information – such as emails or other messaging – the data can be pulled together provided a firm has packet capture systems set up in the right places. For Tier Two firms or smaller organisations, if they are able to implement data capture systems in a granular way, it could be a massive opportunity to enhance their business practices.

 

The discussion highlighted two issues: cost and complexity. As far as cost is concerned, Rodríguez argued that firms need to look at the problem in terms of different layers and tackle it layer by layer. Capturing data on a network via packet capture represents one layer. Translating that data represents another. He says separating layers is certain to be more cost-friendly.

 

As for complexity, firms need to be prepared to take a leap. “It is true there’s going to be a mindset change,” said Rodríguez, adding that the silo approaches of the past need to be abandoned. “It is possible. It just means taking a step and starting to do it.”