The Good and Bad of HFT and The Importance of Consistent, Informed Regulation
Fri, 09 Sep 2011 04:44:00 GMT
An Interview with Hirander Misra
In this interview for the High Frequency Trading Review, Mike O’Hara talks to Hirander Misra, who is well-known in the electronic trading world as co-founder of Algo Technologies and a founding Director of the phenomenally successful MTF, Chi-X Europe.
Earlier this year, Hirander founded Misra Ventures (http:/
HFT Review: Hirander, maybe we can start by looking at the recent volatility in the markets. August was a very volatile month and there has been much finger-pointing at high frequency trading firms from various quarters, with a number of commentators accusing them of exacerbating that volatility. So I’d be interested to get your take on the relationship between HFT and market volatility.
Hirander Misra: Sure. Beyond HFT, a lot of the issues we’ve seen with global markets recently have been driven more from a macro level, by economic and political factors. The Greek debt crisis and aspects of that extending into Italy, the Middle Eastern conflicts, all those factors have had an impact. At a micro level, trading activity will conform largely to those trends. With so much happening globally at macro level and given that we’ve been in this period of sustained lower volumes in general, that effect is exacerbated.
HFT, in fact trading activity in general, can’t really influence the trends at a macro level, but at the micro level it can impact peaks and troughs of individual securities. And because of the multitude of trading strategies, you’ll certainly have some firms that are trying to take advantage and profit from that situation. Although most might be running what we deem good strategies - whether liquidity provision or whether they have an element of alpha or predictability built into them - there are also a minority of strategies that people may deem not so good, for example quote stuffing. Unfortunately the media tend to glorify the minority cases to the detriment of the majority cases.
But HFT is an iterative process. If you stand still as a high frequency trading firm, others come in and literally eat your lunch, so in that respect, firms are continually looking to improve. In fact, most HFT firms are of the opinion where to actually survive and make this sustainable they have to build a level of predictability into their algos and strive for alpha, rather than just look at low latency and latency arbitrage strategies, which are unsustainable.
HFTR: You mentioned “good” strategies like liquidity provision. However, there have been a number of debates recently about how real the HFT liquidity actually is and whether there’s any substance behind the volume or whether it’s just “noise”. I’d be interested to get your view on that.
HM: It’s an interesting point to discuss. Absolutely there are a number of very good liquidity providers that you can class as HFT, although the boundaries are becoming less well defined. A firm like Getco for example provides liquidity to markets in a vast range of securities on a global basis. And although they started out as what you would deem to be a traditional HFT, in February last year they became a registered NYSE market maker. Equally some of the banks that had more traditional businesses have set up quantitative trading and statistical arbitrage desks. Even though they might be saying they’re getting out of prop trading per se, these sorts of algos very much come into the fore. Goldman Sachs, who have hired ex-Citadel people and implemented Millennium, are a prime example of that. So when we look at HFTs in that way, there’s definitely a good purpose in terms of liquidity provision, it moves beyond just “fair weather market making”.
Now I know that during the flash crash for example, HFT firms appeared to withdraw their quotes, but even registered market makers were withdrawing their quotes in contracts like the S&P eMini futures at that time. When you get really abnormal market events, even back in the days of floor trading a manual trader would take a step back to assess what the hell was going on in the market before coming back in. And that was no different during the flash crash - with both registered market makers and non-registered market makers - because it was a very abnormal event.
The experience I had from my time at Chi-X showed that as far as liquidity provision is concerned, a lot of these firms do help in narrowing spreads and deepening the depth of the order books. Having said that, not all HFT strategies are liquidity provision, you’ve also got things like latency arbitrage, where firms for example will undertake some form of distributed co-location by locating next to every single trading venue and then interconnecting those nodes and having some form of centralized risk management. The problem with latency arbitrage is that the speed of light is the ultimate constraint; it can’t be erased to zero. There’s a massive cost with infrastructure. You might be fast today but tomorrow someone else might be faster, so it’s unsustainable. Even the smartest HFTs will acknowledge that, which is why they’re increasingly looking at speed being the enabler but beyond speed they’re looking at continually adapting smarter algorithms.
Initially someone might look at exploiting latency across various markets and if a new fiber link comes in or something to that effect the volumes might be really, really high. Then over the following couple of months when others catch up, the volumes on their sessions go down to near zero, which just means it’s very short-termist. You can’t build a sustainable business on short-termism, whether you’re an HFT or anyone else.
HFTR: So if latency arbitrage is a valid strategy, albeit not particularly sustainable, where should the line be drawn between what is acceptable and valid practice in the market versus manipulative or unacceptable behaviour?
One comment on page 2 of the interview. You mention that manual traders would "step away" on the floor back in the day. If you are referring specialist that is not the case. On the AMEX or NYSE floor specialist had to maintain consistent and orderly markets. If an unusally large order came in the specialist would fill it to the best of his ability with orders residing on the book. then they would seek a governors approval to halt the stock and announce to the crowd a large order. The floor broker community would step up, providing additional liquidity, a market clearing price would be found and then the sepcialist would be obligated to take the other side of the trade for any amount without a natural buyer. While no marjket structure is perfect, i don't not recall a "flash crashever happening when we had good old-fashioned human beings involved in the trading process!
John McGonegal 916 days ago,(2011/10/20)
I think "best of his ability" is the key terms here. Minimum resting periods don't equate to "best of his ability", they are a cast iron gurantee to underwrite liquidity at a price known to be the wrong one given the known facts. or the absence of facts. I cant see how a commitment to make a blind price in the absence of information, or contrary to known information, can possibly result in more efficient markets, as to compensate for the stochastic probability of occassional large losees, market makers will need to make normal returns on every quote that provide economic compensation over time. That can only result in wider spreads, and becasue it would replace relative certainty with uncertainty, that process would involve an excess return on the capital committed.
The role formerly provided by the "floor broker community stepping up" is precisely the role that market participants in general now provide, without the associated excess returns being restricted to a small number of people with the privilege of an information advantage based on institionalised physical proximity. Physical proximity still matters, but now it is the "virtual" physical proximity afforded by efficient electronic connections. It was ever thus.
Ken Yeadon 915 days ago,(2011/10/21)