Inequality in Equity Trading: How Did We Get Here?
Wed, 18 Apr 2012 08:57:46 GMT
(This article is part one of a four-part series called A Question of Fairness.)
Financial markets and the exchanges that support them depend on a public perception of fairness. This perception is essential to maintain the confidence of investors.
Today, however, there are a number of technological vulnerabilities in the market that give a few ultra-sophisticated traders a decided advantage over others. High-frequency trading, algorithmic trading, ultra low-latency trading and co-location are the most obvious examples of this destabilizing trend.
In addition, given the current state of the network architecture that supports most of the major exchanges, it is now possible for unethical traders to use technology to delay, disrupt and gain unauthorized access to the orders of other buyers and sellers. Given the enormous profits that can be made, it is reasonable to assume that unscrupulous traders will take or already have taken advantage of this technological vulnerability.
Regardless of the scope and nature of the problems, it is the responsibility of self-regulating exchanges and regulatory bodies to correct any trading inequities as soon as they become apparent. Unfortunately, effective oversight of these very important issues is almost non-existent — for two basic reasons.
- The public is not yet fully aware of the transformative dangers posed to the market by inequitable practices.
- The current set of regulatory tools does not include a practical, economical solution.
There is, however, an efficient and cost-effective way for exchanges and regulators to address the problems noted above, provide a comprehensive audit trail for both regulatory bodies and the public at large and level the playing field for all traders and investors.
This solution — an Exchange Gateway Device based on proven network concepts and architecture — can ensure fairness and transparency, moderate volatility and support the original free market goals of equitable trading and efficient capital formation.
Fairness in the markets depends on equal access to information
If you visit the website of the Securities and Exchange Commission, you will find that the organization’s very existence depends on its ability to maintain equal access to information for all investors. An SEC article entitled "The Investor's Advocate" states the following:
“The laws and rules that govern the securities industry in the United States derive from a simple and straightforward concept: all investors, whether large institutions or private individuals, should have access to certain basic facts about an investment prior to buying it and so long as they hold it. To achieve this, the SEC requires public companies to disclose meaningful financial and other information to the public. This provides a common pool of knowledge for all investors to use to judge for themselves whether to buy, sell, or hold a particular security. Only through the steady flow of timely, comprehensive and accurate information can people make sound investment decisions.”
It’s impossible to overemphasize the importance of that last point: The markets need to ensure the transparency of price discovery, a responsibility that was re-affirmed by James Brigagliano, an Acting Director of the Division of Trading and Markets, in testimony to the Senate Banking Subcommittee on Oct. 28, 2009:
“As markets evolve,” he said, “the commission must continually seek to preserve the essential role of the public markets in promoting efficient price discovery, fair competition, and investor protection and confidence.”
But Brigagliano also addressed another issue that is having a transformative impact on the markets: technological change.
“The U.S. equity markets have undergone a transformation in recent years due in large part to technological innovations that have changed the way that markets operate,” he said. And later in his testimony, he added the following:
“An exchange brings together the orders of multiple buyers and sellers and is required to provide the best bid and offer prices for each stock that it trades, as well as last-sale information for each trade that takes place on that exchange. This information is collected and made public through consolidated systems that are approved and overseen by the SEC. Any investor in the United States can see the best quotation, and the last-sale price of any listed stock, in real time. This transparency is a key element of the national market system mandated by Congress.” (Emphasis added.)
In theory, Brigagliano’s testimony is relatively reassuring. Unfortunately, the critical trading information he describes is not available in real-time. Furthermore, it is not available to all investors at the same time. As a result, there is a disparity in information access that undermines the fairness of the market and provides an opportunity for market manipulation and exploitation.
Recognizing the real-time myth
At most major exchanges, trading today is handled by computers that match buy and sell orders. And the speed of execution is almost beyond comprehension. Trades are now routinely executed in fewer than 200 microseconds. (A microsecond is a millionth of a second.) For comparison, the blink of an eye takes about 350,000 microseconds. That gives you some idea of the turbocharged speed of trading today.
In addition to executing trades, high-speed computerized systems generate and distribute reports of trading activity that heavily influence buying and selling decisions. But it is important to remember that in each case, the communication is not instantaneous. It is not real-time.
As the information travels from sender to receiver, an infinitesimal but measurable amount of time elapses. And geographical distance is a key factor in the timeframe involved. That’s why it takes more time to send trading information from New York to London than it does from the Bank of New York Mellon at One Wall Street to the NYSE Euronext exchange down the block. And that communication takes more time than it does to send an order to the exchange from a broker located in the same building.
To a layman, the incredibly short delay between sending and receiving — a factor network computing experts call latency — may seem irrelevant. But in a world where an increasing number of trades are managed by computers using sophisticated algorithms to make unthinkably fast decisions, a few microseconds difference in price discovery or order execution could give one trader a significant and highly profitable advantage over another.
In a March 2 blog on the Huff Post Business section entitled “Business at the Speed of Light: What Is a Millisecond Worth?” Tony Greenberg, CEO of RampRate, a consultant for IT and cloud computing sourcing decisions, captured his Chief Technology Officer’s view of the ultra-low-latency arms race:
“‘It can be difficult to imagine how milliseconds or nanoseconds of latency make a significant difference,’ says Internet technologist and RampRate CTO Steve Hotz, ‘but from the viewpoint of a data transaction making the trip hundreds or thousands of times, that incremental advantage can add up.’”
Understanding the problems caused by co-location
This critical split-second advantage is one of the reasons why high-frequency traders — which account for only about 2 percent of today’s 20,000 trading firms, according to a survey by the Aite Group — are so interested in placing their trading technology as close as possible to the supercomputing systems operated by the major exchanges.
This practice — called co-location — could involve a building close to the exchange or even space in same building. In fact, some exchanges offer collocation services directly to their customers, a questionable activity that has raised concerns at regulatory agencies like the SEC.
According to a March 1 article on Wall Street & Technology, Nasdaq OMX established the lowest latency route from the New York metro region to Brazil’s leading exchange in December 2011.
The new connection provided Nasdaq customers with a 2-millisecond round-trip data transmission advantage over the route operated by NYSE Euronext. The difference is less time than it takes for a housefly to flap a wing. But clearly it was meaningful to Nasdaq customers who are willing to pay for the benefits of collocation.
“When trading advantages are measured in mere thousandths or millionths of a second, co-location could be the difference between success and failure,” explained David S. Hilzenrath in a Feb. 22 article on HFT published in the Washington Post.
From a competitive standpoint, it makes perfect sense for the exchanges — many of which now operate as for-profit corporations — to offer the best possible services to their customers and to seek meaningful differentiation.
Nevertheless, the trend toward co-location and the dramatic increase in HFT pose significant threats to the fundamental principle of fairness that’s a professed goal of regulators and is essential to the long-term health of the financial markets.
In fact, in a 2011 report to Congress recommending major changes in the organization of the SEC, the Boston Consulting Group pointed out that today’s computerized, high-speed trading opens the door to market manipulation and potentially creates “an uneven playing field.” (The report was cited in the Washington Post article noted above.)
As you can see, this concern about inequitable trading is based on commonly recognized market practices. But there are other issues that threaten to undermine the operations of the major international exchanges. We’ll explore their potential impact in the next installment of A Question of Fairness.
This article originally appeard on the Tabb Forum: http:/