Remember me

Register  |   Lost password?

The Trading Mesh


OneMarketData logo


Understanding Complex Event Processing, its all about the Data

Thu, 25 Jul 2013 09:04:57 GMT           

For the first half of this year equity markets are soaring on a sugar high; market indices regularly hit new highs as exhibited by the DOWs 18 percent rise and the S&P 500 breaking 12 percent over the end of last year. The debate rages on how long this bullish market will continue. There are numerous factors that could make this year different from the past three, ranging from the continuation of central bank easing policy to improved economic conditions. How do we know this? Its all in the data as the major economic indicators and market indices are tracked, scrutinized and compared to past results.

Yet the undercurrent of the equity markets exuberance is a continued downward trend in volumes and trader-loving volatility. U.S. equity trading volume across all the major Exchanges has dropped around 7 percent so far this calendar year, 2013. NYSEs volume composite index (MVOLNYE) has been on a slow slide reaching all the way back into last year, down nearly 10% year over year. And while the VIX spiked above 20 in June, overall it too is at a six-year low. Again how do we know this? Its all in the data or more specifically the analysis of the data over time.

For the professional trader volumes are a reflection of money flows, achieving margins hinges on total volume and a sprinkle of volatility all the while maintaining an accurate audit trail of trading activity. With the third anniversary of the Flash Crash just behind us is the crush of compliance with increasing regulatory actions cascading from Dodd-Frank, the Consolidated Audit Trail (CAT) and the repercussions of Knight Capitals mishap in the SECs recently proposed RegSCI (Regulation Systems Compliance and Integrity). We live under a cloud of market uncertainty, regulatory oversight and increasing competition. It is a new normal, a fait accompli that is shaping the future and forcing firms to elevate their game. And how do we know this? Its all in the data.

The new normal may represent a dearth of market activity but also mandates an imperative that firms recognize that datas intrinsic value impacts the bottom line. Sluggish reactions to dynamic markets lead to business decision missteps which can result unknowingly in risk-laden exposure. The challenge of the new normal in financial markets is the motivation to think outside-the-box in the hunt for alpha.

The disruptive power of innovation

Amid the cacophony of the narrative of algorithmic trading unfolds the story of Complex Event Processing (CEP), a new breed of technology and a tool for understanding data. And understanding data is a game changer – where quality is critical.

Data management takes center stage in the trade lifecycle chain from market research through live trading and post-trade (TCA) analysis. Market data whether years of captured history or streaming live has been and will continue to be a primary business driver. CEP becomes an enabler to drive better business decisions through better data management and analysis.

CEP is a story of the disruptive power of innovation, a nice segue to understanding data, specifically temporal analysis of time-series data. It excels at exacting data consistency from trades, quotes, order books, executions even news and social sentiment which can instill trader confidence for ensuring profit and minimizing risk.

With so many liquidity sources having a consistent and uniform data model across fragmented markets enables effective analysis for trade model design, statistical pattern analysis and understanding order book dynamics. This spans real-time, historical and contextual content practically speaking its hard to separate them. The efficacy of CEP, while commonly understood to be real-time analytics is wholly dependent on precedence established in historical data. This is based on the simple premise that the past can be a rational predictor of the future. This starts with an understanding of what is a time series.

In techie-speak time series refers to data that has an associative time sequence, a natural ordering to its content such as rates, prices, curves, dividend schedules, index compositions and so on. Time Series data is often of very high velocity. The UTP Quote Data Feed (UQDF) provides continuous time-stamped quotations from 13 U.S. market centers representing literally hundreds of terabytes annually. The datas temporal ordering allows for distinct analysis revealing unique observations and patterns and the possibility for predicting future values. Time series are often called data streams which represent infinite sequences (i.e. computation that does not assume that the data has an end) or simply real-time data, such as intra-day trades. CEP is a temporally-sensitive programming paradigm designed for calculating and extracting meaningful statistics that are unique to and dependent on the datas temporal nature. This includes not just the notion of duration and windows of time, but also temporal matching logic of a fuzzy nature such as trade prices to the nearest or prevailing quote.

Consider the scenario where there is a need to understand historic price volatility to determine accurate statistical thresholds of future price movements. Its not simply a matter of determining price spikes but discerning when they occur, for how long and when a high (or low) threshold is crossed. It is CEPs intrinsic sense of time that makes it uniquely suited to analyzing time series for achieving data consistency, the foundation for accurate trade decisions. Consistency is also about eliminating anomalous and spurious conditions, bad ticks if you will. But the trick is recognizing a bad tick from a good one. Historical precedence, ranging from the last millisecond to the previous year provides the benchmark for the norm and the means to recognize deviations. CEPs analytical effectiveness is relative to the depth of the data set. The further back you look the more confidence can be achieved going forward. Of course this assumes that the future behaves like the past. This is the basis for back-testing algorithmic trading models.

Its all about the Data, all in good time

Data can be an ally for back-testing, simulation, valuation, compliance, benchmarking and numerous other business critical decisions. It is the fodder for understanding the global economy and the markets. The natural temporal ordering of time series data draws analysis distinct from any other and has given rise to a whole field of study and discourse. For understanding complex event processing, its all in the data.

A revision of this article first appeared in Futures Magazine, July 2013

Once again thanks for reading.
Louis Lovas

For an occasional opinion or commentary on technology in Capital Markets you can follow me on twitter, here.

, , , , , , , , , , , , , , , , , , , , , , , , , , , , ,