Cutting latency in any one layer is a task distinct from that of cutting it in any of the others. For the physical or interface layer (the ground floor of our ziggurat), optimization involves fiber optics and efficient queue management.
Algorithmic and high-frequency trading
A forthcoming paper by Goldstein et al opens a window onto the convergence of two market-structure issues that, until quite recently, had not even been thought very similar.
The U.S. Supreme Court has now agreed to hear arguments about Alice, litigation that squarely raises a question with which lower courts have struggled ever since the Bilski decision in 2010 failed to offer them any guidance: is all software 'abstract' in the legal sense, and thus as such unpatentable? If not, then what is the legal sense of "abstract"?
The share of equity trading volume accounted for by dark pools has risen steadily in the U.S. in recent years. What have been the preconditions of this? what are its benefits and what are its costs?
The key to understanding the mechanics of bitcoin is that at ten minute intervals the 'miners' gather up the recent proposed transactions and try to add them to the block chain, the universal ledger of bitcoin transactions.
A recent article on the use of catalytic events to predict volatility, written by Paul Rowady of the TABB Group, provides food for thought for derivative traders, crystal-ball gazers, and compliance officers alike.
Anshuman Jaswal, senior analyst, Celent, has prepared a report on execution quality in the NYSE market that measures such quality along two axes: pricing and speed. Speed is straightforward, the metric for price as an execution issue is a bit trickier.
The CFTC has issued a thoroughly researched report, or 'concept release,' on algorithmic trading, high frequencies, and related issues in todays derivatives markets. Christopher Faille looks at the report and marvels at the term 'execution throttle,' employed for a monitoring system the report suggests.
The CFTC is said to be close to issuing a concept release on high-frequency trading, pushing the regulatory process beyond the agency's earlier talkfests. Christopher Faille muses about an approach the concept release will almost certainly not advocate.
Summary/excerpt: If Clark-Joseph is wrong in his worries about the "exploratory trading" of high frequency traders, he should be shown to be wrong with the use of facts and reason. He shouldn't be shushed, directly or indirectly.
MiFIR includes provisions that allow for "dark pools" and that limit the size -- or, if you will, the depth -- of such pools. On June 10, 2013, authorities in Brussels released a new proposal for tweaks of MiFIR in general and these provisions in particular.
With a wide range of troubling issues on my mind, I recently consulted a sage of the trading-algorithm world, Greg Woods, the head of algorithmic execution, listed derivatives and foreign exchange for Deutsche Bank Securities. He has more than twenty years of experience in the broad IT area.
Guest columnist Louis Lovas looks at data management in the world of algorithmic trading.
Dark trading in Australia is becoming more multilateral and 'market-like' over time, a task force has found. That doesn't sound especially alarming, but ASIC believes the situation may encourage breaches of the Market Integrity Rules and the Corporations Act.
The Germans seem prepared to experiment with limits on high-frequency trading, as we see in a recent Bundestag vote that leaves the particulars to BaFin. I spoke recently to David Weild, a former vice chairman of NASDAQ, about this experiment and about related issues.
Bank of England white paper on high-frequency trading yields little in the way of return.
The report seeks to alleviate certain concerns about both high-frequency and algorithmic trading. In particular, "The evidence available to this Project provides no direct evidence that computer-based HFT has increased volatility in financial markets." Still, the absence of evidence is not the evidence of absence, and the Project does acknowledge prudential concerns.
The real problem behind the 2010 flash crash, Hull says, is that again as in 1987 (in a different way of course) traders were working within a market structure that allowed “the illusion of liquidity” to displace the real thing. He cites an authority, because as he says his firm, Ketchum, likes to stay close to the academic literature.
The new technologies make less difference than some might think. "Even back in the days of physical market makers, when things went bad, as for example in the crash of '87, the market makers would head for the hills," said James Angel. Nowadays the computers go dark. Or (worse) they don't.
There has been a perhaps-unexpected consequence of the disappearance of the old-fashioned floor traders. Floor trading used to serve as a training regimen, "from which many of the industry's leading discretionary traders originated." Without the floors, the talent pool has dried up.
What happened after decimalization? Spreads did fall, but these authors say that “displayed liquidity at the National Best Bid and Offer” also fell, at least in part because there were 100 price points for each dollar now, where once there had been eight or 16, so limit orders no longer come in clusters. This in turn made “pinging and sniffing for order flow” a lot easier, heralding the rise of the sort of algorithmic trading that is to the ordinary retail investor what a hawk is to a mouse.
Here is a draft definition of high frequency trading presented to the CFTC on June 20. HFT is a form of automated trading that employs: (a) algorithms for decision making, order initiation, generation, routing, or execution, for each individual transaction without human direction; (b) low-latency technology that is designed to minimize response times, including proximity and co-location services; (c) high-speed connections to markets for order entry; and (d) high message rates (orders, quotes, or cancellations).
Quite aside from the neat through-the-planet short-cuts they might allow: how fast is a neutrino? This turns out to be a very controversial matter. Last year, scientists working at CERN set off weeks of feverish speculation with reports indicating that neutrinos travel faster than light. If I understand this at all, it would mean if true that a New York or London trader could in theory accept a Tokyo trader’s offer before the offer had actually been made. Now that would be the ultimate in HFT: negative latency.
Chief Judge Dennis Jacons said that the statutory language refers on the one hand to products that have “already been introduced into [placed in] the stream of commerce” and on the other hand to those that “are still being developed or readied” [produced for] such placement. The words evoke two distinct sets of products with a sequential relationship to one another, which satisfies well-established rules of statutory construction. The district court had upheld the indictment against challenge along these lines, because the district court had construed the language to include the production of anything whose purpose is “to facilitate or engage in such commerce.” The appeals court panel found error here.
Cinnober has sold a customized form of its Scila Surveillance software -- a product designed to detect abnormal market behavior -- to the Qatar Exchange. One of the purposes of Scila Surveillance is the detection of harmful variants of algorithmic trading, such as the trading "snipers" who drive off market makers and reduce liquidity.
The Internet and technology have changed the face of trading forever. To get a glimpse of the future we sat down with the oracles of The Daily Delphi to see what the next newest thing might be.
Reliance on optimization tools that in turn rely on standard “user risk factors” will make factor alignment worse, caution three executives of Axioma. An optimizer will cherry pick “the aspects of the model of expected returns that it deems desirable when gauged on the yardstick of marginal contribution to systemic risk.” This amounts to making, and betting on, the erroneous assumption that a lack of correlation with the used risk factors is a lack of systemic risk altogether.
Three lawyers with Covington & Burlington write about the new intensified scrutiny to which regulators are subjecting algorihtmic and high frequency trading. They place it in the context of an old dispute over what constitutes market manipulation. According to the broadest view, if a trader's 'sole intent' in making even a quite ordinary buy or sell order is to move the price, then the resulting trade is market manipulation.
Some managers of HFT or algorithmic funds must have felt some relief upon the arrest of Sergey Aleynikov in July 2009, his conviction in December 2010, or his imprisonment the following March. Programmers in the financial world were put on notice that criminal prosecution was among the possible consequences were they to treat their knowledge of their employer's edge as a marketable commodity. Thus, the news on Friday [February 17, 2012] that Aleynikov is now a free man came as something of a jolt.
Godfrey Cadogan's formula, linking high-frequency trading, bubbles and crashes all into one formula of extreme simplicity (or "parsimony" as Cadogan puts it) leaves our reporter wondering: does the rendering of facts as a formula make them clearer, or does it just create a misleading patina of precision? Emanuel Derman recently warned of the overly simple models of finance economists, and perhaps this is a new token of that type.
Celent says that both the IT world and the financial markets long assumed that the latter had to employ the former in a “batch mode.” A firm would process trades through the day, recording these in a data base. Then at the end of the day, programs that could manipulate this data in search of exploitable patterns would operate on the day’s results as a batch. That is the mode of operation that “has become increasingly obsolete.”
A recent meeting of an advisory group of the CFTC discussed the proper definition for high frequency trading, on the premise that only once a definition is in place can there be focused monitoring of the consequences of HFT.
Could it be that truth is finally stranger than science fiction? The next battle of the quantitative trading strategies may offer some algorithmic features that look downright qualitative.
Spread capture is a percentage of the bid-ask spread, so that if an algorithm always accepts the outstanding offer when buying – if it is always the one to cross -- it will capture 0 percent of the spread. If a deal is concluded on its own bid determined through limit orders, on the other hand, it captures 100 percent of the spread. The spread capture metric is popular, the Pragma paper says, because it “reflects a widespread assumption that the higher the number the better the performance of the algorithm.”
It is by now common knowledge that High Frequency Trading comprises 70 to 75% of market trading in North America and western Europe. And it helps to remember that the primary usefulness of algorithmic trading is to provide liquidity to the investor class. We presume that the distinction of said class is that its investment [...]
By Christopher Faille Charles Jones of Columbia Business School made a presentation at an SIFR event in Stockholm, titled “What do we know about algorithmic and high-frequency trading?” Although a distinguished professor like Jones would not put the matter this way, his thoughts do have me thinking of the old human-inhabited trading floors, and their [...]
Dennis Lohfert, founder of Ion Asset Architecture, discusses quantitative trading strategies and how they are affected by current market conditions.
AllAboutAlpha.com interviewed Arzhang Kamarei, a partner at Tradeworx, a quantitative investment management firm with expertise in high-frequency and medium-frequency equity market-neutral strategies.
By Christopher Faille It is possible to be nostalgic about the era before computerized trading, and thus before high frequency trading (HFT), when the transmission of orders took place by means of a human voice over a landline telephone. But we must qualify any such nostalgia. Consider that the market makers at Nasdaq as recently [...]
In the alpha world, skill often depends on the support of other factors. Some call them pedigree, process and performance, and rank them in that order. There's only one problem: despite the best pedigree, and the best process, performance can suffer tremendously when something unpredictable occurs – a black swan or tail risk.
Hamlin Lovell reports from The Battle of the Quants 2011.
Carrier pigeons and post roads gave way to telegraph lines, then telephones, then networks of computers. In our own century automation has gone so far that actual human beings on an exchange floor seem destined to share the obsolescence of those pigeons.