Observers of the slow slog toward an empirical test of larger tick sizes have raised concerns about the details of the three-track plan under consideration. In particular, there's an order-protection feature for one of the three "tracks" that has raised the hackles of Larry Tabb and the STA
Algorithmic and high-frequency trading
One of the unintended consequences of high-frequency trading may be that it forces a market re-design. Guest columnist Ginger Szala looks at the issues.
What is the real issue behind intermarket sweep orders, and the recent dust-up over an NYSE rules change? Faille answers: Privilege.
The most intriguing revelation in the exchange of briefs between the State of New York and Barclays appears in a humble footnote, where Barclays seems to concede that an employee was pressured to change an internalization number. But it was just the once....
Guest columnist and intrepid reporter Doug Friedenberg talks to Brad Katsuyama about HFT, Michael Lewis and more.
What do I mean by "front run," asked a reader. I use the term for a range of situations in which one party trades on the basis of advance [non-public] information of another party's upcoming trade, Faille replies.
Christopher Faille reviews the basic facts about SIP, the Securities Information Processor, and cites (with some incredulity) a new contention in some quarters that SIP isn't all that important because nobody really relies upon it.
It does appear that speed is helpful in generating alpha. How is it helpful? Here there are two views, and the less HFT-friendly of these views has received some scholarly/empirical support.
Neither liquidity nor a small spread is the be-all and end-all of markets. A spread is a sort of price and, like other prices, spreads can sometimes get too small because someone is cutting corners.
Who or what is responsible if an ATS' self-learning behavior drifts into terrain that, performed by a human, would constitute manipulative behavior? Does it matter than another algorithm has lately passed the Turing test?
HFTs and trading venues alike have worked hard to fit their practices into the Reg NMS framework. As a consequence, violations of NMS “are unlikely” Dolgopolov writes, “to provide a basis for civil liability of HFTs who use such orders because of their compliance – however formalistic – with this regulatory norm.”
ESMA defines HFT as “a special class of algorithmic trading in which computers make decisions to initiate orders based on information that is received electronically, before human traders are capable of processing the information they observe and of taking a decision in relation thereto.” It then decides that needs further definition.
After some preliminaries, McGonagle got around to the central subject of his testimony, the Concept Release on Automated Trading that the CFTC had issued back in September 2013. Much of his testimony was designed to give Congress an inkling of the range of reactions the CR has since elicited.
Consciousness did evolve. Why? I submit that there is no good answer to this question unless consciousness makes a difference to behavior. Trading and investing must be included as "behaviors" in that generalization.
Michael Lewis portrays Aleynikov, the Russian born coder convicted of two counts of theft in 2010 and imprisoned, then released by decision of an appeals court two years later, as a central figure in this dramatic tale about high-frequency trading. Aleynikov is not one of the bad guys, as such: but he is a self-blinkered tool of the bad guys. Some sympathy is extended: not much.
The patent dispute at issue before the Supreme Court March 31st involved a computerized escrow system that serves as a third party to a deal, eliminating settlement risk. A business-method patent, in short: nothing at all to do with speed of execution, or data compression, or other such trading-infrastructure-related feats.
Guest columnist Ginger Szala on Michael Lewis' new book.
There is an arm's race aspect to the trend toward ever-higher speeds in trading, and this has created a "latency risk" in the markets that has an adverse impact on liquidity, a new study shows. This feeds into some ongoing arguments.
Cutting latency in any one layer is a task distinct from that of cutting it in any of the others. For the physical or interface layer (the ground floor of our ziggurat), optimization involves fiber optics and efficient queue management.
A forthcoming paper by Goldstein et al opens a window onto the convergence of two market-structure issues that, until quite recently, had not even been thought very similar.
The U.S. Supreme Court has now agreed to hear arguments about Alice, litigation that squarely raises a question with which lower courts have struggled ever since the Bilski decision in 2010 failed to offer them any guidance: is all software 'abstract' in the legal sense, and thus as such unpatentable? If not, then what is the legal sense of "abstract"?
The share of equity trading volume accounted for by dark pools has risen steadily in the U.S. in recent years. What have been the preconditions of this? what are its benefits and what are its costs?
The key to understanding the mechanics of bitcoin is that at ten minute intervals the 'miners' gather up the recent proposed transactions and try to add them to the block chain, the universal ledger of bitcoin transactions.
A recent article on the use of catalytic events to predict volatility, written by Paul Rowady of the TABB Group, provides food for thought for derivative traders, crystal-ball gazers, and compliance officers alike.
Anshuman Jaswal, senior analyst, Celent, has prepared a report on execution quality in the NYSE market that measures such quality along two axes: pricing and speed. Speed is straightforward, the metric for price as an execution issue is a bit trickier.
The CFTC has issued a thoroughly researched report, or 'concept release,' on algorithmic trading, high frequencies, and related issues in todays derivatives markets. Christopher Faille looks at the report and marvels at the term 'execution throttle,' employed for a monitoring system the report suggests.
The CFTC is said to be close to issuing a concept release on high-frequency trading, pushing the regulatory process beyond the agency's earlier talkfests. Christopher Faille muses about an approach the concept release will almost certainly not advocate.
Summary/excerpt: If Clark-Joseph is wrong in his worries about the "exploratory trading" of high frequency traders, he should be shown to be wrong with the use of facts and reason. He shouldn't be shushed, directly or indirectly.
MiFIR includes provisions that allow for "dark pools" and that limit the size -- or, if you will, the depth -- of such pools. On June 10, 2013, authorities in Brussels released a new proposal for tweaks of MiFIR in general and these provisions in particular.
With a wide range of troubling issues on my mind, I recently consulted a sage of the trading-algorithm world, Greg Woods, the head of algorithmic execution, listed derivatives and foreign exchange for Deutsche Bank Securities. He has more than twenty years of experience in the broad IT area.
Guest columnist Louis Lovas looks at data management in the world of algorithmic trading.
Dark trading in Australia is becoming more multilateral and 'market-like' over time, a task force has found. That doesn't sound especially alarming, but ASIC believes the situation may encourage breaches of the Market Integrity Rules and the Corporations Act.
The Germans seem prepared to experiment with limits on high-frequency trading, as we see in a recent Bundestag vote that leaves the particulars to BaFin. I spoke recently to David Weild, a former vice chairman of NASDAQ, about this experiment and about related issues.
Bank of England white paper on high-frequency trading yields little in the way of return.
The report seeks to alleviate certain concerns about both high-frequency and algorithmic trading. In particular, "The evidence available to this Project provides no direct evidence that computer-based HFT has increased volatility in financial markets." Still, the absence of evidence is not the evidence of absence, and the Project does acknowledge prudential concerns.
The real problem behind the 2010 flash crash, Hull says, is that again as in 1987 (in a different way of course) traders were working within a market structure that allowed “the illusion of liquidity” to displace the real thing. He cites an authority, because as he says his firm, Ketchum, likes to stay close to the academic literature.
The new technologies make less difference than some might think. "Even back in the days of physical market makers, when things went bad, as for example in the crash of '87, the market makers would head for the hills," said James Angel. Nowadays the computers go dark. Or (worse) they don't.
There has been a perhaps-unexpected consequence of the disappearance of the old-fashioned floor traders. Floor trading used to serve as a training regimen, "from which many of the industry's leading discretionary traders originated." Without the floors, the talent pool has dried up.
What happened after decimalization? Spreads did fall, but these authors say that “displayed liquidity at the National Best Bid and Offer” also fell, at least in part because there were 100 price points for each dollar now, where once there had been eight or 16, so limit orders no longer come in clusters. This in turn made “pinging and sniffing for order flow” a lot easier, heralding the rise of the sort of algorithmic trading that is to the ordinary retail investor what a hawk is to a mouse.
Here is a draft definition of high frequency trading presented to the CFTC on June 20. HFT is a form of automated trading that employs: (a) algorithms for decision making, order initiation, generation, routing, or execution, for each individual transaction without human direction; (b) low-latency technology that is designed to minimize response times, including proximity and co-location services; (c) high-speed connections to markets for order entry; and (d) high message rates (orders, quotes, or cancellations).
Quite aside from the neat through-the-planet short-cuts they might allow: how fast is a neutrino? This turns out to be a very controversial matter. Last year, scientists working at CERN set off weeks of feverish speculation with reports indicating that neutrinos travel faster than light. If I understand this at all, it would mean if true that a New York or London trader could in theory accept a Tokyo trader’s offer before the offer had actually been made. Now that would be the ultimate in HFT: negative latency.
Chief Judge Dennis Jacons said that the statutory language refers on the one hand to products that have “already been introduced into [placed in] the stream of commerce” and on the other hand to those that “are still being developed or readied” [produced for] such placement. The words evoke two distinct sets of products with a sequential relationship to one another, which satisfies well-established rules of statutory construction. The district court had upheld the indictment against challenge along these lines, because the district court had construed the language to include the production of anything whose purpose is “to facilitate or engage in such commerce.” The appeals court panel found error here.
Cinnober has sold a customized form of its Scila Surveillance software -- a product designed to detect abnormal market behavior -- to the Qatar Exchange. One of the purposes of Scila Surveillance is the detection of harmful variants of algorithmic trading, such as the trading "snipers" who drive off market makers and reduce liquidity.
The Internet and technology have changed the face of trading forever. To get a glimpse of the future we sat down with the oracles of The Daily Delphi to see what the next newest thing might be.
Reliance on optimization tools that in turn rely on standard “user risk factors” will make factor alignment worse, caution three executives of Axioma. An optimizer will cherry pick “the aspects of the model of expected returns that it deems desirable when gauged on the yardstick of marginal contribution to systemic risk.” This amounts to making, and betting on, the erroneous assumption that a lack of correlation with the used risk factors is a lack of systemic risk altogether.
Three lawyers with Covington & Burlington write about the new intensified scrutiny to which regulators are subjecting algorihtmic and high frequency trading. They place it in the context of an old dispute over what constitutes market manipulation. According to the broadest view, if a trader's 'sole intent' in making even a quite ordinary buy or sell order is to move the price, then the resulting trade is market manipulation.
Some managers of HFT or algorithmic funds must have felt some relief upon the arrest of Sergey Aleynikov in July 2009, his conviction in December 2010, or his imprisonment the following March. Programmers in the financial world were put on notice that criminal prosecution was among the possible consequences were they to treat their knowledge of their employer's edge as a marketable commodity. Thus, the news on Friday [February 17, 2012] that Aleynikov is now a free man came as something of a jolt.
Godfrey Cadogan's formula, linking high-frequency trading, bubbles and crashes all into one formula of extreme simplicity (or "parsimony" as Cadogan puts it) leaves our reporter wondering: does the rendering of facts as a formula make them clearer, or does it just create a misleading patina of precision? Emanuel Derman recently warned of the overly simple models of finance economists, and perhaps this is a new token of that type.
Celent says that both the IT world and the financial markets long assumed that the latter had to employ the former in a “batch mode.” A firm would process trades through the day, recording these in a data base. Then at the end of the day, programs that could manipulate this data in search of exploitable patterns would operate on the day’s results as a batch. That is the mode of operation that “has become increasingly obsolete.”