Monthly Archives: February 2014

Why we all should be following high frequency trading (and how to make the most of the robot revolution)

High frequency trading keeps showing up in the news.   We’ve seen flash crashes, system inefficiencies, and most recently, as Matthew Obrien outlines in his recent piece in the Atlantic’s blog, data sources selling early information to these traders.

We should be concerned about wasteful speculation, and the specter of super-fast quant-designed algobots skimming profits from ‘ordinary’ investors naturally raises hot button questions in a climate of concern over financial markets.

But high frequency trading and the various proposals to address it are equally valuable as a case study in how we interact and govern in an increasingly digital and intelligent machine enabled world.

elegant new design for the Bloomberg terminal (used by 'ordinary' investors) from Colin P Kelly and team

Speaking of ‘ordinary’ traders.  A sleek new design for the Bloomberg terminal from Colin P Kelly et al. photo: colinpkelly.com

O’Brien highlights recurring news of HFT firms buying corporate and economic information from third parties fractions of a second ahead of the public release, allowing them time to trade on the data before the rest of the market.  I would tend to agree with Obrien that this is different only in style and timescale (and current legality) to a company CFO selling earnings data to a trader the day before the public report.  The fact that we can make an illicit insider gain in a fraction of a second that used to take several days is beside the point. (Business Wire announced Thursday it would stop selling its reports directly to HFT firms).  And technology cuts both ways.  Some have argued that some traditional insider trading might be moot if we updated the way companies report information to a more continuous stream that better aligns with today’s information technology.

Using data sold a second ahead is just an easier to visualize version of what some types of HFT are designed to do routinely, however.  Acting on a millisecond timescale, high frequency trading computer algorithms –   what finance writer Felix Salmon has called ‘algobots’ – are designed to jump in between ‘regular’ buy and sell bids as they travel between exchanges and capture the gains (so-called “latency arbitrage”). While speed and automation can certainly improve the functioning of the market, several recent studies have confirmed that much HFT does not, instead it reduces efficiency and profits in the market overall.  In addition, the volume and speed of HFT algobots’ machine-to-machine decisions can amplify errors into ‘irrational’ feedback loops and brief but crazy market swings – so-called “flash crashes”.

For all these reasons there are calls to regulate.  O’Brien calls for a financial transactions tax. But this is where things get interesting.

There are many good arguments to be made for a FTT.  Such a tax could be a fair and effective instrument to retard some speculative trading and/or to generate revenue to address the various public costs of unproductively short term, risky or speculative behavior in financial markets.  But it may be a blunt instrument to address predatory high frequency trading, per se.

By contrast, this past summer two sets of researchers from different disciplines – economics and computer science, respectively – separately demonstrated the market loss resulting from HFT,  They then proposed a technological fix to the exchanges, instead of a tax.   They propose that instead of trading continuously, the exchange system move to a ‘discrete time market’ or ‘frequent batch auction’ where buy and sell orders would clear in discrete intervals of less than a second.  This is plenty fast enough for market liquidity and better tracks “real” changes in price, but virtually eliminates the opportunity for latency arbitrage. These studies are a great read, and their technological fix is elegant.  In October, IEX, an “upstart” stock exchange, launched using just this type of technology.

My guess is that some combination of tax and technology policy is probably ideal.  But whether or not we’re concerned with financial markets, its worth reading the HFT studies and keeping an eye on the progress of IEX, because finance is also a great place to explore a world in which we’re all cyborgs of a sort already, and we’re grappling with the challenge that poses to work, ethics and government.

For example, in assessing the impacts of HFT, the authors of these studies reach inside the technobabble, explore what the algorithms actually do and quantify their effects in the non-digital world.  Then they propose a tailored fix.  We should take notice because it’s not just HFT algobots that are running about.  Our lives are increasingly shaped by a host of algorithms, bots, and machine to machine communications whose specifics and impacts are completely nontransparent.  For example, where bank managers once sat down with a computer and a set of statistics and prejudices to red-line ‘no loan’ districts, now companies and search engines routinely offer you and your neighbor different products at different prices or not at all.   Why, how, and what’s the effect? We won’t know unless we make this kind of systematic inquiry routine.

Second, and more philosophically, they propose a somewhat counter-intuitive solution to the critical current challenge of digital speed and information.  After centuries where decision-making was generally improved by more and faster data, we’ve reached an inflection point where we have so much data and speed it can degrade the efficiency with which we understand and act.  Learning to harness big data is one of the most exciting opportunities today, and our usual inclination is to build an additional tool or algorithm to help us understand, understand faster, or identify errors.  What the discrete time markets proposal does is the opposite.  It slows down action in the market to the speed (still very fast) that real, information-rich decisions can be made (whether those decisions are made by humans or bots). Of course the “speed of reality” (As Michael Wellman talks about in his blog on the University of Michigan HFT research he led) – the speed by which we generate sound actionable information – is different in every network and has different consequences.  In the HFT situation, trading churn that adds nothing to price discovery likely costs the stock market billions every year.  Actions taken too quickly on crowd-sourced data during the Boston bombing, for example, fingered innocent bystanders as criminals.

And finally, back to the robot revolution.  There’s hardly any trader left today, not even grandmoms and pops, who makes trading decisions without a computer.  And when we talk about “ordinary investors” above, what we mean are hedge funds and institutional investors whose trading desks are profoundly digitally connected and running analytical software rife with algorithms of their own. So when algobots talk to each other, execute high frequency trades without much adult supervision, and are prone to irrational swings, are they just acting like regular traders?  Who’s best to regulate, then? Human or machine? And does any of this impact the underlying direction and value of our investments?

Debating HFT reminds us that it is worth continuously asking whether our current frameworks of oversight are sufficient to capture unique problems and opportunities that arise from Machine to Machine decision-making.  At the same time the larger challenge is ensuring that our activities and our regulatory approaches (whether done by humans or bots) accord with the societal objectives we wish to achieve.  And that’s our task as citizens.

————————

For fabulous further reading….

Felix Salmon puts the race for (and perhaps past) better market information in historical context: http://blogs.reuters.com/felix-salmon/2012/10/06/the-problem-with-high-frequency-trading/

Is HFT wasteful?  And how to fix it:

Latency Arbitrage, Market Fragmentation, and Efficiency: A Two-Market Model from Michael Wellman and Elaine Wah of University of Michigan’s Strategic Reasoning Group, Wellman’s article on the same:

Trading faster than the speed of reality

The High-Frequency Trading Arms Race:  Frequent Batch Auctions as a Market Design Response. From Eric Budish at the University of Chicago Booth School of Business etal, and the blog post on their study An alternative to high frequency trading

Research firm NANEX’s now famous clip of what happens in ½ second of trading on a single stock.  Here shown at original and slowed down speeds; Matt Gongloff’s discussion of whats happening in it (albeit with very colorful adjectives).

Tagged , ,