New book

The War Among Algorithms


July 1, 2021

Twenty years ago, a financial trader was still usually a human being, either sharing a trading pit along with dozens or hundreds of other sweaty human bodies, or sitting at a computer terminal, talking into a telephone and buying and selling with keyboard and mouse. A decade later, digital algorithms had made decisive inroads into trading, but those algorithms still mostly ran on conventional computer systems. Nowadays, a trader is very often a specialised silicon chip known as an FPGA, or field-programmable gate array, such as the large, square chip at the centre of this photograph, coated with white paste that had held a cover in place.

The FPGAs that do so much of today’s trading are mainly to be found in about two dozen anonymous, warehouse-like buildings in and around Chicago, New York, London, Frankfurt and other major global financial centres. To walk through one of these computer datacentres is to listen to the hum of tens of thousands of computer servers in row upon endless row of metal cages and to glimpse the incomprehensible spaghetti of cables that interconnect the machines packed into those cages. When I first did so, in October 2014, I was still struggling to find a way of understanding the complex new world of ultrafast trading algorithms that was evolving.

I’ve gradually come to realise that one way of making sense of it is to focus on two of the different species of trading algorithm that are run on FPGAs and other forms of ultrafast computer hardware. One species is ‘market-making’ algorithms. Their chief characteristic is that they continuously bid to buy the stock or other financial instrument being traded and also continuously offer to sell it, at a marginally higher price. Consider, for example, the trading of a US stock. At almost any point in time, there will be an array of bids to buy it at, for instance, $31.48, $31.49 and $31.50, and a corresponding array of offers to sell at $31.51, $31.52, and so on.

Many, perhaps most, of those bids and offers will have been placed by market-making algorithms. An example of a successful operation by such an algorithm would be to buy a hundred shares (a standard size of trade in US stocks) at $31.50 and to sell them at $31.51. With a profit per share traded of as little as a single cent, a trading firm that specialises in market-making needs to buy and sell on a huge scale to earn any substantial amount of money. The largest such firms trade hundreds of millions or even billions of shares every day, which is one reason the activity is often called ‘high-frequency trading’ or HFT.

The second species of algorithm would be referred to by market participants as ‘taking’, ‘removing’ or ‘aggressive’. Like their market-making cousins, these algorithms constantly process data on price movements, but unlike them they aren’t always trying to trade. They wait until their calculations identify a probable profit opportunity and then they pounce. If, for example, that opportunity involves buying shares, a ‘taking’ algorithm will typically do so by snapping up as many as it can of the offers from market-making algorithms. Should the taking algorithm’s calculations be correct, prices will rise, and – perhaps after a few seconds, or maybe even a few minutes, which is a long time in today’s frantic trading – it will then sell the shares it has just bought. The market-making algorithms from which the taking algorithm has bought those shares have been left nursing a loss. They don’t, however, succumb to the dangerous human temptation to hold on to a trading position in the hope that the loss can be recovered. Unemotionally, they will seek to close that position by buying shares even at the new higher price and even from the ‘taking’ algorithm that has, in market parlance, just ‘picked them off’ or ‘run them over’.

The divide between market-making and taking is not absolute – for example, a fast market-making algorithm can ‘pick off’ its slower siblings – but individual traders, groups of traders and sometimes even entire firms tend to specialise in one or the other. Some of them view the divide as moral. ‘I tend to want to work at [HFT] companies that are “makers”,’ one of them said to me, “because I see the inherent evil in the “takers”.’ Because other market participants’ bids and offers tend to arrive only sporadically, financial markets have long been populated by human market-makers, who make their money by enabling those who wish to transact immediately to do so. Algorithmic market-making thus inherits the legitimacy of a traditional human role.

In my experience, only a minority of people in HFT firms, even on the market-making side of the divide, say that one side is more moral than the other. I have, however, found widespread acknowledgment of the way in which the competition between market-making and ‘taking’ algorithms gives rise to ferocious speed races. The ‘signals’ (data patterns) that inform algorithms’ trading decisions are often surprisingly simple and awareness of those patterns is commonplace in the world of HFT. The archetypal signal, focused on in insightful work by the University of Chicago economist Eric Budish and colleagues, is a sharp move in the price of share-index futures, which in the US are traded in a datacentre in the suburbs of Chicago. Such a move is almost always followed, a tiny fraction of a second later, by a change in the same direction in the prices of the underlying shares and of exchange-traded funds, which are shares whose prices track stock market indexes. When the prices of share-index futures change in Chicago, a speed race immediately breaks out in the New Jersey datacentres that host the trading of shares and of these exchange-traded funds. Market-making algorithms rush to cancel bids or offers whose prices are now out-of-date, while taking algorithms dash to execute against those ‘stale’ price quotes before they are cancelled. Winning or losing this race can be a matter of nanoseconds (billionths of a second).

There are also speed races among market-making algorithms. Most of today’s automated trading of financial instruments involves electronic queues. For example, a newly-arrived bid to buy a stock at $31.50 will be executed only when earlier bids have either been executed or cancelled. Since market-making algorithms need their bids and offers to be executed in order to make any money, they want to be as close to the head of the queue as possible. Being at the back of the queue, in contrast, is dangerous: those bids or offers are at higher risk of being picked off.

The competition in speed among market-making algorithms, and between them and taking algorithms, shapes the technologies of high-frequency trading profoundly. Algorithms involved in those races have to run on the fastest available computer hardware, which typically means FPGAs, even though they are far less flexible than standard computer systems. Skilled, experienced FPGA programmers are also hard to find and command high salaries. Fibre-optic cables are no longer fast enough to transmit information between datacentres, because the laser-generated pulses of light that flow through those cables are slowed by the glass of their cores. It is faster – although typically much more expensive – to send signals through the atmosphere, using microwave or millimetre-wave antennas (or sometimes lasers) on top of towers or high buildings.

An old technology, shortwave radio (once used by broadcasters such as the BBC World Service, the Voice of America and Radio Moscow), has been brought back into use by HFT firms. Shortwave transmission is notoriously fickle, and its bandwidth is limited, but it has a crucial speed advantage over light pulses travelling through the glass cores of the cables that carry Internet traffic beneath the world’s oceans. There is even talk among practitioners of HFT of launching a globe-spanning network of satellites in near-earth orbit to transmit the data crucial to algorithms’ decisions to buy or sell. More mundanely, HFT algorithms’ need for speed means that exchanges and datacentres can charge high fees for the fastest datafeeds, for the right to place FPGAs, digital switches and computer servers in the racks of those datacentres, and for use of the cables inside datacentres that connect that equipment to exchanges’ systems.

Automation and speed have brought benefits to financial markets. Human market-makers charged high prices for the service they provided, prices that were often inflated by the de facto oligopolies traditionally to be found in many markets. It’s cheaper to run a cluster of market-making algorithms than to pay dozens of human traders. Those algorithms are also able to change their price quotations very quickly as market conditions alter, so they are typically exposed to less risk than a human market-maker would be. Those factors allow them to provide keener prices. But the extremity of HFT’s speed competition does sometimes bring to mind the Cold War and the nuclear arms race. For example, transmitting prices from Chicago to New Jersey by microwave link rather than fibre-optic cable saves only around two milliseconds (thousandths of a second) and the difference in speed among the fastest such microwave links is now no more than one microsecond (a millionth of a second). Tiny speed gains of that kind offer little benefit to HFT firms in aggregate, much less to wider society. However, just as neither the United States nor the Soviet Union felt able to pause its nuclear weapons programme, even though the race between them was making neither party any safer, so HFT firms often have little alternative but to keep spending heavily on ever diminishing speed advantages.

This sometimes disturbs even those whose business it is to develop the technologies of speed. The man who first showed me an FPGA told me that the specialist vendors of these chips ‘every so and so [often] release an update that’s going to shave 5-10 nanoseconds’ off an FPGA’s processing time. He continued:

I don’t think … there is any other industry than the finance industry that can pay for it. … [I]t’s … mind-numbing to look at this whole industry where you have a lot of people with extended training that spend night and day shaving nanoseconds. Where, if you could put that brainpower to something else, maybe something different, but that’s what it is.

Eric Budish has suggested replacing today’s continuous financial trading with auctions that take place at specific points in time. Such auctions would greatly reduce the incentive for extreme speed and would do so even if they were very frequent (thus preserving much of the immediacy of transactions possible with today’s technology). Exchanges themselves have been seeking to bolster market-making by implementing ‘speed bump’ software that slows ‘taking’ algorithms, typically by around three milliseconds, so giving even reasonably slow market-making algorithms time to cancel their out-of-date price quotations. But Budish’s suggestion has so far achieved only limited traction, while regulators often seem to view speed bumps of this kind as unfairly discriminating between different categories of market participant. My informant’s resigned ‘that’s what it is’ thus largely still holds. The war among algorithms, endlessly fascinating but hugely expensive, continues.

Read more

Donald MacKenzie. Trading at the Speed of Light: How Ultrafast Algorithms are Transforming Financial Markets. Princeton University Press 2021.

Reposted from Berfrois.

4 Comments

  • Reply Weekend reading: London stalling, with stagnant house prices and empty offices – Use Enough Agency August 13, 2021 at 9:19 am

    […] The war among the algorithms – Donald MacKenzie […]

  • Reply Weekend reading: London stalling, with stagnant house prices and empty offices - Comas Family August 13, 2021 at 9:54 am

    […] The war among the algorithms – Donald MacKenzie […]

  • Reply John Charity Spring August 14, 2021 at 3:20 am

    You’re accurate about a lot of stuff here.

    Speed advantage is also “winner takes all” because if I am even just a few micros ahead of everyone else then that’s enough to see just far enough ahead into the future to take the lion’s share of the action. Conversely it’s easy for a firm to be profitable for a while before suddenly dropping off – and figuring out why is hard. So staying on top is the tough part.

    Understanding market structure (which is dynamic) goes hand in hand with speed. 2 microseconds on a mw link is well worth it me – that’s the closest thing one gets to a free lunch in this business.

    As Torvalds said, when you make something go faster, it doesn’t just work more efficiently but people use it differently.

    FPGAs are not a panacea because layouting is inherently hard/time consuming from a theoretical perspective. And the industry moves too fast for “front line” ASICs. So conventional software still plays a big part but finding the balance of the right tools is a skill.

    Is this a waste of talent? I don’t know. Society is very good at squandering talent through inefficiency in other areas and your view presupposes there is an obvious “more useful” opening. Having seen very good engineers go nowhere doing ground breaking science work due to conventional hierarchies and “old school” people in charge who’d have trouble finding a power button on a PC, there is certainly an immediacy to HFT that measures success or failure instantly and in an unambiguous way. Or perhaps I have become too jaded and cynical.

  • Reply Weekend reading: London stalling, with stagnant house prices and empty offices - Monevator - newsotime August 28, 2021 at 12:29 pm

    […] The war among the algorithms – Donald MacKenzie […]

  • Leave a Reply to Weekend reading: London stalling, with stagnant house prices and empty offices - Comas Family Cancel Reply

    This site uses Akismet to reduce spam. Learn how your comment data is processed.