The HFT Debate

If you’ve read any finance or investment blogs over the last couple of weeks you can’t have missed the fuss over high-frequency trading (HFT).

It’s a pretty involved story to get your head around, filled with impenetrable jargon (“low latency”, “collocation”, “dark pools”, “algorithmic trading”) and with a juicy spy scandal thrown in a couple of weeks ago when a former Goldman Sachs employee was arrested by the FBI and charged with the theft of software programmes that the bank had used for its computerised trading.

So what’s it all about? The critics – and there are many of them – hint that HFT may be open to exploitation by the big bank trading desks to make unfair profits at the expense of ordinary investors. Paul Wilmott, an academic who has popularised quantitative trading, even suggests that HFT may “distort the underlying markets and perhaps the economy”.

Others, including a trader I spoke to today, argue that the increasing use of computer-based trading programmes simply reflects the efficiencies offered by technology and new communications infrastructure, and that the cost of trading for investors remains on a declining trend overall – which isn’t a bad thing at all.

Of course, the ability to trade in very large volumes over very short timeframes confers a lot of power on the institutions that have access to these systems. But that doesn’t imply that this power will be abused. In the New York Times article mentioned above, Wilmott argues that HFT might increase the chance of destabilising feedback mechanisms leading to financial bubbles and busts, and refers to the contribution of “portfolio insurance” – a type of feedback strategy – to the 1987 stock market crash.

Whether the existence of super-powerful computerised trading programmes is destabilising in itself is debatable. A sceptic might argue that markets are by nature prone to bubbles and busts because they reflect human beings acting in crowds. Whether computing power might increase the likelihood of destabilisation depends on how the programmes are constructed. The burden of proof therefore remains with the prosecution, in my opinion.

But the debate over computerised trading brings to mind an important question about the ETF market. One of John Bogle’s major criticisms of the fund management industry has been that portfolio turnover is too high, resulting in unnecessary costs for the end-investor. With the steady decline in trading costs that we are witnessing in securities markets, is it still reasonable to suggest that an index investor should be “passive”, with infrequent changes in his or her portfolio? Or should investors embrace the brave new world of systematic, quantitatively driven, high-frequency fund management?

Author

  • Luke Handt

    Luke Handt is a seasoned cryptocurrency investor and advisor with over 7 years of experience in the blockchain and digital asset space. His passion for crypto began while studying computer science and economics at Stanford University in the early 2010s.

    Since 2016, Luke has been an active cryptocurrency trader, strategically investing in major coins as well as up-and-coming altcoins. He is knowledgeable about advanced crypto trading strategies, market analysis, and the nuances of blockchain protocols.

    In addition to managing his own crypto portfolio, Luke shares his expertise with others as a crypto writer and analyst for leading finance publications. He enjoys educating retail traders about digital assets and is a sought-after voice at fintech conferences worldwide.

    When he's not glued to price charts or researching promising new projects, Luke enjoys surfing, travel, and fine wine. He currently resides in Newport Beach, California where he continues to follow crypto markets closely and connect with other industry leaders.

    View all posts