Knight Capital glitch commented by ETNA Software CEO Roman Zhukov

Knight Capital GroupThe glitch that happened with Knight Capital’s algo trading robot can be called the tip of the iceberg. The same sort of situations happen every day, but with less dramatic effect, with many algo strategies. The main reasons for these situations are:

  1. The quality assurance tests that are used
  2. The need to handle a huge flow of information
  3. An increased complexity of regulation rules and fast-changing market environment
  4. Algo strategies lack artificial intelligence (AY5XB26BWXCG)

If we talk about quality assurance, the algo system must be tested twice as much as any other financial software. Example: for credit cards processing, due to the complexity of the algo, because the cost of failure is too great and it is impossible to cancel the trade once it is made.

Every millisecond the algo robot needs to handle a huge information flow. Any error in the information it receives from the market or error in the algo can lead to opening an incorrect position. Taking into an account that many strategies don’t look at the current P/L status when they make a decision, the chance that the algo will be looped and loose the money exists.

Also, there are rapid market behavior changes and regulatory changes that all influence the development timeframe for coding algos.

Nowadays, more than 70% of all trades are made by algo robots of some kind. The increase in the trading volumes by robots eventually leads to more mistakes made by them. Algorithm acts like a set of instructions – a condition when opening the position, a condition when increasing/decreasing the position and a condition when closing the position. Algorithm does not take into account that his actions may affect the market.

And when something goes wrong from the human point of view, everything is ok from the algorithm point of view (or more – maybe it decides it’s time to buy more and more stocks).

One of the ways to overcome these threats will be to make certain regulations at the exchange for robots that are registered on the exchange; but that will lead to more complex trading infrastructure and could lead to failures because of this complexity. It is also not wise to extend the algo testing period too much since the price of producing the algo will become more than the profit it generates. Furthermore many algos are created to trade for a pretty short period of time and don’t cost very much.

From our point of view, the optimum solution will be for all algo trading companies to use professional software development teams that are experienced in trading and can design a risk free algo code from the very beginning before even coding the algo strategy. All fail critical algo strategies must be heavily monitored to have a threshold for the number of transactions, trade volume, price difference, and many other parameters that imitate the human way of decision making and market analysis.

Roman Zhukov, CEO at ETNA Software

3 Comments

  1. Anderssonfilip on August 27, 2012 at 7:06 am

    I agree that professional software teams or developers should be in control of algo’s so that risk and errors can be minimized but think its very sensitive for trading companies to get external help in coding their strategies. How can the company be sure the software company, or its employees, doesn’t use the trading idea for its own benefit? Many trading strategies works best if the the strategy is kept a secret. I thinks its better that professional software companies should be responsible for designing test cases and provide validation of algo, this could be done by automated tests for normal and non-normal market situations.



  2. Leonid Kurguzov on August 28, 2012 at 1:18 pm

    I think the main reason here is lack artificial intelligence.
    What was observed at the market 30 minutes before Knight Capital lost $400M? – huge traded volume.
    For some stocks (where KC has positions in) daily volume was even more than monthly!
    Human may close all positions immediately – because “something” was wrong.
    But computer program can’t understand that “something” is wrong. It can understand nothing – only open and close positions.

    My conclusion is that HFT have to be heavily regulated: any thresholds, number of transactions, trade volume – every restriction will be helpful.



  3. Independent Developer on August 29, 2012 at 5:27 am

    I like that you wrote an article where the conclusion is algo trading companies should hire a software firm like yours instead of doing the work themselves.

    I have spent years in the business and the best algo strategies do have risk controls built in. Its not that Algo’s need to be tested twice, what they need is be tested under different sets of market conditions. All back testing strategies are flawed for algos and all other trading strategies because as you mention they do not take into account your impact on the market. You are correct that most algo’s do not take into account pnl, but most should take into account position and should shut themselves down when certain position limits are reached (whether this result in +/- pnl).