From Wikipedia, the free encyclopedia
In electronic financial markets, algorithmic trading or automated trading, also known as algo trading, black-box trading or robo trading, is the use of computer programs for entering trading orders with the computer algorithm deciding on aspects of the order such as the timing, price, or quantity of the order, or in many cases initiating the order without human intervention. Algorithmic Trading is widely used by pension funds, mutual funds, and other buy side (investor driven) institutional traders, to divide large trades into several smaller trades in order to manage market impact, and risk.[1][2] Sell side traders, such as market makers and some hedge funds, provide liquidity to the market, generating and executing orders automatically. A special class of algorithmic trading is "high-frequency trading" (HFT), in which computers make elaborate decisions to initiate orders based on information that is received electronically, before human traders are capable of processing the information they observe, in addition to which they can reduce costs of trading and improve shareholder returns.[citation needed]
Algorithmic trading may be used in any investment strategy, including market making, inter-market spreading,arbitrage, or pure speculation (including trend following). The investment decision and implementation may be augmented at any stage with algorithmic support or may operate completely automatically ("on auto-pilot").
A third of all EU and US stock trades in 2006 were driven by automatic programs, or algorithms, according to Boston-based financial services industry research and consulting firm Aite Group.[3] As of 2009, high frequency trading firms account for 73% of all US equity trading volume.[4]
In 2006 at the London Stock Exchange, over 40% of all orders were entered by algo traders, with 60% predicted for 2007. American markets and equity markets generally have a higher proportion of algo trades than other markets, and estimates for 2008 range as high as an 80% proportion in some markets. Foreign exchange marketsalso have active algo trading (about 25% of orders in 2006).[5] Futures and options markets are considered to be fairly easily integrated into algorithmic trading,[6] with about 20% of options volume expected to be computer generated by 2010.[7] Bond markets are moving toward more access to algorithmic traders.[8]
One of the main issues regarding high frequency trading is the difficulty in determining just how profitable it is. A report released in August 2009 by the TABB Group, a financial services industry research firm, estimated that the 300 securities firms and hedge funds that specialize in rapid fire algorithmic trading took in roughly $21 billion in profits in 2008[9].
Contents[hide] |
[edit]History
Computerization of the order flow in financial markets began in the early 1970s with some landmarks being the introduction of the New York Stock Exchange’s “designated order turnaround” system (DOT, and later SuperDOT) which routed orders electronically to the proper trading post to be executed manually, and the "opening automated reporting system" (OARS) which aided the specialist in determining the market clearing opening price (SOR; Smart Order Routing).
Program trading is defined by the New York Stock Exchange as an order to buy or sell 15 or more stocks valued at over $1 million total. In practice this means that all program trades are entered with the aid of a computer. In the 1980s program trading became widely used in trading between the S&P500 equity and futures markets.
In stock index arbitrage a trader buys (or sells) a stock index futures contract such as the S&P 500 futures and sells (or buys) a portfolio of up to 500 stocks (can be a much smaller representative subset) at the NYSE matched against the futures trade. The program trade at the NYSE would be pre-programmed into a computer to enter the order automatically into the NYSE’s electronic order routing system at a time when the futures price and the stock index were far enough apart to make a profit.
At about the same time portfolio insurance was designed to create a synthetic put option on a stock portfolio by dynamically trading stock index futures according to a computer model based on the Black-Scholes option pricing model.
Both strategies, often simply lumped together as “program trading,” were blamed by many people (for example by the Brady report) for exacerbating or even starting the 1987 stock market crash. Yet the impact of computer driven trading on stock market crashes is unclear and widely discussed in the academic community.[10]
Financial markets with fully electronic execution and similar electronic communication networks developed in the late 1980s and 1990s. In the U.S., decimalization, which changed the minimum tick size from 1/16th of a dollar ($0.0625) to $0.01 per share, may have encouraged algorithmic trading as it changed the market microstructure by permitting smaller differences between the bid and offer prices, decreasing the market-makers' trading advantage, thus increasing market liquidity.
This increased market liquidity led to institutional traders splitting up orders according to computer algorithms in order to execute their orders at a better average price. These average price benchmarks are measured and calculated by computers by applying the time weighted (i.e unweighted) average price TWAP or more usually by the volume weighted average price VWAP.
As more electronic markets opened, other algorithmic trading strategies were introduced. These strategies are more easily implemented by computers because machines can react more rapidly to temporary mispricing and examine prices from several markets simultaneously. For example Stealth (developed by Deutsche Bank), Sniper and Guerilla (developed by Credit Suisse[11]), arbitrage, statistical arbitrage, trend following, and mean reversion.
This type of trading is what is driving the new demand for Low Latency Proximity Hosting and Global Exchange Connectivity. It is imperative to understand what is latency when putting together a strategy for electronic trading. Latency refers to the delay between the transmission of information from a source and the reception of the information at a destination. Latency has as a lower bound the speed of light; this corresponds to a few microseconds per kilometer of optical fibre. Any signal regenerating or routing equipment will introduce greater latency than this speed-of-light baseline.
[edit]Strategies
Many different algorithms have been developed to implement different trading strategies. Much early algo trading was developed for the buy side in order to reduce transactions costs. Basic "benchmarking" algorithms can be used by traders attempting to mimic an index's return. Recently, high-frequency trading, which comprises a broad set of buy-side as well as market making sell side traders, has become more prominent and controversial.[12] These algorithms or techniques are commonly given names such as "Stealth", "Iceberg", "Dagger", "Guerrilla", "Sniper" and "Sniffer".[13], yet are at their core quite simple mathematical constructs.[14]
[edit]Transaction cost reduction
Most strategies referred to as Algorithmic Trading fall into the cost-reduction category. Large orders are broken down into several smaller orders and entered into the market over time. This basic strategy is called "iceberging". The success of this strategy may be measured by the average purchase price against the VWAP for the market over that time period. One algorithm designed to find hidden orders or icebergs is called "Stealth". Most of these strategies were first documented in [15].
[edit]Other strategies
Any type of algorithmic trading which depends on the programming skills of other traders is called "gaming". Dark pools are alternative electronic stock exchanges where trading takes place anonymously, with most orders hidden or "iceberged."[16] Gamers or "sharks" sniff out large orders by "pinging" small market orders to buy and sell. When several small orders are filled the sharks may have discovered the presence of a large iceberged order. Gaming strategies are widely discouraged by dark pools, where purveyors of such strategies are financially penalized for their behavior.
“Now it’s an arms race,” said Andrew Lo, director of the Massachusetts Institute of Technology’s Laboratory for Financial Engineering. “Everyone is building more sophisticated algorithms, and the more competition exists, the smaller the profits.”[17]
The arms race has allegedly included stealing computer code. UBS has sued three of its former traders and Jefferies & Company for stealing algorithmic trading programs.[18]
[edit]High-frequency trading
High-frequency trading In the U.S., high-frequency trading firms represent 2% of the approximately 20,000 firms operating today, but account for 73% of all equity trading volume.[19] As of the first quarter in 2009, total assets under management for hedge funds with high frequency trading strategies were $141 billion, down about 21% from their high.[20] The high frequency strategy was first made successful by Renaissance Technologies.[21] High frequency funds started to become especially popular in 2007 and 2008.[20] Many high frequency firms say they are market makers and that the liquidity they add to the market has lowered volatility and helped narrow spreads, but unlike traditional market makers, such as specialists on the New York Stock Exchange, they have few or no regulatory requirements.
High-frequency trading is quantitative trading that is characterized by short portfolio holding periods (see Wilmott (2008), Aldridge (2009)). There are four key categories of high-frequency trading strategies: market-making based on order flow, market-making based on tick data information, event arbitrage and statistical arbitrage. All portfolio-allocation decisions are made by computerized quantitative models. The success of high-frequency trading strategies is largely driven by their ability to simultaneously process volumes of information, something ordinary human traders cannot do. Various types of high-frequency strategies are covered in Aldridge, I., "High-Frequency Trading: A Practical Guide to Algorithmic Strategies and Trading Systems" (Wiley, 2009).
[edit]Market making
Market making is a set of high-frequency trading strategies that involve placing a limit order to sell (or offer) above the current market price or a buy limit order (or bid) below the current price in order to benefit from the bid-ask spread. Automated Trading Desk, which was bought by Citigroup in July 2007, has been an active market maker, accounting for about 6% of total volume on both NASDAQ and the New York Stock Exchange.[22]
[edit]Statistical Arbitrage
Another set of high-frequency trading strategies are classical arbitrage strategy might involve several securities such as covered interest rate parityin the foreign exchange market which gives a relation between the prices of a domestic bond, a bond denominated in a foreign currency, the spot price of the currency, and the price of a forward contract on the currency. If the market prices are sufficiently different from those implied in the model to cover transactions cost then four transactions can be made to guarantee a risk-free profit. High-frequency trading allows similar arbitrages using models of greater complexity involving many more than 4 securities. The TABB Group estimates that annual aggregate profits of low latency arbitrage strategies currently exceed US$21 billion.[4]
A wide range of statistical arbitrage strategies have been developed whereby trading decisions are made on the basis of deviations from statistically significant relationships. Like market-making strategies, statistical arbitrage can be applied in all asset classes.[23]
[edit]Low-latency trading
High-frequency trading is often confused with low-latency trading that uses computers that execute trades within milliseconds, or "with extremely low latency" in the jargon of the trade. Low-latency trading is highly dependent on ultra-low latency networks. They profit by providing information, such as competing bids and offers, to their algorithms microseconds faster than their competitors.[4] The revolutionary advance in speed has led to the need for firms to have a real-time, colocated trading platform in order to benefit from implementing high frequency strategies.[4] Strategies are constantly altered to reflect the subtle changes in the market as well as to combat the threat of the strategy being reverse engineered by competitors. There is also a very strong pressure to continuously add features or improvements to a particular algorithm, such as client specific modifications and various performance enhancing changes (regarding benchmark trading performance, cost reduction for the trading firm or a range of other implementations). This is due to the evolutionary nature of algorithmic trading strategies - they must be able to adapt and trade intelligently, regardless of market conditions, which involves being flexible enough to withstand a vast array of market scenarios. As a result, a significant proportion of net revenue from firms is spent on the R&D of these autonomous trading systems.[4]
[edit]Strategy Implementation
Most of the algorithmic strategies are implemented using modern programming languages although some still implement strategies designed in spreadsheets. Basic models can rely on as little as a linear regression. While more complex game-theoretic and pattern recognition or predictive model can also be used to initiate trading. Neural networks and genetic programming have been used to create these models.
[edit]Issues and developments
Algorithmic trading has been shown to substantially improve market liquidity[24] among other benefits. However, improvements in productivity brought by algorithmic trading have been opposed by human brokers and traders facing stiff competition from computers.
Some[who?] have claimed that the models used in algorithmic trading are known to have limitations, and the programs may break down under stress. However, model limitations are not always known, so it is common for a major part of the effort in developing a system to be the introduction of fail-safes and sanity checks.[citation needed]
“The downside with these systems is their black box-ness,” Mr. Williams said. “Traders have intuitive senses of how the world works. But with these systems you pour in a bunch of numbers, and something comes out the other end, and it’s not always intuitive or clear why the black box latched onto certain data or relationships.”[17]
Regulators in both the US and UK have long watched algorithmic trading, since its alleged role in Black Monday where automated trading was perceived to have accelerated the downturn.
“The Financial Services Authority has been keeping a watchful eye on the development of black box trading. In its annual report the regulator remarked on the great benefits of efficiency that new technology is bringing to the market. But it also pointed out that ‘greater reliance on sophisticated technology and modelling brings with it a greater risk that systems failure can result in business interruption’.”[25]
UK Treasury minister Lord Myners has warned that companies could become the "playthings" of speculators because of automatic high-frequency trading (HFT). Lord Myners said the process risked destroying the relationship between an investor and a company.[26]
Other issues include the technical problem of latency or the delay in getting quotes to traders,[27] security and front running, and the possibility of a complete system breakdown leading to a market crash.[28]
Although some systems such as those being developed by Goldman Sachs are expensively developed, the market also includes many players whose total resource applied to the development and maintenance of the system is between one and three people.
The cost of developing and maintaining algorithms is still relatively high, especially for new entrants, as the need for stability, bandwidth and speed is even higher than for regular order execution. Firms which have not developed their own algorithmic trading have had to buy competing firms.
"Goldman spends tens of millions of dollars on this stuff. They have more people working in their technology area than people on the trading desk...The nature of the markets has changed dramatically."[29]
Financial market news is now being formatted by firms such as Thomson Reuters, Dow Jones, and Bloomberg, to be read and traded on via algorithms.
“Computers are now being used to generate news stories about company earnings results or economic statistics as they are released. And this almost instantaneous information forms a direct feed into other computers which trade on the news.”[30]
The algorithms do not simply trade on simple news stories but also interpret more difficult to understand news. Some firms are also attempting to automatically assign sentiment (deciding if the news is good or bad) to news stories so that automated trading can work directly on the news story.
“There is a real interest in moving the process of interpreting news from the humans to the machines” says Kiristi Suutani, global business manager of algorithmic trading at Reuters. “More of our customers are finding ways to use news content to make money.”[30]
An example of the importance of news reporting speed to algorithmic traders was an advertising campaign by Dow Jones (appearances included page W15 of the Wall Street Journal, on March 1, 2008) claiming that their service had beaten other news services by 2 seconds in reporting an interest rate cut by the Bank of England.
In July 2007, Citigroup, which had already developed its own trading algorithms, paid $680 million for Automated Trading Desk, a 19-year-old firm that trades about 200 million shares a day.[31] Citigroup had previously bought Lava Trading and OnTrade Inc.
[edit]Effects
Though its development may have been prompted by decreasing trade sizes caused by decimalization, algorithmic trading has reduced trade sizes further. Jobs once done by human traders are being switched to computers. The speeds of computer connections, measured in milliseconds and even microseconds, have become very important.[32][33]
More fully automated markets such as NASDAQ, Direct Edge and BATS, in the US, have gained market share from less automated markets such as the NYSE. Economies of scale in electronic trading have contributed to lowering commissions and trade processing fees, and contributed to international mergers and consolidation of financial exchanges.
Competition is developing among exchanges for the fastest processing times for completing trades. For example the London Stock Exchange, in June 2007, started a new system called TradElect, which promises an average 10 millisecond turnaround time from placing an order to final confirmation, and can process 3,000 orders per second.[34]. This speed would already be considered a quaint benchmark as competitive exchanges now offer 3 millisecond turnaround times in the US. This is of great importance to high frequency traders, because they have to attempt to pinpoint the consistent and probable performance ranges of given financial instruments. These professionals are often dealing in versions of stock index funds like the E-mini S&Ps because they seek consistency and risk-mitigation along with top performance. They must filter market data to work into their software programming so that there is the lowest latency and highest liquidity at the time for placing stop-losses and/or taking profits. With high volatility in these markets, this becomes a complex and potentially nerve-wracking endeavor, where a small mistake can lead to a large loss. Absolute frequency data play into the development of the trader's pre-programmed instructions[35]
Spending on computers and software in the financial industry increased to $26.4 billion in 2005.[1]
[edit]Communication standards
Algorithmic trades require communicating considerably more parameters than traditional market and limit orders. A trader on one end (the “buy side“) must enable their trading system (often called an “Order Management System” or “Execution Management System”) to understand a constantly proliferating flow of new algorithmic order types. The R&D and other costs to construct complex new algorithmic orders types, along with the execution infrastructure, and marketing costs to distribute them, are fairly substantial. What was needed was a way that marketers (the “sell side”) could express algo orders electronically such that buy-side traders could just drop the new order types into their system and be ready to trade them without constant coding custom new order entry screens each time.
FIX Protocol LTD http://www.fixprotocol.org is a trade association that publishes free, open standards in the securities trading area. The FIX language was originally created by Fidelity Investments, and the association Members include virtually all large and many midsized and smaller broker dealers, money center banks, institutional investors, mutual funds, etc. This institution dominates standard setting in the pretrade and trade areas of security transactions. In 2006-2007 several members got together and published a draft XML standard for expressing algorithmic order types. The standard is called FIX Algorithmic Trading Definition Language (FIXatdl).[36]. The first version of this standard, 1.0 was not widely adopted due to limitations in the specification, but the second version, 1.1 (released in March 2010) is expected to achieve broad adoption and in the process dramatically reduce time-to-market and costs associated with distributing new algorithms.