A recent study into high-frequency trading has concluded that computer-based trading has "several beneficial effects” and has advised regulators to exercise caution “to avoid undoing the advantages that high-frequency trading has brought”.
The UK Government’s Foresight Study, conducted by the Government Office for Science and sponsored by HM Treasury, published its final report into HFT after two years.
The report found that there was "no direct evidence that computer-based high-frequency trading has increased volatility in financial markets". It said: “The evidence suggests that computer-based trading has several beneficial effects on markets, notably liquidity has improved, transaction costs have fallen for both retail and institutional traders… and market prices have become more efficient.”
It did however also call for “immediate evidence-based regulatory action” to “incentivise accident-avoiding practices and behaviour”, a “larger role for standards" and “better surveillance of financial markets.”
This report comes as European regulators are contemplating stricter controls over HFT after a few 'flash crashes' on public exchanges and growing fears that proponents of the activity benefit at the expense of other investors and that it increases volatility.
This has caused the European Commission to propose several pieces of legislation designed to curb HFT – including a minimum order resting time, order-to-trade ratios and European-wide circuit breakers. This report states that “caution needs to be exercised to avoid undoing the advantages that high-frequency trading has brought.”
Being such an indepth report, conducted over 2 years and including views from over 150 academics and 350 stakeholders, it is generally seen as positive news for an industry that has been under teh cloud of increased regulation for a number of years now.
High-speed trading tools pioneered in the stock market are increasingly driving price movements on Amazon as sellers use them to undercut and outwit each other.
Prices change as often as every 15 minutes as some of the 2m sellers on the site join the online retailer in using computerised tools, often developed by former data miners at banks.
Amazon sellers – using third-party software – can set rules to ensure their prices are always, for example, $1 lower than their rivals’. More complex algorithms can analyse data to set prices most likely to secure a prominent position on the site.
But the tools create the risk of malfunctions similar to the 2010 flash crash, when algo trading was blamed for some US stock prices falling to near zero, then rebounding in 20 minutes.
Last year, out-of-control algorithms inflated the price of The Making of a Fly, a genetics book, to more than $23m, according to Michael Eisen, a biologist who blogged about it. The opposite is possible. Some sellers have created dummy accounts with ultra-low prices to pull down those of rivals so they can corner a market by buying their goods, although this clearly violates Amazon’s rules of conduct.
A group that sells pricing tools to Amazon clients, said: “We took a lot of inspiration from the stock market . . . they have these learning algorithms that calibrate themselves and improve themselves. That is the idea we are following.”
As with all algorithms that run like this, as useful as they can be, it is essential that they are built correctly, monitored and have effective risk controls to stop the extreme events mentioned above.
FXCM, a US foreign exchange agency broker has agreed to take a 50 per cent stake in Lucid Markets, a private UK proprietary trading group, in a deal that pushes the group further into institutional trading.
FXCM, listed on the New York Stock Exchange, said it would pay around $176m, net of Lucid cash, to own half of the London-based group in a move that will diversifiy it away from its core retail trading business. The remainder of Lucid will be held by its employees.
The pursuit of little-known (to the wider world but a firm that has been making a big impact over teh last 18 months) Lucid illustrates how algorithmic and high-frequency trading is beginning to reshape dealing in currency markets. High-speed trading, in which investors using superfast computers make bids, offers and execute trades in thousandths of seconds, has dominated trading on equity and some derivatives markets in recent years.
Such techniques have become commonplace in these sectors, but the forex market has often seen itself as an exception, arguing it has built a market that protects itself and traders from algorithmic trading.
However several of the biggest electronic equity market makers, such as Getco and Citadel, have also moved into forex marketmaking in recent years, searching for better returns in new asset classes in the face of fierce competition and becalmed equity markets.
Their presence potentially disintermediates banks from one of their lucrative markets, a move likely to be exacerbated by the incoming Volcker rule in the US, which is set to split banks from proprietary trading. As an agency broker, FXCM offers forex trading services to hedge funds and other institutional customers, allowing customers to get best prices offered by other banks.
FXCM itself has been moving institutional customers onto its own trading platform in the last 12 months, although it remains dominated by retail trading. This produced revenues of $92.8m in the three months to March 31, up from $77.7m a year ago. By contrast, revenues from its institutional business fell from $7.4m to $5.8m.
“The acquisition of Lucid – a leader in market making and trading in the institutional FX market – is a natural extension of the evolution of our institutional business,” said Drew Niv, chief executive of FXCM. “This transaction also adds the expertise and talents of one of the best firms in the FX world.”
Lucid, which employs 16 people, trades foreign exchange contracts and had $16m of cash by the end of the year to December 2010, up from $7.2m a year earlier. Lucid sees $35bn of volume a day and the combination of the two groups will take FXCM’s average daily volume to around $55bn a day in notional volume.
Mr Niv said the deal would also diversify FXCM. “The biggest risk at the moment to us as a firm is regulatory risk to the retail business,” he told the Financial Times. “It’s a game changer for us. It makes us a big player in this space.”
FXCM also confirmed that Lucid would not provide liquidity to FXCM’s retail agency business.
The US group will receive an option to purchase an additional stake in Lucid in the next four years. If it does not exercise the option, Lucid shareholders will have the option to buyout FXCM’s stake.
In the 12 months to December 31, Lucid had revenues of $148.9m and earnings before interest, taxes, depreciation and amortization of $113.4m. FXCM shares rose 6.2 per cent to $12.30 in morning trade in New York.
Last month saw the start of what could be a hard fought battle between governing bodies and the institutions they look to govern. With both sides seemingly having alternate agendas, it is hard to see how things will play out.
Industry groups have called for a regulatory focus on the techniques used in high frequency trading, as opposed to regulating their participants, whereas the regulators seem set upon identification of the largest speed-trading firms.
The problem that will, in my opinion, sit at the heart of the debate is simply the difficulty in attempting to create objective definitions for things that by definition are subjective, such as high-frequency trading and even best execution. Depending on your perspective, it is difficult to know what is “high-frequency” and what is “best.” The guy who is last to market may well struggle to class himself as fast, nor have the investiture of the title “fast” bestowed upon him by his counterparts.
As such, delivering an objective definition for high-frequency trading will require identification of suitable thresholds which can be held up as both defensible and logical. The list of which could be commonly selected as latency, automation, proximity, aggressiveness, but is by no means finite at that point.
However, having given rise to the popularity of dark pools, with their sole raison d'être, to protect institutional traders from the predatory behaviour of high frequency and computer-driven traders, the last of these terms, aggressiveness, is the one that raises some of the greatest concerns.
This is seen as one of the major issues and reasons for defining the industry as concerns have been widely aired that high frequency traders can use technology to game the system, getting ahead of slower-moving investors and profiting at their expense.
Proponents of course argue that the high frequency traders provide valuable liquidity, which benefits the market in many ways, as well as helping lower the cost of trading, improves prices available, improves linkage between markets and damps volatility in equity markets.
It remains to be seen which player has the stronger hand, but the outcome and subsequent ripples could be far reaching for all within the trading space. Should the regulators define the term broadly, we could well see groups such as investment banks and traditional asset managers who use automated trading programmes that send and cancel burst orders to the market could be tagged as “high-frequency traders”.
If they define the term more narrowly, harsh rules may well be imposed on some proprietary shops leaving them at some considerable disadvantage.
Traditionally, high frequency traders seeking the fastest connections between markets have looked to fibre-optic networks. By all accounts, it’s been a largely successful endeavour with latency between the markets of Chicago and New York, cut as low as 13.3 milliseconds.
With attention focused on this trading corridor between New York and Chicago, the second busiest in the world, trading firms are now busy securing licensed radio frequencies that have the potential to cut as much as 40% from the time taken to transfer data between the two financial centres.
The key advantage of stepping away from fibre is the simple matter of distance. A signal transmitted through the air can take a considerably more direct route than a cable which has to negotiate geographical features such as large bodies of water or built-up metropolitan areas.
The fact that High-Frequency Trading (HFT) companies typically trade a huge volume of contracts operating on the tiniest of margins, meaning even the smallest of price movements can essentially be make or break for business, so choosing the right technology is a critical decision.
In an increasingly crowded marketplace, much of the focus for HFT firms is on low latency, which basically means high speed. Strategies based on algorithmic derivatives rely on timings down to the millisecond, with orders being communicated as quickly and accurately as possible.
A recent research paper entitled High Frequency Swaps Trading: Market-Making and Arbitrage, argued that additional regulation – not least in the form of MiFID II – is likely to boost trading volumes across Europe as more diverse contracts are made accessible on electronic platforms.
Companies looking to exploit this opportunity will need the lowest-possible latency from connections between markets, data providers and counterparties to safeguard against price volatility and ensure trading decisions are executed flawlessly.
However, there is a catch to switching to wireless technology. In order to insure interference free operational reliability, these circuits must use frequencies that are licensed and regulated by governmental agencies, the availability of which may be limited, particularly on high interest intercity paths such as between NYC/NJ and Chicago.
Another key point for consideration is that microwave radios are true line of sight dependent systems, meaning that long distance circuits typically must use high towers or mountaintop sites that can be expensive to obtain.
Although such radios now contain sophisticated adaptive systems to insure signal quality, they can on rare occasion be subject to momentary circuit degradation or even brief outages due to unique environmental conditions.
We are not saying that the future is with Radio/Wireless Technology but the very fact it is being discussed, as with FPGA’s/GPU’s, shows that HFT funds will explore all angles to potentially gain that critical advantage.
In high-frequency trading (HFT), programmers eke out every last incremental tick in performance to build algorithms that battle other algorithms for computational supremacy and millions in profits and so the rewards are high. The following article gives a brief overview of the industry and some of the technologies that are being used.
Algorithmic trading reportedly accounts for more than 70% of equity trading these days. It's part of an elite niche of computer science that little is specifically known about as few in the industry seemingly want to talk about in specific terms, probably because hundreds of millions of dollars are at risk. They are, as a rule, secretive, stealthy, smart and relatively unknown.
There are many different aspects to HFT, but basically it comes down to building algorithms and computer systems that can monitor and digest huge amounts of financial data in order to automatically conduct transactions in much less time than it would take a human trader to glance from his BlackBerry to his Bloomberg Terminal.
Or, as Wikipedia explains:
"In [HFT], programs analyse market data to capture trading opportunities that may open up for only a fraction of a second to several hours. HFT uses computer programs and sometimes specialized hardware to hold short-term positions in equities, options, futures, ETFs, currencies and other financial instruments that possess electronic trading capability. High-frequency traders compete on a basis of speed with other high-frequency traders, not long-term investors." HFT was blamed for the huge drop in the stock market in May 2010, called the "flash crash." The Wall Street Journal reported that, "At one point, HFTs traded more than 27,000 contracts in just 14 seconds."
Tackling latency is a key priority, with programmers using every last tool and trick in the book — and continually making new ones — to squeeze out every last millisecond, or even nanosecond, of performance.
Fixnetix, recently announced that its "microchip for ultra-low latency trading is now the world's fastest trading appliance for the financial markets. Customers are now seeing latencies as low as 740 nanoseconds through the stack (wire-to-wire)."
What kind of language tools and techniques are used to develop such high-performance systems? People assume, rightly, that it would be something as close to the metal as even assembly language for certain parts and C++ to feature heavily, however, languages such as Haskell, Ocaml and Scala are starting to feature.
From a HFT platform perspective, C/C++ used to be the language of choice due to the latency requirements, the lower the latency, the more C/C++ is important. In reality it varies from firm to firm depending on their preferences and specific requirements and Java or the Microsoft stack are increasingly prevalent.
Functional programming is playing a greater part in the HFT industry as it's easy to verify functional programs, and because languages like Haskell facilitate rapid development. Being able to go from idea to deployment in under an hour is something that Smalltalk and Haskell give you, and that's something that the financial industry values highly. You can prove the correctness of the code automatically, so quickly!
There's much more to HFT than the choice of a programming language. In regards to an OS system, the default is Linux though Windows is still used and from a back-testing perspective, there is always a tick database – where Kdb or similar is used.
Hardware is also key and using GPU/FPGA [graphics processing units/Field-Programmable Gate Array] cards is not uncommon. Cuda is increasingly used for programming parallel processing algorithms [pattern matching] on GPUs. Impulse C is used for programming FPGAs.
However, the highest-performing systems in the world are worth nothing if the underlying logic is unsound. There is only one winner of a latency arms race so having low latency, but a strategy/model that generates no profit, isn't really going to help, and likewise the reverse. Besides a thorough understanding of data structures, algorithms and concurrency issues, successful programmers "need to understand what it is they're building,". One of the biggest hurdles for HFT firms is that the programmers, quants and traders don't understand one another, which can cause problems. An HFT programmer who understands trading and some of the theory and math behind finance will be regarded as extremely valuable and are highly sort after.
There can be a blur between developers and researchers/traders within HFT or systematic trading. If you are a great programmer and also have a head for maths, then you are more than likely to be able to move towards strategy/model implementation over time, rather than developing the platform.
At times the environment in an HFT firm can be quite high pressured as programmers tasked with developing high-performance systems working with traders to continually tweak or build new apps as market conditions change. To be successful you have to enjoy this fast paced environment that brings out the competitive side in you.
Not all positions/firms are so pressured and a lot of HFQ/Systematic trading firms offer a relaxed working environment that is more akin to a software firm or academic research facility, as they strive to discover the next piece of technology that will keep them ahead of the competition.
Automated Trader presented the findings of their annual survey last week at the London Stock Exchange. We attended an extremely interested presentation, panel discussion and Q&A session and wanted to summarise the findings before the official results are published later this year.
The overall consensus was that the continued automation of the trading process is going to continue, the pace of change will increase along with competition and technical complexities. Full end to end automation is still continuing but a mass uptake is going to take longer than expected and people operating a fully automated system will remain in the minority for the foreseeable future. Respondents were predominately from the buy (36.58%) and sell side (20.62%) with the remaining 41.44% being made up of Infrastructure, Vendors etc.
This is a brief summary to give an indication of major trends, as soon as the full results are available online we will post the link.
The key challenges people are currently facing include - finding alpha in increasing competitive markets, managing real-time risk and coping with an ever increasing volume of data. Responses to these challenges include creating smarter not faster strategies, increased use of machine learning algorithms and a quicker than expected take up of FPGA’s (more popular for sell side) or GPU’s (more likely to be used by buy side as they are slightly slower but more flexible).
Latency is still important but people are moving away from latency arbitrage and concentrating on creating alpha that is profitable as long as you ‘one of the fastest’, 35% of respondents fell into this category. Saying that, people are obviously still taking latency seriously and colocation is expected to continue to increase – both to markets and data suppliers.
Decreasing latency equals increased volumes and increasing tick data so everyone is having to deal with more data. This means that people will have to change their systems to deal with these challenges but will also have more data to learn from - swing and roundabouts.
The complexity of systematic trading algorithms is also growing with the mean number of parameters for an algo being 70 with some having up to 1000. With a growing number of parameters it is increasingly hard to recalibrate algorithms ‘manually’ so genetic algorithms and adaptive learning are increasingly common place. Execution algo’s are less complicated with an average of 20 parameters, though there can be up to 25 algo’s that are used in buy side firms.
One of the key areas of discussion was the potential effect of proposed regulation. There are plenty regulations and taxes that have been propositions over the last few years and the general consensus was that increased regulation was not workable and up to 62% of respondents said they would relocate if it got too much. 79% were against Dodd Frank, 80% were against the Financial Transaction tax an d60% said that Trading speed limits would be unworkable.
The financial news has not exactly been full of positive news recently - the Greeks are broke and are looking increasingly unlikely to be abale to pay back their vast debts, European banks are starting to falter and needing to be saved, traders are losing billions of dollars, UK/US banks are making redundancies and hiring freezes are in place…all in all probably not a great time to be looking for a job in finance…or is it?
The majority of our clients are systematic/quant driven hedge funds or proprietary trading firms. The financial volatility over the summer has been a blessing for models that thrive on such a market and so they are continuing to produce strong returns. These profits are being reinvested in experienced traders (and there are a few people around who are questioning how good their banking bonuses may be this year) and more traders means more technology and infrastructure needs.
The bar is still high for all our clients, and probably been raised a little higher over the last few months as the competition from the investment banks has faded. However, for traders with a proven track record, talented PhD graduates and top percentile technologists (e-trading domain specialist and high calibre non finance software engineers) the market is as buoyant as it has been all year.