t3live

Just another WordPress.com site

Archive for the ‘High Frequency Trading’ Category

>WSJ Recognizes T3 in Ongoing HFT Debate

leave a comment »

>Wall Street JournalBy: Brandon Rowley  

Sean Hendelman has been a leading voice in the high frequency trading debate notably in advocating the idea of an order cancellation tax. Today, The Wall Street Journal added Sean’s thoughts on the causes of the Flash Crash in the lead article titled “SEC Probes Canceled Trades“:

One thing is clear, say traders and regulators: An eye-popping number of the stock quotes entered in the U.S. market’s exchange system are canceled.

For example, on Feb. 18, trading volume on the Nasdaq exchange totaled about 1.247 billion shares, according to data compiled by T3 Capital Management, a New York hedge fund. However, over the course of the same day traders submitted offers to buy or sell stock for roughly 89.704 billion shares. In other words, only 1% of the orders posted on Nasdaq actually traded.

While a portion of cancellations are part of the natural course of trading, Sean Hendelman, chief executive officer at T3, says he believes most of these canceled stock quotes are from traders loading up a stock’s computerized order book with essentially fake bids and offers.

Mr. Hendelman, whose firm employs other high-frequency trading tactics, says the practice creates an inaccurate picture of the true supply and demand for a stock. “People are relying on the [stock quote data] and the data is not real,” he says.

Sean has long been arguing that market participants are relying on liquidity that is not real, particularly in times in stress. Regulators placed their trust in HFT firms to provide liquidity to the markets but then stopped watching. While many HFT firms do not have any duty to provide liquidity, designated market makers (DMMs) and special liquidity providers (SLPs) are offered particular exemptions in exchange for maintaining orderly markets. It is fair to say, coupled with a structural problem between the ‘slow market’ initiated by the NYSE and the lightning speeds of ECNS, that DMMs and SLPs failed in their duty during the Flash Crash.

>High Frequency Trading Regulation Should Promote Deep Markets

leave a comment »

>robot on keyboardBy: Brandon Rowley  

Senator Ted Kaufman has been a leader in critically assessing the impacts of high frequency trading on today’s financial markets. Kaufman wrote a letter to the SEC on August 5th highlighting key issues the Commission should review. Most notably, Kaufman argues that “regulation should aim not to facilitate narrow spreads with little size or depth of orders, but instead promote deep order books”. A deeper market will create greater stability and confidence for investors.

The purpose of equity markets

Kaufman starts his letter by restating what the Chairman of the SEC, Mary Shapiro, has said are the market’s two primary functions. First, “capital formation, so companies can raise capital to invest, create jobs and grow” and second, “attracting and serving long-term investors to help facilitate the capital formation process”. While this is clearly the underlying purpose of financial markets, traders have long served the role of making markets and providing liquidity to improve the efficiency of moving capital between stocks and markets.

Rethinking years of regulatory goals

For a long time now, accepted wisdom has been that regulators should aim for lower transaction costs by fostering competition between brokerages for commissions and by improving liquidity to narrow spreads. Yet, is it possible we have reached a point of diminishing returns?

Wider spreads with a large protected quote size on both sides may facilitate certainty of execution with predictable transparent costs. Narrow fluctuating spreads, on the other hand, with small protected size and thin markets, can mean just the opposite — and actual trading costs can be high, hidden and uncertain. Deep stable markets will bring back confidence, facilitate the capital formation function of the markets and diminish the current dependence on the dark pool concept. At a minimum, the Commission must carefully scrutinize and empirically challenge the mantra that investors are best served by narrow spreads. In reality, narrow spreads of small order size may be an illusion that masks a very ‘thin crust’ of liquidity (which leaves market vulnerable to another flash crash when markets fail their price discovery function only next time within the bound of circuit breakers) and difficult-to-measure price impacts (that might be harmful to the average investors and which diminish investor confidence). [emphasis mine]

The narrow spreads fallacy can be witnessed by any active market participant executing orders. Sean Hendelman and I previously proposed an order cancellation tax which we believe would go a long way in exposing what is the real, wider spread and make the bid and offer more dependable and thus more efficient in the long run.

While a good number of studies have been released that purport to show that spreads have narrowed in the most liquid names, it is well worth considering whether that narrowing has actually improved the price discovery process. Meanwhile, opponents of HFT simply argue that the minor narrowing of spreads in already liquid stocks is irrelevant and the true problems are widening spreads in thinner securities. In that case, it is the worst of both worlds: wider spreads and less dependable quotes.

Explicit versus implicit costs of trading

Kaufman continues in his “Market Structure Solutions” attachment to the letter with nine wide-ranging, comprehensive suggestions, several of which I will discuss in future posts. On the spreads issue he succinctly summarizes his belief:

While some regulations might widen spreads and raise the explicit costs of trading, those outcomes along should not disqualify such rules from being considered. Indeed, policies designed to protect large quote sizes on the bid and offer and to mandate or incentivize significant resting liquidity be provided at multiple price points would result in wider spreads, but also offer greater certainty of execution and make trading costs more predictable and transparent for investors. Simply put, it may be better for investors to pay the spread they can see than the price impacts they cannot see or effectively measure. [emphasis mine]

It is crucial to understand that the costs of trading are both the explicit upfront fees as well as the implicit expense of poor execution of an order. Human traders have recognized the difficulty in executing large orders because of illusory bids and offers that cancel at a moment’s notice coupled with the free-riding high frequency traders that snatch up liquidity alongside in hopes of profiting from the larger trader moving the price. The SEC should examine both costs and design regulation around reducing, or at least making transparent, total trading costs.

>Discretionary Traders in the Brave New HFT World

leave a comment »

>Robot on computer screenBy: Brandon Rowley  

Last week’s high frequency trading conference by the World Research Group helped me form a much greater understanding of HFT systems and the future of the industry. (My summaries of the event here: Day One & Day Two) Much of the discussion focused on the race for the lowest latency which I suspect was somewhat due to the high number of vendors in attendance and on panels. Yet it was reiterated by actual practitioners multiple times that the clear victors in the end will be those with the most creative and innovative programming uncovering and exploiting inefficiencies in the financial system. While hardware will always be important, it has been of an out-sized focus for the last couple years and T3 Capital’s Sean Hendelman, in particular, does not believe that will continue. Software will reign supreme in the end and the brightest computer scientists will garner the largest share of the profits. Jeremy Muthuswamy, Professor of Finance at Kent State University believes we have only “scratched the surface” on HFT computational complexity and expects quantitative modeling to evolve incredibly in the coming years.

The Brave New HFT World
The takeover by the machines was overlooked by many active traders as HFT grew rapidly during the 2008 crash because the human trader still had ample opportunity to profit in a wildly volatile equity market. As market conditions have dramatically changed over the last 18 months, the edge computers have has reeked havoc on the P&Ls of traders unwilling to adjust their strategies. In my personal opinion, dedicated scalpers are a dying breed. The most difficult part of the equation for a hyper-scalper is maintaining discipline and emotional control under an impulsive, rapid fire strategy of buying and selling. Not only is the opponent now an unemotional black box, it is faster, much faster. Trades are now happening in nanoseconds, far faster than the couple microseconds required for a human eye even to see a bid or offer appear in the Level II. With that speed comes an inability for the human scalper to control their downside risk as positions move out of their favor far too quickly. This of course causes a high degree of stress and blurs judgment. Scalpers have found themselves outmatched on speed, emotion and risk control as HFT has grown. For the most part, a scalper’s edge has been stolen from them.

There was decent amount of discussion during the conference on deciding the target latency to achieve through hardware investment. Latency is of importance only relative to what the strategy requires. HFT practitioners distinguish between strategies needing ultra low latency and those only needing low latency and those not as concerned. Regular traders often think of HFT simply as fast computers yet there is a high level of differentiation in terms of latency even within the HFT universe. The thought of being able to beat the computer on speed is almost comical as they are discerning among themselves the varying levels of latency.

Also worth noting is the much more complicated manner in which a computer can rapidly and accurately assess risk and reward scenarios. Typically a trader will judge risk and reward based on perceived levels while watching a Level II or by gauging trading levels on a chart. Yet, black boxes can calculate risks immediately based on percentages not dollars, the bid and offer interaction, the frequency of bids hit/offers paid in a manner far more sophisticated than the human daytrader attempting to measure trades through visual interpretation of the Level II and a chart.

As I have stated in the past (here) HFT is requiring traders to become more sophisticated. The computers will win in the very short-term trading game, there is no doubt. That does not preclude discretionary traders from finding a way to be profitable. It simply forces traders to study, learn, adapt and evolve.

Survive and prosper bookHow to Survive & Prosper
Ultimately, not being an HFT programmer myself the question is: how does the discretionary trader live in this brave new world? In a recent interview on Wall St. Cheat Sheet with President of First New York Securities, a prominent NYC-based proprietary trading firm, Joe Schenk made his business model clear: “Contrary to popular belief, our business is proprietary trading not day trading. Though we may trade intra-day, we are not day traders.” This is a very important distinction and firms ahead of the curve have invested in HFT infrastructure while refocusing the manual trading to strategies beyond the very short-term.

Later in the interview with First New York was an excellent recognition by Donald Motschwiller: “But the guys who truly trade the markets the best — the most talented guys in the firm — they trade the markets intuitively. They’ve seen it so many times and are so confident in the decision making process that they’re not reacting.” From my experience, this is absolutely true. The best traders have an unexplainable gut feel that they are in tune with and trust in their decision-making process. Any technical or fundamental analysis does not represent hard and fast rules. The rules only work within the context of the overall direction and movement of the tape. Fundamental guys buying financials in 2008 without respect for the downward momentum would have seen painful losses. Likewise technicians highlighting a head and shoulders pattern in June 2009 failed to respect the incredibly strong bid that had entered the market. Intuition can certainly trump strictly quantitative strategies.

Simply put you will not win in the quantitative space; your approach must be different. In order for traders to succeed in a highly quant-driven tape they must develop a feel for the overall market and understand the ebb and flow of particular stocks and markets. Feel is very abstract, nearly impossible to teach and for the most part will only come through years of experience. But there are two particular daily activities I believe traders can do to help significantly shorten the learning curve. First, follow prices. Making a purposeful effort to memorize prices will allow you to contextualize any movement over time. This includes internalizing charts in order to know the history of prices. Second, read read read. The only way to understand the prevailing psychology is to gauge price reactions against headlines. There are many great financial blogs out there that help in determining broad sentiment.

In general traders need to understand trend and volatility. Trading with the trend is more important than ever as programs often exacerbate moves far beyond anticipated levels of support and resistance. Volatility is absolutely crucial in predicting the possible reward scenarios. While reward is measured in a pure dollar or percentage sense, traders must also appraise the probability of that reward coming to fruition. Lower volatility times yield lower returns and therefore require tighter stops.

Beyond developing feel, I believe traders are well-served by studying fundamentals. Trading plans must be arranged well before the stock hits the buy or sell points. Most important for me is background research on the underlying companies. My best trades have always occurred when I have the greatest amount of conviction in the idea. This conviction is only gained by putting in-depth research on the idea. Holding stocks for longer periods of times will only be consistently profitable if you are correct on the motivating factors behind the buying or selling. While it is probably not necessary to know the long-term debt to capitalization ratio of a given firm for example, it is important to recognize catalysts and know their impact in order to swing trade effectively. With technical levels becoming more fluid than ever before, the ability to hold through tumultuous volatility is only possible by intertwining fundamentals into the equation in order to maintain confidence in the trade.

At the end of the day, as argued by Muthuswamy, “so went the pit trader for the electronic trader, so will the quant human trader go for the robo trader.” Admitting the inability to compete as a human is the first step, the second is to find a new method. There is huge opportunity in swing trading as volatility remains elevated currently at 27%. High beta names have huge ranges on daily basis. The keys for out-performance over the next few years will be those that are in tune with the tape and those that generate fundamental conviction for their trades.

>World Research Group Summit on HFT (Day 2)

leave a comment »

>data centerBy: Brandon Rowley

I attended the second day of the World Research Group’s Summit on “Buy Side Tech: High Frequency Trading” (day one summary here). Thursday was only a half day and I missed the first panel but below is another rundown of what I found to be interesting.

Data Centers
I caught the second panel of the day with Adam Honoré of Aite Group, LLC and Kevin McPartland TABB Group. There was a good deal of discussion on the topic of co-location. The NYSE has taken the most political approach as McPartland sees it promising the same latency for all participants located within the Mahwah data center in New Jersey. The NYSE built the 400,000 square foot, 2-story fortress last year to accommodate the burgeoning demand for proximity hosting of high frequency trading firm servers. The NYSE has promised all firms within the complex a 70 microsecond latency which it will accomplish by literally cutting all the wire lengths the same for each rack.

Neither Honoré or McPartland see any game-changing regulation in terms of co-location. While many complain of the unfair advantage this provides in terms of ultra low latency, McPartland pointed out that disallowing exchanges from providing the space around the matching engine would only work to drive up the real estate across the block as proximity will continue to matter in terms of the needed travel distance for data.

While the NYSE has fully dedicated itself to the hardware side of the business with its $250 million investment in Mahwah, other venues such as BATS and Direct Edge do not have such data centers. Honoré saw the NYSE as making this major CAPEX decision in spite of its shareholders because it marks a venture into an entirely unproven business model at this point. This investment requires significant upfront costs and substantial continuing upgrade and maintenance costs. Only time will tell whether this is a viable model. Given current rental rates on space and future expectations, the project should be profitable. Yet, the frenzied race for ultra low latency could slow or even end entirely should a plateau in demand be reached.

Honoré postulated that given the possibilities of co-location, all geographic barriers have been erased and he expects to see high frequency firms arise across the globe that trade US markets.

Muthuswamy on Market Efficiency
The two-day conference ended with a speech by Jayaram Muthuswamy, Professor of Finance at Kent State University, titled “Does High Frequency Trading Make Markets More or Less Efficient?” Muthuswamy’s speech was the real reason I attended the second day and while he’s a PhD and talked over my head to some degree, I enjoyed the speech nonetheless.

Muthuswamy talked quite a bit about computational complexity. He believes that the winners in the HFT space will ultimately be those that discover and exploit the most arcane relationships and patterns. He touched on the idea of efficiency in the equity markets and believes we have semi-strong efficiency. Strong form efficiency does not exist explicitly demonstrated in takeover situations where a stock often sees a large burst to a higher level and then trades flat at the bid price. In theoretical strong form efficiency, the stock would trade flat and the bid would have no impact on the price.

He briefly touched on the advantages and disadvantages of HFT. Advantages being enhancements in connectivity between markets and assets, lower transaction costs overall, the encouragement of high levels of creativity and more efficient markets as information is assimilated more rapidly than ever before. Disadvantages include glitches such as May 6th, the unfair player advantage where HFT could possibly hurt traditional participants such as “flashing” which most venues have now banned and the exacerbation of volatility.

While his list of advantages seem self-evident for the most part I wonder how Muthuswamy would answer critics that claim May 6th was not necessarily a result of a glitch. The move away from a mandated liquidity provider, namely the specialist system, to liquidity providers that have no obligation and are often directional has many implications. Tradebot and Tradeworx, two of the largest HFT firms, admitted to shutting down their systems prior to the flash crash. In this scenario, we did not see a glitch rather a conscious decision by the human operators of the machines to remove their liquidity provision precisely when it was needed most, a time of increased volatility. It would appear that designated market makers clearly failed in their role to maintain orderly markets. The oft-repeated idea that HFT was not a negative influence because stocks bounced back just as fast as they fell is simply hogwash.

Ultimately, the question for the non-human is: where does the discretionary trader fit into this brave new world? My thoughts on this coming up.

Written by t3live

June 12, 2010 at 11:23 am

>Where is HFT Headed? World Research Group Summit

leave a comment »

>By: Brandon Rowley

I spent the afternoon off the trading floor attending the Buy Side Tech: High Frequency Trading Summit put on by the World Research Group. Our CEO and Managing Partner of T3 Capital Management, Sean Hendelman, was featured on two panels discussing a variety of aspects of high frequency trading development and the future of HFT.

I arrived late after trading the morning with the panel discussion on “Assessing the Growth of High Frequency Trading in Futures, Options, and FX” in progress but heard a few interesting tidbits worth repeating. One panelist noted how HFT strategies based on speed now have so many competitors in US equity markets and options that major profit potential is becoming increasingly limited in terms of the latency game. The future of the industry lies primarily in trading across asset classes and across global markets. Global currencies offer significant arbitrage opportunities as the landscape is still quite fragmented with many different exchanges across the world. For example, there are only 4 fiber-optic lines running from London to New York and being on the right line with the highest speed is crucial and speed arb is still possible in that space.

Another highlight was the idea of the high frequency trading style becoming a strategy into which investors would want to put their money within a portfolio. Like the desire for exposure to various asset classes within a portfolio, it is feasible to think that in the future there could be a demand for an exchange-traded fund that invests in HFT strategies. There is certainly a natural attraction for portfolio managers to a non-economic, no-risk strategy.

The last interesting comment I heard was out of Mark S. Longo from theoptionsinsider.com who noted the distortive impact of non-economic volume particularly in the options market. While many market participants use “unusual” trading activity in options as an indicator of possible future movement, many arbitrage strategies in the HFT universe may be creating false signals. Dividend arb or fee arb are two examples that can cause a surge in volume that is entirely non-directional in strategy. He finished by mentioning that investors would be wise to look more toward open interest and other readings along with unusual trading volumes.

The next panel discussion was entitled “Build or Buy? Strategies for Determining the First Step in Implementation” and Sean was one of four participants with another very intriguing guest being Adam Afshar, President of Hyde Park Global Investments. Hyde Park creates robotic artificial intelligence programs designed to self-adjust their trading to optimize results, a fascinating concept to begin with.

The primary thrust of this discussion came from Sean and Afshar telling interested observers to focus on the strategy before considering the build or buy decision. Broad consensus between panelists seemed to be that buying was the best option when starting out because of the time to market and lack of expertise but this will likely grow into a hybrid operation over time as you demand higher control over your data and execution. By Afshar’s estimates it would take a firm $5-10 million to fully setup low-latency execution in-house involving direct feeds, co-location, etc. Yet, Sean stressed how important it is to have a strategy that works first because despite seemingly common opinion that HFT shops simply setup and make money, it is actually extraordinarily difficult to find profitable strategies as the vast majority of strategies fail and the ones that do work can go out-of-favor very rapidly.

Next up was a half an hour presentation by Matt Samelson of Woodbine Associates titled “The Impact of High Frequency Strategies on Spreads and Volatility on Highly Liquid U.S. Equities”. The presentation was a summary of the $3,750 “ground-breaking study” available from the firm with their basic argument being that spreads tightened in 2/3 of the 39 most liquid stocks throughout 2008-09 and therefore the “traditional” market participant is better off, not worse off, as HFT has grown as a share of trading volume. While this presentation purported to defend HFT against attacks, it accomplished nothing in terms of engaging in the contemporary debate. While T3 Capital runs HFT strategies and welcomes defense against much of the misinformation out there, this presentation was sorely lacking only working to regurgitate old arguments. It’s as if he were a philosophy professor that taught Descartes’ Meditations and simply didn’t bother to acknowledge the circularity objection to the “I think, therefore I am” statement (even though I don’t believe this is a crippling refutation but that’s another discussion). The current debate has moved far beyond his presentation.

The anti-HFT crowd believes traditional market participants are being disadvantaged in illiquid securities, not liquid ones. And, I won’t be one to join the stereotyping, most of the “anti-HFT” crowd are not anti-HFT per se. They are against strategies that they believe hurt the retail trader/investor and there’s clearly merit to many of their thoughts on the problems with the current market structure. The primary issue within the liquid security universe is not the spread but the overall true trading cost as the ECN fee structure encourages an absurd level of liquidity provision in stocks not needing any. Samelson also explained a bizarre statistic of “realized spread” in which they measured where a stock was trading five minutes after an HFT trade and claimed that the majority of the time the stock had gone in favor of the counterparty to the trade. The use of five minutes is highly arbitrary and is completely irrelevant in the HFT world especially rebate traders who may have been in and out of the stock multiple times by the time five minutes has elapsed.

The second panel discussion with Sean was titled “The Drive for Zero Latency: Optimizing Existing Systems for High Frequency Trading Strategies” moderated by Jayaram Muthuswamy, Professor of Finance at Kent State University. Muthuswamy offered a stimulating academic perspective on the current race for low latency. While the concept of the limiting factor in latency ultimately reducing to the speed of light was mentioned, the discussion shifted towards the various areas of possible latency reduction. It is not simply the distance and speed of execution, it is also quote speeds coming in and the speed of interpretation, the complexity of the algorithm code and its decision-making speed, routing speeds, and regulatory hindrances among other things. Sean also pointed out the importance of low latency during times of market stress where inefficiencies are high and HFT can find substantial opportunity. These volatile times, like an FOMC announcement for example, act as a perfect test of the capabilities of an HFT system.

Ultimately, Muthuswamy finds HFT to be a source of incredible progress. He maintains this belief even while he mentioned an email from his friend Eugene Fama, often regarded as the father of the efficient market hypothesis, where Fama stated his beliefs on the growth in HFT in one line: “excessive HFT can be deleterious to market efficiency”. Yet, even with the incredible growth, Muthuswamy believes we are only “scratching the surface” of what is possible particularly in statistical arbitrage between any and every market and asset class around the world. He hinted that in his speech tomorrow he will attempt to hypothesize what the ultra low latency game will be dependent on ultimately: the complexity of the underlying code.

>The World of High Frequency Trading

leave a comment »

>By: Brandon Rowley

With all of the recent interest in high frequency trading, I put together the chart below explaining what we see as the six primary strategies of buy-side short-term algorithmic traders. This chart excludes the subset of algorithmic trading dedicated to the execution of buy-side funds with longer-term interests. These six strategies are what short-term traders contend with on a daily basis and understanding their methods is useful.

http://d1.scribdassets.com/ScribdViewer.swf?document_id=30295187&access_key=key-226lg4zo036orzmn202h&page=1&viewMode=slideshow

Written by t3live

April 21, 2010 at 5:14 pm

>Sen. Ted Kaufman Presents T3Live’s Idea to the Senate

leave a comment »

>Senator Ted Kaufman picked up on T3Live’s idea with order cancellations. He recognizes the problems of manipulation they present and has introduced it to the Senate.

http://www.c-spanvideo.org/videoLibrary/assets/swf/CSPANPlayer.swf
Skip to 164:00 for Ted Kaufman’s Speech, 178:00 for mention of T3Live

Concerned about Stock Market Abuses? Join the Club
by: Senator Ted Kaufman

I have spoken on the Senate floor many times about the importance of transparency in our markets. Without transparency, there is little hope for effective regulation. And without effective regulation, the very credibility of our markets is threatened.

But I am concerned recent changes in our markets have outpaced regulatory understanding and, accordingly, pose a threat to the stability and credibility of our equities markets. Chief among these is high frequency trading.

Over the past few years, the daily volume of stocks trading in microseconds — the hallmark of high frequency trading — has exploded from 30 to 70 percent of the U.S. market. Money and talent are surging into a high frequency trading industry that is red hot, expanding daily into other financial markets not just in the United States but in global capital markets as well.

High frequency trading strategies are pervasive on today’s Wall Street, which is fixated on short-term trading profits. Thus far, our regulators have been unable to shed much light on these opaque and dark markets, in part because of their limited understanding of the various types of high frequency trading strategies. Needless to say, I’m very worried about that.

Last year, I felt a little lonely raising these concerns. But this year, I’m starting to have plenty of company.

On January 13th, the Securities and Exchange Commission issued a 74-page concept release to solicit comments on a wide-range of market structure issues. The document raised a number of important questions about the current state of our equities markets, including: “Does implementation of a specific [high frequency trading] strategy benefit or harm market structure performance and the interests of long-term investors?”

The SEC also called attention to trading strategies that are potentially manipulative, including momentum ignition strategies in which “the proprietary firm may initiate a series of orders and trades (along with perhaps spreading false rumors in the marketplace) in an attempt to ignite a rapid price move either up or down.”

The SEC went on to ask, “Does…the speed of trading and ability to generate a large amount of orders across multiple trading centers render this type of strategy more of a problem today?”

The SEC raised many critical questions in its concept release, and I appreciate that the SEC is trying to undertake a baseline review. As its comment period moves forward, I am pleased to report that other regulators and market participants, both at home and abroad, have taken notice of the global equity markets’ recent changes, including the rise in high frequency trading.

In the United States, the Federal Reserve Bank of Chicago, in the March 2010 issue of its Chicago Fed Letter, argued that the rise of high frequency trading constitutes a systemic risk, asserting, “The high frequency trading environment has the potential to generate errors and losses at a speed and magnitude far greater than that in a floor or screen-based trading environment.” In other words, high frequency trading firms are currently locked into a technological arms race that may result in some big disasters.

Citing a number of instances in which trading errors have occurred, the Chicago Fed stated that “a major issue for regulators and policymakers is the extent to which high frequency trading, unfiltered sponsored access, and co-location amplify risks, including systemic risk, by increasing the speed at which trading errors or fraudulent trades can occur.”

Moreover, the letter cautions us about the potential for future high frequency trading errors, arguing, “Although algorithmic trading errors have occurred, we likely have not yet seen the full breadth, magnitude, and speed with which they can be generated.”

There is action internationally as well. On February 4th, Great Britain’s Financial Services Secretary, Paul Myners, announced that British regulators were also conducting an ongoing examination of high frequency trading practices, stating, “People are coming to me, both market users and intermediaries, saying that they have concerns about high frequency trading.”

This development comes on the heels of another British effort targeting so-called “spoofing” or “layering” strategies in which traders feign interest in buying or selling a stock in order to manipulate its price. In order to deter such trading practices, the Financial Services Authority (FSA) announced that it would fine or suspend participants who engage in market manipulation. Noting that some market participants may not be sure that spoofing or layering is wrong, an FSA spokeswoman said, “This is to clarify that it is.”

In Australia, market participants are also requesting clearer definitions of market manipulation, particularly with regard to momentum strategies like spoofing. In a review of algorithmic trading published February 8th, the Australian Securities Exchange called on its regulators to, “Ensure that…market manipulation provisions…are adequately drafted to capture contemporary forms of trading and provide a more granular definition of market manipulation.”

It is critical our regulators understand the risks posed by high frequency trading, both in terms of manipulation and on a systemic level. As the Chicago Fed stated, the threat of an algorithmic trading error wreaking havoc on our equities markets is only magnified by so-called “naked,” or unfiltered sponsored access arrangements, which allow traders to interact on markets directly — without being subject to standard pre-trade filters or risk controls.

Robert Colby, the former deputy director of the SEC’s Division of Trading and Markets, warned last September that naked access leaves the marketplace vulnerable to faulty algorithms. In a speech given at a forum on the future of high frequency trading, which was cited by the Chicago Federal Reserve’s recent letter, Mr. Colby stated that hundreds of thousands of trades representing billions of dollars could occur in the two minutes it could take for a broker-dealer to cancel an erroneous order executed through naked access.

According to a report released December 14th by the research firm Aite Group, naked access now accounts for a staggering 38% of the market’s average daily volume compared to 9% only four years ago.

Let me reiterate that: almost forty percent of the market’s volume is executed by high frequency traders interacting directly on exchanges without being subject to any pre-trade risk monitoring.

In January, the SEC acted to address this ominous trend by proposing mandatory pre-trade risk checks for those participating in sponsored access arrangements. This move would essentially eliminate naked access, and I applaud the SEC for its proposal.

While I am pleased that the SEC has taken on naked access and has issued a concept release on market structure issues, there is much work that still needs to be done in order to gain a better understanding of high frequency trading strategies and the risks of frontrunning and manipulation they may create. In the last few months, several industry studies aimed at defining the benefits and drawbacks of high frequency trading have emerged. While these studies may not be the equivalent of peer-reviewed academic studies, they do have the credibility of real-world market experts. And they begin to shed light on the opaque and largely unregulated high frequency trading strategies that dominate today’s marketplace.

In addition to the Aite Group study, reports by the research firm, Quantitative Services Group (QSG), the investment banking firm, Jefferies Company, and the institutional brokerage firm, Themis Trading, all raise troubling concerns about the costs of high frequency trading to investors and reinforce the need for enhanced regulatory oversight of these trading practices.

Last November, QSG analyzed the degree to which orders placed by institutional investors are vulnerable to high frequency predatory traders who sniff out large orders and trade ahead of them.

Specifically, the study concluded investors placing large orders risk, “leaving a statistical footprint that can be exploited by the ‘tape reading’ HFT algorithms.” While traders have long tried to trade ahead of large institutional orders, they now have the technology and models to make an exact science out of it.

In a study put forth on November 3rd, the Jefferies Company estimated high frequency traders gain a 100 to 200 millisecond advantage by co-locating their computer servers next to exchanges and subscribing directly to market data feeds. As a result, Jefferies concludes, high frequency traders enjoy, “(almost) risk-free arbitrage opportunities.”

A Themis Trading white paper released in December elaborated on Jefferies’ conclusion, asserting that high frequency traders, “know with near certainty what the market will be milliseconds ahead of everybody else.”

The studies and papers I have mentioned underscore the need for the SEC to implement stricter reporting and disclosure requirements for high frequency traders under its “large trader” authority, as Chairman Mary Schapiro promised she would in a letter to me on December 3rd. We need tagging of high frequency trading orders and next day disclosure to the regulators, and we need it now.

For investors to have confidence in the credibility of our markets, regulators must vigorously pursue a robust framework that maintains strong, fair and transparent markets. I would make five points along these lines.

First, the regulators must get back in the business of providing guidance to market participants on acceptable trading practices and strategies. While the formal rule-making process is a critical component of any robust regulatory framework, so too are timely guidelines that bring clarity and stability to the marketplace. Co-location, flash orders and naked access are just a few practices that seem to have entered the market and become fairly widespread before being subject to proper regulatory scrutiny. For our markets to be credible, it is vital that regulators be pro-active, rather than reactive, when future developments arise.

Second, the SEC must gain a better understanding of current trading strategies by using its “large trader” authority to gather data on high frequency trading activity. Just as importantly, this data – once masked – should be made available to the public for others to analyze.

I am concerned that academics and other independent market analysts do not have access to the data they need to conduct empirical studies on the questions raised by the SEC in its concept release. Absent such data, the ongoing market structure review predictably will receive mainly self-serving comments from high frequency traders themselves and from other market participants who compete for high frequency volume and market share.

Evidence-based rule-making should not be a one-way ratchet because all the “evidence” is provided by those whom the SEC is charged with regulating. We need the SEC to require tagging and disclosure of high frequency trades so that objective and independent analysts — at FINRA, in academia or elsewhere — are given the opportunity to study and discern what effects high frequency trading strategies have on long-term investors; they can also help determine which strategies should be considered manipulative.

Third, regulators must better define manipulative activity and provide clear guidance for traders to follow, just as Britain’s regulators have done in the area of spoofing. By providing “rules of the road,” regulators can create a system better able to prevent and prosecute manipulative activity.

Fourth, the SEC must continue to make reducing systemic and operational risk a top regulatory priority. The SEC’s proposal on naked access is a good first step, but exchanges must also be directed to impose universal pre-trade risk checks. If left solely in the hands of individual broker-dealers, a race to the bottom might ensue. We simply must have a level playing field when it comes to risk management that protects our equities markets from fat fingers or faulty algorithms. Regulators must therefore ensure that firms have appropriate operational risk controls to minimize the incidence and magnitude of such errors while also preventing a tidal wave of copycat strategies from potentially wreaking havoc in our equities markets.

Fifth, the SEC should act to address the burgeoning number of order cancellations in the equities markets. While cancellations are not inherently bad – potentially enhancing liquidity by affording automated traders greater flexibility when posting quotes – their use in today’s marketplace is clearly excessive and virtually a prima facie case that battles between competing algorithms, which use cancelled orders as feints and indications of misdirection, have become all-too-commonplace, overloading the system and regulators alike.

According to the high frequency trading firm T3Live, on a recent trading day, only 1.247 billion of the 89.704 billion orders on Nasdaq’s book were executed – meaning a whopping 98.6% of the total bids and offers were not filled. Cancellations by high frequency traders, according to T3Live, were responsible for the bulk of these unfilled orders.

The high frequency traders that create such massive cancellation rates might cause market data costs for investors to rise, make the price discovery process less efficient and complicate the regulators’ understanding of continuously evolving trading strategies. What’s more, some manipulative strategies, including layering, rely on the ability to rapidly cancel orders in order to profit from changes in price.

Perhaps excessive cancellation rates should carry a charge. If traders exceed a specified ratio of cancellations to orders, it’s only fair that they pay a fee. The ratio could be set high enough so that it would not affect long-term investors (even day traders), and should apply to all trading platforms, including dark pools and ATSs as well as exchanges.

The high-frequency traders who rely on massive cancellations are using up more bandwidth and putting more stress on the data centers. Attempts to reign in cancellations or impose charges are not without precedent. In fact they have already been implemented in derivatives markets where overall volume is a small fraction of the volume in the cash market for stocks. The Chicago Mercantile Exchange’s volume ratio test and the London International Financial Futures and Options Exchange’s bandwidth usage policy both represent attempts to reign in excessive cancellations and might provide a helpful model for regulators wishing to do the same.

Finally, the high frequency trading industry must come to the table and play a constructive role in resolving current issues in the marketplace, including preventing manipulation and managing risk. In order to maintain fair and transparent markets and avoid unintended consequences, market participants from across the industry must contribute to the regulatory process. I am pleased that a number of responsible firms are stepping forward in a constructive way, both in educating the SEC and me and my staff. I look forward to continuing to work with these industry players.

We all must work together, in the interests of liquidity, efficiency, transparency and fairness to ensure our markets are the strongest and best-regulated in the world. But we cannot have one without the other – for markets to be strong, they must be well-regulated. So with this reality in mind, I look forward to working with my colleagues, regulatory agencies, and people from across the financial industry to ensure our markets are free, credible and the envy of the world.I have spoken on the Senate floor many times about the importance of transparency in our markets. Without transparency, there is little hope for effective regulation. And without effective regulation, the very credibility of our markets is threatened.

Written by t3live

March 4, 2010 at 6:51 pm

>Structural Changes in US Equity Markets

leave a comment »

>By: Brandon Rowley

Knight Capital Group recently commissioned a study entitled “Equity Trading in the 21st Century” that explores the major structural changes we have seen in the markets so far this century and the impacts it has had on investors and traders. I will not offer an opinion or recommendation but rather just summarize the transformations the authors present. The professors who authored the paper offer well-argued perspectives and it’s worth the read if you are interested in the regulatory angle. But, based on previous interest in our articles on high frequency trading, it seems that many would find it worthwhile to examine the changes that have fostered a system where HFT dominates the tape for better or worse. So, how has the market changed?

Displayed Depth Has Increased
Within six cents of the national best bid and offer, the depth of the book has seen a dramatic increase recently. From 2003-2009 the depth looks to have averaged around 2,000 shares on the bid and offer for S&P 500 companies. Yet, in just 2009, the book saw a dramatic jump in displayed quote volume leaping from lows of 1,500 shares to over 4,500, a 200% increase. The depth, to be sure, is not necessarily indicative of actual executed volume. We argued in an earlier paper that this is largely a result of HFTs providing bids and offers that they intend to cancel without a fill such that the “real” depth is much less than displayed. Note: the inclusion of stocks outside the S&P 500 based on nominal share sizes is misleading. We would rather see these stocks on their own graphs or a logarithmic graph to interpret it logically.

Average NYSE Trade Size Has Consistently Fallen
Average trade size on the New York Stock Exchange has consistently fallen for the last six years. At the beginning of 2004 average trade size was over 700 shares while now it is slightly over 300 shares. The hyper growth in automated trading strategies has helped cut average trade size in half in five years. Large-scale investors have utilized algorithms to execute orders in an effort to hide their intentions and reduce price impact. Splitting up large orders into smaller and smaller pieces minimizes the impact of any single order on the market and can reduce other traders’ abilities to capitalize on their inefficient order flow. Many scalp-style active traders have found order flow more and more difficult to read because falling transaction costs have allowed institutions to invest heavily in algorithm development in order to hide their intentions while executing orders.

Average Quotes per Minute Skyrocket, Then Return to Earth
Average quotes per minute coming through skyrocketed in the Panic of 2008 when markets became the most volatile they had been in decades. Quotes per minute went from below 50 to over 500 in five years before falling back to more normalized levels as volatility slowed. Quotes per minute are now around 200, still roughly a 900% increase in the last six years.

Cancels to Executed Steadily Rising
Early 2002 saw a cancellation to execution ratio of ten to one, but that ratio has steadily climbed over the years. Now, the Nasdaq sees 30 cancels for every execution according to Knight Capital Group’s research. This increasing ratio and the upsurge in average quotes per minute is consistent with a market that has become HFT dominated with greater total volumes coupled with lower average trade sizes.

NYSE to Become a Museum
Volume on the New York Stock Exchange has collapsed and it seems like it won’t be much more than a decade from now that the NYSE Floor will be turned into a museum. Regulation National Market System (NMS) was passed in 2005 and opened the NYSE up to increased competition. At the beginning of 2003, the NYSE executed 80% of the total volume. This ratio has plummeted as electronic communication networks (ECNs) gobbled up market share. The NYSE ex ARCA accounted for only 25% of all volume by the end of 2009, about a 70% decline in market share in just five years. NYSE Euronext has wisely invested in ARCA so the company is far from doomed, but the physical floor will likely be an exhibit in the not too distant future.

All these charts are consistent with the growth in high frequency trading as a means of increasing efficiency, making the execution side of the business largely computer-driven. The days of the floor broker are rapidly disappearing. Automated trading is here to stay and participants need to learn how to adapt.

All charts are excerpted from “Equity Trading in the 21st Century” commissioned by Knight Capital Group and authored by professors James Angel of Georgetown University, Larry Harris of the University of Southern California, and Chester Spatt of Carnegie Mellon University published on February 23, 2010.

Written by t3live

March 2, 2010 at 9:00 pm

>A Proposal for an Order Cancellation Tax

leave a comment »

>By: Sean Hendelman and Brandon Rowley

The mechanics of the equity markets are under constant evolution with the consistent goals over time of increasing speed of execution and advancing price discovery. Participants in the stock market have benefited from increased competition among exchanges and brokerage houses that has worked to greatly reduce commission costs and drastically cut execution latency. But, the major changes seen in the last decade have not come without drawbacks. The prevalence of disingenuous quoting on the visible book is extremely high and we propose an order cancellation tax to remedy this detriment to the financial markets.

Recently, there has been a great deal of talk in Congress about instituting a tax on transactions, the so-called “trader tax”. The goal is to raise needed revenue for the federal government and to force Wall Street to pay for the losses incurred by the federal government in supporting systemically important financial institutions. Yet, there is clearly a problem in that a large portion of the tax will fall on everyday investors as transaction costs rise for mutual funds and pension funds holding the bulk of Main Street’s savings. The transaction tax idea, while impacting Wall Street to some degree, has massive collateral damage by raising costs on retail investors. We are not necessarily advocating increasing taxes rather we are operating under the assumption that the government needs to raise revenue and wants to do it by taxing Wall Street. To that end, there is a more effective alternative to the “trader tax” that also has the positive externality of greater price transparency.

While most retail investors probably have little concept of what high frequency trading is or its impacts, active equity traders have seen its pronounced imprint on the markets. High frequency trading is an entirely legitimate strategy of using computer algorithms to execute trading strategies with ultra-low latency. Yet, the explosion in HFT has led to a major structural flaw in equity markets. This flaw is the abuse of uncharged bidding and offering for shares. Level II traders know exactly what this is as they see it day after day in every stock they trade. The book of bids and offers is supposed to be a top-to-bottom list of the prices every player in the market is willing to buy and sell a stock. In this idealized world, there is price transparency as everyone can see who wants to buy and who wants to sell should the participant chose to place a limit order. The price at any given second then is an accurate reflection of the current supply and demand for shares (ignoring the use of dark pools, hidden orders, etc.). Limit orders are meant to be the showing of an explicit intention to buy or sell shares at a predetermined price. Should a trader not want to show his hand, he can execute market orders or use reserve orders. Yet, the book no longer acts in accordance with the idealized world.

Every single listed stock’s order book is filled with false bids and false offers. These limit orders are constantly used to manipulate prices back and forth to the HFT’s advantage. Nearly every higher volume, lower priced stock has a book that is stacked with offers and bids at nearly every penny increment but the vast majority of these quotes are fake. The HFTs submitting the bulk of these orders do not have the objective of being filled on their orders. The purpose is to manipulate the price in some way. This is clearly a deceptive practice occurring in nearly every stock in the current hybrid and fully electronic markets. The high frequency trader has the explicit goal of tricking other traders into believing there is something real there when there is not. Bidding and offering without the intention of actually filling the order is nothing more than a mechanism to mislead other traders. This game, as played by HFTs, is an obscenely inefficient allocation of resources.

After a quick study with our internal systems, we recorded data on the Nasdaq book for Thursday, February 18, 2010. Volume executed off the visible book was 1.247 billion shares (excludes special block prints, hidden liquidity, opening/closing crossing, etc.). Throughout the day, the Nasdaq book showed a total bidded and offered volume of 89.704 billion shares. That means bids and offers in the amount of 88.457 billion shares were put on the book and subsequently cancelled without being filled. These statistics show that only 1% of the total visible order volume is actually executed. Put another way, 99% of the bids and offers placed on the book go unfilled for one reason or another. We would argue that the largest share of this volume can be attributed to HFT stacking the book to move stock prices for their advantage. This is nothing more than trickery and falls outside the spirit of the laws.

Now, it would be far too extreme to ban order cancellation on the whole. Traders and investors need the ability to cancel orders should they change their mind and part of the 99% is precisely that. The flexibility should not be hindered but there should exist a disincentive for cancelling orders because of the obvious current abuse. Even a very marginal tax would significantly curtail the activity while only hardly impacting Main Street. In a world where taxes must be raised and Wall Street should be taxed for bailout funds, a tax on order cancellations is a clear choice.

Taxing order cancellation has various advantages and limited disadvantages. HFT has many valid strategies primarily based on the speed of execution and, in general, it has benefited the market in terms of price discovery. But, the use of intentional trickery to manipulate prices falls well outside the spirit of current regulation. There is no need to enter and cancel hundreds of millions of orders a day unless the HFT is profiting from the results. In order to curb the deception practiced by HFTs, taxing the cancellation of orders will work to reduce it. The goal is not to eliminate cancellations but to make the cost prohibitive to HFTs entering and cancelling millions of quotes every single day. A key positive externality in reducing order cancellations is the increase of price transparency in the markets. The goal of any market is that the price is accurately reflective of the current supply and demand. The tax will also fall largely on Wall Street firms practicing HFT. Retail investors will be charged for cancelling limit orders but the impact will be negligible contrasted with the impact a general tax on all transactions would have. The government will raise revenue, primarily tax Wall Street without hurting average investors and advance price transparency by enacting an order cancellation tax.

Sean Hendelman is CEO of T3Live and heads the automated/high frequency trading division. Brandon Rowley is an equity trader with T3Live.

Written by t3live

February 19, 2010 at 4:12 pm

>HFT Forcing Traders to Become More Sophisticated

leave a comment »

>By: Brandon Rowley

In recent years, high frequency trading (HFT) has become a buzzword even in the mainstream media. HFT has been attributed as a contributing cause of many events, including the velocity of the 2008 crash and the duration of the 2009 rally. Active traders perhaps did not see the effects of HFT on their trading results in 2008 because volumes were highly elevated as panic set in. However, 2009 was a year of struggle for many in the active trading industry. I believe a significant catalyst behind this struggle has been the increased presence of HFT.

While high frequency trading is highly profitable for the companies developing the algorithms, few understand how they work in any detail. HFT shops are filled with mathematical and computer science Ph.D’s writing sophisticated algorithms that no one outside the industry can comprehend. A report out of TABB Group in July 2009, a financial markets research and strategic advisory firm focused on capital markets, claimed that while HFT firms represent only 2% of the 20,000 trading firms in the US, these firms account for 73% of all US equity trading volume. HFT firms are a force to be reckoned with and a reality in today’s markets.

What is high frequency trading?
High frequency trading, also known as algorithmic or black-box trading, is the use of computer programs for the execution of trading strategies. The program is written such that all decisions for time of entry, price and quantity are pre-defined and executed without human intervention. Algorithmic trading has long been used by buy-side institutional investors to execute large orders effectively by minimizing the price impact of the order. Rebate algos have also been around for a while seeking to provide liquidity and capture the rebates paid by ECNs. Active traders see these algos constantly creating the bid and offer in high volume large cap stocks. More recently though, there has been a large explosion in predatory algorithm development. Predatory algos attempt to detect larger players in the market and front run those orders.

While HFT has increased liquidity and tightened spreads in large cap names, small cap companies have actually seen spreads increase according to NYSE Arca data. Investment Technology Group’s trading costs for small-cap stocks were 40% higher in Q2 2009 than Q1 2008, reflecting the inability to get trades processed and rising commission costs. Joseph Saluzzi of Themis Trading, a lone voice in bringing HFT to light, has argued that this is because of predatory algos.

With HFTs as dominant players in the market, active traders need to learn as much as they can about them. First, high frequency trading strategies are highly dependent on ultra-low latency. Many shops have their servers co-located on the exchanges to provide the fastest possible execution. Second, the coding is under constant evolution because of exceptionally high competition among participants and the micro precision of strategies that means they may only be effective for days at a time.

So, what is the active trader to do?
Clearly, any strategies that have an edge based on speed are out the window with the increase in high frequency trading. While the active trader used to front run the order of the institutional desk that was inefficient in execution, now even the small trader’s order is front run by the computer algorithm. Every human trader is now the inefficiency with their slower execution. Entering and exiting stocks will also be tougher. Any active trader is quite used to seeing his order front run immediately as he shows his bid or offer making it more difficult for him to get a fill. There is a high likelihood that active traders must become used to paying an added toll to HFTs for entering and exiting their positions.

The trading business is forever changing, that we know for sure. Level II strategies based on speed of execution are certainly on the decline. Active human traders must therefore become more sophisticated. First, minimize the impact of HFT by trading “in-play” stocks that have large volume from “real” players. Second, avoid non-volatile stocks trading below average volumes. Third, greater anticipation based on sound technical analysis is also needed. Most of us will need to fight hard for better prices and avoid the temptation to buy highs or short lows as algos are programmed to manipulate prices around these areas. Fourth, many of us will need to cut down our size and look for larger moves in stocks. Scalping very small moves is not nearly as profitable when a predatory algo scalps 3 cents from you on your buy and another 3 cents on your sell, just as a hypothetical. Also, levels in stocks are not as clear-cut because algos are programmed to push stocks through the level to shake out weak holders. But, if you can begin trading for dollar moves on less size, you’re less likely to notice the 6 cents you paid as a toll and you’ll be able to give the stock a little extra room around levels.

In order to successfully navigate through the choppiness that HFT has brought into equity markets, traders must spend an increasing amount of their after-hours time researching and learning levels. Spend more time analyzing charts on multiple time-frames. For traders who focus on very small timeframes, now might be the time to take a step back, decrease size, and look for setups and levels on higher timeframes. The higher the timeframe, the more powerful the setup and level and the harder it is for an algo to overtly cloud the area. Additionally, familiarity as to how particular stocks trade around levels helps provide the confidence necessary to follow-through on your ideas. Traders need to develop a universe of familiarity—a core group of “in-play” stocks and sectors—to follow each and every day. The more often you we trade a particular vehicle, the more familiar we become with how algos work in that particular stock.

These are not fail-safe rules but HFT is a reality and it is here to stay. Active traders must adjust and come to find a new edge beyond speed of execution. Where there’s movement, there’s opportunity and the survivors in our business will become more sophisticated in order to continue trading profitably.

Written by t3live

February 10, 2010 at 7:32 pm